Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Introduction

The presence of the word engineering in “Tissue Engineering”, the interdisciplinary field combining biomedical and engineering sciences in the search for replacements of diseased/non-functional organ (parts) by manufactured living implants that support functional tissue regeneration [1], has not yet fulfilled its true potential. Ever since the official introduction of the field in 1987, the research focus has been predominantly on the biology of the tissue construct and biomaterial development. Apart from the latter, the engineering input has remained limited to technical aspects such as bioreactor and biosensor development, and automation. Although these aspects are needed when bringing the biological processes from bench to bed side, they cover only a small part of what the engineering sciences have to offer the TE field.

One key issue the TE field in general is struggling with is the lack of quantity and quality of the generated products [2, 3]. Protocols and procedures followed in the lab are mainly established based on trial and error, requiring a huge amount of manual interventions and without clear early time-point quality criteria to guide the process. This also makes these processes very hard to scale up to industrial manufacturing levels as can be appreciated from the limited number of companies that have survived the first decade of TE [3]. Overall, there is a lack of intelligent process design in the current TE field. Over the last few years, a number of leading labs [48] in the TE field have realized that the trial and error approach is not a good way to obtain products that can meet the quality standards of international regulatory bodies such as EMA or FDA. It was proposed to return to an approach inspired by nature’s own regenerative and developmental processes [48] called developmental engineering. Much the same as aspired in any manufacturing process, developmental processes are robust, multistage, observable, controllable, path-dependent and autonomous. A common engineering approach when designing any kind of manufacturing process, from the food over the chemical to the automotive industry, is to use in silico models of the product and/or the manufacturing process, based on physical, mechanical or (bio)chemical laws/equations and/or experimental data, in order to minimize the variability, increase the quality and optimize the overall process. In silico models can, amongst others, help to identify key regulating parameters of the manufacturing process and extrapolate early time point information to predict final product behaviour.

This chapter will briefly introduce a number of modelling techniques and applications related to the design of TE products and processes. It will start by introducing the novel paradigm of developmental engineering that has recently emerged in the TE field. Subsequently various examples of how computational tools can assist in the development of robust and reliable products and processes are discussed, many of which will come back in the other chapters of this book.

2 Developmental Engineering: A New Paradigm for Tissue Engineering

Over the last few years, several leading labs [48] in the TE field have proposed to return to an approach inspired by nature’s own regenerative and developmental processes [48] called developmental engineering as many of the characteristics of developmental processes are desirable in process design. Firstly, the developing embryo has the ability to cope with a wide variety of external perturbations, i.e. it is a robust system. This robustness of developmental processes would allow the in vitro process to be impervious to a wide scale of external perturbations, which are often unavoidable in an artificial environment. Additionally developmental processes exhibit a multistage character, meaning cells will differentiate in characteristic stages, with a distinct morphology and marker genes. Accordingly, the in vitro process could be divided into a series of sequential subprocesses, each corresponding to a specific stage in developmental biology. This would make the process highly observable, by for example determining expression of certain marker genes, and highly controllable, since growth factors could be added when the cells are at a stage where they are competent to respond to them. Another concept advocating the use of in vitro developmental processes is path-dependence, or the dependence of one developmental stage on the previous ones. This means that the optimal conditions of the successive stage are provided by its predecessors. These conditions consequently do not have to be incorporated in the process design and will make the process more autonomous. An example of path-dependence can be found in endochondral ossification, where first a cartilage anlage is formed which creates optimal conditions for the invading ossification front [9]. Furthermore, some intermediate tissue forms have a great robustness thanks to intrinsic factors, allowing them to be treated as individual modules. The regulative, self-controlled behaviour exhibited by these modules will likely lead to a high product consistency. Several modular forms appear during development, including cellular modules like cartilage condensations and multicellular modules with a spatially extended and heterogeneous cell population like the growth plate.

The growth plate is a developmental centre that integrates many signalling pathways in order to regulate the patterning and growth of the skeleton. As a cell progresses throughout the growth plate, going from the long bone’s epiphysis towards the diaphysis, its shape and function changes drastically [10]. At the epiphysis, a pool of small round chondrocytes makes up the resting zone. These cells differentiate into more rapidly proliferating flat chondrocytes, forming proliferative columns. The resting and proliferating chondrocytes secrete structural proteins, such as collagen type II, that form a hyaline cartilage matrix. Towards the diaphysis, chondrocytes differentiate further into prehypertrophic, secreting Indian Hedgehog (Ihh), and thereafter hypertrophic chondrocytes [11]. Hypertrophic chondrocytes remodel the cartilage matrix into a calcifying matrix comprising primarily collagen type X (Col-X). At terminal differentiation, the cells will induce invasion and resorption of the hypertrophic cartilage as well as the start of vascularisation by excreting proteins like Matrix Metallopeptidase 13 (MMP13) and Vascular Endothelial Growth Factor (VEGF) [12]. The evolution the chondrocytes undergo is reminiscent of the developmental process of endochondral ossification, indicating these events can be recapitulated using adult stem cells [1317]. Indeed, implantation of articular chondrocytes (mixed with osteoblasts) in mice has been shown to result in formation of a structure similar to that of the growth plate [18].

Lenas et al. [48] extensively discuss how the growth plate developmental process can be used as a template for robust and reliable bone TE processes. Proof-of-concept for the basic idea that aggregates or constructs stem cells of (embryonal and postnatal bone marrow derived) after in vitro differentiation along the chondrogenic lineage (all the way up to hypertrophy) will result in bone formation after implantation in vivo has been delivered by several groups [1921].

3 Computational Tools for Product and Process Design in Tissue Engineering

Engineered products will only be a viable and competitive alternative to upcoming off-the-shelf innovations in regenerative medicine if they are manufactured with reproducible properties, a prerequisite for consistent clinical outcomes. This important target is mainly challenged by the intrinsic variability in the behaviour of human cells from different batches or donors as well as by the sensitivity of cells to perturbations in the culture environment [22].

Although research-oriented systems are generally too complex, user-unfriendly, unsafe and expensive for direct use in clinical applications, their underlying principles could nevertheless lay a solid foundation for more clinically compliant manufacturing systems. This will require not only the optimization of the TE product itself but also the identification of only the most essential processes, culture parameters and construct parameters that must be monitored and controlled to standardise production and provide meaningful quality and traceability data, at the same time minimise risks, costs and user complexity [22].

Below we describe a number of computational tools are being used to optimize the design of both TE products and processes as well as the in vivo result. These tools use a variety of modelling techniques building on physical, mechanical or (bio)chemical laws/equations and/or experimental data. The models can range from mechanistic (hypothesis-driven, white box) to phenomenological (data-driven, black box) and, depending on the specific application, range from the gene/protein level over the cellular level up to the tissue/organ level. Figure 1 shows an overview figure of different models that have been developed in the field of bone tissue engineering by the research group of the author. The overview in this chapter is by no means exhaustive but serves to illustrate different aspects of product and process design. The reader is referred to the other chapters of this book that provide a more in depth review on a number of the aspects mentioned here.

Fig. 1
figure 1

Overview of the different models developed in the research group of the author of this chapter. Models can be classified according to the origin of their development (phenomenological vs. mechanistic) or according to the length and time scales of the processes they describe (gene/protein, cell, tissue/organ). a Roberts et al. [23]; b Kerkhofs et al. [24]; c Geris et al. [25]; d Peiffer et al. [26]; e Carlier et al. [27]; f Geris et al. [28]; g Geris et al. [29, 30]

3.1 Computational Tools for Product Design

There are a number of aspects on the design of TE products where computational models can make and already have made contributions, not only in the way the carrier structure (when dealing with combination products or biomaterials only) is designed but also the way cells are processed prior to implantation.

Obvious aspects of scaffold design include their structural, mass transport and mechanical properties. Several chapters in this book describe the design and characterisation of hydrogels, frequently used as carrier structure in tissue engineering, but each focusing on a different aspect. Israelowitz et al. [31] argue that in order to define the correct position of e.g. collagen in the fibre network arrangement of extracellular matrix, which is important to determine its tensile strength, an optimized tertiary structure of the protein needs to be characterized. They provide an introduction into the different methods that are currently used to determine protein conformation in silico. On a higher length scale, Nekouzadeh et al. [32] describe the development of a mechanical model to design and evaluate engineered tissues and/or carrier structures (such as hydrogels) that serve a mechanical role. An important component of such models is often viscoelasticity, or the dependence of mechanical response on loading rate and loading history. In a great number of biological and bio-artificial tissues the passive tissue force (or stress) relates to changes in tissue length (or strain) in a nonlinear viscoelastic manner. Choosing and fitting nonlinear viscoelastic models to data for a specific tissue can be a computational challenge. Nekouzadeh et al. [32] describe the range of such models (focusing in particular on adaptive quasi-linear viscoelastic models), criteria for selecting amongst them, and computational and experimental techniques needed to fit these to uniaxial data. Additional to structural mechanics, mass transport is an important property of hydrogels which can influence the behaviour of cells encompassed in these hydrogels in various ways. Lambrechts et al. [33] provide a thorough overview of how consumption and production of soluble medium components gives rise to gradients inside hydrogels and how mass transport related phenomena can shape these gradients. The authors focus on the combined use of experiments and mathematical modelling and describe not only how the simulation results play an important role in generating information that can help in unravelling mechanisms that drive solute transport but also genuine efforts that have been taken to translate this information into real TE set-ups.

Another type of carrier material that is often used, mainly in musculoskeletal tissue engineering, is a (macro-) porous scaffold. Similar to the hydrogels discussed above, these scaffolds allow supporting mechanical loading and mass transport. Olivares and Lacroix [34] review the computational methods applied to characterize scaffold morphology and simulate different biological processes in and around these scaffolds. These processes include cell seeding, cell migration, cell proliferation, cell differentiation, vascularisation, oxygen consumption, mass transport and/or scaffold degradation. Song et al. [35] describe how using a combination of computational fluid dynamics and finite element analysis allows to predict flow regimes within scaffolds and to optimize flow rates to deliver mechanical cues during cell seeding and subsequent cell behaviour. They furthermore demonstrate how computational modelling can be used to optimize spatiotemporal mechanical cue delivery and mechanically modulated biochemical gradients through optimization of scaffold geometry, material behaviour and mechanical properties.

Besides the scaffold’s physical properties, also its chemical properties (e.g. its release properties) can have a substantial influence on the overall behaviour of the TE construct. Chemicals released form the scaffolds can either be dissolved components of the scaffold material itself (e.g. the release of soluble calcium from calcium-phosphate-collagen scaffolds) or substances that were added to the scaffold structure for delivery in vivo and have a specific biological function (e.g. controlled release of growth factors). Mathematical models have been developed describing these release processes and have been applied to determine in silico optimal scaffolds for a variety of biomedical problems, e.g. Carlier et al. [27] and many others.

For TE products that include (or are solely consisting of) a cellular component, models have been developed to investigate aspects ranging from storage over proliferation and selection to implantation strategies. Cincotti and Fadda [36] describe a model of the cryopreservation process of cell suspensions, a critical step in tissue engineering. The model is based on bio-physical properties and takes into account size distribution of the cell population. After validation, the authors have used their model to investigate the effect cell size distribution on system behaviour under various operating conditions showing that under commonly used operating conditions, intra-cellular ice formation may be lethal for the largest cells in the population. In addition to cell size, cell populations are also heterogeneous in various other functional and molecular aspects. Galle et al. [37] review the most recent results on heterogeneity in mesenchymal stem cells (MSC) and introduce a mathematical framework that approaches MSC heterogeneity on the single cell level. This framework is capable of describing the impact of MSC heterogeneity on in vitro expansion and differentiation and can be used to investigate MSC adaptation to changing environments and the cell’s intrinsic control of state fluctuations. Prior to implantation, the quality of the cells needs to be assessed in order to guarantee a safe and effective therapy. As this quality check should preferably be non-invasive, complete (meaning all cells and not only a sample should be checked), real-time and predictive of further clinical therapeutic effect, conventional cell biology techniques are ruled out. Sasaki et al. [38] discuss the potential of image-based quality assessment by implementing machine learning models to connect biological phenomena with the measurements. After storage and quality assurance of the cell population, the timely administration of the appropriate concentrations of cells in the correct location is another crucial point where computational modelling can be an interesting tool. Geris et al. [39] have investigated the administration of MSCs in and at the fracture site of atrophic non-unions by means of a computational model and have corroborated their simulation results through comparison with the results from a pilot experiment (which was based on the in silico predictions).

3.2 Computational Tools for Process Design

Nature uses a very complex system of regulatory mechanisms compounded by a huge amount of redundancy. Systems biology and bio-informatics are just beginning to unlock the huge amount of information that is hidden within the human genome. From this huge amount of info a limited number of functional regulators (targets or markers) needs to be distilled that are indicative of the progress of the biological process in vitro and can hence be used to control the TE manufacturing process. These regulators are not necessarily restricted to biological parameters but can also be properties of the carrier structure and culture environment. These regulators are part of an intricate network that is too complex to be interpreted without the help of in silico modelling.

An additional challenge when dealing with biological processes is how to extract knowledge on these regulators from the relatively few process states that can be measured on-line [40]. In this context, the monitoring and control of bioreactor systems will be crucial at the research stage of product development, in order to identify these key regulators and to establish standardised production methods [41]. A mathematical model of the process is a cornerstone for modern control approaches such as model-based predictive controllers (MPC). Therefore, a complete design of an automatic bioreactor system should include the development of a good model, which should be complete enough to fully capture the process dynamics at interest and should also be capable of allowing the predictions to be calculated but at the same time, it should be intuitive and permit theoretic analysis [42].

Various types of models can be used as long as they allow accurate predictions of the most important process output(s) and are compact enough to be implemented in the bioreactor system. In many control applications black box models are used (e.g. impulse response models, step response models, transfer function models, state space models, neural networks, etc.) that describe the process under consideration based on data of dynamic experiments (dynamic data-based models). They have the advantage that they are compact, allow accurate predictions of the process behaviour and are easy to implement in a model-based control framework. However, an important drawback of these models is that they are not based on knowledge of the system and as such are difficult to interpret in a biological way. At the other end of the spectrum, there are mechanistic models (white box) that are knowledge based and as a result often (much) more complex than data-based models. A hybrid (grey box) approach has been developed that combines the advantages of both the dynamic data-based modelling approach and the mechanistic modelling approach into a so called data-based mechanistic (DBM) modelling approach [43]. DBM models can be developed in different ways, but one commonly used approach is to start from available mechanistic models that then will be reduced in complexity by applying sensitivity analysis and principal component approaches (e.g. [44, 45]). Parameters of the reduced order model structure can be estimated in a time varying way by using e.g. a recursive instrumental variable estimation method using data from dynamic experiments with the bioreactor system to be controlled.

Mechanistic (white box) models of in vitro bioreactor processes have been repeatedly proposed in the literature. O’Dea et al. [46] provide an overview of the models that use continuum modelling techniques to investigate how the different underlying processes interact to produce functional tissues for implantation in cell-seeded porous scaffolds. They aim to demonstrate how a combination of mathematical modelling, analysis and in silico computation, undertaken in collaboration with experimental studies, may lead to significant advances in the understanding of the fundamental processes regulating biological tissue growth and the optimal design of in vitro methods for generating replacement tissues that are fully functional. Raimondi et al. [47] discuss, also for cell-seeded porous scaffolds, the need for and the advances in the use of multiphysics and multiscale mathematical models. They describe various possible approaches to couple biomass growth, medium flow and mass transport in a single model. Furthermore they discuss recent advances in scientific computing techniques that are needed to implement these multiscale/multiphysics models as well as new tools that can be used to experimentally validate the computational results.

Besides the control of the bioreactor process, the design of the bioreactor set-up itself can play a major role obtaining the desired results. Bjork et al. [48] use computational models focusing on the dissolved oxygen transport to design bioreactor set-ups for engineered vascular tissues that improve transport, particularly by perfusion of medium through the interstitial space by transmural flow. Their computational models, supported by empirical data, specifically investigated designs that would eliminate nutrient gradients evident during static culture methods, in order to develop more uniform engineered vascular tissues which would lead to improved mechanical properties of the resulting construct.

3.3 Computational Tools for the Study of the In Vivo Process

Although according to the developmental engineering concept [48], the establishment of robust modular tissue intermediates in vitro should lead to the desired high-quality outcome in vivo, the effect of the in vivo environment is an important unknown that requires thorough experimental investigation. From the obtained experimental results mechanisms of actions can be proposed and subsequently mathematical models can be used to translate these mechanisms of action into a coherent set of mathematical equations. These equations form a quantitative spatio-temporal framework of interrelated biological variables and sub-processes, providing a dynamic and comprehensive overview of the entire repair process. As such, the mathematical models can help in interpreting the in vivo data (by establishing causal relations) on the one hand and generating new hypotheses on in vivo outcome (by running in silico experiments) on the other hand, in this way adding to the design and optimization of TE products and processes.

A myriad of models has been proposed in the literature describing various pathologies and in vivo regenerative processes. Watton et al. [49] have developed a fluid–solid-growth model to simulate the evolution of abdominal aortic aneurysms. The model uses a realistic constitutive model of the arterial wall accounting for a wide number of lower scale structures and processes. With the help of this model they were able to predict e.g. the development of tortuosity that accompanies abdominal aortic aneurysm enlargement. Besides providing a basis for further investigation and elucidation of the aetiology of aneurysm formation, the computational framework can also be applied to aid the design and optimisation of tissue engineered vascular constructs. In the field of bone regeneration Geris et al. [50] have reviewed the existing models of fracture healing, dividing these models into bioregulatory (fracture healing guided by biological stimuli), mechanoregulatory (fracture healing guided by mechanical stimuli) and mechanobioregulatory models (fracture healing guided by mechanical and biological stimuli). Nagel and Kelly [51] adapted a well-known mechanoregulatory model to explicitly account for the influence of oxygen tension on tissue differentiation. They furthermore discuss the effects of incorporating the tissue architecture during skeletal regeneration as well as the variability of the process. Reina-Romo et al. [52] discuss the importance of angiogenesis on both bone regeneration and TE. They describe the role of the vascular network in these processes as well as the most recent in silico models simulating the vascular network within bone constructs. They analyse discrete as well as continuum approaches from a computational perspective.

As mentioned above, simulation of the behaviour of a TE construct after implantation is another crucial aspect in the optimisation of TE products and processes. Lemon et al. [53] have developed a mathematical model of the regeneration of a tissue-engineered trachea seeded with cells in situ, in order to study the biological processes (e.g. stenosis) taking place after implantation for various designs of the TE construct (different cell seeding strategies). They provide an in depth discussion on the obstacles that are encountered when trying to formulate a faithful model of (any kind of) biological product or process. Furthermore they investigate how a simplified mathematical model that omits much detail of the biology can be of use for studying regeneration of a TE construct, using their model of a tissue-engineered trachea as an example.

4 Discussion

As shown above, computational tools have already investigated a wide variety of products and processes in the tissue engineering field. Whereas in the early models the distance between the computer and the bench was quite substantial, integration of (biological) experiments and simulation efforts are increasing. It has become evident that imaginative and refined experimental strategies based on genetics, imaging, quantitative and biophysical approaches, combined with the exploration of the fullest potential of mathematical modelling are necessary to understand cellular and developmental biology. The increased attention for this integrative approach can be appreciated from the initiatives that have been and are being taken by large funding agencies to promote this research, e.g. the Quantissue network [54] (funded by ESF-RNP) and the Physiome [55]/Virtual Physiological Human initiatives [56] (funded by agencies worldwide). The potential of this integrative research has already been demonstrated in a number of biomedical fields [5762]. For example, Faratian et al. [57] successfully used a systems biology approach to stratify patients for personalized therapy in cancer and provided further compelling evidence that a particular biomarker, appropriately measured in the clinical setting, could refine clinical decision making in patients treated with a specific therapy. In developmental biology, Von Dassow and co-workers [59, 60] showed by means of a computational model that the drosophila segment polarity genes constitute a robust developmental module. The simulation results provided important insights into the overall dynamics of the gene network and highlighted mechanistic details that require further experimental research.

With the increasing demand for more quantitative models, there is also an increasing attention for the determination of relevant parameter sets [6365]. Precise measurements of the different parameter values is in almost all cases impossible, either due to the fact that not all parameters represent physical processes (even when dealing with mechanistic or white box models) or because the physical property cannot be measured without altering the process. An example of the latter is the use of in vitro experiments to determine properties of in vivo processes. Classical system identification techniques, typically used in grey and black box approaches, will determine the parameter values as to fit the model to the system it is intended to describe. Depending on the system at hand and on the available experimental information, estimation theory or neural networks are commonly used concepts. Additionally, engineering concepts such as the design of experiments and optimal experimental design are finding their way into the biomedical sciences to increase the amount of information that can be retrieved from experiments while reducing the number of experimental runs required to obtain this information. Alternatively, or better yet concomitantly, to finding appropriate parameter values based on experimental results, many modellers apply techniques to investigate the impact of the chosen parameter values on their simulation results by means of sensitivity analyses. Sensitivity analyses appear under many different forms. The most frequently used technique is the one-at-the-time (OAT) analysis where only one parameter is altered (e.g. [66] to give but one example). This provides information on the main effects of this parameter but it does not provide any information on the combined effects or the interactions between different parameters. Design of experiment techniques have been successfully applied to mathematical models to overcome the limitations of the OAT technique [27, 67].

In the above discussion on the optimal way to determine parameter values for quantitative models, a completely different point of view is taken by a number of researchers. Gutenkunst and co-workers argue against the focus on optimizing experimental design to best constrain model parameters with collective fits as discussed above, particularly in cases when the understanding of a system is tentative and incomplete. An important consideration underlying their point of view is the question of how we should deal with uncertainties in the data [68], in the fitting of parameters, and in resulting predictions. Brown et al. rigorously explored one source of uncertainty in their model of growth-factor signalling in PC12 cells; their analysis considered not just the set of parameters that best fit the data but a statistical sampling of all parameter sets that fit the data [69, 70]. Like in many other systems [71], the space of parameter sets that could fit the data was vast. Perhaps surprisingly, some predictions were still very well constrained even in the face of this enormous parameter uncertainty. Brown et al. found a striking `sloppy’ pattern in the sensitivity of their model to parameter changes; when plotted on a logarithmic scale, the sensitivity eigenvalues were roughly evenly spaced over many decades. This sloppy nature was then further investigated by Gutenkunst and others [7274]. Even though sloppiness is not unique to biological systems, it is particularly relevant to biology [75] because the collective behaviour of most biological systems is much easier to measure in vivo than the values of individual parameters. Using sloppy parameter analysis, concrete predictions can be extracted from models long before their parameters are even roughly known [70], and when a system is not already well-understood, it can be more profitable to design experiments to directly improve predictions of interesting system behaviour [76] rather than to improve estimates of parameters.

5 Conclusion

In conclusion, this chapter has provided an overview of how computational modelling could contribute to advancing the tissue engineering field. Regardless of whether the models focus on the product, the process or the in vivo results, the aim is always to try to understand the biological process and to design strategies in silico to enhance the desired in vitro or in vivo behaviour. Finally, if models are to be applied in a quantitative way, experiments need be designed as to feed the models in the most intelligent way. Also here computational tools and models can play an important role.