Abstract
Metabolic pathways are enzyme-catalyzed reactions that can be described by the Michaelis-Menten equation. The inherent problem with Michaelis-Menten kinetics is that the model is not reversible. Reverse catalysis, which is common in metabolic pathways, can be better described by Cell Communication Protocol© which offers a Self-Organizing Map (SOM) for the visualization of the enzyme-catalyze reactions. Neural gas, an algorithm for finding optimal data, is added to the SOM and allows for an increase in the weight space to handle the potential complexity of enzyme-catalysis with the use of back propagation, the system allows for the modeling of reverse reactions.
Access provided by Autonomous University of Puebla. Download chapter PDF
Similar content being viewed by others
Keywords
- Michaelis-Menten kinetics
- Cell communication protocol
- Self-organizing map
- Neural gas
- Back propagation
- Metabolic pathways
1 Introduction
Analytic models have limitations when applied to biological applications, because the of both the simplicity and the complexity of the biological systems [1, 2]. Neural networks have provided a functional alternatives, since they can allow for processing large data systems [3]. Neural networks can be considered as descriptive modeling much like more traditional analytic tools, but neural networks can reduce the need for computer power for every parameter that is introduced by analytic models [4]. In this way they can be used for undertaking enormous challenges such as artificial modeling o of the brain [5, 6].
Metabolic pathways can be considered as a series of enzyme-catalyzed reactions where the product of one reaction becomes the substrate for subsequent reactions and where all reactions are branched and interconnected as a graph network. Metabolic pathways have traditionally been modeled by using Michaelis-Menten kinetics [7]. Although the equation’s stable predictions work well in a homogenous system, it becomes very limited when the model is complex, and especially for reverse reactions, and hence the Michaelis-Menten kinetics does not necessarily describe enzyme biology accurately. To overcome this limitation, particularly for complex system interactions in metabolic pathways [8], we propose to use neural networks via Cell Communication Protocol© by using:
-
1.
Self-Organizing Map (SOM) or Self-Organizing Feature Map (SOFM): a type of artificial neural network that is trained using unsupervised learning to produce a low-dimensional (typically two-dimensional), discretized representation of the input space of the training samples, and called a map. Self-organizing maps are different than other artificial neural networks in the sense that they use a neighborhood function to preserve the topological properties of the input space [9, 10].
-
2.
Neural Gas: The neural gas is a biologically inspired adaptive algorithm. It sorts for the input signal according to how far away they are. A certain number of them are selected by distance in order, and then the number of adaptation units and strength are decreased according to a fixed schedule. Neural gas is a special kind of SOM, but here the fixed neighborhood of the SOMs does not exist. As such, even high dimensioned feature spaces can be analyzed or clustered; this could only be done by SOMs in special cases as they are 2 dimensioned [11].
-
3.
Back Propagation (BPP): Propagation of error is a common method of teaching artificial neural networks how to perform a given task. It is a supervised learning method, and is an implementation of the delta rule. It requires a teacher that knows, or can calculate, the desired output for any given input. It is most useful for feed-forward networks (networks that have no feedback, or simply, that have no connections that loop). Back propagation requires that the activation function used by the artificial neurons (or “nodes”) is differentiable [12, 13].
The model can be tested against Kyoto Encyclopaedia of Genes and Genomes (KEGG) [14, 15], BioCyc database collection, MetaboLights, Molecular Ancestry Network (MANET) [16, 17] and Reactome [18].
2 Model
The neural network model (Fig. 1a) can be applied to metabolic pathways [19], and diffusion models. We considered three approaches to maximize the effectiveness of prediction by the neural network via Cell Communication Protocol.
The first approach was by training the self-organizing map (SOM) by using competitive learning. When a training example is fed into the network, its Euclidian distance to all weight vectors is computed. The ‘neuron’, whose weight is most similar to the best matching input (BMU) and the neuron closest to the SOM lattice, are adjusted towards the input vector; the update is given by [20]:
where, \(s\) is the step index, \(i\) index for the training set, \(u\) is the weight for BMU, for \(D\left( i \right)\), \(\theta \left( {u,v,s} \right)\) is the neighborhood function depending on the lattice, \(\alpha \left( t \right)\) is a monotonically learning function, \(D\left( i \right)\) is the input vector, \(v\) is all the visit vectors. The SOM option shows the developed network [21].
The second approach uses neural gas which considers a probability distribution, \(P\left( i \right)\) of data vectors \(i\). Every time the \(s\) step is a random choice, with the \(V_{j}\), \(j = 1, \ldots ,N\):
The second approach is useful to limit the pathway distances, i.e. it tells us the limitations of the metabolic pathway [22, 23].
The third approach is back propagation based on the delta rule (Fig. 1b) where the initial weight is random. During the training, the weights are obtained from the input layers.
The input is a set of tuplets \(\left( {i_{1} ,i_{2} , \ldots ,i_{n} } \right)\), and \(\left( t \right)\) corrects the output. Because the initial weights are random, the weight from the training set is:\(y = i_{1} w_{1} + i{}_{2}w_{2}\) (NB because back propagation treats each neuron in isolation, gradient calculation considers only one input at a time); therefore the square error is \(E = \frac{1}{2}\left( {t - y} \right)^{2}\)[24].
The neuron activation is given by
The third approach allows handling of reverse catalysis which is one of the limitations of the Michaelis-Menten Kinetics [25].
3 Method
Metabolic pathways are a series of reactions where one reaction produces the substrate for the subsequent reactions; neural networks can match the model. The initial random weight entry incorporates the constant parameters and particle density from an enzyme assay. Specifically, the input vector is given by weight obtained from density of the enzyme assay. To develop the chemical pathway the number of clusters representing an assumed network for the metabolic pathway, and a maximal range of clusters nodes and possible nodes for the pathway are entered. Figure 2, shows input vectors for the training set that sets up the actual training of the network.
4 Results
The diffusion across the training set is shown in Fig. 3a, where the upper cells are the initial vectors from the weight given by the density of the enzyme. The reference vector is the upper part; in this example the simulation considered 100 input neurons. The input vector is obtained by the weight of the close nodes (neurons) to the SOM lattices with adjustment [26].
Figure 3c shows the activity card, where the lattices are followed in real time. The second screen is the development of the network and shows the data point of the nodes which are also evolving in time: the red points are clusters or nodes representing products of enzyme catalysis, and the green lights are the references neurons where the pathway is being developed. The yellow is the reference input vector. In Cell Communication Protocol there is an option for changing the positions to alter the graph, the direction and the catalysis, all in real time.
Figure 3d shows the data representing the evolving network in two-dimensions; each cell is represented by the intensity of the color of the nodes, clusters and chemical reactions in the development of the metabolic pathway. In three-dimensions, the increased data improves the rate of accuracy. The 3-D visualization in Cell Communication Protocol is a torus, and the changes of color during the simulation gives a better handle of the data points of the metabolic pathway [27]. When the edges of the cells are joining, the topology is of the torus.
For the neural gas approach, an input vector is introduced to the Cell Communication Protocol, and similar to SOMs, the learning parameters are calculated from the input vector (see Fig. 3a) [28]. The input vector parameter is the density of the enzyme as per the previous version. SOMs are under constraints [29] of a predefined neighborhood between the neurons and the distances between the vectors that are not directly represented in the map [9, 30]. The output space is not defined in neural gas and the neural gas algorithm is based on a stochastic gradient decent cost function and is connected to a Hebbian competitive rule which forms the connections between the neighborhoods [31].
Figure 3b shows the neural gas data representation in two-dimensions. At the base, the neural gas prototype clusters move around in the data space similar to the Brownian movement of gas molecule in a closed container. The visualization shows the data structures, clusters, and outliers in 2-dimensions as a line and in 3-dimensions as an iris [32,33,34].
The back propagation is included in Cell Communication Protocol© to better mimic reverse catalysis of real biological metabolic pathways which are not considered by Michaelis-Menten Kinetics [35, 36].
Figure 4a shows the learning rule, which considers the desired output for any given input, and the momentum term which is a component that dampens the oscillations of the learning rule to provide direction. Figure 4b shows the overall error, where the rate of learning is given by the probability (y-axis) and the interactions in the x-axis.
5 Discussion
The particle diffusion (i.e. mobility) impacts the reaction of the enzyme kinetics. Collision theory gives the enzyme catalysis reaction, and the bimolecular reaction can be expressed as collision [37]. The original Stokes-Einstein diffusion equation was developed for small particles, but can also be used to predict for sizes up to micrometer [38,39,40]. The diffusion equation is a partial equation and used in concentrations where viscous forces are dominated by low Reynolds numbers [41]. From the low Reynolds number, the particle moves at a velocity proportional to the force applied:
where, \(\sigma\) is the proportionality constant (\(\sigma \propto {1 \mathord{\left/ {\vphantom {1 {r\eta }}} \right. \kern-0pt} {r\eta }}\)).
where, \(\text{Re}\) is the Reynolds number, \(\eta\) is the dynamical viscosity, \(\rho\) is density, and \(r\) is the radius of the particle.
The diffusion coefficient is given by:
where \(k_{B}\) is the Boltzmann constant and \(T\) is temperature.
From collision theory we can be relate to Michaelis-Menten Kinetics.
The limitation is given by the Michaelis-Menten kinetics:
where \(V_{\hbox{max} }\) is the maximum rate archive by the system, \(S\) is concentration of the substrate and \(K_{m}\) is the Michael constant, which is half of the \(V_{\hbox{max} }\).
The initial weights (e.g. by Michaelis-Menten kinetics) for the SOM, produces a graph for a particular network, where each cell shows a catalyst reaction, Fig. 5a.
Figure 5b shows the winner neuron and its distance (i.e. the initial catalysis reaction), and also shows how the SOM has learnt, as well as how far the last reaction is from the initial catalysis reaction [42]. Figure 5c shows the actual reaction with the red being the initial input vector and the white marks are the number of the neighboring neurons to the reference neuron and their arrangement.
Figure 5d is the same representation for neural gas, where X-Pos are the reference neuron, and where the weight is from Michaelis-Menten kinetics. The winner cells took 13 iterations from the initial catalysis.
Figure 5e shows the iteration and the probability error of the back propagation. The Michaelis-Menten Kinetics weights are derived from the relationship between the Probability of diffusion and Michaelis-Menten Kinetics. The probability is obtained from Ficks First Law.
For proofing the theoretical assumptions above we used pathway data sets from The Kyoto Encyclopaedias of Genes and Genomes (KEGG) [43, 44] and trained the SOM with these data. We selected two different features of the pathway data. The first feature vector coded the number and distances of the pathways [45] and the second vector the metabolism characterized by the chemical substances involved and their concentrations. In both cases we expect that the trained neural classifier would be able to separate/identify different path-ways constellations if data sets from different experiments were presented to the neural net [46]. The theory of the neural nets (and therefore for every kind of SOM) every constellation of measured data can be combined (used) to form an input or training vector [47].
Figure 6a shows an untrained trained SOM with 100 neurons (green crosses) in a 2 dimensional feature spaced which was normalized in an [0, 1] interval. This normalization is used very often, to prevent numerical overflows in the calculation during the training phase of the SOM. The shown 10 red point in Fig. 6a represent the constellation of a pathway metabolism.
After 15,000 training iterations (requiring about 4 min of computational time) the green neurons adapted to the model noted by localization of the green neurons near the red pathway model feature points as shown in Fig. 6b.
In Fig. 6c similar data sets from other measurements has been presented to the net. As can be pointed out, only one of them met the classifiers structure (the yellow neuron in the middle of Fig. 6c), the other data sets—represented by the 2 white and the 10 purple neurons—present similar but not the same structure as the trained ones.
One of the major advantages of neural nets as be able to chosen method allows defining (e.g. out of theoretical consideration) an arbitrary input vector which can be validated by SOM regarding its fitness [48]. Occasionally, especially if the feature vector and/or the model was more or less complicated, it was be necessary to enlarge the number of neurons of the SOM and/or the number of training cycles [49]. This case is shown in Fig. 6d and e.
Figure 6d shows—due to the fact that now 44 features (red points (model feature points) in Fig. 6d and e) represent the model—the SOM based classifier now bears 400 neurons, but even after 5.426 training cycles the net structure has not adapted the red points (the system forming the classifier).
It took 7,400 training cycles more until the net structure fits the system structure, as shown in Fig. 6e.
Neural nets should not be relied upon for automatic supervision and deciding [50] if a net structure fits or if the model or system will adapt. But, this model allows a visualisation routine to aid the user to find and define a satisfactory decision if a neural net met a desired classification or not [51].
Figure 6f shows the common visualisation of a SOM activity. Here the so called winner neuron (the yellow neuron, left hand side to the SOM activity card) is represented in orange, limited by a black square at the top. The activity of the other neuron—from light orange over yellow to dark green are surrounding the winner, but—and this is most important for our discussing—no information about similarity or identity can by pointed out by this visualisation. So it is the special way of visualisation we discuss here, which enables a deeper model analysis and/or competition of measured data set.
6 Conclusion
Metabolic pathways are difficult simulated with Michaelis-Menten kinetics that do not consider the reverse catalysis. By using a neural network, Self-Organizing Maps, and back propagation, this methodology can overcome the Michaelis-Menten limitations. Self-Organizing Maps are used to graphically describe the catalysis reaction, with the constraint of the weight space being overcome by the neural gas algorithms [52,53,54,55,56], and by doing so achieves a visual display of the catalysis and expands the weight space to improve the accuracy of the modeled reaction [57,58,59,60]. With the use of the back propagation the model can overcome the inherent limitation of Michaelis-Menten, and the Cell Communication Protocol© provides a set of solutions for a descriptive and analytical analysis of metabolic pathways.
References
Israelowitz, M., Rizvi, S.W., Kramer, J., von Schroeder H.P.: Computational modeling of type I collagen fibers to, determine the extracellular matrix structure of connective tissues. Protein Eng. Des. Sel. 18, 329–335 (2007)
Israelowitz, M., Weyand, B., Rizvi, S.W., Gille, C., von Schroeder, H.P.: Protein Modeling and Surface Folding by Limiting the Degree of Freedom. Springer, New York Dordrecht London (2013)
Sole, R., Delgado, J.: Universal computation in fluid neural networks. Complexity 2, 49––55 (1997)
Bahiraei, M., Hosseinalipur, S., Zabihi, K., Taheran, E.: Using neural network for determination of viscosity in water-TiO2 nanofluid. Adv. Mech. Eng. (2012)
Reuter, M., Lenk, K., Schroeder, O., Gramowski, A., Jügelt, K., Priwitzer, B.: Information extraction from biphasic concentration-response curves for data obtained from neuronal activity of networks cultivated on multielectrode-array-neurochips. Presented at the BMC Neuroscience January (2010)
Lodhi, H., Muggleton, S.: Modeling metabolic pathways using stochastic logic Programs-Base Ensemble methods. Comput. Methods Syst. Biol. 119–133 (2005)
Duggery, R.G., Clarke, R.B.: Experimental design for estimating of the Michaelis-Menten equation from progress curves of enzyme-catalyzed reactions. Biochim. Biophys. Acta. 1080, 231––236 (1991)
Keener, J., Sneyd, S.: Mathematical Physiology I: Cellular Biology. Springer
Kohonen, T.: Self-organized formation of topologically corrected feature map. Biol. Cybern. 43, 59–69 (1986)
Reuter, M.: Of the Stability of Closed Self Organising Maps (gSOMs) for Predictive Control. Presented at the, Lyon, France (2008)
Martinez, T., Berkovich, S., Schulten, K.: Neural Gas Network for Vector quantitation and its application to time series prediction. IEEE-Trans. Neural Netw. 4, 558–569 (1993)
Rumelhard, D.E., Hinton, G.E., William, R.: Learning representations by back propagation errors. Nature 323, 533–536
Gorban, A.N., Zinovyed, A.: Principal Manifolds and graphs in practice: from molecular biology to dynamical systems. Int. J. Neural Syst. 20, 219–232 (2010)
Kanehisa, M., Goto, S., Kawashima, S., Okuno, Y., Hattori, M.: The KEGG resource for deciphering the genome. Nucleic Acids Res. 32(Database issue), D277–80 (2004)
Kanehisa, M., Goto, S., Hattori, M., Aoki-Kinoshita, K.F., Itoh, M., Kawashima, M.: From genomics to chemical genomics: new developments in KEGG. Nucleic Acids Res. 34 (Database issue), D354–7
Salek, R.M., Haug, K., Conesa, P., Hastings, J., Williams, M., Mahendraker, T., Maguire, E., González-Beltrán, A.N., Rocca-Serra, P., Sansone, S.A., Steinbeck, C.: The MetaboLights repository: curation challenges in metabolomics, 1. Database. 2013, bat029 (2013)
Kim, H.S., Mittenthal, J.E., Caetano-Anolles, G.: MANET:tracing evolution of protein architecture in metabolic networks. BMC Bioinf. 7
Croft, D., O’Kelly, G., Wu, G., Haw, R., Gillespie, M., Matthews, L., Caudy, M., Garapati, P., Gopinath, G., Jassal, B., Jupe, P., Kalatskaya, I., Mahajan, S., May, B., Ndegwa, N., Schmidt, E., Shamovsky, V., Yung, V., Birney, E., Hermjakob,, H., d’ Eustachio, P., Stein, L.: Reactome: A database of reactions, pathways and biological processes. Nucleic Acids Res. 39 (Database issue), D691–D697
Gorban, A.N.: Principal Manifold for Data Visualization and Dimension Reduction. Springer
Shah-Hosseni, H., Safabakhsh, R.: TASOM: A new time adaptive self-organization map. IEEE Trans. Syst. Man Cybern.-Part B Cybern. 271–282 (2003)
Reuter, M., Bohlmann, S.: Supervising MultiCut Aggregates by Special Neural Nets. Presented at the WAC 2012, Puerto Vallarta, Mexico (2012)
Canales, F., Chacon, M.: Modification of the growing neural gas algorithm for cluster analysis. Presented at the International Association for Pattern Recognition, Image Analysis and Applications 12th Iberoamerican Congress on Pattern Recognition, CIARP 2007, Vińa del Mar-Valparaiso, Chile, November 113–17 (2007)
Reuter, M.: Supervising cathodic protected gas nets with CI-based methods. Presented at the ISC’2013, 11th Annual Industrial Simulation Conference, Ghent, Belgium, 22 May 2013
Rojas, R.: Neural Networks- A Systematic Introduction. Springer-Verlag, Berlin, New York (1996)
Goudar, C.T., Sonnad, J.R., Duggley, J.R.: Parameter estimation using a direct solution of integrated Michaelis-Menten equation. Biochim. Biophys. Acta- Protein Struct. Mol. Enzymol. 1424, 377–383
Kagans, I.: Time-dependent self-organization maps, (1994)
Kurasova, O., Molytẻ, A.: Quality of quantization and visualization of vectors obtained by neural ga and self-organizing map by neural ga and self-organizing map. Informatica 22, 115–134 (2011)
Estẻvez, P.A., Figueroa, C.J.: Online data visualization using neural gas network. Neural Netw. 19, 923–934
Kolbe, L., Tünnerman, R., Hermman, T.: Growing neural gas sonification model for interactive surfaces. In: Proceeding ISon 2010 3rd Interactive Sonification, KTH, Stockholm,Sweden (2010)
Kaski, S.: Data Exploration using self-organizing maps. Acta Polythecnica Scand. Math. Comput. Manag. Publ. Finn. Acad. Technol. 57–60 (1997)
Martinez, T.M., Martinez, K.J.: Topology representing networks. Neural Netw. 7, 507–522
Liu, K., Liu, P.: Color model based 3-D self-organizing map, information visualization. Inf. Vis. IV pp. 403–408 (2004)
Xinzhi, L.J.: Visualization of high-dimensional data with relational perspective map. Inf. Vis. 3, 49–59 (2004)
Ito, M., Myoshi, M.: The characteristic of torus self-organizing map. Presented at the In Proceeding 16th Fuzzy Systems Symposium Akita, Japan Society for Fuzzy and Systems, Japan (2000)
Riedmiller, M., Braum, H.: A direct adaptive method for faster backpropagation learning: The RPROP Algorithm, Neural Networks. Presented at the IEEE International Conference (1993)
Riedmiller, M.: Advanced surpervised learning in multi-layer perceptrons—from backpropagation to adaptive learning algorithms. Comput. Stand. Interfaces. 16, 265–278
Blanch, B.H., Clark, D.S.: Biochemical Engineering. Marcel Dekker, New York (1997)
Bremmell, K.E., Wissenden, N., Dustan, D.E.: Diffusing probe measurements in newtonian and elastic solutions. Adv. Colloid Interfaces Sci. 89–90, 141–154
Norris, D.A., Sinko, P.A.: Effect of size, surface charge, and hydrophobicity on the translocation of polystyrene microspheres through gastrointestinal mucin. Appl. Polym. Sci. 63, 1481–1492 (1997)
Crocker, C.J.: Measurement of hydrodynamic correction to the Brownian motion of two colloid spheres. J. Chem. Phys. 106, 2837–2840 (1997)
Weyand, B., Israelowitz, M., von Schroeder, H.P., Vogt, P.: Fluid dynamics in bioreactor design: considerations for the theoretical and practical approach. Bioreactor Systems for Tissue Engineering. pp. 521–268. Springer, Berlin, Heidelberg, New York (2009)
Reuter, M.: Computing with Activities V. Experimental proof of the stability of closed self organizing maps (gSOMs) and the potential formulation of neural nets. Presented at the WAC 2008,, Waikoloa, Hawaii, USA (2008)
Caetano-Anollés, G.: Database was retrieved from SCOP 1.67, KEGG Enzyme, and Phylogenetic tree of Protein Fold Architecture, (2004)
Retrieved September 2014 from http://www.genome.jp/kegg/pathway.html#metabolism
Lotz K., Bilonu L., Roska T., Hamori J.: A cellular neural network model of the time coding pathway of sound localization-hyperacuity in time, Neural Networks, 1996., IEEE International Conference on (Volume:2). IEEE (1996)
Lampert, C.H., Nikisch, H., Harmeling, S.: Attributed-base classification zero-shot-visual object categorization. IEEE Trans. Pattern Anal. Mach. Intell. 36, 453–465 (2014)
Grunz, A., Memmert, D., Perl, J.: Tactical Pattern Recognition in soccer games by means of self-organizing maps. Human Move. Sci. 31, 334–343 (2012)
Komendantskaya, E.: Unification neural networks: unification by error-correction learning. Logical J. IGPL 19, 821–847 (2011)
Kröger, B.J., Kannampuzha, J., Kaufmann, E.: Associative learning and self-organization as basic principles for simulating speech acquisition, speech production, and speech perception. EPJ Nonlinear Biomed. Phys. (2014). https://doi.org/10.1140/epjnbp15
Stein, G., Chen, B., Wu, A.S., Hua, K.A.: Decision tree classifier for network intrusion detection with GA-based feature selection. ACM-SE 43(2), 136–141 (2005)
Twomey, J.M., Smith, A.E.: Performances measures consistency, power for artificial neural networks models. Mathl. Comput. Model. 21, 243–258 (1995)
Reuter, M., Bohlmann, S.: Automatic Detection of Buried Utilities in Georeferenced Multi-Sensor Data with Neural Networks. Presented at the TOK, Izmir, Turkey August (2011)
Goodwin, C.R., Sherrod, S.D., Marasco, C.C., Bachmann, B.O., Schramm-Sapyta, N., Wikswo, J.P., McLean, J.A.: Phenotypic mapping of metabolic profiles using self-organizing maps of high-dimensional mass spectrometry data. Anal. Chem. 86, 6563–6571 (2014)
Ueno K., Mineta K., Ito K., Endo T.: Exploring functionally related enzymes using radially distributed properties of active sites around the reacting points of bound ligands. Struct. Biol. (2012). http://www.biomedcentral.com/content/pdf/1472-6807-12-5.pdf
Vikas, Chaudhary V., Ahlawat, A.K., Bhatia, R.S.: Growing neural networks using soft competitive learning. Int. J. Comput. Appl. 21, 1–6 (2011)
Bishop, G.M., Svense, M., Williams, K.J.: GTM: The generative topographic mapping.”. Neural Comput. 10, 215–234 (1998)
Psichogios, D.A., Ungar, L.H.: A hybrid neural network-first principles approach to process models. AIChE J. 38, 1499–1511 (1992)
Fernandes, F.A.N., Lona, M.F.L.: Neural network applications in polymerization. Braz. J. Chem. Eng processes. (2005). https://doi.org/10.1590/S0104-66322005000300009
Khan S., Xekalakis P., Cavazos J., Cintra M.: Using predictive modeling for cross-program design space exploration in multicore systems. In PACT. (2007) http://homepages.inf.ed.ac.uk/mc/Publications/pact07.pdf
Hessine, M.B., Saber, S.B.: Accurate fault classifier and locator for EHV transmission lines based on artificial neural networks. Math. Problems Eng. 2014, 1–19 (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Israelowitz, M. et al. (2021). Neural Networks for Modeling Metabolic Pathways. In: Israelowitz, M., Weyand, B., von Schroeder, H., Vogt, P., Reuter, M., Reimers, K. (eds) Biomimetics and Bionic Applications with Clinical Applications. Series in BioEngineering. Springer, Cham. https://doi.org/10.1007/978-3-319-53214-1_12
Download citation
DOI: https://doi.org/10.1007/978-3-319-53214-1_12
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-53212-7
Online ISBN: 978-3-319-53214-1
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)