Keywords

These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

1 Dynamic Instability and Natural Order

In 1963 an event took place that soon would help to significantly change the way we look at reality in all the sciences, including physics: Lorenz discovered deterministic chaos, whose foundations were laid in 1889 by Poincaré with the three-body problem. Lorenz showed how in order to have a chaotic behavior of a dynamical system, a very simple model of nonlinear differential equations was sufficient. In this case, in fact, despite the strict determinism of Newton’s law, one is faced with chaotic system behavior caused by the extreme sensitivity of the solutions of the equations to the initial conditions. So, it happens that two states, as similar to each other as you like, will distance themselves (become dissimilar) from each other exponentially over time. From the impossibility, not just practical, but in principle, to define the initial conditions with infinite precision, there descends therefore a substantial system state unpredictability that becomes less and less able to be predicted with the growing interval of time elapsed from the initial instant. Behold, then, the concept of deterministic chaos, a type of chaos that involves the fact that an exponential increase in knowledge of the present is required to maintain a significant contact with the past and future evolution of the system [30]. We find the root of randomness and unpredictability not in external reality or in the subject treated as separate domains, but in the persistent relationship that exists between the time of evolution for specific activities of the source and of the cognitive agent. From an effective point of view, chaos essentially mixes state space trajectories. The process of “stretching-folding” leads to a generalized dispersion of the points over the whole attractor, rendering any kind of forecast impossible. Chaos, then, is randomness, but a randomness that has a deterministic basis, which is linked to the defining of a very precise setup, which has elements of regularity and that is characterized by a coupled game of external constraints and fluctuations [8]. Chaos, in other words, is a systematic construction of a collective of Bernoullian stamp within which are expressed at the same time, according to specific conditions, principles of invariance and pure irregularities. It is, in other words, a random process that came to articulate itself with respect to one or more specific, contextual constraints, that intervene on a random process already underway [9]. Thus in all disciplines of the 1960s and 1970s (parallel to the studies of Monod) new languages were born, suitable to represent the properties of systems characterized by a functional and structural complexity that prevents one from deducing what they are from those of their constituents. They are based on the insufficiency of reductionism as the only valid scientific method, accepting the irreducibility of the various different levels of organization of such systems and the impossibility to find comprehensive explanations of their properties without resorting to historical and evolutionary categories (biological organisms, mind, social organization, economies). According to the school of Brussels the phenomena of irreversibility and self-organization rest upon a well-defined microscopic base. The basic idea that underlies the work of this school is that irreversibility is closely linked to the notion of dynamic instability. In the prediction of the behavior of unstable systems, in fact, it is not our lack of knowledge that is at play, but the dynamic nature of the system [31]. Therefore, dynamic instability is at the origin of the concept of probability and not vice versa. To clarify the meaning of this statement simply remember how, for Prigogine [33], by subjecting a particular type of system to a given constraint, we can obtain as a result an increase in entropy that is related, at the same time, to the emergence of a phenomenon of order [34]. The mechanism underlying this type of phenomenon is, essentially, an amplification mechanism of the fluctuations. Far from equilibrium there is an amplification of fluctuations that opens the way for a series of varied possibilities. Non-equilibrium thermodynamics deals with systems that have exchanges with the environment, systems in which the change in entropy is related not only to processes that occur within the system, but also to the flows of matter and energy between the system and environment. In this type of system, the decisive quantity is no longer the entropy, but the production of entropy, the entropy change per unit time with respect to the processes taking place within the system [8]. As is well known, since 1967, Prigogine defines these systems “dissipative structures”, or structures that are a form of supermolecular organization. In these systems, therefore, unlike equilibrium thermodynamic systems where the balance is associated with the fall towards the most likely and least ordered state, the flow of matter and energy constitutes a driving force that generates order.

Applying this theory to biology by following the thoughts of the Russian scholar, you can infer that the structures adapt to external conditions, showing a kind of pre-adaptation mechanism and that, in conditions of being far from equilibrium, the matter begins to be able to perceive differences in the outside world (such as gravitational or electrical fields); this might not make any sense whatsoever in terms of balance, because matter is blind to balance [35]. From this perspective life appears much less opposed to the normal laws of physics, much less in a fight against them to prevent its normal fate, which would be its destruction. On the contrary, life seems to express somehow precisely the conditions in which it is immersed our biosphere, if account is taken of all the nonlinearities of chemical reactions and conditions of distance from equilibrium that solar radiation imposes on the biosphere [36].

2 The Systemic Vision

During work on the laws of chaos Prigogine and others showed how Boolean networks are logical-mathematical models (though limited) of a large class of non-linear dynamical systems [32]. Attractors of these networks can simulate the natural object of interest. From a biological point of view, according to Kauffman [22], one can hypothesize that these attractors match the cell types, while from a cognitive point of view it is possible to interpret these attractors as the natural classification that the network does of the outside world. These findings represent a prudent widening of the results obtained in the field of non-equilibrium thermodynamics. In particular it is important to point out, in this regard, that this enlargement concerns, first of all, the nature and dynamics of differentiation processes, the link, in perspective, which exists between these processes and the subsequent formation of particular basins of attraction. In such circumstances, therefore, in my opinion, it is possible to affirm that Kauffman, using the languages of Dynamics to interpret the biological phenomena, develops a mathematical model (plausible from a biological point of view) to enter the mystery of ontogeny in a broader theoretical framework in which biology suddenly finds itself in “dialogue” with other skills such as mathematics, physics, chaos theory, computer science and systems theory. Well, in this context, through the theory of randomly built Boolean networks, the American biochemist responds definitively to the need of Theoretical Biology Club (represented by the line of research begun by Waddington [38] with studies on canalization and genetic assimilation) to build a new organicistic non-vitalist paradigm of development in which biology is endowed with that power of logical and mathematical explanation which the physical sciences have always had. With this in mind, then, Kauffman, giving a mathematical code and a new method of approach to biology, offers, at the same time, an effective model able to give body to the original theorizing that overcame the classical dichotomy between mechanism and vitalism and which Waddington, in the 1940s, defined as the third systemic way, which can be summarized schematically in the following points: (a) life is a phenomenon that is not solely determinable by the physical and chemical laws to which nevertheless it is bound; (b) neither is there a special property of life, an intangible ingredient that directs its course; (c) the secret of functioning of living systems is the layering of evolutionary levels, irreducible to one another yet interacting; (d) the passage from one level to another corresponds to the succession of emergent properties, produced by interactions between the different evolutionary units of each level; (e) the living object in its entirety is given by its morphological and functional organization; (f) this organization is in a vital relationship at the same time of continuity and autonomy in relation to the physical principles [39].

In light of all this, then, the systemic vision of Waddington, in my opinion, can be summed up, in agreement with the results obtained by Prigogine and Kauffman, as follows: life is an emerging phenomenon that develops when the molecular diversity of a prebiotic chemistry system exceeds a certain level of complexity. If this is true, then life is not located in the individual property of each individual molecule, but is a collective property of systems of molecules interacting with each other. From this perspective, life emerged as a whole and has always remained a whole. In the whole that emerges and self-reproduces there is no vital force or foreign substance.

But the collective system does possess a stunning property not possessed by any of its parts. It is able to reproduce itself and to evolve. The collective system is alive. Its parts are just chemicals [23, p. 24].

In this way, therefore, in transition from the theory of dissipative structures to the theory of self-organized Boolean networks we can actually perceive the development of a coherent and continuous research aimed at identifying the general principles that characterize the deep reality of that mysterious self-organization that marks the complexity of the bios.

3 Meaningful Complexity, Self-organization and Biological Information

The understanding of deep processes of self-organization concerning biological systems requires today a systemic and interdisciplinary approach. As highlighted by recent studies in the framework of an extended theory of Meaningful Complexity [9, 17, 20], there is more to the simple examination of Markovian-style dissipative phenomena, but one comes to consider the phenomena of the coupled processing and transformation of information present at the level of the subsequent constitution of a biological system characterized by information processing itself. From this context, from the end of the 1980s, some studies have tried, first of all, to illuminate the inner biological articulation, at the biological and cognitive level, of that particular process constituted by the emergence of teleonomical and intentional structures that underlie the processes of life in an evolutionary and co-evolutionary framework [28, 40]. This emergence is essentially tied to precise procedures for transmission and assimilation of in-depth information that arise, however, at the semantic and probabilistic level and that determine the subsequent articulation of a specific biological “code” considered as an operative synthesis of function and meaning, that arises as an effective support for the dynamic constitution of precise teleological structures [4, 5, 10]. Here we can recognize with accuracy that particular interweaving of complexity, self-organization, intentionality and emergence of meaning that characterizes the natural forms of cognitive activity of any living system. The analysis on the results obtained by the Human Genome Project, in fact, both reductionism and naive holism (already refuted in 1953 with the discovery of the DNA Double Helix) are definitely outweighed by a new theoretical synthesis which does deal with the idea of emergence of meaning in which the parts interact with each other and the whole giving rise to systemic circularity wider than that outlined for example, by Waddington and the early Kauffman. What characterizes the bios, therefore, is no longer only teleonomy as conceived by Monod (design without intention), but there begins to emerge the conception according to which life is inextricably linked to the idea of meaning, intentionality and memory (from the cell, to the immune system, to the apparatuses, all the way to the mind). According to this new interpretative framework which addresses the emerging qualities, life not only appears to be tied to a program written in the double helix, but, above all, is linked to the circularity of distributed programs related to self-programming, or the idea of biological meaning [14, 15]. The genetic information of the organism does not reside in the initial conditions of the dynamic process of ontogeny, but in distributed programs that govern new information and make it impossible, given the initial conditions, to predict with certainty the final state of the organism in question. Today we are aware of the hidden mechanism that allows the DNA, through the genetic code, to control the synthesis of proteins: the dynamics of auto-programming, in fact, is the same functionality of the genome that creates the genetic information [27]. The secret of self-organization that escaped Monod, which we identified in the concept of biological meaning, can be, in other words, identified in that creative function which generates the syntax (the genetic information of DNA nucleotides) and which is the basis of life [3, 18]. The biological meaning, in my opinion, is the “hidden face” of genetic information, the creating and organizing function that responds to a mathematics which are, in many aspects, as of yet unknown, a mathematics, for example, of the infinite that goes beyond Cantor’s theorem and Kolmogorov’s complexity theory, which could explain those highly complex phenomena and currently not completely explained, unpredictable and non-measurable by human reason using only statistic rarity or computational incompressibility. We have attributed the term deep reality to this foundation of life that exists, but is not (at the moment) understood, a reality that escapes biology, mathematics, physics and chemistry and that, however, allows us to study the life as an emerging phenomenon as a free and unyielding order [13]. The concept of Monodian invariance is now being revisited in the light of the emergence of meaning which, surpassing genetic determinism, completes it. So the mathematical modeling (Markovian processes and Boolean algebra) that allowed the first [22] to interpret the evolution of dynamic systems and stochastic processes of gene expression are no longer sufficient since we no longer find ourselves before a simple Markovian stochastic automaton, but are faced with non-standard models of complex cellular automata able to channel the flow of energy in such a way as to enable the hidden potential in this same stream to be progressively revealed in always different ways. In fact, as Atlan [3, 4] notes correctly, in a natural system that self-organizes, the end is not established from the outside. What self-organizes is the function itself with its meaning. The origin of meaning in the organization of the system is, therefore, an emergent property. In addition, the origin of meaning is closely related with precise options of observation. If we plan to build a complex cellular network, in order to simulate the activities of a biological system (cognitive) must take account of the fact that the behavior of the network has any significance not only to the extent that the result will be autonomous, but also in so far as the result is observed and intentionally tied to continuous production of new possible interpretations [5]. So, for an information source to be able to show independent behavior that will self-organize, we must add to these processes of mutation, selection, and special differentiation also the ability of observation, self-observation, simulation and interpretation. From an objective point of view, it must be noted, first, that the boundary between order and chaos seems to be able to offer much more sophisticated tools to selection [2, 8]. In particular it is able to offer, rather than point mutations, a wide variability, able to lead the environment to deploy itself by manifesting its hidden potential at a deep level. With reference to this particular “landscape” the constraints imposed by the selective pressures at the level of the dynamics of dissipative cellular automaton can actually allow a more complex channeling of the inbound information flow [11]. Through the development of the process now outlined, the genome determines, therefore, the progressive construction of a specific channel for another partial expression of deep informational content and revelation of new forms of incompressibility. Thus, the source of the information, in order to achieve a stable form of expression (a new order) must encapsulate itself in specific generative properties. These properties must be included in the physical matter of the system so as to give rise to the possibility of generating the produced varied complexity of the properties themselves [12].

4 Towards a Semantics of Biological Processes

In this theoretical framework, therefore, it seems clearly that to realize the old TBC project related to the construction of a theoretical biology independent from chemistry and physics, the observations, remodulations and abstract design regarding a statistical mechanics of a renewed nature as identified and pursued by Kauffman in The Origins of Order, and repeatedly revisited in his later texts, do not suffice. To build at the biological level a statistical mechanics concerning genes and macromolecules (in action) it is necessary to reckon fully with profound information, an information, namely, not measurable through the use of tools offered by the traditional Shannon information theory based on a mathematics that is too simple and thus “incompatible” with the complexity of vital phenomena. We must, in other words, define, as we previously mentioned, the principles of a new algorithmic information theory (i.e. a new complexity theory), not exclusively linked to a propositional basis, but articulated on the level of a logical dimension of a predicative and stratified character. Such a theory of complexity should be able, among other things, to show us how it is possible to speak, without any contradiction, of non-existence of finite algorithms in relation to issues which are well placed in terms of uniqueness and existence (non-existence is a necessary matter of departure just as, on the physical side, in agreement with Prigogine, the existence of randomness that is rooted in dynamics is a primitive given). Well, this also implies the elaboration of an intensional and hyper-intensional semantics for recurrent processes of self-organization, and the construction of automata simulation models endowed with intensional bases and reflexive and interpretive functions. In this sense, the first stage of a project so vast should be to refer, at least from an abstract point of view, to the attempts underway to delineate new conceptual principles to define an appropriate logical background for a correct semantics of biological processes [7, 11, 37]. These efforts have focused so far on the provisional definition of at least two new central concepts: the truth considered not as invariance but as emergence, and the model that self-organizes. As regards the first we are no longer faced with a notion of truth as a simple form of invariant propagation within the frame of a monotonic logical structure. The truth seems to now be defined, as Carsetti notes correctly, only by reference to non-monotonic procedures, at the level of the second order, to the actions of coupled systems, to the existence of specific intensional functions. In this sense the emergence of truth seems to be specifically linked not only to the processes of revelation and articulation of the original information source, but above all to the preliminary distinction between surface information and depth information [12, 21]. What characterizes, however, even better, that particular revolution paradigm in the field of semantics of our time represented by the actual delineation of an adequate semantic of processes, is perhaps the general concept of the model that self-organizes inherently, associated with dynamic aspects of the concept of meaning. At this level we are no longer just linked to the existence of individuals and logical invariants forms, as in the case of the theoretical constructions of Russell, Tarski and Henkin; we are, on the contrary, linked, first, to the articulation of specific and complex attributes of generators, that is to say, to the existence of attractors, operational closures, recurring flows of information and, in general, to a multi-tiered architecture of self-organization [24, 25]. And it is precisely with regard to this particular type of logic that the domains of individuals-object can then begin to articulate their existence, seeming, on the logical level, to be the end result of the effective articulation of some specific processes, in particular an internal building process and an effective functional partition process, where information flows represent “the true fibre” of the structure of the dynamic model itself [12]. Therefore, we are once again faced with some of the brilliant insights of Prigogine. According to the Russian scholar, in fact, to explain irreversibility (and stochasticity) you must consider states with a temporal symmetry-breaking propagated through laws which are themselves due to a break of symmetry [26]. The temporal symmetry breaking, therefore, in that context, represents an essential tool for developing a new level of understanding in which rationality is no longer identified with the idea of certainty. Similarly, therefore, it is possible to say that in the semantics of processes we witness the gradual introduction of concepts related to particular conditions of symmetry breaks that occur logically and informationally, such as, for instance, the concepts of partition and of self-organizing models. However, if on the other hand also by virtue of an intuition of Prigogine today we can penetrate new territories of semantics, in a terrain, namely, that appears to be strictly determined by the progressive expansion of evolutionary processes [6, 29]; on the other hand it should be noted once again how the current level of semantics, as well as to the level of a theory of multidimensional information, we are no longer confined within the limits of simple Markovian frames. This fact constitutes a real dividing line with respect to the formal apparatus developed by Nicolis and Prigogine at the level of their exploration of the mathematical basis of the theory of complexity [1]. Within the framework of process semantics, therefore, we really need to resort to the delineation of new and more complex informational spaces, of new measures (new axioms) of complexity that can express themselves not only propositional-level (as, for example, the Shannon entropy), but also at the level of the first and second order [8]. It is only with reference to these more complex informational spaces that these functional partition processes and processes of dynamic self-organization will be defined according to mathematical models more adequate to describe life phenomena [11, 16, 19]. From this perspective, then, we are dealing only with two different conceptions of time, namely the time as repetition (invariance) and time as disintegration (dissolution), but we find ourselves before a third concept of time able to overcome this dualism: the time as construction, a construction that appears to our eyes, simultaneously, as creation and as rediscovery, although this same construction goes through specific states of degradation and invariance. This weft presents itself, at the same time as creation and revelation. As the creation of new forms of autonomy, and at the same time as continuing revelation of new levels of generative power: an emergence of ever new meanings that shape consecutively and in a closer manner the determinations of time, which form, in turn, on the basis of precise mathematical forms, the variegated and constrained expression of the language of life.