Keywords

1 Introduction

To our common sense, information is the occurrence of something unexpected. We all agree with this definition, although it lacks clarity. Unexpected to whom? On what grounds? And about what? On second thought, the use of “informative” as a predicate seems to be more appropriate than the hypostatized use of “information” as an abstract noun. After all, we find surprisingly informative the occurrences that sensitize our perceptual apparatus, and we do so through synthetic judgments. We do not intuit “information” as an intellectual object per se, without experience, as we can do with the aprioristic predicates of “space” and “time.” The intrinsic semantic dynamism of information, always contextual, should exclude a purely syntactic and structural definition. Conversely, information seems to be naturally pragmatic and a consequence of semantic processes linked to psychical phenomena. However, the concept of information has lost its meaning and has even been reified as an elemental component of the universe. From a general predicate under the logic domain, information has come to be considered a measurable and countable quantity under the domain of physics and mathematics.

It is not my purpose here to elucidate how such a metamorphosis happened. Here I will only acknowledge that the result of this process is that for contemporary Western science, the concept of information has been reduced to that of entropy and algorithmic complexity. I guess that, to a great extent, this is a consequence of the hegemony of cybernetics in scientific culture after the union of the theories of Alan Turing and Claude Shannon. The infamous anecdote is that Shannon was advised by John von Neumann, the founding father of cybernetics, to equate his logarithmic–probabilistic concept of information with entropy to reduce the possibility of critical questionings of his seminal work on the mathematical theory of communication “since no one understands what entropy means.” However, we must doubt if this clever stratagem, which seems to have worked so far, did not betray Boltzmann’s noble intentions to seek a mathematical–scientific foundation for the empirical discoveries of thermodynamics.

I will proceed by going backward in the story by providing short accounts of Shannon Information (the currently dominant scientific paradigm) and Fisher Information (its close predecessor). I will then introduce Peirce’s definition of information as a logical quantity of symbols and show how it relates to perception and experience. I will also point out some similarities with the other two previous descriptions through logarithmic calculations and conclude by introducing Peirce’s extreme realism as the grounds for scientific metaphysics. Here final causation is the core idea to bring information and meaning under the same frame. Pragmatism might be defined as a method to extract real information and infuse it into our shared beliefs, as our representations evolve (albeit never achieving a definitive, ultimate state) toward a genuine final opinion.

Let us start with Shannon Information, which is commonly described as a measure of the number of binary distinctions that would be necessary to bring a disordered system into an orderly state or to describe it completely. Since it ultimately depends on the system’s physical disorder, it can indeed be reduced to entropy. As the entropy of a system necessarily grows in time because of the second law of thermodynamics, the amount of information necessary to describe it increases proportionally. Moreover, since entropy is a statistical average quantity expressed by the logarithm of the sum of probable states of a system, so is Shannon Information (Shannon 1948). The larger the number of possible, probable states, the larger the ignorance about which state is the current actual one, and the larger the amount of information needed to describe the system by a sequence of binary distinctions. When entropy is maximal, Shannon Information is simply log n (the logarithm of n with base 2), where n is the number of probable states. Conversely, if a well-known system is in a state so well ordered that it does not change at all in time (totally redundant), then no Shannon Information is needed to describe it.

The contemporary hegemony of Shannon’s definition tends to cloud other earlier and equally interesting mathematical formulations. For instance, in the 1920s, the British statistician Ronald Fisher (1922) had already proposed a logarithmic–mathematical definition of the concept of information, with the advantage that it preserved an observer’s experience as ballast. Fisher developed his concept of information as a way to measure the degree of confidence in observational experience. Mathematically, Fisher Information is always about observing data concerning a parameter a (Frieden 2004; Frieden and Romanini 2008). The data x collected during any observation varies randomly because no real measurement is free of noise or imperfection in the measuring device. Fisher Information can be expressed as I = <[(d/dx) log p(x)] 2>, where < > means average and p(x) means a probability function about a variable x and d/dx is its variation. The variable x is defined as data carrying information about a. Fisher Information is then a measure of the “width” of the probability function p(x) or its density. That is why it can also be expressed as the curve of an amplitude of probability. A wide amplitude curve means less information, and a narrow amplitude curve means more information. Another important feature is that the variation of x expressed by d/dx can be interpreted as the expression of dynamics in time, a property that is lacking in Shannon Information.

Fisher Information is therefore local and measured from the logarithm of the registrations’ density made on an observable parameter. Fisher offered a mathematical way to measure our experiments’ effectiveness in the physical world, while Shannon proved that everything solid melts into the air, so to speak. If Fisher had a ground wire connected to physical existence, Shannon offered us bits that navigate freely in the virtuality of the real but without any sense. Crushed by the uni-dimensionality of the binary digital code, framed by restricted Boolean logic (in which the principles of identity and the excluded third are essential), and cluttered with the numerical language of 0s and 1s that allows computers to perform algorithmic tasks, information has gained ontological accuracy but lost its meaning (Machado and Romanini 2011). A non-local and systemic amount has been transformed and wholly dissociated from the conditions of possibility of an observer’s/interpreter’s experience.

The story could be different if the semiotic concept of information developed by Charles Sanders Peirce had been more widely studied and understood before Fisher and Shannon started their studies. To our knowledge, Peirce was arguably the first scientist ever to deal with the concept of information scientifically and systematically.Footnote 1 As early as 1865, he defined information as a third logical quantity of symbols, in addition to denotation (extension, or also breadth) and connotation (comprehension, or also depth). During his half-century of philosophical production, Peirce’s understanding of symbols evolved from the Kantian and nominalistic approach of his youth to an Aristotelian and extreme realist position in his late writings. Then the class of symbols covers everything capable of transmitting a form from an antecedent to a consequent in the process of triadic relations that Peirce calls semiosis, the action of signs—be it a communicative assertion, the work of bees, the action of a complex molecule in our immune system, or even the growth of crystals. In the limit, every conceivable universe where chance events are subsumed in new creative patterns is analogous to a developing symbol. The corollary is that information is the outcome of universal semiosis, which is pervasive in the cosmos.

2 Peirce’s Definition of Information

From 1883 to 1891, as Peirce tried to make ends meet after a series of professional failures and personal setbacks, he worked as a freelance lexicographer for the Century Dictionary,Footnote 2 contributing as a writer or editor of more than 15,000 definitions. One of his entries was the definition of “Information” reproduced below.

figure a

As the entry reveals, Peirce’s concept of information is at the same time:

  1. 1.

    Logical in the traditional sense (“the aggregate of characters predicated of it over and above what is implied in its definition,” as was already found in Abelard’s nominalism).

  2. 2.

    Semiotic, understood as the expanded doctrine of logic (“the sum of synthetical propositions in which the symbol is either subject or predicate”). Here Peirce quotes himself, a clear indication that he recognizes the originality and uniqueness of his contribution to the concept.

  3. 3.

    Metaphysical (“the imparting of form to matter”), which resonates with Aristotle’s hylomorphism.

Let us start with the first sense of information, the logical one. As mentioned above, Peirce introduced information as the third quantity of symbols as early as 1865 when he was 26 years old. Symbols can be concepts, the terms of propositions (such as the subject and predicate) but also any other general representation used to communicate, no matter how complex it is:

Symbols or general representations … connote attributes and so connote them as to determine what they denote. To this class belong all words and all conceptions. Most combinations of words are also symbols. A proposition, an argument, even a whole book may be, and should be, a single symbol. (W1: 467, 1866)

Peirce then explains his “Abelardian” definition of information:

The information of a term is the measure of its superfluous comprehension. That is to say that the proper office of the comprehension is to determine the extension of the term. For instance, you and I are men because we possess those attributes—having two legs, being rational, &c.—which make up the comprehension of man. Every addition to the comprehension of a term lessens its extension up to a certain point, after that further additions increase the information instead. (W1: 467, 1866)

The quotation above exposes Peirce’s originality. Although he adopts the Abelardian definition of information, Peirce fits it into the traditional scheme of inverse proportionality of comprehension and extension, which was formulated by the logicians of Port Royal. For them, comprehension and extension (also known as depth and breadth, and sometimes even connotation and denotation) were taken to be in a way that the growth of one of them would diminish the other. Instead of writing the usual “Comprehension × Extension” formula, Peirce creates a new one: “Comprehension × Extension = Information.” He defines information as the quantity that can increase comprehension without diminishing extension or can increase extension without diminishing comprehension, or can even simultaneously increase both. So if I say the complex term “Caucasian man,” I have lessened the extension while increasing the comprehension in comparison with saying the simple term “man” only. However, if I say “non-black Caucasian man,” the addition of “non-black” as an increase in comprehension is superfluous and does not lessen any amount of extension, for “Caucasian” implies being “non-black.”

So Peirce concludes, “(i)f we learn that S is P, then, as a general rule, the depth of S is increased without any decrease of breadth, and the breadth of P is increased without any decrease of depth” (CP 2.420). In sum, in the 1860s, information for Peirce is a correlated quantity of a triadic relation, which also includes an extension (all real things that can be predicated by a symbol) and comprehension (all general predicates involved in the definition of a symbol). Since this process of conceptual synthesis can be carried on indefinitely without ever coming to an end, Peirce’s quantity of information implies a fallible state of knowledge, “which may range from total ignorance of everything except the meanings of words up to omniscience.”

3 Information in Perception, Cognition, and Experience

Beginning in the 1870s (when he wrote his famous cognitive series of articles), Peirce’s concept of information became an essential part of his theory of perception and cognition, which relates to the second aspect of his entry in the Century Dictionary. It was continuously generalized to cover the whole of experience, from perception to scientific inferences. From now on, we will see Peirce’s understanding that not only symbols but also indices (quantifiers) and then icons (diagrams) are important for a complete theory of information. Information is grounded on habitual forms that are not available to reason, such as in the case of instinct. Peirce explains that a state of information is directly dependent on the knowledge we gather in experience, since perception, the parting point of experience, is the only door we have to let in new information. In CP 7.587, he explains that “perception is the possibility of acquiring information, of meaning more,” pointing out that the reality of his three categories and correspondent types of signs (icons, indexes, and symbols) should be accounted for to produce a complete theory of information as a cognitive process of learning from experience—of meaning more.

Whenever we have an informative synthetic judgment such as the proposition “The orange in my hand is sweet,” we see that the predicate “sweet” gains extension. In contrast, the subject “the orange in my hand” gains comprehension, for the class of known sweet things now has a new element. In turn, the object “orange” denoted by me now has a new and increased comprehension—i.e., besides being known to be a tropical, round, juicy, citric, orange-colored fruit, it is now known to be sweet. All scientific reasoning proceeds in the same manner: we note some interesting facts in our experience and then search for an explanatory hypothesis capable of subsuming these novelties under a larger and more comprehensive logical whole.

While keeping information as the ampliative result of a synthetic judgment, as an interpretant of a symbol produced by experience, Peirce is now advocating a much fuller view of information as connected to non-conscious feelings, volitive actions, and pragmatic consequences. The meaning of a concept, word, term, symbol, etc., is the sum of all general effects that would naturally precipitate from its adoption by a community of interpreters, and the meaning of symbols may grow as novel general effects are incorporated above its usually known (i.e., familiar) definition, which is always merely verbal:

By information, I mean all that knowledge that we collect from the experience of ourselves and of others. Now I call any acquisition of Knowledge “information,” which has logically required any other experience than the experience of the meanings of words. I do not call the knowledge that a person known to be a woman is an adult nor the knowledge that a corpse is not a woman, by the name of “Information,” because the word “woman” means a living adult human being having, or having had, female sexuality. Knowledge that is not Informational may be termed “verbal.” (MS 664, 19, 1910)

3.1 Collateral Information

At this point, we must introduce the concept of collateral experience, or collateral information, which covers precisely the amount of information that interpreting minds must possess as a repertoire before engaging in the process of interpretation.

By collateral observation, I mean previous acquaintance with what the sign denotes. Thus, if the Sign be the sentence “Hamlet was mad,” to understand what this means one must know that men are sometimes in that strange state; one must have seen madmen or read about them; and it will be all the better if one specifically knows (and need not be driven to presume) what Shakespeare’s notion of insanity was. All that is collateral observation and is no part of the Interpretant. But that which the writer aimed to point out to you, presuming you to have all the requisite collateral information, that is to say just the quality of the sympathetic element of the situation, generally a very familiar one—a something you probably never did so clearly realize before—that is the Interpretant of the Sign—its “significance.” (CP 8.179)

To sum up, collateral information is the previous knowledge about the object of representation that provides adequate comprehension of a symbol. As a familiarity, it depends on a “mental habit,” or a belief that grounds the meaning of all predicates implied in a symbol’s definition. In short, collateral information is what we already know about the object, while semiotic information is the new learnings about it.

3.2 Is Collateral Information Logarithmic?

The familiarity of collateral information relates to the redundancy of mental habits (our beliefs) as much as the novelty of information relates to the unexpectedness of perception. There can even be a logarithmic account of this relation in Peirce’s concept of chance. Suppose the inquiry task is to break wrong old habits (usually grounded on unscientific methods such as blind reliance on authority, stubbornness, and aprioristic misconceptionsFootnote 3) and build new and more accurate ones. In that case, the research work seems to involve a minimax optimization equation quite similar to logarithmic relations. In Peirce’s own words:

[Knowledge has a “money value” that] increases with the fullness and precision of the information, but plainly it increases slower and slower as the knowledge becomes fuller and more precise. The cost of the information also increases with its fullness and accuracy, and increases faster and faster the more accurate and full it is. It therefore may be the case that it does not pay to get any information on a given subject; but, at any rate, it must be true that it does not pay (in any given state of science) to push the investigation beyond a certain point in fullness or precision. (CP 1.122)

In situations such as this, what we see is a game-like dispute between the inquirer and the object of inquiry, which we may consider as Nature in general. The best strategy for the inquirer is to formulate a hypothesis as close as possible to an even choice and eliminate (as quickly and as much as possible) wrong answers to a question:

The game of twenty questions is instructive. In this game, one party thinks of some individual object, real or fictitious, which is well-known to all educated people. The other party is entitled to answers to any twenty interrogatories they propound which can be answered by Yes or No, and are then to guess what was thought of, if they can. If the questioning is skillful, the object will invariably be guessed; but if the questioners allow themselves to be led astray by the will-o-the-wisp of any prepossession, they will almost as infallibly come to grief. The uniform success of good questioners is based upon the circumstance that the entire collection of individual objects well-known to all the world does not amount to a million. If, therefore, each question could exactly bisect the possibilities, so that yes and no were equally probable, the right object would be identified among a collection numbering 2 to 20. Now the logarithm of 2 being 0.30103, that of its twentieth power is 6.0206, which is the logarithm of about 1,000,000 (1 + .02 × 2.3) (1 + .0006 × 2.3) or over one million and forty-seven thousand, or more than the entire number of objects from which the selection has been made. Thus, twenty skillful hypotheses will ascertain what two hundred thousand stupid ones might fail to do. The secret of the business lies in the caution which breaks a hypothesis up into its smallest logical components, and only risks one of them at a time. (CP 7.220)

For Peirce, the natural choice is to consider the logarithm of chance as the best measure of our attachment to a belief. He wants to ground his theory of information on experience, and chance is defined as the simple relation between the number of successes over the number of failures in a given experiment:

Any quantity which varies with the chance might, therefore, it would seem, serve as a thermometer for the proper intensity of belief. Among all such quantities there is one which is peculiarly appropriate. When there is a very great chance, the feeling of belief ought to be very intense. Absolute certainty, or an infinite chance, can never be attained by mortals, and this may be represented appropriately by an infinite belief. As the chance diminishes the feeling of believing should diminish, until an even chance is reached, where it should completely vanish and not incline either toward or away from the proposition. When the chance becomes less, then a contrary belief should spring up and should increase in intensity as the chance diminishes, and as the chance almost vanishes (which it can never quite do) the contrary belief should tend toward an infinite intensity. Now, there is one quantity which, more simply than any other, fulfills these conditions; it is the logarithm of the chance. (CP 2.676)

Amazingly, Peirce concludes that an even chance (1/1) means a probability of (1/2), which he calls “randomness.” It is not hard to imagine how chance can be translated in probabilistic statistics in such a way as to place the concept of total randomness as maximum entropy, where a maximum number of binary-choice experiments would have to be performed by a scientific mind. Both Fisher Information and Shannon Information would have a place in this translation, and this is work waiting to be accomplished by information theorists.

4 Sensations, Non-conscious Abductions, and Instinct

In our usual cognitions, the most fundamental level of information is given by qualities of feeling, which are purely iconic and yield only emotional effects. They are the cradle of all sorts of information and the origin of all sensations that, ultimately, ground our predicates. We will not deviate too much into Peirce’s pragmatism, but we must recall that abductions or hypotheses are the starting point of all knowledge. During perception, abductions are responsible for synthesizing new predicates, which come first as hypothetical sensations and develop as accepted general terms. In perception, Peirce explains, non-conscious synthetic inferences reduce a multitude of complex feelings and outward irritations into one generalized sensation, which becomes the general primary state of information of a particular mind. This embodied iconic information turns out to be the primordial quantity of Peirce’s logic system, which grew to become his general semiotics.

From our perceptual judgments to our most complex scientific inferences, the whole process can be described as a semiotic endeavor to capture the real qualitative forms and present them as assertions (synthetic judgments) composed of separated predicates and subjects. This means that concepts are not produced by any transcendental synthesis that welds together sense impressions, as Kant stated. On the contrary, they begin with a feeble qualitative hypothesis that springs from unconscious inferences as undifferentiated wholes—or “percepts.” They only become assertions much later in the pipeline of experience, when subject and predicate are dissociated as different quantities (breadth and depth) and then reunited by reasoning.

The judgement [sic], “This chair appears yellow,” separates the color from the chair, making the one predicate and the other subject. The percept, on the other hand, presents the chair in its entirety and makes no analysis whatever. (CP 7.631)

This reductio ad unum is conjectural, non-conscious, and immediately connected to reality, which explains how “wild guesses” given in instinct are usually more reliable in practical matters than piecemeal intellectual reasoning:

Instinct is capable of development and growth—though by a movement which is slow in the proportion in which it is vital; and this development takes place upon lines which are altogether parallel to those of reasoning. And just as reasoning springs from experience, so the development of sentiment arises from the soul’s Inward and Outward Experiences. Not only is it of the same nature as the development of cognition; but it chiefly takes place through the instrumentality of cognition. (CP 1.637)

5 Information in Its Semiotic Minute Aspects

Peirce’s theory of signs involves a complex network of sign types connected with an intricacy of philosophical and metaphysical positions regarding the nature of the real and our ability to know from experience, which could not be tackled unless we took a long detour into Peirce’s method of pragmatism. However, that is not the aim of this chapter. For our limited purposes, it suffices to say that its most important feature is the triadic character of its relations, bearing directly on the problem of information we are dealing with. In Peirce’s doctrine, the sign is a logical entity that could be broadly defined as a medium that conveys information from mind to mind (or quasi-mind, for the mind here is also defined as a logical entity).

The sign’s task is to create in the mind of an interpreter a representation of something other than itself. “Object” is what the sign represents, the aspect of the sign relation that provides the sign with its “aboutness.” The term “object” here is not necessarily an existent thing or fact, such as a chair or the falling of a specific rock, although it can certainly be any of these too. The sign’s “object” is defined as a logical position and can be translated as the “subject” of representation without losing much logical meaning. More psychological approaches would prefer “subject” as the best term, but Peirce stresses that once the Copernican revolution of putting thought before the thinker is done (since we are in thoughts and not the other way around), “object” and “subject” become fundamentally identical in semiosis.

The receiver is the interpreter, or, if we get rid of the psychological component, we might call it the interpretant, which can be understood as the general tendency, or power, that any symbol possesses to determine specific kinds of effects, even if this tendency is never actualized, let alone developed to its full consequences. A message in a bottle is a symbol even if it is buried forever in the deep ocean. Nevertheless, Peirce’s late extreme realism goes further and professes that a hidden fossil fish in a rock is also a potential symbol, albeit in a latent state until a proper reasoning mind eventually actualizes its possible general interpretant:

If, for example, there be a certain fossil fish, certain observations upon which, made by a skilled paleontologist, and taken in connection with chemical analyses of the bones and of the rock in which they were embedded, will one day furnish that paleontologist with the keystone of an argumentative arch upon which he will securely erect a solid proof of a conclusion of great importance, then, in my view, in the true logical sense, that thought has already all the reality it ever will have, although as yet the quarries have not been opened that will enable human minds to perform that reasoning. For the fish is there, and the actual composition of the stone already in fact determines what the chemist and the paleontologists will one day read in them. … It is, therefore, true, in the logician’s sense of the words, although not in that of the psychologist’s, that the thought is already expressed there. (EPII: 455, 1911)

Since the object must be logically absent to be represented in the sign, the efficient cause of this representation must be a “real form,” or general potentiality, that in semiosis becomes an actual dynamic agent—or utterer of the communicated form. The third aspect of the sign relation is the interpretant, which is a consequence (possible, actual, or eventual) of the object’s representation by the sign. As the effect of signification, the interpretant is taken to be the final cause of the process of semiosis. In a more direct phrase, the sign’s function is to transmit the form of the object to the interpretant. In this sense, the sign is the medium of communication of a form par excellence:

That which is communicated from the object through the sign to the interpretant is a Form. It is not a singular thing; for if a singular thing were first in the object and afterward in the interpretant outside the object, it must thereby cease to be in the object. The Form that is communicated does not necessarily cease to be in one thing when it comes to be in a different thing, because its being is a being of the predicate. The Being of a Form consists in the truth of a conditional proposition. Under given circumstances, something would be true. The Form is in the object, entitatively we may say, meaning that that conditional relation, or following of consequent upon reason, which constitutes the Form, is literally true of the object. In the sign the Form may or may not be embodied entitatively, but it must be embodied representatively, that is, in respect to the Form communicated, the sign produces upon the interpretant an effect similar to that which the object itself would under favorable circumstances. (MS 283; partially reprinted in EPII: 371–397, 1906)

The form present in the semiotic object is not so hidebound by strict habits as to be considered dead and then easily coded into bits, but is an active one that influences and guides the whole semiosis process. A dead form might be usual for the automatic regulation of machines. However, the form that runs in the artery of semiosis is closer to Leibniz’s “vis viva,” although with a significant difference: it is not conservative, nor does it strictly obey the principle of the least action; on the contrary, it is in continuous development and evolution due to the very process of semiosis in which it takes part—a property that could well be called continuous self-organization, or self-formation. In-formation, or the internalization of novel forms by the sign, in the process of continuous transformation of the sign.

This objective account of the form as an emanation from a dynamic object, which is ultimately the very essence of reality (its “species,” as the scholastic philosophers would put it), is the basis for Peirce’s so-called objective idealism. This objective idealism, in turn, is an interpretation of Plato’s philosophy that accepts most of Aristotle’s contribution as much as a particular reading of Darwinism, by which chance variations in the characteristics of a “species” are fortuitous (i.e., undetermined) as much as “finious,” or directed to general end states, and this is the basis for the third aspect of his definition of “Information” for the Century Dictionary, as shown above.

Final causation, or the law of association to produce more generalized predicates, is taken by Peirce to be as much a critical logical rule in semiotic processes as the pure chance that Darwinists take to be the sole rule of natural selection. This radically new interpretation of Darwinism, introduced in the core of Platonism, can account for the growth of symbols, its continuous development, and self-organization toward more complex forms that, ultimately, grant the increase of concrete reasonableness in the real.

Contrary to the traditionally accepted account of the increase of entropy in the universe leading to a complete dissolution of all forms, Peirce advocates a balance between dissolution and evolution that, if in the short run it condemns every particular form of life to ultimately end, in the long run would favor intelligence by multiplication of the possible types of life. This means that semiosis and mind, as well as growth and development, are all-pervasive features of our universe. Without the influx of a symbol, Peirce states, the universe would be unintelligible and devoid of life. The evolutionary laws of nature favor life and intelligibility, and every thought, even if mistaken, must represent some aspect of the real and participate in this teleological process:

A symbol is an embryonic reality endowed with the power of growth into the very truth, the very entelechy of reality. This appears mystical and mysterious simply because we insist on remaining blind to what is plain, that there can be no reality which has not the life of a symbol. (EPII: 323–324)

6 The Symbol as the Vehicle of Information

By symbol, I understand, first, a general sign. Symbols do not exist per se but depend on instantiations in replicas to gain physical embodiment. The typical example is a linguistic word, which does not depend on being written or spoken by some individual to be real, although only these existential instantiations can make it effective. This notwithstanding, a symbol must also represent some general aspects of the object or complex of objects represented, the latter being generally the case. This general aspect, which is similar to the Platonic notion of the “idea,” is also the general form that the symbol conveys as an interpreting power.

In a late manuscript, Peirce explains what he understands by the word “symbol,” leaving no doubt that he is not classifying his semiotics as a branch of social psychology or even anthropology at large. In this undated but certainly late manuscript fragment, Peirce brings up the essential features of the nominalistic definition of a symbol (a human-made dyadic match-word—arbitrary, conventional) but then explains how he departs from and expands on it to embrace his realistic account of signification:

Symbols: The word “symbol” has already many meanings; and I shall ask leave to add a new one. Among its early significations, perhaps the original one, is that of a match-word, a somewhat arbitrarily adopted word or phrase, by which persons of one party recognize one another. It is nearly in this sense that the church creeds are called symbols. So, a flag is a symbol, etc. I think, then, that I shall not wrench the word too much if I use it to mean a sign to which a general idea is attached by virtue of a habit, which may have been deliberately instituted, or may have grown up in a natural way, and perhaps have been acquired with one’s mother milk, or even by heredity. (MS 797)

The central point is that every symbol must be a sign with “a general idea attached to it by virtue of a habit.” As we have seen, this general idea grounding the symbol is its form or “species” in the mode of the conditional future—the real “would-be” not only capable of evolution but the very leitmotif of every evolutionary process. It is Peirce’s habit.

A piece of colored cloth hanging on a pole cannot be a flag if it is not cognizable by an interpreter or community of interpreters. By spatial instantiation, replicas of the same type of flag can be found in many places throughout a nation. Another example is the sun, which is a singular thing but nevertheless a logical general, for its repetitive appearances in time, due to astronomical revolutions, grant the necessary cognitive generality for it to be considered what Peirce calls a “legisign.” All physical laws of nature are legisigns because their final interpretants are necessarily habitual. Similarly, all natural classes fall into the same typology, such as chemical elements and biological taxonomies. They are intrinsically true and thus independent of any thinking about them.

However, not every thought, albeit habitual to some degree, has the same necessary nature. The nature of thought is that of being conditionally true. Its condition rests on the normativeness of logic: correct reasoning would in the long run produce habitual final interpretants, but at any stage of inquiry, the provisory dynamic interpretants of any thought are all imperfect to some degree. What is undeniable is their potentiality for being developed to a final true opinion, a power that is expressed by their habitual immediate interpretants. The distinctiveness of thought is that it has the power of information, the power of forming itself. It does so because the idea attached to it is an aspect of the object it professes to represent. Peirce calls this idea internal to the sign “immediate object.”

These characteristics explain why the habitual “would-bes” of symbols are always modally vague and non-determined in a certain measure (not so much as to render them useless, but sufficiently to let iconic possibilities creep in and produce novelty), depending on the context to produce particular determinations. These determinations, as they rule out possibilities and define some range of possible types of outcomes, produce variety under invariance that can be mathematically expressed by differential equations. The symbol is then analogous to the double-faced Greek goddess Tyche (Luck). One of her faces captures information coming from the past, embodied by icons inhabiting the indices of experience. In fact, what determines the symbol is the index that it must involve, which materially connects it with the concrete existential context, the hic et nunc of reality. The other face envisions the future and creates conjectures trying to bring the perceived icons and indices into the unity of an evolving concept. A living symbol must be explained as a continuously evolving sign functioning as the vehicle or medium for a flow of information from the wholly determined past toward a vague and undetermined future—the core of semiosis.

This is why the growth of symbols takes place by the always defective but also self-corrective embodiment of the forms of the dynamic objects they represent. If and when this embodiment is completed, the symbol would reach its entelechy, or the perfect final interpretant. This entelechy would be its “ultimate final interpretant,” which might be defined as a habit in perfect harmony with the super-order, or super-habit, that rules the laws of nature. Since even the physical laws and biological classes are in evolution, we must accept that the ultimate final interpretant of symbols in an evolutionary universe is to develop alongside them, breaking old habits and embracing new ones as reality itself unfolds toward complexity and concrete reasonableness.

We see then that information is a process intrinsic to the symbol, although the other types of representation also play important roles as symbols semiotically involve them. Icons are essential to embody the form or idea to be communicated by the symbol, while indices are needed to point out the objects to which this idea might be applied. Peirce defines the denoted object as the source of information, which occupies the position of the emitter (Peirce uses “utterer” for the source and “utterance” for the message). Since our main concern here is information in communication, we must consider what turns a symbol into a conveyor of information, such as an assertion (the expression of a particular belief in a definite context) or a proposition (the general form of an informative symbol, usually diagrammatic, which can be asserted in different syntaxes). Peirce calls these informative symbols “dicent” signs or “dicisigns,” such as propositions. One of the examples most worked out by Peirce is the weather-vane (or weather-cock), a device capable of informing the wind’s direction.

The reference of a sign to its object is brought into special prominence in a kind of sign whose fitness to be a sign is due to its being in a real reactive relation—generally, a physical and dynamical relation—with the object. Such a sign I term an index. As an example, take a weathercock. This is a sign of the wind because the wind actively moves it. It faces in the very direction from which the wind blows. In so far as it does that, it involves an icon. The wind forces it to be an icon. A photograph which is compelled by optical laws to be an icon of its object which is before the camera is another example. It is in this way that these indices convey information. They are propositions. That is, they separately indicate their objects; the weathercock because it turns with the wind and is known by its interpretant to do so; the photograph for a like reason. If the weathercock sticks and fails to turn, or if the camera lens is bad, the one or the other will be false. But if this is known to be the case, they sink at once to mere icons, at best. It is not essential to an index that it should thus involve an icon. Only, if it does not, it will convey no information. (MS 7, 17–18)

The bunch of metal pieces or wood that compose the weather-cock convey no information if it is unassembled, rotten, or broken. The weather-cock can convey information about the wind only if it receives the influx of a symbol given by the interpreters’ community that recognizes it as an informative device about wind conditions. A particular weather-cock becomes then the replica of an active symbol, and its pointing to a particular direction becomes an index—although not a pure blind one, but one more appropriately understood as selective or as a quantifier, capable of denoting the real direction of the wind. Its logical validity is grounded on the fact that it is a part of a greater whole, which is the real direction in which the wind is blowing.

If the weather-cock is functioning correctly, this index also involves an icon that represents the real form of the blow of the wind. And the symbol would be true if the habitual “would-be” that accompanies it correctly represents this iconic information presented by the weather-cock. In logical terms, the symbol connotes truly what it truly denotes. Moreover, the wind that blows the weather-cock is in the past of any conceivable observer that would collect the information, while the pragmatic consequences of the assertion made by the apparatus are always in its future. There must be then a continuous schema, or syntax, linking the real possibilities of the icon at the perceptive level to the icon of the logical consequences. The former enters the knowledge through perceptual judgments, and the latter becomes conscious information by diagrammatic reasoning, where relations are represented in the form of thinking. This flow of information from the real form of the object to the general form of the interpretant in the symbol must then be continuous in time, and the logical schema of time must account for the being of a proposition. (MS 664, 10–13, 1910)

The result is that living symbols, capable of growth and development, must be in a dynamic process of embodying the form of their object in particular situations through experience. The information it generates then also becomes a continuous transformation of its intrinsic forms. This is the kernel of Peirce’s synechism, the doctrine that continuity and generality are the bases of reality. Without the real continuous predicates that weld together every mind in a “commens,” or commind, no communication would be possible, because there would be no common ground among the minds of the community of interpretants that is created by the very work of symbols, in a way that:

No object can be denoted unless it be put into relation to the object of the commens. A man, tramping along a weary and solitary road, meets an individual of strange mien, who says, “There was a fire in Megara.” If this should happen in the Middle United States, there might very likely be some village in the neighborhood called Megara. Or it may refer to one of the ancient cities of Megara, or to some romance. And the time is wholly indefinite. In short, nothing at all is conveyed, until the person addressed asks, “Where?”—“Oh about half a mile along there” pointing to whence he came. “And when?” “As I passed.” Now an item of information has been conveyed, because it has been stated relatively to a well-understood common experience. Thus, the Form conveyed is always a determination of the dynamical object of the commind. (EPII: 478, 1906)

The assumption of a continuous flow of information that ultimately welds and fuses every intelligent mind that participates in and shares the same experience also opens what Peirce affirms to be the “lock” that Kant put in the door of philosophy when asked his most important question: How are a priori synthetic judgments possible? By “a priori,” Peirce explains, Kant meant universal, and by synthetic judgments, Kant meant based on experience. Communication, or Rhetoric (as Peirce classified the third branch of Semiotics, after Speculative Grammar and Critical Logic), is to be understood not as a mere application of semiotics to real communicative situations but as a core discipline to understand how knowledge can be gathered from experience and shared by a community. This knowledge can be shared not only by a community of conscious rational minds but even by a community of uni-cellular living beings co-evolving through Darwinian reproduction and natural selection. Peirce says:

Analogous to the increase of information in us, there is a phenomenon of nature—development—by which a multitude of things come to have a multitude of characters, which have been involved in few characters in few things. (CP 2.420)

However, we can further generalize to reach his cosmological view of information as a natural phenomenon connected as much as to matter as it is to mind, which is not surprising since his metaphysical account of reality does not separate mind and matter like we cut logs with an ax. Peirce’s universal evolutionism is much more universal than contemporaneous proposals because Peirce sees some degree of life and iconic feelings whenever chance is involved in breaking old habits and establishing new ones, even when we talk about laws of nature and universal constants—all taken by him as a result of evolution:

Consider the life of an individual animal or plant; or of a mind. Glance at the history of states, of institutions, of language, of ideas. Examine the successions of forms shown by paleontology, the history of the globe as set forth in geology, of what the astronomer is able to make out concerning the changes of stellar systems. Everywhere the main fact is growth and increasing complexity. (EPI: 307–308)

7 Summary and Conclusion

In natural languages, traditional logic, and human culture, semiotic information is a property of symbols. These symbols can be terms, propositions, or any complex arguments of any length (a book, or even a whole library) used to communicate the form of a cognizable object (taken as anything capable of being represented by generally known predicates) to an interpretant, which is the general consequence of such representation. However, in a more universal and (extreme) realistic approach to the nature of symbols, semiotic information can result from the communication of a general form from past to future, allowing for the growth of complexity. Information is generated as an interpretant of a symbol as it learns from experience—i.e., as it develops, evolves, and gains complexity in time. Ultimately, a portion of space-time can be seen as a medium of communication where information flows from past to future, and living organisms can be seen as very specialized portions of space-time where information flows at a very high rate, escalating quickly to create systems far from thermodynamic equilibrium (or pure randomness). Since every symbol also has both an iconic and an indexical part, so does information. In the iconic part, information is embodied as a novel predicate—unexpected and often fortuitous variations given by continuous perception or experience. The creative novelty in chemical reactions and the fortuitous variations in biological reproduction are good examples of iconic embodiments outside our human species’ cultural realm. In the indexical part, information is expressed as subjects to which novelty is assigned, such as a unique crystal or living individual where novelty manifests. In the action of symbols as fully fledged propositions (a sentence, a codified gesture, a traffic sign, etc.), information is generalized and shared to all participants of a community in which it is adopted and used. In the action of symbols as living beings or developing universes, information is realized when new species appear in biological evolution and in the irreversibility of events in space-time as the future comes to the past. In all of these continuous symbolic processes, new general attributes become predicates of known general objects, allowing for permanence, survival, and being—and that is how information becomes meaningful.