Algebraic Concepts Characterized / Architectonic Articulations / Distinguishing the General from the Generic / Pre-specificity / The Alphabetic Absolute

Negentropy

author’s manuscript, work in progress (Vera Bühlmann).

“Thought interfers with the probability of events, and, in the long run, therefore, with entropy”.[1] The term “negentropy” is born from this very situation. It was introduced by Schrödinger to distinguish biological systems from physical systems, and then generalized by Léon Brillouin into the domain of information theory. The perspective from which it will be discussed here situates the term in a certain problematics: the central paradigm of empirical experiments for science, and how the algebraic encryption of quanta in quantum physics interfer which non-probabilistic practices of measuring and counting in this paradigm. It is only by means of computations performed upon cyphers, the mathematical way of articulating naught, nothing, and the equational manners of balancing and completing by involving an encrypted negativity to all countable and measurable positivity, that probabilistic procedures are applicable: for it, the amount total of possible cases that are said to happen with probabilistically determinable likeliness must be finite and countable. Negentropy, in accordance with this, means negative entropy; it quantifies and makes countable the symmetrical negative of what the term entropy quantifies and makes countable. And entropy was introduced by Robert Clausius in want of “a word for measuring a quantity that is related to energy, but that is not energy”.[2]

The problem that triggered postulations of an interference between thought and this particular quantity, entropy, must be understood before the background of the modern assumption that thought cannot affect the nature of its object, that it only affects the subjective understanding of this nature. While subjectivity depends upon will or intent, natural forces are working determinably and gratuitously. We can formulate the analog of this situation in terms of thermodynamics, insofar as the conversion of heat into energy, or energy into work, leaves the amount of heat (in this analogy playing the corresponding role to the nature of thought’s object) unaffected: the total heat in a system remains constant, it merely passes from a hotter body to a colder one. At the same time, the 2nd law of thermodynamics states that we cannot maintain an identity between heat and energy: “No matter how much energy a closed system contains, when everything is the same temperature, no work can be done. It is the unavailability of this energy that Clausius wanted to measure”.[3] If heat is regarded as the manifestation of energy in a system, what is needed is a distinction between energy that is available for work, and energy that is not: the total amount of heat in a system may be constant, but it cannot be transmitted (from a warmer to a colder body) without some work being executed. Thus, the work done by natural forces in the thermodynamic setup cannot be regarded, after all, as “gratuitous” in the same manner as it is classical physics. Thermodynamic processes introduce a certain irreversibility into how we think of the conversion of energy from one form to another –  “Time flows on and never comes back”, thus Leon Brillouin 1948 – which in the classical formulation of natural laws does not exist. When the modern paradigm for experimental science builds on the assumption that thought leaves the natural object it tries to conceive untouched, we can see now that it is exactly this assumption which appeared to break down – if physical processes involve a certain irreversibility, then the thinking that guides experiments plays in a manner that cannot so easily be disregarded.[4] In thermodynamic processes, energy is not lost, but it dissolves, it becomes “useless”. This is the so-called “expense problem” related to the irreversibility that applies to thermodynamics: the amount total of entropy (unavailability of energy for work) in all physical systems that can be studied empirically, experimentally, necessarily seems to increase. There is hence a source of disorder that applies to systems which “seemed strangely unphysical,” that even “implied that a part of the equation must be something like knowledge, or intelligence, or judgement” as James Gleick puts it in his recent study Information: A Theory, A History, A Flood (2011). He continues: “Dissipated energy is energy we [emphasis added] cannot lay hold of and direct at pleasure, such as the confused agitation of molecules which we call heat”.[5] Heat, it began to be clear, cannot be regarded as a force nor as a substance, it was not equivalent to energy. In the course of these developments, order – as the epitome of objectivity – acquired a certain amount of subjectivity; it entailed the eye of an observer: “It seemed impossible to talk about order or disorder without involving an agent or an observer – without talking about the mind.” The above used formulation, “necessarily seems to increase” expresses the controversiality of the 2nd law as a law properly: it is based entirely on observation. Its philosophical or even cosmological implications, if it indeed is a “law”, are immense: it introduces the inevitable (however distant) doomedness of all life on earth. Lord Kelvin was not the only well established scientist who began to consider the consequences of the Universe’s “heat death”, as this doomedness was often referred to, for science. Resolutions to this problem began to be discussed in terms of the possibility of a perennial kind of motion that began to be linked up with an interest in “A perfect experiment” as one that is liberated from reasoning biased by the imperfect human faculties and their limitations (—> Maxwell’s Demon), and that arguably still haunts todays discourse on Artificial Intelligence. [6]

Let us jump now to the introduction of the negative entropy term. Erwin Schrödinger introduced it in What is Life? Mind and Matter (1944) as a term that allows to expand the thermodynamic view from physics to biology, and thus also to relativize the implications of the physical view on the entropic universe. His point of departure is that animate systems are capable of metabolizing – of binding and incorporating temporarily – a kind of energy which he called ‘free’ in the sense of ‘available’, or ‘unbound’. Negentropy came to mean for Schrödinger a term that allows for quantifying life (but is not life) similar to how for Clausius, entropy had come to mean a term that allows to quantify energy, (without being energy). What used to be the energy-expense problem of work for Maxwell turned henceforth into a veritable economy in terms of import and export at work in the biosphere-world of thermodynamics – organisms import negentropy (quanta of life), as Schrödinger put it, and the more they do so the more they rid themselves of entropy (quanta of physical entropy now conceived as disorder, vis-à-vis an organisms temporary order/organization). The biological paradigm hence seems to contradict the 2nd law of thermodynamics, and instead suggest that the metabolisms that make up the biosphere were in fact capable of decreasing rather than increasing the universe’s entropy (the amount of work unavailable in the thermodynamic universe). The competing paradigms contrast like this: While thermodynamic physics relates the notion of the universal to the universe (as, ultimately, one generic nature), biology relates universality to the specific natures of life forms. The physicalist notion of entropy, which in physics started out as denoting not the absence of order but the virtual presence of order in any of its possible variations, appeared, from the light of how biology’s operational term of negative entropy can quantify life, as the relative absence of possible variations of order, or as the relative absence of order, or, in short, as “disorder”.

It is this dilemmatic impasse between a certain monism and its pluralist counterpoint that the introduction of “information” into the thinking about thermodynamic processes managed to abstract from, and to open up. I can only point briefly here to how this converting between information and energy works.[7] My core reference is the quantum physicist Léon Brillouin’s adoption of Schrödinger’s term of negative entropy in a manner that adds an algebraically quantized (cryptographic, –> equation) notion of information to this competition (between physics and biology). Brillouin conceived of information as a kind of currency that circulates in energetic expenditure (the import and export between systems), such that “all these [macrological, quantum physical, VB] unknown quantities make it possible for the system to take a large variety of quantized structures, the so-called Planck’s complexions.” [8] With this, he began to postulate information science as the proper domain for quantizing how physical entropy (the virtual presence of any-order) and biological entropy (the absence of order, disorder) relate to one another without subjecting one to the other. Familiar with Turing’s[9] and Shannon’s[10] and Wiener’s[11] work on a mathematical notion of information and their dispute with regard to whether information can be measured in terms of the experimental entropy notion applied to physical systems (Shannon), or whether it needs to be accounted for in Schrödinger’s terms of negentropy import in biological systems[12], Brillouin foregrounded the role of “code” in such “intelligent” computation and applied a double notion of negentropy and entropy – one to energy, one to information, under the assumption that both be linked by code: free (entropic) information to him is the maximum amount of apriori cases formulated in a code (any finite system of ordered elements like the morse code, or the Roman Alphabet, the Periodic Table in Chemistry or the DNA in molecular biology); the apriori cases can be computed by combinatorics, and in entropic information each of them must be regarded as equally likely to be actualized. Bound (negentropic) information is empirically measured information (in experiments with any particular manifestation of such a code). This inclination in the measurement of information allows for thinking of information as a kind of currency – an operator capable of establishing general equivalence, equivalence between observation and object – that circulates in the physical expenditure of energy in executed work as well as in the economy of import and export in a biological systems’s metabolism. “We cannot get anything for nothing, not even an observation”, Dennis Gabor famously maintained.[13] This very important law is a direct result of our general principle of negentropy of information, Brillouin elaborates, and “[I]t is very surprising that such a general law escaped attention until very recently”.[14] The acquisition of information in measurement not only has a price, it also yields something: an increase in operational power; an idea that lends itself to develop a theory of how to quantize and hence quantify in like manner to energy (Clausius) and life (Schrödinger) something like “power of abstraction” (–> invariance). It is this very idea, that information and energy articulate each other in an evolutionary dynamics and in mutually reciprocal manner, that the assumption of a perennial motion is no longer needed in order to proceed with the experimental paradigm in science. With Brillouin’s quantum-cryptographical theory of information, information can be transformed into energy (as electric current), and the other way around (through studying distributions of heat).

[1] David L. Watson in his 1930 article entitled “Entropy and Organization” in Science, 29 August 1930: 220-222, cited in James Gleick, Information: A History, a Theory, a Flood, Harper Collins, 2011, here from the kindle edition: Position 4306.

[2] James Gleick, ibid., Position 4313.

[3] ibid., position 4323.

[4] Cf. Léon Brillouin, Science and Information Theory, Dover, New York 2013 [1956], here referred to in the kindle edition, position 2766. In the measurement of any physical system, there are macroscopic and microscopic variables to be taken into account. The former refer to those quantities that can be measured in the laboratory, but they do not suffice to define completely the state of a system under consideration. Once a system is also considered in quantum-terms of its radiation and absorption, there is an enormously large number of microscopic variables to be taken into account as well – and these, one is unable to measure with accuracy as they regard positions and velocities of all the individual atoms, quantum states of these atoms or of the molecular structures, etc. “Radiation is emitted when a physical system looses energy,” Brillouin explains, “and absorbed when the system gains energy,” ibid., position 2776.

[5] Gleick, Information, ibid., position 4355.

[6] “ In other words, as Serres asks, can we maintain that the second law of thermodynamics, which states the necessary increase in entropic energy, is itself universal – even though it is only a “Law” based on experience? His answer is: “Yes, but not quite in the manner of Newton. It [the second law] is it [universal], if I may say so, in non-continuous manner, from region to region. There are archipels, here and there, between them, islands of negentropy. In the limit case we have to deal with an antinomy in the Kantian sense, when one assumes for that instance the universe as being either open or closed. In any case, it is universal in its negation or better: in that which it excludes: perennial motion.” Michel Serres, „Leben, Information, und der zweite Hauptsatz der Thermodynamik“, in Hermes III Übersetzung, Merve, Berlin 1992 [1974], S. 53-96, here p. 80. My own translation.

[7] Cf. Brillouin, ibid., for an extensive and detailed discussion.

[8] Brillouin, ibid.:“There is no continuity at the atomic level but only discrete stable (or metastable) structures, and the atomic system suddenly jumps form one structure to another one, while absorbing or emitting energy. Each of these discrete configurations of the quantized physical system was called a “complexion” by Planck.” Position 2762.

[9] Allan M. Turing, „The Chemical Basis of Morphogenesis“. In: Philosophical Transactions of the Royal Society of London, series B. 237, Nr. 641, 1952, S. 37–72; and „On Computable Numbers, with an Application to the Entscheidungsproblem“. In: Proceedings of the London Mathematical Society. 42, 1937, S. 230–265.

[10] Shannon, Claude E., “A Mathematical Theory of Communication”, Bell System Technical Journal 27 (3): 379–423 (1948).

[11] Norbert Wiener, Cybernetics: Or Control and Communication in the Animal and the Machine, MIT Press, Cambridge MA, 1948.

[12] Shannon discusses the term negative entropy, but considers its distinction negligible for information as a mathematical quantity notion. It was Norbert Wiener, who via the work by John von Neumann, Alan Turing, Claude Shannon and Leo Szilard maintained against Shannon that negentropy is in fact crucial, rather than negligible for a mathematical theory of information; it is largely due to this dispute that until today, different notions of mathematical information are in usage: (1) information as a measure for order in terms of entropy, and (2) information as a measure for order as negentropy; while both speak of information as a measure, and hence capable of establishing order, the two concepts of order are actually inverse to each other: order as negentropy means minimal entropy (maximal amount of bound energy, minimal of free or available energy in Schrödinger’s terms), while order as entropy means minimal negentropy (maximal amount of free and available energy, minimal amount of bound energy in Schrödingers terms). Much confusion in the understanding of “information” arises from this still today. Cf. James Gleick, ibid., around position 3956, although, it must be criticized, Gleick doesn’t seem to be aware of the implications of the issue at stake

[13] Dennis Gabor, MIT Lectures, 1951 cited in Brillouin, ibid., position 3805.

[14] Brillouin, ibid., position 3805.

Leave a comment