Lectures / Thinking as an Algebraic Mechanist

Generic Mediality – A Response to Mark B. N. Hansen’s “Speculative Phenomenology of Micro-Temporal Operations”

Hansen’s lecture and my response were held at the joint annual conference of the Society for European Philosophy and the Forum for European Philosophy, September 3-5 2014 at Utrecht University, with the annual theme: Philosophy after Nature.

http://philosophyafternature.org

 

Generic mediality, and the Real as the physical substance of technical criticality (gk. krinein, Ermessen).

by Vera Bühlmann

Algebra is the art of subsuming givens under a rule“
(Immanuel Kant)

There are phenomena that are to be considered as genuinely simulacral but nevertheless real, Mark Hansen has just maintained in his talk. And he suggests, this is not a capitulating gesture but one of intellectual reclamation: Phenomena, if they are mediagenic, as I would call them, i.e. if it is technical instruments rather than natural bodies directly that render them apparent and perceivable in the first place, then they can be approached within the framework of what Hansen calls speculative phenomenology. My paper will focus on the peculiar role that channels play in such instrumentally augmented perception.

In my understanding, Hansen’s phenomena resemble the role of the Deleuzian dark-precursors – neither predicative nor directly anticipative, they are also not quite premonitions, because it is not a message whose content is sinister that they have to deliver, and neither are these phenomena, despite their simulacral nature, apparitions that merely pretend to be what they in reality cannot display and effectuate. Mediagenic phenomena, in Mark Hansen’s project, are accredited a reality that is genuinely natural, and hence can be approached in the registers of physics, because they correspond to real magnitudes which manifest in nothing else but the discretable and registerable, physical actuality of their apparent appearance – i.e., their mode of appearing. I call their reality genuine because these phenomena are not in need to be legitimated and authorized as „substantial“ by way of testing and determining what may have in fact, that is, in a linearly preceding past, caused them. All the abundant reasons that conflue and cause the effects they display, are to be looked for „entirely within the present“, as Hansen insists. It is not despite but rather because of a peculiar kind of non-signifying autonomy – we could say their impredicativity – that Hansen attributes to these phenomena by treating them within the quantitative registers of physics, that simulacral phenomena, like global warming in his example, can help us to pre-hend that of which we know not (yet) how to assess, how to measure and relate it. Such phenomena are like speculative integrals, meant to embody rational, calculate-able links between the global and the local, between the predicative and the predicated, in our case between climate and weather. Hence the reality Hansen claims for his simulacral phenomena, and the magnitudes in which whey manifest themselves, is not a representational one, but a performative, or rather: an operative one. There is a Real whosepotentiality is referent beyond the manifestation of it as a particular fact, as Hansen puts it.

In order to gain knowledge from such an operative reality, one needs to look for a physical approach. By following Wolfgang Ernst’s approach of theorizing media in their operative dimension[1], which he calls their time-basedness and their timecriticality, Hansen sympathizes with locating this „physics“ in the electromagnetic domain that comprehends all the radiation of physical particles in wave form. But Hansen’s own approach seems to distinguish itself from that of Wolfgang Ernst in an important manner. Hansen’s interest with a speculative phenomenology seems less interested in demarcating a one horizon of „simultaneous origination“ (Gleichursprünglichkeit), as is Ernst’s declared interest for his media-archaeology[2]. Hansen’s own focus is less on such a horizon or, as I would call it, „a master integral“ that would register objectively the past, and the potential future, and that can be „recorded only by media themselves“, by their „superior wisdom“ as Hansen quotes from Ernst[3]. Instead it is the bodies of media – those of both, senders and receivers – that Hansen seems to be interested in. With that, his project seems to be the development of a veritable physics of mediated communication, rather than a physicalist’s theory on the communicative activity of media. A physics of mediated communication includes a phenomenological notion of embodiment into its account. At the risk of overdrawing it a little, let me further dramatize the implications of this difference: One way of expressing this distinction, it seems to me, would be to say that the Ernst view wishes to see Hansen’s Real whose potentiality is referent beyond the historical manifestation of it as a particular fact as a white spectrum, in which the embodiment of media, that is from a phenomenological perspective always singular, be purified and normalized into the mathematical ideality that is of a transcendent order. What Ernst refers to, when he speaks of „technomathematics“, seems to be exactly this. For Hansen, on the other hand, such a Real figures as a dark spectrum, whose knowledge resides in the essential darkness of the manifest embodiment of things themselves, and shimmers through only in speculative renderings of an integral of which all we can specify, speculatively, is that it is to comprehend actual links between a now and here of the manifest body or fact, and the belonging of this now and here to an insisting anywhere and anytime.

In other words, Hansen’s phenomenological view seems to suggest that we should think of this Real, which takes the electromagnetic domain as a dark spectrum, as an active state of latently vibrant radiation, more probable than factual, and that insists in embodied things. In my response I would like to make a suggestion of how the laid-out program of a speculative phenomenology could perhaps be complemented by a further aspect – namely a distinction between what I call functional technology and equational technics, respectively manifesting as dispositional apparatuses and as encrypted applications. The crucial difference is that one is dependent upon a stable framework of coordination, whereas the other encrypts manners of coordination symbolically – a distinction that somehow escapes Ernst’s important identification and exposition of electronic media’s timecriticality. But first I would like to try disentangling some of the implications involved in assuming such an initial state of activeness, before going on with a more technical part which discusses the importance, and at the same time the philosophical insufficiency, of timecriticality for a physics of mediated communication that considers the Real as a dark spectrum. There are seven strings I would like to distinguish and expose, so that they can resonate through the more technical discussions that will follow.

  • We begin with the assumption of a Real as an active state which is virtually „pregnant“ in an indefinit manner, such that it allows for the speculative interplay between discreting (Ermessen) and pre-hending (Vorwegnehmen). This interplay can be seen as a kind of technical criticality that applies to simulacral phenomena whose magnitudes are real, despite being simulacral, real in a sense that is purely operational.
  • This assumed activeness, if it is to be approached speculatively and physically, i.e. non-hermeneutically, requires that fluctuating ratios (quantitative Verhältnisse) are considered to make up a peculiar relationality, a kind of rational fabric that constitutes this activness’s latently vibrant radiation. This rational fabric must precede and provide abundant rather than sufficient reason for whatever mensural order of being or having one might come to characterize of it’s appearances.
  • These ratios are to be dealt with as analytic points, rather than as representations of geometric points. This distinguishes operativity from functionality.
  • We can call dealings with ratios-as-analytical points computations. Computations are themselves purely rational, but they are so in a reckoning, numbering, calculating manner that does not respond to strict, arithmetically predicative, necessities. There are strategic and tactic levels involved which place computations in an agoratic setup – not unlike Lyotard has characterized for „the state of knowledge in computerized societies“[4] – rather than in a historically dialectical one. Such an activeness never yields neutral recording, it’s recording always elects according to a direction that is, within certain constraints, arbitrarily imposed.
  • From the point of view of a speculative phenomenology of media’s micro-temporal operations, the computations these operations perform do not at all legitimate and autonomize thought in a disembodied, non-corporeal manner; quite inversely, so understood, computations place considerable weight on the role of our bodies in what-ever-it-may-be that we call „thinking“: abstractions amount to nothing much of value if there are no lived experiences that correspond to them. But this same point of view also seems to insist that it is only with the employment of abstractions, that the body’s affectivity is capable of opening up a mediate Real that in principle does and forever will continue to be elusive with regard to how we can pinpoint facts by words that name, concepts that comprehend and delimit, forms that manifest regularities, or numbers that count predicatively.
  • Such a stance of lived abstractions „phenomenalizes“ the very quantities that are being processed in technical instrumentality;
  • These phenomenalized quantities are speculated to characterize real magnitudes that are, so to speak, genuinely simulacral magnitudes – like global warming. As a kind of dark precursors, these simulacral phenomena can help us to pre-hend that of which we know not (yet) how to assess, how to measure and relate it.

Hansen’s speculation of where-off and how the simulacral phenomena of such a physics of mediated communication might be decrypted follows Wolfgang Ernst, and the latter’s distinction of „measuring media“ from „mass media“. We must perhaps specify that „media“ here is related to „technical media“ in the sense of communications engineering more narrowly.[5] Within these restrictions, mass media figure in the time domain of waves propagating in space, and measuring media figure in the time-critical domain, which is the frequency spectrum of how waves propagate in space. That is why „mass media“ are called time-based, whereas measuring media are called timecritical. Both are operating within the electromagnetic continuum. But the frequency domain regards it analytically, and represents it as a spectrum, a technical image, whereas the time domain regards the electromagnetic continuum as mechanical, and hence pictures it as a field. Now, what Ernst calls a timecritical event features as an analytical point in the electromagnetic field regarded as a spectrum. Let us bear in mind that an analytical point, unlike a geometrical one, is a split point, a ratio, the encapsulation of a quantitative relation. This is important because it demarcates where Ernst’s timecriticality remains silent about the aspect of digital computation which is perhaps the most powerful of all its aspects: namely that analytical points need to be integrated, and that this can be done in myriads of ways by encryption.

If we consider this aspect of how the electromagnetic continuum needs to be encrypted before it can be taken into account, then we can more clearly characterize three distinct levels that are involved in what Ernst calls „measuring media“: (1) a geometric and mechanical level of a wave propagating in the electromagnetic continuum, the physical substrate of telecommunications; (2) a dynamical and analytical level where a propagating wave is singled out of the field, and where it is attributed a particular frequency number, as a kind of identity-tag within the larger spectrum. Through this singling out, a particular wave is being dynamized, i.e. it is identified as a particular temporality that can be differentiated and integrated, and (3) a level of encrypting manners of how to integrate and differentiate this temporality, what we can call its sequencing. This third level is the level of coding. It is mechanical again, yet algebraically so: it subsumes the ratios, the analytic points, under an encrypted, symbolic form. I will come back to what we can understand by such a „symbolic form“. Important now is that this third level is mechanical, like the first, but on a different level of abstraction then the wave level – it is only here, as far as I understand, that we might be in the realm of quantizing dynamical systems through „encryptive probabilistic procedures“ and where we might face what in Quantum Mechanics is called the measurement problem. Hansen is interested in media’s timecriticality on this third level, as I understand him. Because it is here that a notion of media’s embodiment, insofar as it is not normalized and idealized, can be seen to play a role at all.

So, Ernst’s distinction between „mass media“ and „measuring media“, how far it can carry us with regard to this third level of algebraic mechanics, or quantum mechanics? Technically speaking, each frequency itself can be treated as a field for other frequencies. A field of fields of waves is called a spectrum, a technical image. It is by way of manipulating this technical image and rendering its manipulations back into the physical continuum, that Ernst can speak of media’s measuring timecriticality. In this indirect manner, the amplitudes of waves are encoded in terms of distinguished phases. As a consequence of this, where we have one mass-media channel per frequency that can broadcast the program from one particular source, we can have n, i.e. an indefinit amount of discreted channels per frequency in measuring media. With them, it is not one source which broadcasts, but distributed populations of sources that send messages in parallel. The time-based manner of broadcasting is now being coded, in the strict sense of the term – it is being encrypted according to probabilistic alphabets – and like this, it can serve to host not simply one channel, but myriads of channels. In such probabilistic set-ups, we have, in its most extreme form of peer-to-peer file sharing, one channel for each „message“ sent. Many channels can be encoded onto one and the same physical carrier (a wave). Just to remember the level of artistry and sophistication we are talking about: In this modulatory manner, one Telecom cable, for example the one which supplies our household in Zurich with phone and internet connection, is capable of maintaining more than 10 million distinct channels „within“or rather: „with the carrier of“ one single frequency. This is of course an extraordinary large number because we are talking about a cable, and a cable allows waves to propagate with fewest disturbances (as opposed to air, light or water, for example), but in principle this explosion of sustainable channels applies also to services without manifest cables, like mobile cellular services or bluetooth. Now this encryption, which relies on probabilistic procedures, may well be working „mechanically“ – but that does not mean that it does not involve incredible diligence and sophistication on the side of the engineers! The mechanical work they perform is algebraic before it is functional, it has to make different protocols compatible. This is why I think that the emphasis on „measuring media“ for what is actually an entire compound of both, symbolic encryption and performed timecriticality, is somewhat obscurative. On this technical level of telecommunication which Ernst addresses, it seems more productive to speak of generic mediality rather than of reified „measuring media“.[6]

Let me try to illustrate what is actually happening in such encryptive coding. I will try to do it with an example that is perhaps easier to grasp. Gian Battista Alberti, the Italian architect and polymath in Renaissance Florence, famous of his legendary 10 books on architecture, wrote a book entitled De Componendis Cifris. It is a code of practice for how to encrypt texts in a manner that is augmented by a mechanical device, the so-called cipher disk (which he is said to have invented). Such a disk consists of two concentric circular plates mounted one on top of the other. The larger plate is called the „stationary“ and the smaller one the „moveable“ since the smaller one could move on top of the “stationary”. The first incarnation of the disk had plates made of copper and featured the alphabet, in order, inscribed on the outer edge of each disk, and coordinated in cells that are split evenly along the circumference of the circle. This enabled the two alphabets to move relative to each other, and thus to create an easy to use key – one could give orders like shift one unit to the right after every 5th turn of the movable disk, in order to reconstruct the right letters of the text message in the right sequence.

Communication engineers today are not dealing with cipher disks anymore when they organize for the co-existence of 10 million distinct channels within one frequency, of course. But they are still providing channels for communication through just such encryption. The frequency would be the „static plate“, and each modulation of its amplitude in phases would be a „mobile“ plate. Obviously, such plates can be stacked and set relative to one another in an indefinit amount of manners. With digital channels, every channel is one such key, crafted for every single message that is to be transmitted. Alberti’s cipher disk is still the best illustration of the peculiarly rational, yet not reasonable,laws“ to which technical telecommunication media obey when they enframe how messages can be stored, processed and transmitted. It is algebraic laws of equations that enframe in particular notations (code-systems) a particular calculus of variations. In encrypted mediation, that which circulates remains invariant in the algebraic sense of the word: algebraic because we are on the level of equations, not their derivatives, which would be that of functions.[7]

This is important to realize: every act of coding spells out a code-system, that is in fact, a measured nothingness. A system of rationality entirely decoupled from any reasonable ground. That’s why it can be a system (unlike Saussure’s semiology, for example), precisely because it introduces a notion of zero upon which it operates – zero was indeed the name attributed to the cipher’s character, once it was introduced from Indian and Arabic mathematics to Europe. A cipher, as far as algebra and operability is concerned, is genuinely neutral and vacuous, neither positive nor negative. Empty, as Kant and especially Hegel insisted. Gleichursprünglich, as Ernst says today. Coding, because it is algebraic, operates outside of historical time. That is why a quantum logical approach to information and data seems so promising. The set-up of a code-system is formulaic, equational. It literally represents nothing, or in other words, it constitutes a cipher: a notational body of reciprocal transformability that is transcendent to the distinction between positive and negative. Programming languages are algebraic, and they are heterogenous with respect to each other.[8] It is the epistemological concerns that, within the program of providing logical foundations for knowledge, try to systematize them in one globally consistent symbolic order. Within the mathematical domain itself, to determine the solvability of an equation all the terms on both sides of the equation sign must be arranged such that they cancel each other out, and sum up to 0. Thus, they literally and actually so describe nothing. In literally describing nothing, they can conserve what is contained in the givens (the data). This is different from a function. A function is derivative to an equation, and it doesn’t concentrate on nothing, like the formula it is derived from. Unlike an equation, it is concerned with something: namely with determining that one variation of the invariant conforms to another variation of it. A function is always directed, while an equation rests in itself – although, it never really „rests“. I would like to suggest that the character of a function may be considered as dynamic, and that of an equation as active. Or in other words, functional technology comes in the form of apparatus (with strictly controlled dispositions that are fixed, such that they allow to support variations of a same behavior) while equational technics comes in the form of applications. They live from the opposite of centrally-controlled dispositions, they open up their own zones of exchange through encrypting the domains in which they operate. On the basis of this idea, that equations, while resting in themselves, do actively nothing, rather than represent and stand in for something, we can modulate and actualize their proper „domains of activity“ by endowing it with particular dispositions. This may sound farfetched and hard to picture, but it describes for example how solar cells work. In the case of photovoltaics, a semiconductor is dispositioned such that it is capable of capturing photons from the light to which it is exposed. With solar cells, this disposition is a certain balance between the atomic weights of Bohr and Phosphor. Once exposed to sun light, the electrons begin to jump in order to keep the balance of the initial saturation, and eventually spill over the framework of the cell, and hence produce electric current, garnered from sunlight. Their character cannot be captured in terms of functions (dynamics), it is equational (active). Solar cells don’t need an overall framework, they tap into streaming radiation and encapsulate some of it by „imposing“ upon it their own „rationality“ (by capturing them in code). This is what „measuring media“ do, too: they engender the domains in which they operate through partitioning. We cannot only regard internet apps as instances of such equational technics, but also any kind of computer simulations. Their activity is operational, and hence strictly technical: it produces what it is set up to produce. If you set up a simulation which computes global warming, you will get values that account for global warming. If you set up a simulation which computes the limits to population growth, you will get limits to growth. I don’t mean to ridiculize these simulations, and the seriousness and urgency of the themes they address, but if we would set up a simulation that computes the end of the world, we would also get a result that must be considered valid within the constraints embodied by the parametric model, by the equation on which a simulation runs. Equational technics is strictly rational, and yet it is entirely decoupled from logics and reason. Which is indeed, why it can support speculative reasoning so well.

Thus, I would like to suggest that the „physics“ of mediagenic phenomena can best be explored experimentally by cryptographical sophistication. The real magnitudes of this communicational physics – real in Hansen’s sense of bearing a potentiality is referent beyond the manifestation of it as a particular factare not given in natural kinds, they are continuously being engendered from within the order of operativity within which we choose to address problems. They are magnitudes, and they are real, because they are objective – yet objective in all the manners in which relations of equivalence can be formulated for them. Such real magnitudes are real in a strictly object-oriented order – and I use this word in the sense of computer programming, i.e. with deliberate distance to the term’s appropriations by OOO or OOP. Because in programming, it is clear that one operates within a space of encryption. Questions of denotations never even arise, here. In these languages, one behaves in either a declarative or an imperative manner. In Object-Oriented Programming, as opposed to procedural manners of programming, a coded description depends on no given classification system, but is being set up as virtual in kind, in a shamelessly wasteful and a-pragmatic manner with the simple aim of providing greatest possible generality, i.e. in the form of algebraic kinds of to-be-specified-later types.

******************************************************************

[1] Wolfgang Ernst,“Experimenting with Media Temporality: Pythagoras, Hertz, Turing,” in Digital Memory and the Archive, ed. J.Parikka (Minneapolis: University of Minnesota, 2013).

[2] Wolfgang Ernst, Gleichursprünglichkeit. Zeitwesen und Zeitgegebenheit technischer Medien, Kadmos 2013.

[3] Wolfgang Ernst,“Experimenting with Media Temporality“.

[4] Jean-Francois Lyotard, La condition postmoderne: rapport sur le savoir, Minuit Paris 1979.

[5] My following and very brief discussions of the physics of communication engineering build upon basic knowledge in this field. For an elaborated and detailed account cf. for example Leon W. Couch, Digital and Analog Communication Systems, Pearson 2013 (8th edition).

[6] Cf. Vera Bühlmann, Die Nachricht, ein Medium. Generische Medialität, städtische Architektonik, ambra 2014, especially the Coda to the book.

[7] It is important to realize that the invariant quantity whose fractions are circulating in the transformability space which an equation constitutes features neither as variable nor as constant (coefficient) within the equation. A Calculus of Variation is today referred to as obeying The Laws of Conservation. They find their perhaps most important application in physics, where energy is treated as the invariant quantity (its amount total in the universe can neither be expanded nor diminished) on whose basis we can modulate its „partitioning“ or even, to put it a bit drastically, its „communication“ (Mitteilung) by the electrons which „commute“ or jump between particles. This is for example how a photovoltaic cell is working: it is a material disposition rendered such that it captures photons from the light to which it is exposed, thereby moving electrons to jump and „spilling over“ the bounds of the chemical saturation of the cell and thus producing electric power. Such technics, like photovoltaic cells, I suggest to call equational technics. Cf. for example: Dwight E. Neuenschwander, Emmy Noether’s Wonderful Theorem, John Hopkins University Press 2010.

This aspect, that „content“ is treated as that which can be conserved throughout transformations within a reciprocal space constituted by signal horizons – in short, probabilistic encryption – seems to me the main characteristic distinguishing digital media categorically from analog media. A further context which can help us to better comprehend this aspect: In a simplificatory manner, we can think of analogicity as the idea where the words that can be articulated by the alphabet are taken to make up, all together, an inventory which names all things existing, in other words, a kind of an Adamitic or Original Language which represents a (or rather: the) conceptual order. We can easily find this idea at work in our intuitive but naive idea of the measurement system with all its normalizations based on prototypical material artefacts – the Original Meter in Paris, for example, or the Original Kilo, in France as well, and so on. Now, just like language is studied from a structural and systematical point of view since the end of the 19th century, also the International System of Units began to rid itself from these material prototypical artefacts. The units are defined today within a structural system of conversion – an idea already propagated by Maxwell in the 19th century – where all the units must cohere, that is, exact values must be formalizable for some base units, and all the other units must be derivative from these base units. In the form that is authoritative today, all the definitions of the base units are precise algebraic formulations of possible conversions that can be applied to the base unit as an invariant (meter for length, ampere for electric current, kelvin for thermodynamic temperature, second for time, mole for the amount of substance, candela for the luminous intensity (light)) – except for the kilogram. It too is a base unit, but its definition is still a prototypical artefact. Thus it is the declared goal of recent meetings in 2007 and 2010 to eventually set up a New International System of Units where the structure of the system is to shift from giving explicit precise definitions to the base units themselves to giving explicit precise definitions for the natural constants like the speed of light, involved. Like this, so the ambition, it will be possible to do away with the kilogram as well and find a formulaic definition for it. In order the come up with such a coherent system, it is necessary to assume „natural constants“ – as of today, this is the speed of light, the elementray charge of atoms et cetera. I owe my thanks to Nathan Brown for drawing my attention to this in his talk „Hegel’s Kilogram“, given at the conference Quantity and Quality, the Problem of Measurement in Philosophy and Science which he organized in April 2014 at UC Davis, California USA. For further info, the wikipedia.org site on The International System of Units provides a valid starting point.

[8] Cf. the manuscript to my talk at the Universal-Specific, from Analysis to Intervention Conference at ETH Zurich in November 2013: „The Question of ‚Signature‘ and the Computational Notion of Genericness“, available at www.academia.edu/5117590/The_question_of_signature_and_the_computational_notion_of_genericness;

Leave a comment