“…linguistics has just provided the death of the author with a precious analytical tool, by showing that the complete utterance is an empty process that functions perfectly without the need for filling it with its individual interlocutors: linguistically speaking, the author is never anything more than he or she who writes, in the same way as the self is none other than the person who says I; language knows a ‘subject’, not a ‘person’, and that subject, empty except in the utterance itself, which is what defines it, is sufficient to keep language ‘on its feet’, that is, to completely exhaust it”. (Roland Barthes,The Whisperer of Langauge. Beyond Words and Writing)
0 „Who“ says the enunciation of the universal is untenable?
„The paradox of the enunciation of the universal. Historical experience and the history of philosophy have made us highly sceptical towards the very possibility of enunciating the universal, yet the universal can be said to have become a fact of contemporary life, and the attempt at enunciating the universal remains an inescapable demand, in politics and notably in practice. Not to enunciate the universal is impossible, but to enunciate it is untenable.“ (Etienne Balibar, Construction and Deconstruction of the Universal, Critical Horizon, 2006)
„As individual production, utterance can be defined, in relation to language, as a process of ‘appropriation’. The speaker appropriates the formal apparatus of language and utters their position as speaker by means of specific signs, on the one hand, and by using secondary procedures, on the other. […] The individual act of appropriation of language places the speaker in their own speech. This is a constituent fact of utterance. The presence of the speaker in their utterance means that each instance of discourse constitutes an internal point of reference.“ (Emile Benveniste, Problems in General Linguistics)
By raising the issues of „signature“ in terms of a postulated „literacy“ in computation, this lecture will focus on the philosophical backgrounds before which what is suggested in the theme of this conference as an opposition – between universality and specificity – need not be taken as an opposition at all. These backgrounds concern the status of algebra for mathematics, in the comprehensively philosophical sense of mathematics as „the art of learning“, in general, from Gk mathema, „that which is learnt“. We tend to forget this legacy of thinking about mathematics today, but Heidegger has certainly given it new relevancy in his lectures on Kant entitled Die Frage nach dem Ding. Mathematics, he says, is giving to oneself what one already has.
Algebra provides ways of managing the infinite, this is what we can read in the introductions to text books on the subject. Yet in practice, the common assumption today is to regard the status of algebra for that which can be learnt as functional and instrumental. In disagreement with that, I would like to make a case for regarding it, instead, as constitutional. What this shift of perspective results in is that algebraic enunciation of the universal means, to put it in an Aristotelian way, to raise the wealth of that in what the specific is „richer“ than its genus: namely differences.
Let us begin by remembering that algebra has evolved around developing what is often called „auxiliary constructions“. Such constructions are capable of supporting proportional reasoning. An algebraic equation establishes how two things, A and B, may be regarded as equivalent. Over the millennia, algebra has developed ever more general forms and procedures, of how to formulate general equations, in ever higher levels of abstraction: that is, equations raised to their quadratic, cubic, quartic, quintic powers, and higher. This „generality“ that is established thereby, I would like to suggest, relates not to the form of a thing, but to a thing‘s powers. What changes over time follows a simple principle: the level of the abstractness in which an equation‘s terms can be handled with general procedures is proportional to the amount of ways in how such equivalence can be reasoned and maintained: the higher abstraction, the more ways of resolving a postulated equivalence.
We might well ask whether this development is necessarily a good thing. But we should be aware that to raise this question touches upon the question of humanism. Stances towards what it means to be human need not only clarify „natural dispositions“, moreover they need to embed intellectuality within a „virtuality of morals“. This is also at stake in Balibar‘s citation: enunciating the universal is not only establishing the conditions for equality, at the same time it also animates intellectuality.
However one might feel inclined to relate to this, what we need to state is that increase in power of algebraic genericness provides the conditions for co-existence of what, in the generality on a lower level, must count as an irresolvable contradiction. Mathematically, we would then speak of an unsolvable equation and call for more realism in our speculations. But this would be to disregard that algebra has developed exactly along a vector of abstraction which has allowed to render ever more equations into a general form such that they provide possible solution spaces, and hence the conditions to settle conflicts into co-existence, such that they need not be decided.
But how can we make sense of this „vector of abstraction“? Is „to abstract“ not to extract from what is given, and hence necessarily to reduce that which is given concretely to some deficient general form ? In philosophy this is the very hot spot about which „realist“ and „nominalist or conceptualist“ stances towards the universal debate fiercely. From my point of view, if we speak about abstraction as engendering algebraic genericness, we can simply address enunciations of the universal in terms of literacy. Is literacy real? of course. Is it conceptual? of course.
To sum up the argument so far, we can say more clearly what algebraic literacy refers to. The higher the capacities to treat equations that are raised in their power – i.e. the more abstract the level of algebraic genericness – the vaster the solution spaces yielded thereby. If we remember that the level of genericness determines the amount of specific differentiation in which the terms that make up an equation can be treated, we can easily relate the situation to language. With an active vocabulary of 40‘000 words rather than 10‘000 words, the world in which one lives is simply more differentiated. To express, share, and communicate this differentiation, however, it is not enough to make grammatically correct sentences. This is why an utterance – different from the logical form of a statement – always bears the signature both of the person who uttered it, and that who received it. With computed objects, and this is may main point, the situation is not much different.
1 algebraic extension
The perspective that algebra is a kind of language was perhaps most prominently pursued by Leibniz and Spinoza in the 17th century, and then again by the algebraists in the 19th century. At issue for this perspective was, then as today, how we could make sense of objects, after the rise of the Cartesian space and of infinitesimal calculus in science, if their extension is analytic. Analytical extension is, in its algebraic constitution, of a higher level of abstraction than (Euclidean) geometric extension. To illustrate in perhaps the most quick manner what this involves, we can recall the Cartesian distinction between two substances, the Res Extensa and the Res Cogitans. This distinction is hardly overestimated if we consider it as the crucial one for modern science at large: science that is rational, experimental, empirical. Well, it is this distinction which gets into troubles with analytic extension.
I will come back to that in a moment, but before we begin with a proper and much slower consideration of these issues, I would like to name a few of the main protagonists, as a way of entrance, so those of you who happen to be a little familiar with the history of mathematics and computing will understand better what I am arguing for.
We have George Boole and his theory about a general procedure of how to reason in terms of probabilities (The Laws of Thought); we have Hermann Grassmann and his theory of algebraic extension as a means for geometrical, instead of numerical, analysis (Exterior Algebra, vector spaces were introduced here); and we have Richard Dedekind with his procedure to provide proper concepts of numbers, and with his contributions towards a categorial treatment of numerical concepts. In this, Dedekind was perhaps the most algebraic algebraist among those named, because his procedure, the so-called Dedekind Cut, really did provide a whole new universe with possible ways of how to „manage the infinite“ – with it, we have the entire strange world of abstract quantities that allow us to think in terms of very abstract distinctions in programming. For example, if we‘d picture what it means to think without algebraic extension, we are perhaps likely to say: I know this thing here can vary in it‘s color. A cow can be more or less dark, lets say. If we consider this with a mindset of the 18th century, that of (thermo)dynamics and probabilistics, on the other hand, we will say: there is a variety of manners in how the cow can be colored differently. So we deal with sets. But in the mindset introduced by Richard Dedekind, and if we are geometrician like Bernhard Riemann, we will consider not one variety but several ones in terms of the entirety of their possible interplay, say, color, temper, age, and so on, and hence focus on their variability in particular contexts. The mathematical concepts that characterize varieties and variabilities are called Groups and Fields, Modules and Rings, and they are already pretty abstract; but they are all 19th century. From the point of view of todays math, which peaks in all the branches that are made possible by category theory, they have long given way to an entirely new manner of doing analysis. Category theory is considered a mathematical language even by mathematicians, and analysis does not find anymore „elements“, in the sense of the Cartesian Rules for Conducting Reason, as the „smallest possible units“; it approaches everything in terms of symmetry relations and invariance, and correspondingly it identifies Sheaves instead of Elements, Bundles instead of Sets, Toposes that are coordinated specifically by Abstract Categories, not Regions within a Homogenous Space.
Despite these developments, when working with modeling software in CAAD we can easily get the impression that we are still handling objects in a formal space that represents – rather than hosts articulations of – geometrical extension, as a framework that has not, in essence, changed since Descartes. But certainly, Descartes did not think about the „smallest possible units“ on which his analytical methods build, as units of a language. To him, they were quantities that extend in geometrical space. They constitute Res Extensa. The „elements“ of algebraic extension on the other hand find their extension in abstract metrical spaces – that are engendered through conception, and not by extraction in any direct way from things as they are given in their physically manifest form. Such conceptually abstract spaces are vector spaces, matrix spaces, and manifolds that extend in n-dimensions. Bertrand Russell has thematized this shift in his PhD from 1898 On the Foundations of Geometry, and he has made it most clear that the issue at stake, with regarding algebra as a kind of language, is unacceptable in any philosophically non-problematical way. It conflicts strongly with epistemological interests in the logical foundations of mathematics, because in these mathematical spaces logics is instrumental, not external and descriptive. So to view algebra as language has been much disputed, most fiercely perhaps around the turn of the 19th to the 20th century. But what I would like to focus on here is not a discussion of this conflict, but the beauty of the perspective to view algebra as a language.
2 Programming languages – computational utterances in a literal number space
Within theoretical information science – with this I mean where the programming languages are invented and developed – the interest in language lies in the sheer transformability of whatever can be „articulated“ by means of „generative grammars“. The interest in language is purely structural, and the articulations possible, by such language, they too are structural (abstract objects). We obviously see an appropriation of linguistic insights for genuinely operational ends. In what sense then can we at all understand programming languages in terms of linguistics? Or in other words: If linguistics studies how sound and meaning are related, then what would be the corresponding pair to characterize the study of programming language and its abstract structures?
The answer I would like to suggest is to regard these „structural objects“ in terms of an „integrity“ that is proper to them, symmetrical to how „meaning“ is assumed to be proper to words and sentences in linguistics. Such „integrity“ of structural objects can be studied via the relation between the „literacy“ and the „signature“ that constitute an object‘s formulation and expressive power.
But let us first look closer at the backgrounds of programming languages. Ada Lovelace (1815-1852), the daughter of the somewhat scandalous poet (and freedom fighter) Lord Byron (1788-1824) is famous for perhaps the major leap in thinking which stands behind the paradigm of language for computation: she considered that Babaggae‘s The Differential Machine, and its successor called Analytical Engine incorporate an abstract space in „manifest“ (symbolical) form, such that it could be coded. But lets first look at Babbage:
„Charles Babbage came up with the idea about the time the Analytical Society was founded in 1812. He was sitting in front of a set of logarithms that he knew to have errors. At that time there were people, called ‘computers’, that would compute parts of logarithms in a sort of mass productive enterprise. Babbage had the thought that if people could break down bits of a complicated mathematical procedure into smaller parts that were easily computable, that there must be a way to program a machine to work from these smaller bits and compute large mathematical computations, and to do so more quickly without human error.“ (European Graduate School library entry on Babagge)
Ada Lovelace was a mathematician, but her interest in these engines was precisely not that they operated mechanically on bundling arithmetic sequences in handy bits and pieces, but that the numbers actually open up an entirely different kind of space to think in. She was the first to consider that the numerical space, as it is „manifest“ in such an engine, could actually have memory, and hence be structured in much more complex ways than the ideas of non-striated number spaces on which arithmetics usually relies. Much more, she thought, a numerical realm with memory and differential, heterogenous coordination, can be structured such that it can host activities not unlike the verbs are hosted by the grammatical structures of nouns, prepositions, and adverbs in language. That is, in different temporal forms that allow for story-telling, or, as we are more likely used to say, to encode several activities into a complex which we call procedures. From our perspective today we could say that she attended to the mediality of numbers, not only to their instrumentality – much like since the linguistic turn we attend to the mediality of language, not only to its supposedly neutral instrumentality. Ada Lovelace has been called „the Enchantress of Numbers“ because she thought about the numbers in these engines as notational codes, and on this assumption she could invent the first theory of how to program.
With Ada Lovelace‘s Leap still in mind, let us look briefly at the much more recent development of how such thinking, that situates itself in a literal number space which can host something like grammars for formulating computational utterances has developed since, and what we can imagine as these ,abstract‘ activities of which Lovelace envisioned that they could be staged and dramatized, through programming, in a number space that is, peculiarly so, literal.
There can be distinguished two very strong paradigms in programming throughout the last decades. Early languages such as Fortan, Ada, or C started out with a procedural paradigm. The main interest was to make available for easy application, as a kind of toolbox of “instruments” in coded “form,” the precise way of how a certain organizational procedure needs to be set up in order to function well. Think of SAP, I‘m sure almost everyone has had his or her encounter with it. The developments in this paradigm are driven by the fact that every step of decision can thereby be “dispersed” into constitutive procedures, and hence, an infinitesimal limberness can be introduced into organizational forms.The paradigm subsequent to the procedural one pursued a much less directly hands-on approach, and instead became more didactical. With languages like smalltalk, Java, and C++, an object-oriented paradigm followed the procedural one, and it strictly kept apart the levels of what (described by procedures) and how (the specification of this what). Through this distinction, negotiation began to be supplied by “computational augmentation” about what is to be reached, and about how systems can be devised that allow the instantiation of procedures (whats) in much wider variations. Object-oriented programming allows devising entire “libraries” of “abstract objects” that depend on no statically specified order or classification system. Such abstract objects are called generic, and if we consider the algebraic genericness as the levels of abstraction in which things are treated in their powers, we can understand that they are not really “objects” at all—much more adequate would it be to say that they incorporate entire “objectivities“: they allow for one-of-a-kind particulars to “concretize” singularly, and optimally be fitted according to the local and contextual requirements of a task – precisely because they are specified instances of universal enunciation, in the manner of algebra.
3 The amphibolic status of algebraic conception
So let us look at algebra more slowly, by following its discussion in a dedicated article on Stanford Encyclopedia of Philosophy. Algebra is „a branch of mathematics sibling to geometry, analysis (calculus), number theory, combinatorics etc“ we are told, although, as the article continues „in its full generality it differs from its siblings in serving no specific mathematical domain. Whereas geometry treats spatial entities, analysis continuous variation, number theory integer arithmetic, and combinatorics discrete structures“ the introductory paragraph continues, „algebra is equally applicable to all these and other mathematical domains.“
What we can immediately see from this is twofold: (1) it is custom to regard algebra as on equal par with other mathematical disciplines, in a manner that is „instrumental“, and not „constitutive“ as I would like to argue – it is presented as a brother or sister to them, not their parent; (2) yet we find support for the non-instrumental perspective immediately: unique about algebra among its siblings is, we are told, that it is independent of any domain in particular. A bit later on, when it comes to why algebra is of philosophical interest, the implications of this get even more explicit: „Algebra is of philosophical interest for at least two reasons. From the perspective of foundations of mathematics, algebra is strikingly different from other branches of mathematics in both its domain independence and its close affinity to formal logic.“ – so here we seem to be at the kernel of the problem at stake in conceiving of mathematics as language: there appears to be a competition about whether we should think of it as governed and organized by algebra or by logics. And yet, isn‘t it rather strange to see them in competition, if we follow how the article continues?
„Algebra has also played a significant role in clarifying and highlighting notions of logic, at the core of exact philosophy for millennia. The first step away from the Aristotelian logic of syllogisms towards a more algebraic form of logic was taken by Boole in an 1847 pamphlet and subsequently in a more detailed treatise, The Laws of Thought, in 1854. The dichotomy between elementary algebra and modern algebra then started to appear in the subsequent development of logic, with logicians strongly divided between the formalistic approach as espoused by Frege, Peano, and Russell, and the algebraic approach followed by C. S. Peirce, Schroeder, and Tarski.“
This observation, that algebra has played a crucial role in the development of logics over the millennia, is the actual structure the Encyclopedia Article follows. On its basis, it distinguishes three „generations“ of algebra: elementary, abstract, and universal. The article makes no suggestion of how these three „generations“ are to be related to each other. This is rather confusing because the separation into „elementariness“, „abstractness“ and „universality“ seems to suggest that they all unfold within one common scale, within which they gradually, and in a kind of bottom up manner, extend their scope. This invokes a narrative of progressive approximation of a final goal – universality, the most recent generation of algebra, supposedly being the place to be reached. If we assumed instead that the generations correspond to different levels of abstractness, to each of which correspond simultaneously notions of elementarity, abstractness and universality specific to each level, we can rely on such a generational model of algebra in order to compare how these notions can be formulated in different manners. But for now, and just to get more familiar with this difficult relation between logics and algebra, we will stick close to the generational distinction as is proposed in the Stanford Encyclopedia article. Let us recall, perhaps, that algebra provides „finite ways of managing the infinite“, as the article states, by elaborating general procedures of how we can enumerate and count possible solutions that can be found for a problem insofar as it is formulated in general terms.
(1) The article speaks about elementary algebra as having provided, for many centuries if not millennia, finite ways of managing the infinite. It elaborates: a formula such as πr² for the area of a circle of radius r describes infinitely many possible computations, one for each possible valuation of its variables. A universally true law expresses infinitely many cases, for example the single equation x+y = y+x summarises the infinitely many facts 1+2 = 2+1, 3+7 = 7+3, etc. Each of its methods is also applicable to many nonnumeric domains such as the subsets of a given set under the operations of union and intersection, the words over a given alphabet under the operations of concatenation and reversal, the permutations of a given set under the operations of composition and inverse, etc. Each such corpus of application is called „an“ algebra, and it consists of the set of its elements and operations on those elements obeying the laws holding in that domain. Here, each algebra is treated in a fixed and closed off manner. We can say that in them, what is provided are distinct inventories of coding. These inventories allow to encode particular situations (events) in manners that lets them appear as a case, that is, as an instance of a general form for which the inventory provides the means for computing possible deviations, conjugations, and so on.
We can imagine the relevance of these inventories for science by considering that it‘s symbolic constitution was, for example, crucial for learning to deal with quantities that must appear, in any intuitive sense, as genuinely „unreal“ – as negative values, infinitesimals, imaginary units. In effect of dealing with them purely symbolically, instead of intuitively, elementary algebra allowed for example to go from mechanics to dynamics, and to opened up, with that, a whole wealth of new possibilities that could now be realized – thermodynamics, the clocking and control of processes in systems with the steam engine, the translation of this systemical view to working conditions with the shift from manufacture to industrial fabrication in the factories, the invention of electricity, and so on. Algebra is dealing with symbols whose referents may be left arcane – like this, it can work with assumed quantities that, strangely so, are not really (physically) there – an infinitesimal is an infinitesimal exactly because it has no extension in space whatsoever, and the imaginary unit not only proportionalizes „complex“ quantities, but strictly speaking it proportionalizes „virtual“ quantities. Virtual in the sense that if we try to picture them, they have an extension in time without having one in space. In his recently written History of Abstract Algebra , Israel Kleiner writes illustratively: “Bombelli [(1526-1572)] had given meaning to the ,meaningless‘ by thinking the ,unthinkable,‘ namely that square roots of negative numbers could be manipulated in a meaningful way to yield significant results. This was a very bold move on his part. As he put it: ‘it was a wild thought in the judgment of many; and I too was for a long time of the same opinion. The whole matter seemed to rest on sophistry rather than on truth. Yet I sought so long until I actually proved this to be the case.’“ Israel Kleiner describes what Bombelli means thereby: Bombelli developed a “calculus”, he explains, for how to manipulate these impossible quantities, and this was the birth of complex numbers. „But birth“, he points out, „did not entail legitimacy.” This legitimacy question arises because in elementary algebra, computing with such arcane symbols has added a new dimension to mathematics with striking consequences: the input of certain values in a formula may now not only turn out to be unsolvable because of lack of solutions, it may also yield a solution space that is so vast in options that none of the possible solutions seem more necessary than any other.
(2) The next generation of Algebra then is called abstract algebra. Whereas elementary algebra is conducted in a fixed algebra, abstract algebra treats classes of algebras having certain properties in common, typically those expressible as equations. In this generation, which emerged no earlier than throughout the 19th century and is introduced via the classes of groups, rings, and fields, the inventories of elementary coding are comprehended within larger frameworks that allow to generalize them. With this, the central interest was not anymore to find a particular solution, but to modulate and synthesize entire solution spaces by exploring the symmetry structures among them. Abstract algebra establishes, we might say, on the basis of elementary inventories for coding, generic spaces of potentiality. Within these generic spaces, the main goal is to expand the vastness of generically formulated solution spaces.
(3) With this, we are in the third generation of Algebra – Universal Algebra. In universal algebra, the movement of analysis is not anymore one that departs from cases and seeks to find a generalization of them. Analysis in universal algebra is inverse: it assumes a generalization speculatively, and computes in order to see whether one might indeed, i.e. empirically, find cases that correspond to this generalization. Whereas elementary algebra treats equational reasoning in a particular algebra (inventory for coding), and abstract algebra studies particular classes of algebras (generic solution spaces), universal algebra studies classes of classes of algebras, by attending to their categoricity. It begins to explore the problematicity proper to the abstract and generic solution spaces, we might say. Universal algebra does not deal with inventories of coding, nor with their generalization into classes and sets; it explores on the basis of universal code any way of modeling that may be formulated. This inversion is challenging for the link between mathematical formalization and empirical falsification, because it treats any solution that can be computed as an arbitrary case. It comes to be asked of logics to introduce criteria for identifying necessities. Without the intervention of logics, we get no clue whether a particular solution is actually the best possible one, or even in which regard it is a good or a insufficient one. In short, problems are still – in full conformity with modern experimental science – dealt with as that which is to determine scientific reasoning by guiding the course of analysis; but at the same time, any one formulation of a problem is regarded as problematical in turn, that is, as genuinely indeterminate and yet resolvable – and it is the way in which it is resolved that effectively determines scientific reasoning. Let us work out the contrast more strikingly: Abstract algebra operates within a notion of fully determined general nature where each correctly computed solution counts as a necessity, and within the confines of which it allows for gradual variation; universal algebra on the other hand operates within the impredicative horizon of a determinable universality, within which solutions can vary not only gradually, but also categorically – the values of its formulations can be predicated within varieties that may differ in kind.
This was indeed the key critique on George Boole‘s algebraic logics, and it is illustratively expressed in an open letter by one of his contemporaries in the mid 19th century:
„The disadvantage of Professor Boole’s method is […] he takes a general indeterminate problem, applies to it particular assumptions […] and with these assumptions solves it; that is to say, he solves a particular determinate case of an indeterminate problem, while his book may mislead the reader by making him suppose that it is the general problem which is being treated of. The question arises, is the particular case thus solved a peculiarly valuable one, or one more worthy than any other of being solved? It is clearly not an assumption that must in all cases be true; nor is it one which, without knowing the connexion among the simple events, we can suppose more likely than any other to represent that connexion.“
Boole’s methods were not shown to be faulty or inconsistent—the reason why they had been disliked or even spurned by so many was the immense depth of horizon they had opened up. The openness of this horizon results from regarding intuition not as based in a sensible quantity notion, referring to something that extends in both time and space, but as referring to an intellectual quantity notion. It is a distinction which affects the very heart of critical philosophy. Immanuel Kant himself had considered this option before discarding it. In a short appendix to his Critique of Pure Reason, which is entitled „The amphiboly of concepts of reflection“, Kant criticized that Leibniz, in his thoughts on a universal characteristics, departed from an intellectual notion of intuition instead of a sensible one; he rightly observed that in consequence of this, judgements about a thing in general – i.e. about an object – can never be possible in an unproblematical manner. With this development, mathematics is opening up an abstract domain for developing and raising our faculties to make judgements – yet daringly decoupled from all grounds that could, unproblematically, be considered grounded in reason that simply counts as natural. This is why, as I want to argue here, we ought to begin considering our abilities to compute in terms of literacy that does not, in itself, answer to quests of consistency, necessity, or even truth.
It is surely due to these reservations that Boole‘s algebra, like the contributions of Hermann Grassmann, Bernhard Riemann and others, were met with greatest possible suspicion by their contemporaries. It is hardly exaggerated to say that within philosophy, the view on algebra as a language that is capable of articulating the universal in the form of particular cases fell nearly to oblivion except for some enthusiasts like Charles Sanders Peirce and Alfred North Whitehead, until Claude Shannon realized that Boole‘s Logic actually allows to be applied to electrical current. On this basis he invented his Mathematical Theory of Communication. The revival of the view on algebra as language, and as constitutional rather than instrumental for mathematics at large is very recent (category theory developed roughly since the 60ies) – and it is regarded as „too abstract to be useful“ by many.
And yet, in what kind of world would we find ourselves if we began to consider that through Information technology, universal algebra is de facto constitutive for nearly all domains in how we organize our living environments today?
4 Signing the Natural Contract
I can do no more than exemplify the beauty I see in this perspective with finally attending to Michel Serres‘ notion of world-objects, which I placed so prominently in my abstract to this talk.
“By world-objects I mean tools with a dimension that is commensurable with one of the dimensions of the world. A satellite for speed, an atomic bomb for energy, the Internet for space, and nuclear waste for time […] these are four examples of world-objects.”
In his 1990 book The Natural Contract, Michel Serres proposes a shift in perspective in how to address the fragility of the Earth and our responsibility for it. What he suggests is to treat in inverse manner, corresponding to universal algebra just described, the question central to humanism, anthropology, as well as ecology, namely the assumption of a universal nature, of mankind and of the earth. Instead of attempting to settle with a certain definition of such universal nature, he suggests to give primacy to a notion of „collectivity“ as a natural and universal horizon of what it might mean to be human, on equal par with considering what it might mean to be anything else. It is in the range of universality as horizon, as the subject of collectivity that he places responsibility for his world-objects.
His universal horizon of collectivity is at once object and subject. It allows us to think a new subject-object distribution, such that we need not, as he puts it drastically, „become the victims of our victories, the passivity of our activities.“ What we currently experience, as we learn about the fragility of the Earth, he suggests, is that „The global object becomes subject because it reacts to our actions like a partner.” Now, everyone who knows the writings of Michel Serres only slightly knows that agitation is certainly not his taste as an intellectual. But then why is he putting things so drastically, and what does he attempt to relax by doing so?
As Serres puts it in a retrospective lecture from 2006: „The Natural Contract does not use the term ecology once. Why not? because it deals with the philosophy and the history of Law, and in particular with the question of who has the right to become a legal subject.“ It is in these terms that he radically reframes the issue of identity.
Everyone is aware of the role of that status for modern values. With the famous Declaration of the Rights of Man and the Citizen decreed during the French Revolution, the compass of who counts as a legal subject gradually begins to open up beyond the scope of a few rich white males. But only with a similar yet dedicatedly Universal Declaration published by UNESCO after the second World War can we say that everyone is a legal subject today (although this has become another issue again, with the peculiar status, or rather non-status, of the so-called sans-papiers). But Serres‘ interest is a principle one. In short, he declares: „My book argues that this Declaration [the UNESCO one] is not yet universal as long as it does not determine that all living beings and all inert objects, in short, all of Nature have in turn become legal subjects.“
Of course, the main objection that has been raised against Serres book is to ask: If we are to replace the enlightenment paradigm that „puts nature to trial“ – this too is meant in a juridical way by Kant – and instead seek to negotiate a contract with nature, then „who will sign such a Contract since Nature does not have a hand with which to write nor an understanding capable of any such intention?“ I must count on your credit in dismissing – or at least postponing to settle with – the almost instinctive suspicion that Serres might naively or animistically think Nature is a person.
If nature is the distributed universal subject of collectivity we can see how for Serres, every enunciation of a thing‘s universal nature would be such an act of signature. For Aristotle, to speak meant to „lend our voice to the inarticulate elegance of nature“; in a similar manner we can think of what we do in doing science as lending our voice to express nature‘s inarticulate agreement or disagreement. We can view algebra as a language for learning about nature. Such language would be capable of constituting the corpus of what we might call „General Literacy“. With this, I wish to mark out a similar turn around in perspective regarding the symbols of algebra like the one achieved by Saussure, when abstained from studying the „original“ pure or Adamitic language, and instead began to focus on General Linguistics.
November 14th 2013
My manuscript for the Universal – specific. From analysis to intervention? conference organized by ETH Zürich, D-ARCH Department of Architecture, Institute for the History and Theory of Architecture (gta), Prodoc Art and Science
Keywords: Michel Serres; the algebraic quantity notion; computability; literacy
you can find the abstract here.
Pingback: “Die Empörung des Modells // models, outraged“ – abstract and slides of my talk at the Zhdk conference “David and Goliath – Models between art and architecture” | monas oikos nomos