Here are the programs, materials and documentation of the research courses I teach. These courses are possible because of an extraordinary slot for *meta-disciplinary* *research in theory* at the Institute where I am located. The doctoral students have very diverse backgrounds, often more applied than theoretical. This is on our ongoing research on how to theorize technology in a way that helps to characterize, sort, relate, differentiate and organize the implications involved in computation as a practice such that theory is capable to incite experimental perspectives. Any feedback is highly welcome.

**cf also my posts:**

*How to theorize technology today?*

*Scopes of analysis and synthesis enabled in different ways by different “number spaces”*

*Towards an algebraic understanding of architectonic interpretation (according to J. Vuillemin)*

*Projective theory on technology: an emphatic plan*

********************************************************************************************************************

# Winter 2013/14 Phd Kolloquium on ‘computing symbols as literacy and ability’

The next PhD Kolloquium (Winter 2013/14) Computing symbols as literacy and ability starts next Tuesday September 24th.

download the flyer here: PHD_KolloquiumWS13_flyer

**Signification | Communication: theory and applications of glossematic coding as method for pre-specific modeling**

«The entities of linguistic form are of “algebraic” nature and have no natural designation; they can therefore be designated arbitrarily in many different ways.»

(Louis Hjelmslev)

Since Claude Shannon‘s Mathematical Theory of Communication (1936), the notion of information in its technically treatable sense is often distinguished from its linguistic sense by ascribing to the former, as opposed to the latter, a purely quantitative treatment. Yet since the founding documents of a general linguistics in the late 19th century, it is clear for every linguist who affirms the break with the traditional way of studying language as philology, that the notion of the sign is to be treated purely quantitatively as well. Ferdinand de Saussure‘s structuralist paradigm for understanding processes of signification views the linguistic sign as a quantitative value, yet as a negative one which cannot, in itself, be positivized. As a negative value, it can only be specified by „profiling“ it through infinitary lists and their net of contrasts: a ,this‘ can never be signified directly, Saussure held, but only by labelling it as ,not-that‘ and not-that‘ and ,not-that‘ etc. In short, a linguistic sign can only be determined structurally and differentially, within a framework of place-value distributions.

From a logical point of view, de Saussure‘s paradigm of negative determination obviously entails problems regarding methodological feasibility, since it holds, by principle, that the necessary infinitary lists can never exhaustively be made explicit. This is the decisive reason why de Saussure himself considered his own structural approach, which attempted to conceive of language as a system, as having ultimately failed. Surely the post-structuralist critiques on such a notion of general linguistics are well known; yet from the point of view of algebraic computability (rather than that of logics), the situation looks different and is hardly explored today. Louis Hjelmslev is one of the very few linguists who continued the „differentiability within negativity“ approach initiated by de Saussure, by extending it mathematically. He considered Saussure‘s ,negative values‘ in a generalized sense as ,algebraic invariants‘. Like this, the structuralist paradigm is open for taking probabilistic procedures like Markov Chains and other algorithms, with which the diverse programming languages ordinarily work today, into account. From the logical point of view, this can hardly count as a forward pointing path, since it does not clarify how a notion of system could be objectified. Yet with regard to the logistic networks, such fixation is (arguably) neither necessary nor desireable. Here, Hjelmslev‘s algebraic approach offers a powerful alternative to the pre-dominant approaches in terms of semantic or object-oriented (informational) database logic and ontologies, because it is capable of abstracting from the distinction between natural language vs artifical/formal language and needs not subject one to the other: communication and signification can be treated as mutually complementary aspects.

In this kolloquium we will work through Hjelmselv‘s Prolegomena to a Theory of Language (1943), and appropriate it methods in practice. We want to explore if and how structural linguistics as glossematics (in the sense of Hjelmslev) can be extended towards an alphabet of things that were capable of integrating the operability of generative linguistics (Chomsky etc), and hence could provide a powerful method of pre-specific modeling.

**Primary Readings:**

Louis Hjelmslev, *Prolegomena to a theory of language* (1946).

Umberto Eco, *A Theory of Semiotics* (1976)

**Complementary Readings:**

[1] Gilles Deleuze, “How Do We Recognise Structuralism?” in: Desert Islands and Other Texts 1953-1974.

[2] Alfred North Whitehead, „Preface“ in: A Treatiese of Universal Algebra (1898)

[3] Jean Baudrillard, The System of Objects (1968)

[4] Gilbert Simondon, On the Mode of Existence of Technical Objects (1980)

**SCHEDULE & TASKS**

**Tuesday September 24 2013**

Introduction: the rise of linguistics amidst the competition between logics and universal algebra for a hegemonial position within an architectonics of communication: the structural paradigm for studying language. Mandatory Readings: [1] & [2]. Format: Lecture Vera Bühlmann

**Tuesday October 1 2013**

The General Criteria for a Theory of Language (part I): Hjelmslev chapters 1-6. These chapters present the ,specification‘ of glossematics. *Task for this meeting*: extract its parameters, principles, and hierarchy of principles. List them as if you were devising the description for a design task. *Format*: Presentations by students and discussion.

**Tuesday October 8 2013**

The General Criteria for a Theory of Language (part II): Hjelmslev chapters 7-10. These chapters present the horizon of application of the approach specified as glossematics. *Task for this meeting*: extract its outlooks, summarize them, make a list of possible objects of study. *Format*: Presentations by students and discussion

**Tuesday October 15 2013**

Glossematic Coding: Considering any artefact as ,text‘ in the glossematic sense (in terms of ,articulation‘

and ,partition‘). *Format*: Lecture Vera Bühlmann.* Task to prepare for the next meeting*: choose an artefact with which you will work in the following sessions. Begin to orgainze your understanding of it as a ,text‘.

**Tuesday October 22 2013**

no meeting (Seminarweek)

**Tuesday October 29 2013 **

The specifics of glossematic coding: Hjelmslev chapter 11. This chapter introduces the role of «functions» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday November 5 2013 **

The specifics of glossematic coding: Hjelmslev chapter 12. This chapter introduces the role of «signs and figurae» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday November 12 2013 **

The specifics of glossematic coding: Hjelmslev chapter 13. This chapter introduces the role of «expression and content» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday November 19 2013**

The specifics of glossematic coding: Hjelmslev chapter 14. This chapter introduces the role of «invariants and variants» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday November 26 2013**

The specifics of glossematic coding: Hjelmslev chapter 15. This chapter introduces the role of «linguistic schema and linguistic usage» in glossematics. *Task for this meeting:* think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday December 3 2013 **

The specifics of glossematic coding: Hjelmslev chapter 16. This chapter introduces the role of «variants in the linguistic schema» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday December 10 2013 **

The specifics of glossematic coding: Hjelmslev chapter 17. This chapter introduces the role of «function and sum» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday December 17 2013**

The specifics of glossematic coding: Hjelmslev chapter 18. This chapter introduces the role of «syncretism» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday January 7 2014**

The specifics of glossematic coding: Hjelmslev chapter 19. This chapter introduces the role of «catalysis» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday January 14 2014 **

The specifics of glossematic coding: Hjelmslev chapter 20. This chapter introduces the role of «entities of the analysis» in glossematics. *Task for this meeting*: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. *Format*: Presentations by students and discussion

**Tuesday January 21 2014 **

The specifics of glossematic coding: Hjelmslev chapter 21-23. These chapters reflect, from the point of view of glossematics, about the «language-non language» distinction. *Task for this meeting*: think about what Hjelmslev discusses from the point of view of replacing the sound unit (phoneme) with information as a constitutive unit.*Format*: Presentations by students and discussion

********************************************************************************************************************

# Summer 2013 Phd Kolloquium on ‘computing symbols as literacy and ability’

download PHD_KolloquiumSS13_flyer as pdf.

**Information – in the light of the strange theory of light and matter (quantum electrodynamics) **

According to Shannon & Weaver’s mathematical theory of information, information is strictly speaking neither a value (number) nor a magnitude (quantity), but it can be treated symbolically in terms of so-called random variables: values governed by chance. But how can we have a mathematical theory of information, then? For treating it mathematically, they postulated a proportionality framework within which information is to be measured: the more information in a system, they assume, the less strictly organized is the system, and the more uncertain they take the system’s behavior to unfold. With their notion of information entropy, Shannon & Weaver have set up a framework analog to the dynamic paradigm of thermal machines.Within this paradigm, the goal for communication processing is clear: to help reduce uncertainty by organizing a systems flows, transformations and exchanges more strictly.

What if we tried to view a mathematical theory of information within a quantum paradigm, rather than within a thermodynamical paradigm? The most important change as opposed to the thermodynamic paradigm is that the formalism’s capacity does not have principle boundaries by restricting it to the real number space: instead of focusing on the (representative) *determination of random variables*, at interest here is the (operative) *articulation of path integrals*. According to a quantum paradigm, we can deal with a decoupling and open ended paralleling of what (within the thermodynamic paradigm) needs to be nested within one comprehensive system: in a quantum paradigm we can deal with *a stream of ‘data’* (1) and *a* *formalism* that captures quanta from this stream (proportional to its individual capacity to integrate) (2), and *an act* necessary for deciding when and with regard to what the formalism is to capture and integrate ‘stuff’ from the live stream (3). The goal in this paradigm is to increase the tolerance for a model to cope with uncertainty, not to decrease it. The outlook promised by this paradigm is that the model’s capacity to integrate probable behavior can be developed by training (e.g. with self-organizing maps SOMs).

The main reading of this Kolloquium is Richard P. Feynman’s *QED, The strange theory of light and matter*(Princeton University Press 1983).

**Key notions to develop a clearer understanding:**

*integrals, path integrals, complex numbers, probability, probability amplitude, random variable, quantization, encoding, representation, rendering, live streams of data, mathematical structure, categories, classes *

**First meeting is on June 18th 2013**

Meetings are held on Tuesdays, 9 am Swiss Time at the CAAD Chair in Zürich. If anyone is interested to join per Skype, you are welcome to email me.

**Program & readings**

**Tuesday June 18 2013**: *Introduction to the program, the generic and the pre-specific
reading*: Plato, Timeaus [1]

**Tuesday June 25 2013**: *computability and probability
readings*:

Stanley Burris, The Laws of Boole’s Thought. [2]

Walter Carnielli, Polynomizing: Logic Inference in Polynomial Format and the Legacy of Boole [3]

Claude Shannon, Mathematical Theory of Communication [4]

**Tue****sday July 2 2013
**

*reading*: Feynman, chapter 1 (Introduction) [5]

**Tuesday July 16 2013**

*reading*: Feynman, chapter 2 (Photons, particles of light) [5]

**Tuesday July 23 2013**

*reading*: Feynman, chapter 3 ( Electrons and their interactions) [5]

**Tuesday July 30 2013**

*reading*: Feynman: chapter 5 (loose ends) [5]

**Tuesday August 6 2013**: *The quantum view vs the thermodynamic view*

**References and links to online texts:**

[1] Plato, *Timeaus*.

[2] Stanley Burris, “The Laws of Boole’s Thought”

[3] Walter Carnielli “Polynomizing: Logic Inference in Polynomial Format and the Legacy of Boole”

[4] Claude Shannon “Mathematical Theory of Communication”

[5] Richard Feynman, *QED, the strange theory of light and matter,* Princeton University Press 1983.

# Winter 2012/13 Phd Kolloquium on ‘computing symbols as literacy and ability’

*PhD Kolloquium in Projective Theory on Technology, Wintersemester 2012, LAV CAAD Swiss Federal Institute of Technology ETH Zürich, Switzerland*

*PhD Kolloquium in Projective Theory on Technology, Wintersemester 2012, LAV CAAD Swiss Federal Institute of Technology ETH Zürich, Switzerland*

**Computability – considered in the light of the Master Argument**

Computation is largely treated today as the procedure to »mechanize« »logics«. Our interest with a projective theory on technology is not to reject (negate) or affirm (analyse) the assumptions involved, but to sort them out strategically. Our interest is to complement the scientistic paradigm of »control« for theorizing technology with a humanistic dimension of ability and artistic mastership. This interest has a long tradition in philosophy, and crystallizes in the so-called Master Argument. The Master Argument regards the possibility if and how we can meaningfully and methodically involve temporality and self-referentiality into logical/formal considerations. The inferential structure of the Master Argument has first been articulated by Diodorus Cronus in the 3rd century BC, and tries to formalize a paradox which has preoccupied all the main steps of development in systematical thought ever since. This is why the many attempts to formalize this paradox provide, for our projective theory interest, a rich and differentiated reflecting surface that allows to investigate, comparatistically, how these questions have been treated over time.

While the philosophical interest in the Master Argument was mainly in questions of legitimation and foundation, our interest in it is operational. We will not take, allegorically speaking, the position of the Despotic Priest, the Philosopher King, the Statesman or the Assigned Administrator, but that of the Symbolical Metallurgist. In short: we will seek to extract from the Master Argument and its history a template that allows us to cultive computing as an ability, namely the template of a mechanism for learning how to learn when being equiped with the generic methods of algebra.

We will read Jules Vuillemin‘s book Necessity and Contingency, The Master Argument (Center for the Study of Language and Information, Stanford University Press 1996). The historical account he gives is framed by the rôle of probabilistics for Information Science and Computing, and thereby especially relevant.

If you are interested in participating please email until September 28 to: buehlmann at arch.ethz.ch The texts are avaliable in PDF version upon request.

MAIN READING

Jules Vuillemin, *Necessity and Contingency.The Master Argument*, Center for the Study of Language and Information, Stanford University Press 1996.

COMPLEMENTARY READINGS FROM

E.T.Bell, *The Development of Mathematics*, Dover, 1940. [1]

Israel Kleiner, *A History of Abstract Algebra*, Birkhäuser Basel/Boston 2007. [2]

Leo Corry, *Modern Algebra and the Rise of Mathematical Structures*, Springer 2004. [3]

Yvette Klosmann-Schwarzbach,* The Noether Theorems: Invariance and Conservation Laws in the Twentieth Century*, Springer 2011. [4]

Werner M. Seiler, *Involution. The Formal Theory of Differential Equations and its Applications in Computer Algebra*, Springer 2010, [5]

PROGRAMME

**Tuesday October 9 2012**

Generic Methods and The Master Argument (Vera Bühlmann)

Reading: E.T.Bell: From Mechanics to Generalized Variables ([1] p. 370-382)

**Tuesday October 16 2012**

Reading:

Jules Vuillemin: Systems of Necessity – A System of Logical Fatalism: Diodorus Cronus.

Israel Kleiner: History of Classical Algebra ([2] p.1-14) & History of Group Theory ([2] p. 17-38)

**Tuesday October 23 2012**

Jules Vuillemin: Systems of Necessity – Eternal Return and Cyclical Time: Cleanthes’ solution.

Leo Corry: Richard Dedekind – Numbers and Ideals ([3] p. 64–135)

**Tuesday October 30 2012**

Jules Vuillemin: Systems of Necessity – Freedom as an Element of Fate: Chrysippus.

Leo Corry: Emmy Noether – Ideals and Structures ([3] p. 220-252)

**Tuesday November 13 2012**

Israel Kleiner: History of Ring Theory ([2] p.41-60) & History of Field Theory ([2] p. 63-77)

**Tuesday November 20 2012**

Jules Vuillemin: Systems of Contingency – Towards Rehabilitating Opinion as Probable Knowledge of Contingent Things. Aristotle.

Israel Kleiner: History of linear Algebra ([2] p.79-89) & Emmy Noether and the Advent of Abstract Algebra ([2] p. 91-101)

download slides Aristotle and Probable Knowledge

* Tuesday December 4 2012*E.T.Bell: Invariance ([1] p. 420-468)

download slides E.T.Bell on Invariance Theory

* Tuesday December 11 2012*Jules Vuillemin: Systems of Contingency – Epicurus and Intuitionism.

download slides Intuitionism, Atomism and Amphiboly

**Tuesday January 7 2013**

Jules Vuillemin: Systems of Contingency – Carneades and the Skeptical Nominalism of the Modalities.

Yvette Kosmann-Schwarzbach: The Inception of the Noether Theorems ([4] p. 25-53) &

The Noether Theorems ([4] p. 55-65)

**Tuesday January 14 2012**

Jules Vuillemin: Platonism and Conditional Necessity.

Werner M. Seiler: Involution – Overdetermined Systems ([5] p.1-8) &

Involution 1: Algebraic Theory ([5] p.63-104)

* Tuesday January 21 2012*Jules Vuillemin: Epilogue

Review Computability in the Light of the Master Argument.

download slides epilogue COMPUTABILITY AND THE MASTER ARGUMENT

********************************************************************************************************************

# Summer 2012 Phd Kolloquium on ‘computing symbols as literacy and ability’

*PhD Kolloquium in Projective Theory on Technology, Summer semester 2012, LAV CAAD Swiss Federal Institute of Technology ETH Zürich, Switzerland*

**Computation, and the question of the applicability of arithmetics**

Computation is treated today as an art, just as Mechanics had been in the Renaissance and the Baroque periods. This basically means that its actual performance is widely recognized and welcome, striking in effect, unexpected, fascinating and also convincing-by-fact, while at the same time the actual methods and procedures are applied rather like recipes. Over time, this gives rise to: 1) a lot of the same, boredom. And 2) to vast disputes around ancient questions on the rôle of technics in the nature of reasoning, intelligence, science.

We want to gain a better insight about the modern theoretical context of these involved topoi, and will start with reading Michael Potter‘s introductory book to the main stances Reason‘s Nearest Kin –Philosophies of Arithmetics from Kant to Carnap, Oxford University Press 2002.

Meetings are held on Wednesdays, 11 am (Swiss Time) via skype, between the CAAD Chair in Zürich and the NUS / ETHZ Future Cities Laboratory in Singapore.

**Compulsory reading**

Michael Potter, *Reason‘s Nearest Kin – Philosophies of Arithmetics from Kant to Carnap*, Oxford University Press 2002.

**Program**

*Wednesday February 22 2012*

Chapter O – Introduction

*Wednesday February 29 2012*

Chapter 1 – Kant

*Wednesday March 7 2012*

Chapter 2 – Grundlagen

*Wednesday March 14 2012*

Chapter 3 – Dedekind

*Wednesday April 4 2012*

Chapter 4 – Frege‘s Account of Classes

*Wednesday April 11 2012*

Chapter 5 – Russell‘s Account of Classes

*Wednesday April 18 2012*

Chapter 6 – The Tractatus

*Wednesday May 2 2012*

Chapter 7 – The Second Edition of Principia

*Wednesday May 9 2012*

RECAP chapters 1-7

*Wednesday May 16 2012*

Chapter 8 – Ramsey

*Tuesday May 22 2012*

Chapter 9 – Hilbert‘s Programme

*Tuesday May 29 2012*

Chapter 10 – Gödel

*Tuesday June 5 2012*

Chapter 11 – Carnap

*Tuesday June 12 2012*

Chapter 12 – Conclusion

**Elective reading suggestions
**

*on Dedekind‘s notion of numerical ideality and the rôle of abstraction therein*

Richard Dedekind, *Essays on the Theory of Numbers*, transl. by Wooster Woodruff Beman, Project Gutenberg, released 2007.

Erich H. Reck, “Dedekind’s Contributions to the Foundations of Mathematics”, *The Stanford Encyclopedia of Philosophy* (Fall 2011 Edition), edited by Edward N. Zalta.

Erich H. Reck, “Dedekind, Structural Reasoning, and Mathematical Understanding” in: *New Perspectives on Mathematical Practices*, B. van Kerkhove, ed., Singapore: WSPC Press, 2009, pp. 150-173.

W.W. Tait, “Frege versus Cantor and Dedekind: On the Concept of Number” in *Frege: Importance and Legacy*, M. Schirn, ed., de Gruyter: Berlin, pp. 70– 113.

Sephorah Mangin, “Dedekind Abstraction and the ‘Free Creation’ of the Natural Numbers”, online: http://www.sephorahmangin.info/selected_essays/ Dedekind_Abstraction.pdf

Vera Bühlmann, “Continuing the Dedekind Legacy today: Some ideas concerning architectonic computation” paper delivered at Turing 2012: International Conference on Philosophy, Artificial Intelligence and Cognitive Science at the De la Salle University in Manila, Philippines, March 27-28 2012.

Pingback: PHD Kolloquium Winter 2012: Computability in the light of the Master Argument | monas oikos nomos

Pingback: Summer 2013 Phd Kolloquium on ‘computing symbols as literacy and ability’ | monas oikos nomos