The assumption I am trying to organize and structure in this post is thought-in-action (careful! no uncritical taking for facts):

**1 “analogy” and “proportionality”**

**the identity of a term expressing a “quantity” is comprehended as determinable only within a relational order. The problems are formulated in words, not yet in (algebraic) symbolic notation. We are dealing with “kinds of individuals”. **

*Analysis and synthesis over the range of magnitudes* (geometrical quantity, measurable)

natural numbers – proportionality of a term in relation to other terms – x

* (without zero and negative numbers: the “identities” of the terms are treated as “substantial” (magnitudes), as positive, affirmable things within an analogical setup a : b = c : d. The “identities” of the terms are “specious” (geometrical), they involve measuring (comparing), and a particular *unit* as the mark of distinction)

**2 From “proportionality” to “incrementality”
**

**the identity of a term expressing a “quantity” is comprehended as a series, to be approximately defined such that it corresponds to idealized geometrical magnitudes. The problems are formulated from now on in (algebraic) symbolic notation. The proportionally determinable quantities are studied in their incrementality.**

*Analysis and synthesis over the range of multitude*s (numbers *indexing* the geometrical quantities in a space of representation, countable).

integers – incrementality of a value within a process – x/y

* (with zero and negative numbers, works with *finite series* (including recursive (“self-terminating”) ones) where the identification of a term x regards x as a variable under transformation in a given process). Allows for a positional placing of the values within the coordinated representational space (Cartesian). The “identity” of a term x is strictly relational, and can be determined by comparing experimental mappings of its dis-locations within this relation (along the axis). Allows to do geometry by arithmetics. In other words: Analysis over the incrementality of x within a process, e.g. a spatial displacement, or its position and value within a time series in the math of interests and rates (from L. *incrementum* “growth, increase; an addition,” from stem of *increscere* “to grow in or upon”, *incremental*: “act or process of increasing”). The magnitudes are not treated “substantially” (measureable) anymore, but purely “countable” (arithmetical).)

**3 From “incrementality” to “infinitesimality”
**

**the identity of a term expressing a “quantity” is comprehended as a series, involving a dubious infinitesimal “magnitude” (a helpful fiction, according to Leibniz) necessary to support the assumptions of matter’s homogeneity and uniformity. The incrementality of proportional quantities is infinitesimalized.**

Analysis and synthesis range over the infinitesimal values (x_{1}, x_{2}, ….x_{n}) proper to variable (x), comprehended as the sum of the terms of a sequence (series).

real numbers – functional mappings within a system – dx

* with irrational numbers, involving *finite and infinite series* as a variable with different values for establishing a derivational order. Infinite processes can be modeled, analyzed and used in synthesis in *schematically* idealized form: processes that would continue in uniform manner to infinity if no exterior cause interrupts their course. Trigonometric series involving sinus and cosines allow to calculate rotational processes. Analysis and Synthesis here – though their “constitution” is algebraic (involving symbols for the variable and constant values etc) – is treated as strictly arithmetic (assuming the universally uniform applicability of the basic operations of subtraction, addition, multiplication, division involving the problematic assumption of “infinitesimally continuous magnitudes”).

**4 From “infinitesimality” to “virtuality”**

**the identity of a term expressing a “quantity” is comprehended by treating the series as a polynomial expressing a product rather than the sum of the term’s variable parts. The assumption of matter’s homogeneity and uniformity is not taken as an axiom anymore, but treated experimentally. The infinitesimalized incrementality is brought back into a proportional order – yet on a higher level of abstraction than previously. It is not a proportional order among “kinds of individuals”, but one among “kinds of populations or systems”. **

Analysis and synthesis range over the symbolic indexes of infinitesimal values (x_{1}, x_{2}, ….x_{n}) proper to term as variable (x) as a series within the domain of complex numbers – *a locus in quo of imaginary points and figures*, as Arthur Cayley called it in his presidential address to the British Association for the Advancement of Science in 1880.

complex numbers – within derivational orders of systems-in-act – ∂x /∂y

* may involve the *imaginary unit* *i (*also called:* the impossible quantity) *and the domain of complex numbers in general. This allows to treat “infinitesimally continuous magnitudes” symbolically, i.e. *discontinuously* by indexing them as countable (multitudes). Experimental analysis and synthesis applies to the level algebraic symbolicalness of quantities, not to that of geometrical (magnitude) or arithmetical (multitude) quantities. On the level of symbolical quantities, a calculus of variations can be established which treats systems of Differentials constructed *within* the derivational order acquired by infinitesimal calculus. By symbolizing the schematical ideality of infinitesimal processes, it can, through partial integration, also involve series that are potentially non-convergent. Developed in mechanics by Lagrange and Euler (a.o.) in order to do analytical mechanics which can abstract from the distinction between statics and dynamics. This was achieved by generalizing the Cartesian coordinates, because this allowed to formulate conditional equations of the constraints affecting a schematically idealized system, and thus to study systems-in-act (whether statical or dynamic, cf. the heap problem, through placing within the projectively representational space (generalized coordinates) “virtual displacements”. Joseph Fourier radicalized this approach of “virtual displacements” in his analysis for transforming functions into the space of frequencies. The eventual mastering of electricity crucially depends upon this “virtualization of the infinitesimal”.

**5 From “virtuality” to “categoricality”
**(caution: highly “experimental” ! I am thankful for any comment on this line of thought! )

**the identity of a term expressing a “quantity” is comprehended by treating the series as a polynomial expressing a product rather than the sum of the term’s variable parts, and it treats the “product” as purely symbolic. The assumption of matter’s homogeneity and uniformity is not only treated experimentally determinable, but also as conceptually definable. Analysis and synthesis are in the service of the technologically possible and the artificial (rather than the factual). **

*(Cf. my post Articulating Quantities)*

Analysis and synthesis are applied in an inverted manner: polynomials (products) are factorized such that conceptually well-defined values (x_{1}, x_{2}, ….x_{n}) proper to a term as variable (x) can be *engendered*. With polynomial equations in this paradigm we are not looking for “solutions” of equations, but for resolving a formula such that a desired solution space can be mastered, by “spelling out” its (polynomial) terms. The numerical domains do not dictate the scope of a solution space here, rather numerical domains can be defined as fields, modules, rings, such the solution space has most approximately the desired properties.

algebraic (conceptual) numbers – within projectively symbolic orders of ideality – P(x) = 9x

^{3}+ x^{2}+ 6x

* involving any numerical domain, especially also ideal ones. Analysis and synthesis here applied directly to polynomials, and is used for factorizing its terms algebraically. Such analysis and synthesis can depart from formulating the identity of a term expressing a quantity as a *tautology X = x.* It can treat the one side of the tautology as an Invariance (X) while the other side is treated as one of many possible instances x of this Invariance X. Tautologies, in this paradigm, need to be “challenged” (*to challenge*, from L. *calumniare* “to accuse falsely,” *calumnia*“trickery”): in order to “articulate” them (finding a good way of factorizing them) we have to projectively “characterize” them, put them onto an experimental stage of possible happenings / scenarios (i.e. within a simulation). This stage is rather like the Roman *agora,* “public assembly,” than the laboratory situation and its secluded neutrality. To challenge shares common reference also with to categorize, which comes from Gk. *kategorein* “to speak against; to accuse, assert, predicate, to declaim (in the assembly),” before had lost its challenging (accusatory) character and simply turned into the “operational enabler” for naming things properly with Aristotle’s system of 10 categories. To address polynomials as invariances means learning to categorize the genericity of the general (kinds, species, etc). The treatment of the general in terms of types or classes is not an option anymore, once the algebraic number fields (Zahlen*körper*, literally: numerical bodies, not “territories”) are affirmed.

(Cf also my post *Articulating a thing entirely in its own terms, or: what can we understand by the notion of “engendering”* ?)

Pingback: 6 dimensions of architectonic interpretation (according to J. Vuillemin) | monas & nomos