What this lesson covers
The syntactic side of the semiotic universe: how a typed lambda calculus generates the compositional grammar of the sign system, what it means to interpret syntactic operators in the semantic domain, and what constraints the interpretation must satisfy.
Prerequisites
The Semantic Domain. Familiarity with the typed lambda calculus is helpful.
Types, terms, and raw operators
The semiotic universe’s syntax is generated by a typed lambda calculus. The type language is built from:
- A base type — the type of semantic objects (elements of ).
- Function types — the type of operations taking an and producing a .
- Product types — the type of pairs.
Terms are built from variables, lambda abstraction (), application (), pairing, and projections, with standard typing rules. The important objects are the operators: closed terms of type — functions that take semantic objects and produce a semantic object.
Operators are quotiented by definitional equality (-equivalence and any additional equational axioms). Two syntactically different terms that compute the same function are identified.
The set of all definable operators is:
where is the smallest set of -ary operators closed under primitive constants, lambda-definability, and composition.
Semiotically, the typed lambda calculus plays the role of grammar — the rules by which signs combine. Each operator is a combinatory rule: a way of taking signs and producing new signs. The type system ensures that combinations respect the kinds of signs involved — you can’t apply a rule that expects a proposition to something that isn’t one.
Interpretation
An interpretation maps each syntactic operator to a function on the semantic domain:
This is where syntax meets semantics. Each syntactic rule (an operator) gets a meaning (a function on ). But not any mapping will do — the interpretation must satisfy seven conditions that ensure syntax respects the full semantic structure:
-
Monotone and join-continuous: each preserves the order and directed joins in each argument. Syntax respects the logical structure.
-
Heyting compatibility: the operators for , , , , on the base type are interpreted as the Heyting operations on . The logical connectives mean what they should mean.
-
Modal homomorphism: . Applying an operator and then stabilizing gives the same result as stabilizing the inputs and then applying the operator. Syntax commutes with modality.
-
Trace compatibility: . The same for the trace — tracing a combination is the combination of the traces.
-
Definitional equality preservation: if the lambda-theory proves , then . Syntax that computes the same thing means the same thing.
-
Fragment preservation: for every fragment . Operators don’t break the locality structure — they keep fragments closed. (Fragments are defined in the next lesson.)
-
Hereditary extensionality: the family of all interpreted operators is hereditarily extensional — equalities valid on a fragment remain valid under all semantic construction steps.
Conditions 3 and 4 are the key coherence requirements. They ensure that the syntactic grammar is compatible with both the modal structure (the distinction between stable and provisional meanings) and the trace structure (provenance tracking). An operation that creates a new sign from existing signs must behave consistently with how those signs stabilize and how their histories compose.
What the interpretation provides
The interpretation turns the syntactic algebra into a layer of the semantic domain. Every syntactic operation has a precise semantic meaning, and that meaning respects the full algebraic, modal, and comonadic structure.
This is stronger than a typical semantics-of-programming-languages setup. There, interpretation preserves types and equality. Here, interpretation also preserves modality (which computations are stable), trace (which computations preserve provenance), and fragment structure (which computations are local). The semiotic universe demands that syntax and semantics agree not just on what things compute, but on how they relate to stability, history, and locality.
The next lesson — Fragments and Fusion — shows how the semantic and syntactic sides are brought into coherence through fragmentwise reasoning and the fusion closure operator.