Re: Linear logic semantics (Barwise) (353 lines)

To: linear@cs.stanford.edu
Date: 25 Feb 92 00:03:12 PST (Tue)

Jon Barwise asked for the intuitive meaning of linear logic, or failing
that, at least what LL was for.  Judging by his summary of the
responses I'd say he has not received a reasonable answer to his
question.  Imagine an instructor who answered every question with a
bibliographic reference to a whole paper.

I would answer Jon's question as follows.

Traditional logic reasons with two distinct sorts, individuals and
truth values, with the logical machinery of the latter external to the
former.  Linear logic combines these into one sort by incorporating the
logical machinery into the nonlogical structure.  The purest
implementation of this takes a model of linear logic to be a monoidal
adjunction F:C->D from a cartesian closed category C to a representably
self-dual closed category D with finite products.  Here D constitutes
the composite logical-nonlogical structure, the duality expresses
conservation of information, and C is the view of D from a traditional
logic perspective.

The implementation is more than adequately expanded on in Robert
Seely's paper on pp. 371-382 of "Categories in CS and Logic", AMS
Contemporary Math 92, 1989, see also more recent ftp-able papers by
Barr (/pub/barr) and Seely, Blute, and Cockett (/pub/rags) on

In the rest of this message I will expand on the intuition in the first
half of this answer in a way that presupposes little or no category
theory.  The physics analogies are intended primarily to help with
intuition, perfectionists will find the details unsatisfyingly

Andre Scedrov provided some feedback on a preliminary version of this
message and asked to add to it the following remarks. "Linear logic did
come about from semantics studies in the first place, namely from
coherent spaces.  I am amazed that this fact got lost in the recent
debate.  Both the coherent spaces and the phase spaces are discussed in
Girard's TCS paper on linear logic. Another semantic aspect of linear
logic that should be emphasized is in Girard's "Geometry of
Interaction" in Logic Colloquium '88 (North-Holland yellow series), in
Danos' exposition of geometry of interaction (in French) in Gazette
Math. (Societe Math. de France) 41, 1989, 55-64, and in the fascinating
work of Gonthier, Abadi and Levy "The geometry of optimal lambda
reduction" in POPL '92."

Michael Barr generously provided much useful model theoretic feedback.
Robert Seely also offered useful commentary.


1.  Observer Participation

Traditional logic, like classical mechanics, is a spectator sport:
there is an apparatus and a separate observer.  Information flows from
the apparatus into and around the observer, whose measurements are
assumed not to disturb the apparatus.  The observer is therefore an
information processing system, the essence of which is a graph with
nodes A,B,...  along whose edges f:A->B (measurement f with source A
and result B) information flows.  The apparatus itself does not see
these edges (but constitutes the sources of some of them) and is not
disturbed by the observer.  The graph of an idealized observer is a
Heyting or even Boolean algebra in the case of nonconstructive logic
and a cartesian closed category in the case of constructive logic.
Considerations of computational complexity and relevance may call for
weaker observers, but not so weak that they disturb the apparatus.

The essence of traditional logic then is an intelligent graph reaching
its edges into an unsuspecting structure and contemplating its

Linear logic, like quantum mechanics, identifies the observer and the
apparatus.  For this to be possible the apparatus itself must be an
information processing system.  That is, we must either find a suitable
graph already present in the apparatus that we can press into service,
or install one.  The apparatus now becomes its own observer, and any
part of the apparatus may be viewed as observing its neighborhood.

The first point of linear logic then is observer participation.

2.  Participant Observation as Linear Negation

We are now no longer permitted to view information as an abstraction
independent of its carrier but must identify information with
information packets as objects of the apparatus.  Girard postulates
that information is conserved: to every measurement corresponds an
"equal and opposite" measurement.  This postulate confers on the flow
of information the same character as the flow of electricity, where a
flow of electrons in one direction can equally well be considered a
flow of holes in the other.  Specifically, a packet of information in
one direction can be viewed dually as an antipacket going the other
way.  That is, to every measurement f:A->B corresponds its mirror image
or *linear negation* f':B'->A'.

(The corresponding duality in QM is called complementarity:
antiposition is momentum, antitime is energy, and the tensor product of
either of these complementary pairs is action, forming a discrete space
scaled by Planck's constant.  A measurement communicating momentum
information in one direction along a graph edge has an equal and
opposite measurement communicating position information in the other
direction.  A logician might identify this duality with word-object
duality, the Galois connection between theories and classes of
structures, with position naturally identified with word and momentum
with object.)

This symmetry of observation means that not all observers can be
perfect.  In a model with moving parts, a fast observer may see a slow
apparatus clearly, but when that apparatus exercises its right to be
called an observer it finds it cannot see the other end clearly.  When
they are the same speed it may be that neither can fully resolve the
details of the other, as with the Heisenberg uncertainty principle of
quantum mechanics, and likewise the difficulty of exactly determining
both the pitch and the start time of a musical note.

It also means that linear logic is not specifically proof theory *or*
model theory.  More precisely, it is both: a proof theoretic
measurement in one direction is a model theoretic measurement in the
other.  While this may sound mystical, to those applying linear logic
in their everyday work it seems perfectly natural.

The second point of linear logic then is symmetry of observation,
expressed by linear negation.

3. Dynamic Conjunction

Any two measurements f:A->B and g:C->D may be logged side by side in the
observer's logbook:

		f: A -> B
		g: C -> D

and treated as a single measurement consisting of two strands f and g.
The juxtaposition operator is called tensor product and is notated
explicitly as @ (\otimes), making the picture

		f   A    B
		@ : @ -> @
		g   C    D

which we can write on one line as f@g : A@C -> B@D.  We shall think of
tensor product as *dynamic* conjunction.

4.  Braiding and Commutativity

The two strands of f@g are parallel.  Noncommutative linear logic
permits strands to be combined without touching only in this way.  In
contrast, braided or weakly commutative linear logic, the most
promising form of LL for QM, offers left and right handed "braided"
versions of f@g:

	          A --   --> D            A --   --> D        
		     f\ /                    f\ /
		  @    /     @  =  f#g    @    \     @ =  f$g
		     g/ \                    g/ \
	          C --   --> B            C --   --> B

both running from A@C to D@B and satisfying (g'$f')(f#g) = (g'g)@(f'f):

          A --   --> D --   --> E         A ----> B ----> E
	     f\ /      g'\ /                  f      f'
	  @    /     @    \     @    =    @       #       @
	     g/ \      f'/ \                  g      g'
          C --   --> B --   --> F         C ----> D ----> F

(where B#D denotes D@B).  Similarly (g'#f')(f$g) = (g'g)@(f'f).
However (g'#f')(f#g) : A@C -> E@F makes two twists and hence is not
equal to (g'g)@(f'f) : A@C -> E@F.

The 3-strand braid used to make pigtails in hair is the iteration of

    (a#b)@c      b@(a$c)     (b#c)@a     c@(b$a)     (c#a)@b     a@(c$b)

Three iterations yields
	      _   _   _   _   _   _   _   _   _
	  a\ / \ / \ / \ / \ / \ / \ / \ / \ / a
	    /   /   /   /   /   /   /   /   /  
	  b/ \ / \ / \ / \ / \ / \ / \ / \ / \ 
	      \   \   \   \   \   \   \   \   \c
	  c\_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/ \_/b

Commutative linear logic offers f#g and f$g but identifies them.  When
f and g are identities f#g (= f$g) is called a *symmetry* (sometimes
permutativity).  Use commutative linear logic when the order in which f
and g are performed makes no difference even if A and C overlap.

5. Static Conjunction

When C=A we can still jointly log the two measurements f:A->B and
g:A->D as f@g: A@A -> B@D, but now we have the alternative of logging
them as f&g: A -> B&D (f&g abbreviates (f&g)d where d:A->A&A is the
diagonal map and f&g really goes from A&A to B&D).  This is ordinary or
categorical product; we shall think of it as *static* conjunction.
Static conjunction makes the weaker claim that the observer saw both B
and D *at the same time*, modulo interference between any side effects
of f and g.  If both must consume a common piece of the apparatus A
then only one can get it (a more detailed interpretation of this
phenomenon is I think necessarily model dependent).  Static conjunction
is the choice operator for event spaces (see end), and I would expect
it to participate in the expression of Pauli exclusion (but the details
of this QM connection are still very unclear to me).  In summary static
conjunction has the intuitive meaning of logical conjunction when f and
g are interference-free and otherwise expresses the notion of mutual

6.  Disjunctions

Dynamic and static disjunction each arise as the De Morgan dual (under
linear negation) of the corresponding conjunction.  The four constants
arise as the corresponding zeroary operations serving as units.  The
duality makes disjunction look like conjunction in the mirror.

7. Freezing

The operation !A freezes the motion of A to eliminate motion blur and
give a sharply focused picture of A.  Frozen objects are classical:
they are oblivious to observation (no side effects) whence the
recording of f:!A->B and g:!A->C as f@g:!A@!A->B@C can be logged as
(f@g)m:!A->B@C.  (Here m:!A->!A@!A is the "comultiplication" formed as
the composition of the isomorphism !(A&A) ~ !A@!A with the freezing
!d:!A->!(A&A) of d:A->A&A; m expresses the essence of freezing.)

Observations of frozen objects are themselves frozen, allowing f:!A->B
to be logged as !f:!A->!B.  Obviously one may not log f:A->B as
f':A->!B for some f', which would dishonestly assert that one had seen
B in frozen detail.  However it is permitted to log it as !f:!A->!B
since one can apply any functor including ! to a whole measurement.  It
is also permitted to log f:A->!B as tf:A->B where the axiom t:!B->B
thaws !B (shades of Sylvester); thawing is justified on the ground that
the thawed object conveys no more (and usually less) than its frozen
counterpart.  Freezing is idempotent: what is already frozen cannot be
slowed down even more, captured by the axiom d:!A->!!A.

Freezing has many implementations, all achieving the same logical end:
put A in a freezer or memory location, clone A indefinitely, speed up
the observers, use a stroboscope (potentially useful when the dualizer
is a circle as in Pontryagin duality rather than an edge as in Stone
duality), remove A's engine or electric blanket, forget A's algebraic
structure, compensate for the Lorentz transform, observe A indirectly,
e.g. the track left by an electron in a Wilson cloud chamber, etc.

The third point of linear logic then is that every object A has its
classical counterpart !A.

8.  Implication

In a ring (PLUS-TIMES structure) it is very handy for multiplication to
have an inverse, division, definable as xy = z <-> x = z/y, making it a
field.  In an OR-AND logic it is likewise very handy for AND to have a
semi-inverse, residuation, definable as AB < C <-> A < C/B, making a
"semifield".  When conjunction is commutative there is only one
residual, written B-oC.  If conjunction is noncommutative there are
two, Co-B and A-oC, pronounced respectively "C if ever B" and "HAD a
THEN b".

Implication internalizes information flow so that it can be referred to
in the log book as A-oB naming the the collection of measurements from
A to B.  The measurements from !A to B are notated A=>B.  We have the
familiar "Curry" isomorphisms between (A@B)-oC and B-o(A-oC), and also
between (A&B)=>C and B=>(A=>C); the latter entails !(A&B) being
isomorphic to !A@!B.  In a "concrete" universe A-oB "embeds" in
(categorically, has a monic to) A=>B, corresponding to freezing not
losing information.


What seems to bother people is not so much this intuition itself, which
is not unreasonable, as whether this sort of behavior is realized in
nature.  (A similar concern bothered people about quantum mechanics and
led to Hilbert spaces and orthomodular lattices as models of the
apparent axioms of quantum mechanics.)  If one just divides the algebra
of all terms in the language by all linear logic equations between
terms one obtains a model, but this is just the standard Lindenbaum
algebra (freely) modeling an equational theory, which is as useless for
evaluating a theory as a teacher that gives nothing but praise.  When
someone has carefully designed a model around the axioms of linear
logic one suspects that something like this is going on and cries

In fact models of linear logic do arise in nature, or Michael Barr
would not have been motivated to write about *-autonomous categories in

Any category whose objects are the natural numbers and whose maps from
m to n are representable as all mxn matrices with elements from an
appropriate ring or even semiring (no need for addition to be a group),
with composition supplied by matrix multiplication, has the duality
property: just transpose the matrix.  The category Vct_K of
finite-dimensional vector spaces over the field K arises in this way,
as does the category of binary relations each of the form R:m->n from a
set of size m to one of size n.

Now when K is a finite field, vector spaces admit freezing: the space
!n (where the object n is the n-dimensional space K^n) is the space of
dimension |K|^n.  Binary relations on the other hand do not admit
freezing without some work.  What makes vector spaces freezable is that
they are closed under addition, which turns out to be just right.  If
they only had scalar multiplication this would be too little.  If they
had multiplication as well as addition to make them a ring, like the
ring of integers, this would err too far on the other side.

Binary relations on their own don't freeze, but if they are beefed up a
little to more closely resemble vector spaces then not only do they
freeze nicely but they link up with such staples of mathematics as the
Birkhoff and Stone dualities and moreover take over *all* the jobs of
more recent inventions such as Davey and Priestley's intersection
structures, coherence spaces, event structures, and Scott domains.

The trick is to use the category CSLat of complete semilattices.  The
connection with binary relations is that a map from a free cslat A (a
complete atomic Boolean algebra, CABA) to an algebraic cslat B (one
whose elements are representable as joins of primes, elements x not the
join of the set of elements strictly less than x) is exactly the notion
of a binary relation from the generators of A to the primes of B.
(Note that the generators of A are also the primes of A, being the
atoms of a CABA.)  An arbitrary CSLat map generalizes this to nonfree
sources and nonalgebraic targets.  The dual A' of a CSLat A is the
cslat A->2 of all CSLat maps from A to the 2-element CSLat, while the
dual f':B'->A' of a map f:A->B is the map taking maps g:B->2 to

This is pretty good except that, like vector spaces, A&B turns out to
be self-dual.  Hence logic in these domains is rather strange in that
AND and OR look the same.  It turns out that there is a trick for
fixing this that looks at first sight like it might break something,
but which turns out only to break the isomorphism between AND and OR
and create an interesting logic.  The trick is to modify the definition
of complete semilattice to omit the empty join (bottom) but include the
empty meet (top) yielding the notion of event space, dual to state
spaces which have all nonempty meets and empty join.  More details in
/pub/es.tex on boole.stanford.edu, see also algecon.tex and esrs.tex.
In particular algecon explains in detail the connections with coherence
spaces, event structures, and Scott domains.

As a model of QM however event spaces can't "forget", entailing a
number of parallel worlds that is one exponential too large to make
information-theoretic sense.  I would dearly love to see a forgetful
version of ES.  If such exists its dualizer is likely to properly embed
a cycle.  Another mismatch with QM is that ES has a commutative tensor
when QM seems to want braids, see e.g. Street and Joyal, "An
Introduction to Tannaka Duality and Quantum Groups", LNM 1488.

	Vaughan Pratt