Computational Type Theory

A lecture series given by Bob Harper at OPLSS 2018

Part One

Lecture 1

Type theory from a computational perspective


References

  • Martin-Löf, Constructive Mathematics and Computer Programming
  • Constable et al., NuPRL System & Semantics

The plan is to develop type theory starting with computation, and developing a theory of truth based on proofs. The connection with formalisms (e.g. theories of proofs, formal derivation, etc.) comes later.

The idea is to emphasize not just playing the Coq/Agda/Lean video game!

Start with a programming language


The language has a deterministic semantics via a transition system.

We have forms of expression called

and two judgement forms called

These mean that is fully evaluated, and that we have performed one simplification of , respectively.

We have a derived notion

which we can understand by way of a mapping

As an example expression

Operational semantics for binary decision diagrams


Principle: types are specifications of program behavior

Judgements are expressions of knowledge, in the intuitionistic sense (Brouwer), based on the premise that mathematics is a human activity, thus finite, and that the only way to constrain facts about infinite infinities (Gödel, Russell?) is via algorithms.

E.g. there is no way to convey infinitely many utterly disconnected facts about other than to have a uniformity expressed as an algorithm. We agree on such uniformity solely because of the fundamental faculties of the human mind w.r.t. computation.

and here are programs, and are behavioral, not structural.

For example:

Why? Because the simplification step entails .

Further:

is a type precisely when:

This applies to type expressions as well, e.g.

This helps explain why a deterministic operational semantics is required, because it is the same in the simplification step. Types/specifications are programs.

Type-indexed families of types a.k.a. dependent types


We can make a judgement that:

precisely when

And that:

That is, a family of types is indexed by a type. Another way to phrase it, which emphasizes the indexing of a sequence by is:

where:

In NuPRL notation:

This is sometimes represented in the literature as:

Functionality


Families (of types, of elements) must respect equality of indices. So what is equality?

Trivially:

is "the same as"

or in other words

is "the same as"

which can be clarified with a slight change in notation, substituting for

As a type structure gets richer, when two things are equal is a property of the type they inhabit, and it can be arbitrarily complicated. The situation about what is true has a much higher -- in fact -- unbounded quantifier of complexity, whereas the question of what is written down in any formalism is always relentlessly recursively enumerable. This means that formalisms can never approach equality-based truth! This standpoint is validated by Gödel's theorem.

Indeed, the entire endeavour of higher-dimensional type theory arises from investigating when two types are the same.

Judgements


When is a type:

specifies an exact equality of types.

When :

specifies an exact equality of elements.

Thus, equal indices deterministically produce equal results (this is also why a deterministic operational semantics is a necessity). A term for this is equisatisfaction.

A simple example is the following:

cannot be true...

However:

is true!. It always depends on type inhabitants.

This is in contrast to the older tradition in formalized type theory of axioms as being somehow type-independent. By the principle of propositions-as-types, we now know this is is a logical fallacy.

To make the notation a little clearer, from here on we will try to write:

as:

A computational semantics


means that

and are equal type-values, or what Martin-Löf called canonical types.

is thus a program which evaluates to equal type-values.

A few core ideas

Given that

we can say

Thus

means

This induces a certain implication about type indexing, called functionality:

Check that

A final example in the general case:

means

assuming

A specific example by deriving the Boolean type


i.e. names a type-value (hence the ).

What type does it name? The membership relation for canonical elements is

is the strongest relation (though some literature calls this least, the extremal nature is what is important)

such that

The extremal clause is that

  1. The stated conditions hold.
  2. Nothing else!

We can now make proposition/fact/claim, and prove it. We'll use this format from here onward.

Fact

If

and

and

and

then

Proof

  • is given by a universal property, the least (or most) containing
  • Fix some type , , .
  • If then
  • Thus, means and either
  • It suffices to show that
  • evaluates its principal arguments (via a transition step). Typing is closed under "head-expansion" or alternatively "reverse execution".

Part Two

Lecture 2

Where we are so far


A type system consists of with called a .

with .

The expression above is both symmetric and transitive, that is

If and then

Hypotheticals express functionality:

This means is a family of types that depends functionally on .

implies

means that is a family of elements; A family of elements is a mapping.

The expression

implies

Similarly for ,

All of the above constitutes a type system defined in terms of evaluation (hence computational) using the language of constructive mathematics.

Now we can start to write down fragments of our own type theories.

There exists a type system containing a type bool


Presuming

The expression

is true either

or

Fact

If

and

and

and

then

Proof

  • Either

    • by head expansion (reverse execution)
  • or

    • by head expansion (reverse execution)

An example

Given

This can be generalized by Shannon expansion.

If then

Which can be seen as a "pivot on ". Binary decision diagrams are created by choosing pivots in a way that minimizes the size of conditionals.

  • The given facts look like definitions in a formal type system, and this is intentional.
  • Semantics define what is true, and formalisms that follow are a pale approximation which are useful for impl.
  • These "little lemmas" are the method of logical relations; type-indexed information determines what equality means.

There exists a type system containing a type Nat


is the extremal (strongest/least) statement s.t.

  • Either
  • Or
  • The extremal clause provides a morally-equivalent notion of an induction principle.

Given some assumptions

We can define the Y-combinator

is thus the greatest solution to the specification

We may now define a recursor .

if

and

and

Fact

If then

Proof

Case for :

Case for

  • Given an inductive hypothesis ... (the proof is unfinished)

These (,) are representative examples of inductive types (least extremal clauses that satisfy some conditions). They are not representative examples of coinductive types (most extremal clauses that imply/are consistent with some conditions).

There exists a type system containing a type Prod


and

and

where

that is, that the values for each are evaluated under the conditions in

Fact

If then and (one could sub other notation here, e.g. fst, etc.)

where

and

and

A note about logical relations as a tactic: membership in product types is defined in terms of equality in each of the component types. Constituent types are, in a sense, already given, and then we speak about the composite types.

Going further...

If , then , which has no requirement on .

Recall that by head-expansion (a.k.a reverse-execution) .

This may seem like a "technical anomaly", but is an important insight into how computational type theory relies on specifications as opposed to a grammar for writing down well-formed things according to syntactic rules that impose additional requirements beyond those described by this fact. Formalisms are about obeying protocols.

  • Completing the proof of fact w.r.t. Prod from the assumptions (in similar vein to the earlier examples) is left as an exercise.

There exists a type system containing a type Fn


and

and

and

where

Given some assumptions

The latter is a kind of -reduction, or at least a well-specified reduction strategy.

Fact

If and then .

We can translate this into a usual type-theoretic notation like so

and the inductive definition is the "protocol" or "syntax" of the aforementioned fact.

Observe: what is the quantifier complexity of ?

In an informal sense, one can say

such that

This is the reason that starting with the formal syntax is inadequate, because an induction rule like

is a derivation tree that results in a quantifier complexity of "there exists something". But cannot be captured by alone!

Fact

If

and

then

One may call this "function extensionality"

  • Completing the proof of facts w.r.t. Fn from the assumptions (in similar vein to the earlier examples) is left as an exercise.

This is profound: One cannot axiomatize equality in by a deeper understanding of Gödel's theorem.

There exists a type system containing a type Dependent Prod


where

and, different from Prod,

which encodes the dependency between and .

There exists a type system containing a type Dependent Fn


From the definition, it's easy enough to look back at the definitions / initial transitions for Fn and subsitute the following, as we did for Prod

The meaning of the change is as follows:

If

then

Fact

  1. If then and

  2. If and then

  • Completing the proof of fact w.r.t. Dependent Fn from the assumptions (in similar vein to the earlier examples) is left as an exercise.

In summary, so far

This is a development of a simple, inherently computational dependent type system

which can be used to prove an earlier claim

Part Three

Lecture 3

Formalisms

Formal type theory is inductively defined by derivation rules.

We want to express some statements regarding what is called "definitional equality".

One can write down axioms of the typical form:

where is one of the projections of .

Formal logic statements and their corresponding types

called unit

called void

called product

called sum

called function

called function[a]

called product[a]

Part Four

Lecture 4

Part Five

Lecture 5