👋 Hello, I'm Damien

This is a loosely collected set of notes on computer science and related/applicable fields of study.

They are typically quite raw and only for my own personal use, except for posts in the editorials section, which are intended as blog posts for public consumption.

Software engineering is about division of labor

  • Between users and machines...
  • Between clients and servers...
  • Between different programmers...
  • Between different modules...

Tension lies between notions of abstraction and composition

  • Abstraction (which embodies division)
  • Composition (which embodies harmony)

PL theory is "advancing linguistic solutions to the contradiction between abstraction and composition" - Reynolds, 1983.

This is a beautiful quote from a presentation by Jon Sterling.

I am fascinated by PL theory, type theory, and the idea of the computational trilogy, and this informs how I think about my work as a professional software developer.

My goal is to pull useful abstractions from the fields of logic, philosophy, ethics, and artificial intelligence (at least in my particular niche as a research engineer) and make them available both to my colleagues at work as well as to interested others in the open source and general public.

Hopefully, the decision to publish both my raw and refined thoughts via this mdbook will help motivate me to keep articulating, editing, and learning indefinitely.

This section pertains to hobby or independent, open-source work.

Computational Type Theory

A lecture series given by Bob Harper at OPLSS 2018

Part One

Lecture 1

Type theory from a computational perspective


References

  • Martin-Löf, Constructive Mathematics and Computer Programming
  • Constable et al., NuPRL System & Semantics

The plan is to develop type theory starting with computation, and developing a theory of truth based on proofs. The connection with formalisms (e.g. theories of proofs, formal derivation, etc.) comes later.

The idea is to emphasize not just playing the Coq/Agda/Lean video game!

Start with a programming language


The language has a deterministic semantics via a transition system.

We have forms of expression called

and two judgement forms called

These mean that is fully evaluated, and that we have performed one simplification of , respectively.

We have a derived notion

which we can understand by way of a mapping

As an example expression

Operational semantics for binary decision diagrams


Principle: types are specifications of program behavior

Judgements are expressions of knowledge, in the intuitionistic sense (Brouwer), based on the premise that mathematics is a human activity, thus finite, and that the only way to constrain facts about infinite infinities (Gödel, Russell?) is via algorithms.

E.g. there is no way to convey infinitely many utterly disconnected facts about other than to have a uniformity expressed as an algorithm. We agree on such uniformity solely because of the fundamental faculties of the human mind w.r.t. computation.

and here are programs, and are behavioral, not structural.

For example:

Why? Because the simplification step entails .

Further:

is a type precisely when:

This applies to type expressions as well, e.g.

This helps explain why a deterministic operational semantics is required, because it is the same in the simplification step. Types/specifications are programs.

Type-indexed families of types a.k.a. dependent types


We can make a judgement that:

precisely when

And that:

That is, a family of types is indexed by a type. Another way to phrase it, which emphasizes the indexing of a sequence by is:

where:

In NuPRL notation:

This is sometimes represented in the literature as:

Functionality


Families (of types, of elements) must respect equality of indices. So what is equality?

Trivially:

is "the same as"

or in other words

is "the same as"

which can be clarified with a slight change in notation, substituting for

As a type structure gets richer, when two things are equal is a property of the type they inhabit, and it can be arbitrarily complicated. The situation about what is true has a much higher -- in fact -- unbounded quantifier of complexity, whereas the question of what is written down in any formalism is always relentlessly recursively enumerable. This means that formalisms can never approach equality-based truth! This standpoint is validated by Gödel's theorem.

Indeed, the entire endeavour of higher-dimensional type theory arises from investigating when two types are the same.

Judgements


When is a type:

specifies an exact equality of types.

When :

specifies an exact equality of elements.

Thus, equal indices deterministically produce equal results (this is also why a deterministic operational semantics is a necessity). A term for this is equisatisfaction.

A simple example is the following:

cannot be true...

However:

is true!. It always depends on type inhabitants.

This is in contrast to the older tradition in formalized type theory of axioms as being somehow type-independent. By the principle of propositions-as-types, we now know this is is a logical fallacy.

To make the notation a little clearer, from here on we will try to write:

as:

A computational semantics


means that

and are equal type-values, or what Martin-Löf called canonical types.

is thus a program which evaluates to equal type-values.

A few core ideas

Given that

we can say

Thus

means

This induces a certain implication about type indexing, called functionality:

Check that

A final example in the general case:

means

assuming

A specific example by deriving the Boolean type


i.e. names a type-value (hence the ).

What type does it name? The membership relation for canonical elements is

is the strongest relation (though some literature calls this least, the extremal nature is what is important)

such that

The extremal clause is that

  1. The stated conditions hold.
  2. Nothing else!

We can now make proposition/fact/claim, and prove it. We'll use this format from here onward.

Fact

If

and

and

and

then

Proof

  • is given by a universal property, the least (or most) containing
  • Fix some type , , .
  • If then
  • Thus, means and either
  • It suffices to show that
  • evaluates its principal arguments (via a transition step). Typing is closed under "head-expansion" or alternatively "reverse execution".

Part Two

Lecture 2

Where we are so far


A type system consists of with called a .

with .

The expression above is both symmetric and transitive, that is

If and then

Hypotheticals express functionality:

This means is a family of types that depends functionally on .

implies

means that is a family of elements; A family of elements is a mapping.

The expression

implies

Similarly for ,

All of the above constitutes a type system defined in terms of evaluation (hence computational) using the language of constructive mathematics.

Now we can start to write down fragments of our own type theories.

There exists a type system containing a type bool


Presuming

The expression

is true either

or

Fact

If

and

and

and

then

Proof

  • Either

    • by head expansion (reverse execution)
  • or

    • by head expansion (reverse execution)

An example

Given

This can be generalized by Shannon expansion.

If then

Which can be seen as a "pivot on ". Binary decision diagrams are created by choosing pivots in a way that minimizes the size of conditionals.

  • The given facts look like definitions in a formal type system, and this is intentional.
  • Semantics define what is true, and formalisms that follow are a pale approximation which are useful for impl.
  • These "little lemmas" are the method of logical relations; type-indexed information determines what equality means.

There exists a type system containing a type Nat


is the extremal (strongest/least) statement s.t.

  • Either
  • Or
  • The extremal clause provides a morally-equivalent notion of an induction principle.

Given some assumptions

We can define the Y-combinator

is thus the greatest solution to the specification

We may now define a recursor .

if

and

and

Fact

If then

Proof

Case for :

Case for

  • Given an inductive hypothesis ... (the proof is unfinished)

These (,) are representative examples of inductive types (least extremal clauses that satisfy some conditions). They are not representative examples of coinductive types (most extremal clauses that imply/are consistent with some conditions).

There exists a type system containing a type Prod


and

and

where

that is, that the values for each are evaluated under the conditions in

Fact

If then and (one could sub other notation here, e.g. fst, etc.)

where

and

and

A note about logical relations as a tactic: membership in product types is defined in terms of equality in each of the component types. Constituent types are, in a sense, already given, and then we speak about the composite types.

Going further...

If , then , which has no requirement on .

Recall that by head-expansion (a.k.a reverse-execution) .

This may seem like a "technical anomaly", but is an important insight into how computational type theory relies on specifications as opposed to a grammar for writing down well-formed things according to syntactic rules that impose additional requirements beyond those described by this fact. Formalisms are about obeying protocols.

  • Completing the proof of fact w.r.t. Prod from the assumptions (in similar vein to the earlier examples) is left as an exercise.

There exists a type system containing a type Fn


and

and

and

where

Given some assumptions

The latter is a kind of -reduction, or at least a well-specified reduction strategy.

Fact

If and then .

We can translate this into a usual type-theoretic notation like so

and the inductive definition is the "protocol" or "syntax" of the aforementioned fact.

Observe: what is the quantifier complexity of ?

In an informal sense, one can say

such that

This is the reason that starting with the formal syntax is inadequate, because an induction rule like

is a derivation tree that results in a quantifier complexity of "there exists something". But cannot be captured by alone!

Fact

If

and

then

One may call this "function extensionality"

  • Completing the proof of facts w.r.t. Fn from the assumptions (in similar vein to the earlier examples) is left as an exercise.

This is profound: One cannot axiomatize equality in by a deeper understanding of Gödel's theorem.

There exists a type system containing a type Dependent Prod


where

and, different from Prod,

which encodes the dependency between and .

There exists a type system containing a type Dependent Fn


From the definition, it's easy enough to look back at the definitions / initial transitions for Fn and subsitute the following, as we did for Prod

The meaning of the change is as follows:

If

then

Fact

  1. If then and

  2. If and then

  • Completing the proof of fact w.r.t. Dependent Fn from the assumptions (in similar vein to the earlier examples) is left as an exercise.

In summary, so far

This is a development of a simple, inherently computational dependent type system

which can be used to prove an earlier claim

Part Three

Lecture 3

Formalisms

Formal type theory is inductively defined by derivation rules.

We want to express some statements regarding what is called "definitional equality".

One can write down axioms of the typical form:

where is one of the projections of .

Formal logic statements and their corresponding types

called unit

called void

called product

called sum

called function

called function[a]

called product[a]

Part Four

Lecture 4

Part Five

Lecture 5

Algebraic Effects and Handlers

Andrej Bauer

Basics

Take the operation symbol for :

3 + (4 + 5)

plus(b:2 if b = 0 then 3
         else plus(c:2 if c = 0 then 4
                       else 5))

where is the arity, or number of distinct effects


where is the parameter type, a monad


where is an unbounded type; a delimited continuation

Sources

Non-markup code which is adjacent to this repo can be found here: https://github.com/damienstanton/notes

Code will generally, as in the notes, be written in Rust, Swift, Haskell, or Agda.

This section pertains to coursework notes in the Master's in Computer Science (MSCS) program at CU Boulder.

CSCA 5414: Greedy Algorithms

Rod cutting is shown as a canonical example of using DP in a recursive and then memoized setting (a third example recovering the data from the table is elided for brevity).

#![allow(unused)]
fn main() {
// Naive attempt to implement the recurrence (see dp_rodcutting.ipynb)
fn max_revenue_recursive(len: i32, sizes: &Vec<i32>, prices: &Vec<f64>) -> i32 {
  if len == 0 {
    return 0;
  }
  if len < 0 {
    return -100_000_000;
  }
  let k = sizes.len();
  assert_eq!(prices.len(), k);
  let mut option_values = vec![];
  for i in 0..k {
    option_values.push(prices[i] + max_revenue_recursive(
      len-sizes[i], sizes, prices
    ) as f64);
  }
  option_values.push(0.);
  option_values.into_iter().reduce(f64::max).unwrap() as i32
}

#[test]
fn examples() {
    let sizes = vec![1, 3, 5, 10, 30, 50, 75];
    let prices = vec![0.1, 0.2, 0.4, 0.9, 3.1, 5.1, 8.2];
    assert_eq!(3.0, max_revenue_recursive(30, &sizes, &prices).round());
    assert_eq!(5.0, max_revenue_recursive(50, &sizes, &prices).round());
    assert_eq!(33.0, max_revenue_recursive(300, &sizes, &prices).round());
}
}

Assumptions

  • L is a positive usize
  • sizes and prices will always have the same length

Fining an optimal substructure

Given some length , we say

  1. A cut of pays revenue.
  2. Optimally cut remaining

Example recurrence relations

Example base cases

  • whenever . If no rod to cut then nothing to do.
  • whenever . If there is negative rod to cut, we will "punish" the cutter infinitely.
#![allow(unused)]
fn main() {
// Memoized version
fn max_revenue_memoize(len: i32, sizes: &Vec<i32>, prices: &Vec<f64>) -> f64 {
    let n = len as usize;
    let k = sizes.len();
    assert_eq!(prices.len(), k);

    let mut table = vec![0.; n + 1];
    let mut vals = vec![];

    for i in 1..n + 1 {
        for j in 0..k {
            let i = i as i32;
            if i - sizes[j] >= 0 {
                let idx = (i - sizes[j]) as usize;
                vals.push(prices[j] + table[idx] as f64);
                vals.push(0.);
                table[i as usize] = vals
                  .clone()
                  .into_iter()
                  .reduce(f64::max)
                  .unwrap()
            }
        }
    }
    table[n]
}

#[test]
fn examples() {
    let sizes = vec![1, 3, 5, 10, 30, 50, 75];
    let prices = vec![0.1, 0.2, 0.4, 0.9, 3.1, 5.1, 8.2];
    assert_eq!(3.0, max_revenue_memoize(30, &sizes, &prices).round());
    assert_eq!(5.0, max_revenue_memoize(50, &sizes, &prices).round());
    assert_eq!(33.0, max_revenue_memoize(300, &sizes, &prices).round());
}
}

When no optimal substructure exists

Noting some conditions of the well-known knapsack problem, where is weight.

Value:

An alternate example of non-optimal substructure is path-finding, where movement decisions may be disjoint. The sequencing of finding optimal substructures thus becomes the driving factor.

EMEA 5065: The neuroscience of leading high-performance teams

Desires

  • Finances and work
  • Relationships
  • Health
  • Recreation
  • Moral, ethical, and spiritual

Greed

  • Stimulates motivation centers in the brain (more one has, ++ rewards one seeks)
  • E.g. sharing accumulated wealth with others often leads to greater emotional satisfaction
  • Brain evolved to anticipate extreme scarcity, we need to actively retrain to anticipate abundance

Two forms of desire

  1. Unconscious drive to survive (bottom-up)
  2. Conscious wanting by optimistic fantasies about the future (top-down) “wishful seeing” <- what we want to tap into in teams. “Breadcrumb” model for long-term driving of goals.

Dopamine control circuit

  • Manage uncontrolled urges of dopamine desire
  • Guides “raw energy” towards profitable ends
  • Abstract concepts and forward-looking strategies
  • Imagination
  • Control of world

Desire circuit vs control circuit

  • Phantoms: things we wish to have

  • Phantoms: building blocks of imagination, creative thought

  • Motivation and persistence

    • Cost-benefit tradeoffs
    • Celebrating small wins
  • "I don't care anymore; I need to stop for good and call it a day" vs "I'm going to take a short break and regroup by focusing on why achieving this goal matters to me and keep going”

References

Dopaminergic circuitry responds not to reward or pleasure, but to reward prediction error

  • Keeping expectations low can maximize the delta between actual and expected rewards
  • Think about this w.r.t. goal setting, and how to discretize tasks
  • Anticipation, reward prediciton error, novelty, and curiosity drive motivation
  • Pain and fear diminish motivation, but drive growth and learning (push through to harder goals)

Barriers to motivation

  • Physical pain
  • Unpleasant emotional experiences
  • Ruminating on unpleasant thoughts

Time boxing is a technique to properly process negative experiences (small mourning) - even 90 seconds.

Distant goals and desires (future oriented)

"Time to hunt vs time to eat"

  • Switch transition from forward-oriented dopamine (wanting) to present-oriented (H&N) dopamine.

  • Pursue hard, long-term goals

  • Dopaminergic personality

    • Bored easily
    • Enjoys anticipation and planning more than doing
    • Easily distracted
    • Loves thrill of chase > prize
  • Dopamine fasting (trendy take on asceticism) has some validity but shouldn't be taken to extremes. Fasting, of any kind, can be addictive.

  • Ocytocin

    • Associated initially with sex & breastfeeding
    • Synthesized by the brain when needed, short half-life (~ 3 minutes)
    • Ocytocin in analogy to dopamine is not a "trust hormone", but rather anticipation of trustworthiness.

What is engagement?

Activates the prefrontal cortex, provides a sense of empowerment and motivation, activates areas of the brain involved in reflection and introspection, driving fulfillment and satisfaction.

  • Engagement must utilize the social interfaces wired into the brain
    • Reward systems
    • Trust systems
      • Sense of autonomy and control over one's work, decision-making
      • Sense of meaning and purpose; serving a larger mission

Social brain and compassion

  • Insula (sensorimotor processing, sense of self) and anterior cingulate cortex (threat-processing, pain or suffering)
    • Helps connect the goal-seeking desires of the prefrontal-cortex with the interroceptive centers (amygdala, limbic areas)
    • Drives our ability to feel compassion and guilt

SCARF model

  • Status
  • Certainty
  • Autonomy
  • Relatedness (ingroup/outgroup)
  • Fairness

General ideas

  • Be mindful of instance labels
  • Find and work with "chemistry creators"

This section stands in the place of a normal blog.

🤷