*understand*quite well.

One of the most celebrated aspects of functional programming is the correspondence that shows that logical propositions and programming language types are essentially the same thing. It is known as the "Curry-Howard" correspondence. Less known but perhaps even more interesting that each common type corresponds to an algebraic operation. Let's look at all three in a table (the types are written in OCaml syntax):

Type |
Constructor |
Logic |
Algebra |

void | none |
$\bot$ | 0 |

unit | () | $\top$ | 1 |

'a * 'b | (a, b) | $A\wedge B$ | $A\times B$ |

('a, 'b) sum | Left a Right b |
$A\vee B$ | $A+B$ |

'a -> 'b | fun x -> b | $A\supset B$ | $B^A$ |

where type ('a, 'b) sum = Left of 'a | Right of 'b.

There are many tutorial introductions explaining this fascinating correspondence.

What is very cool is that the correspondence runs deep, so that, for example, if an algebraic equation holds then the corresponding types are isomorphic. For example, since $A\times (B+C)=A\times B+A\times C$ it follows that the types 'a * ('b, 'c) sum and ('a * 'b, 'a * 'c) sum are isomorphic. Indeed, the isomorphisms are:

let f = function (a, Left b) -> Left (a, b)

| (a, Right c) -> Right (a, c);;

val f : 'a * ('b, 'c) sum -> ('a * 'b, 'a * 'c) sum = <fun>

let g = function Left (a, b) -> (a, Left b)

| Right (a, c) -> (a, Right c);;

val g : ('a * 'b, 'a * 'c) sum -> 'a * ('b, 'c) sum = <fun>

The correspondence is really about types rather than about OCaml or any particular language. Limitations of the language may occasionally break the correspondences. For example, $0^A=1$ and indeed there is always a unique function void -> 'a just as () : unit is unique. But in OCaml, because of the lack of support for empty patterns, we cannot implement it. A language which has empty patterns and in which we can implement it is Agda.

The correspondence is broken in both directions, since in OCaml we can define

let contr : (unit -> void) =

let rec div x = div x in div ()

which has the type which corresponds to $\top\supset\bot$, which is logically inconsistent. This again cannot happen in Agda. Note that it can also happen in Haskell, so I wouldn't call Haskell a "pure" language, as it is also not faithful to these correspondences.

So these correspondences may or may not hold in an actual language. In what follows I will be even less rigorous. But rigour here would move us too far away from the clarity required to tell a good story. Apologies in advances, but this is merely a blog post. I hope a bit of licence will not brand the whole thing as "fake news". For rigour I refer the reader to the proper literature, for which I give a few entry points at the end.

Type isomorphisms are cool enough, but the correspondence runs even deeper than that. Consider for example the type of

**lists**over some type

*A*. If we enumerate the number of distinct lists of size

*n,*noting that a list of size

*n*is essentially a

*n*-tuple of

*A*s, the number of such lists is $A^n$. So the type of lists, in general, is either an empty list, or a unit-length list, or a length-two list, or ... . So we have that:

$$

L(A)=1+A+A^2+\cdots+A^n+\cdots = \sum_n A^n.

$$

If we think of $L(A)$ as a function, then the above is its Taylor-series expansion, from which it follows that

$$

L(A)=1+A+A^2+\cdots+A^n+\cdots = \sum_n A^n = \frac 1{1-A}.

$$

$$ L(A) = 1+A\times L(A)$$

which if we solve for $L(A)$ we get the same result! However, speaking of informality, the equation-solving route does not always apply. Consider the case of natural numbers, defined as type nat = Zero | Suc of nat. The corresponding equation is $N=1+N$, which has no solution. But let us look the other way when faced with such complications.

For the types in the table at the beginning the intuitions can be quite direct, but the algebraic representation of lists, which involves division and subtraction, the direct intuitions are not obvious at all. This is rather fascinating!

Going deeper down this rabbit hole, the algebraic manipulation opens new possibilities. If types are functions $T(A)$, is

*the derivative*$T'(A)=\frac{dT(A)}{dA}$ meaningful? Yes! The derivative corresponds to the

*zipper*of a type. An element of a zipper is like an element of the original type but with a "highlighted" point, or a "hole".

For example, given a list

its zipper is a list with a highlighted node

But how does this relate to algebra? As we said, the type of the zipper is computed by the derivative:

$$L'(A)=0 + 1\times L(A) + A\times L'(A)$$

If we solve the equation we get

$$L'(A) = \frac {L(A)}{1-A}= L(A)\times \frac 1 {1-A}$$

But $L(A)= \frac 1 {1-A}$ so

$$L'(A) = L(A)\times L(A)$$

The zipper of a list is a pair of lists! You should have a heart hardened by years of system-level programming or hollowed by years of corporate middleware drivel not to be enraptured by how beautiful all this is.

And note that if you compute the Taylor expansion of $L'(A)$ you get the same result as computing the derivative of the Taylor expansion of $L(A)$, i.e.

$$L'(A)=\sum_n (1+n)\times a^n.$$

It all comes together superbly.

After this too-fast introduction let's get to the main course, which is answering this question:

Does the magic of algebra only apply to the inductive types we deal with in functional programming?

The answer might shock you, because it is

**NO**!
Let us take the type of

**over some type $A$, which we represent graphically as***circular lists*
This is a typical non-recursive data-type, so we cannot retrieve its algebraic representation using an equation. But we can via Taylor series, by enumerating the number of distinct such structures of each size $n$. Some not-so-deep thinking leads us to the conclusion that since there are $A^n$ distinct lists of length $n$, there will be $\frac{A^n}n$ distinct circular lists: each time we move the head at the end of the list we get an equivalent list, and we must quotient by this equivalence. Thus

$$C(A)=\sum_n \frac{A^n}n = −\mathrm{log}(1−A),$$

by bringing the Taylor expansion to a closed form.

But, hey, how did I do that, go from that Taylor expansion to the original function? Via zippers! Because I know two things.

First, I know that the type of the zipper is the derivative of the original type. Second, I know that the zipper is like the original structure with a hole. And to me, that looks like a list:

So $C'(A)=\frac 1{1-A}$, which means that $C(A)=\int \frac 1{1-A}da=−\mathrm{log}(1−A)$. And I can easily verify that my intuition is right, because computing the Taylor expansion gives me $C(A)=\sum_n \frac{A^n}n$, which is consistent to the above.

Is this a fluke? No. Lets take another important non-inductive type, that of multisets. Again looking at the enumeration of distinct multisets in order to compute the Taylor series we get that

$$MSet(A)=\sum_n \frac{A^n}{n!}=e^A,$$

since there are $n!$ permutations to quotient by.

From this, since $\frac{\mathrm de^x}{\mathrm d x}=e^x$ it further follows that the zipper of a multiset is a multiset!

From this, since $\frac{\mathrm de^x}{\mathrm d x}=e^x$ it further follows that the zipper of a multiset is a multiset!

Also, we can get isomorphisms such as

$$MSet(A+B)=e^{A+B}=e^A\times e^B=MSet(A)\times MSet(B)$$

which says that a multiset of $A$s or $B$s is the same as a pairing a multiset of $A$s with a multiset of $B$s. The isomorphism going one way is partitioning by the obvious equivalence, and the other is multiset union.

#### Reading list

- A bunch of papers by Conor McBride link
- Combinatorial Species and Tree-like Structures by Bergeron et. al. link
- ∂ for Data: Differentiating Data Structures by M. Abbott et. al. link
- The zipper by G. Huet link
- Scrap your zippers: a generic zipper for heterogeneous types by M.D. Adams link
- Seven trees in one by A. Blass link
- Derivatives of regular expressions by J.A. Brzowski link