Coinduction is the mathematical dual to an indispensible mathematical tool: induction. While mathematical induction has been known for thousands of years, coinduction has only been studied for a few decades. It is still primarily used in computer science, from which it originated in the field of concurrency theory. Coinduction allows us to define circular or infinite objects (such as streams, lists that can be infinitely long), and to prove things about them.
It should not be confused with this coinduction, which may put you to sleep instead.
Inductive (or recursive) definitions are ubiquitous in mathematics, to the point where they are often implicit. They follow a common pattern to build up a set of objects incrementally. A base case (or multiple) is first established, and then rules for building up objects based on previous levels are defined.
The set of finite strings \(S\) on an alphabet \(\Sigma\) is the set inductively defined by the following rules, in inference rule notation:
\[\frac{}{\epsilon \in S} \qquad \frac{s \in S \quad \sigma \in \Sigma}{\sigma s \in S}\]So \(\epsilon\) (the empty string) is a string, and for any symbol \(\sigma\) in the alphabet, we can prepend that onto another string to yield a string. Only the objects generated from the rules are in \(S\).
Inductive definitions can be thought of as an iterative process: we start with the empty set and keep adding objects according to the definition, until in the limit, we reach a fixed point, when applying the rules no longer adds anything new to the set. We add \(\epsilon\), then the length 1 strings, then the length 2 strings, and so on, until we have the infinite set of strings over \(\Sigma\) of any length in \(\mathbb{N}\).
An inductive definition is thus the smallest set closed forward under its defining rules. That is, \(S\) is the smallest set such that \(\epsilon \in S\) and that if \(s \in S\), then \(\sigma s \in S\) for any \(\sigma \in \Sigma\). We apply the rules from premises to conclusion.
Since coinduction is the dual to induction, let’s try “flipping” the inductive definition. A coinductive definition is the largest set closed backward under its defining rules.
What does this mean? For an inductive definition, we can think of the set as starting from \(\varnothing\) and iteratively adding elements according to the rules. For a coinductive definition, we can think of the set as starting from the set of all possible objects (even infinite ones), and iterative removing objects that contradict the rules.
If we use the same rules that inductive defined \(S\) above, the coinductively defined set \(S'\) is the largest set such that \(\epsilon \in S'\) and that if \(\sigma s \in S\), then \(s \in S\) (and \(\sigma \in \Sigma\)). Here, the backward closure goes from the conclusion to the premises, the opposite of the forward closure. The set of finite strings, \(S\), is included in \(S'\). But we also have some new strings in \(S'\), the infinitely long strings. Consider the string \(s = aaaaaa \dots\), where \(a \in \Sigma\). We cannot construct it using the base case, but it doesn’t lead to a contradiction either, since if \(s = aaaaa \dots \in S\), taking off the first \(a\) results in the same infinite string \(s\), and \(s \in S\) as desired.
The proof tree for \(s\) is infinite, and looks like the following:
\[\large \frac{a \in \Sigma \quad \frac{ a \in \Sigma \quad \frac{ \cdots }{ aaa \dots \in S' } }{ aaa \dots \in S' }}{ aaa \dots \in S' }\]While objects of inductive definitions require finite derivations, objects of coinductive definitions can have infinite derivations.
For the following, I will skip over some (many) details.
The function \(F\) can be thought of as the set of rules for a given (co)inductive definition. \(F(X)\) is the set of conclusions obtained after applying the rules using \(X\) as the set of premises.
Recall that an inductive definition is the least fixed point of a set of rules, and that a coinductive definition is the greatest fixed point. Now here is a specialization of the Knaster–Tarski fixpoint theorem:
Theorem:
The least fixed point of \(F = \mu F = \bigcap \{ X \mid F(X) \subseteq X \}\).
The greatest fixed point of \(F = \nu F = \bigcup \{ X \mid X \subseteq F(X) \}\).
\(F(X) \subseteq X\) captures the meaning of the informal “closed forwards” definition from earlier. Given a set \(T\) where the premises \(X \subseteq T\), we can apply all rules in the “forwards” direction, obtaining the set of conclusions \(F(X)\) which are also in \(T\): \(F(X) \subseteq X \subseteq T\).
Dually, \(X \subseteq F(X)\) captures the meaning of “closed backwards”. Given a set \(T\) where the conclusions \(F(X) \subseteq T\) from some set of premises \(X\), we can apply some rule for each \(t \in F(X)\) in the “backwards” direction, obtaining the set of premises \(X\) which are also in \(T\): \(X \subseteq F(X) \subseteq T\).
Simple corollaries of the fixpoint theorem gives us proof principles for inductive and coinductive definitions:
Lemma (Induction Principle):
If \(F(X) \subseteq X\), then \(\mu F \subseteq X\).
Lemma (Coinduction Principle):
If \(X \subseteq F(X)\), then \(X \subseteq \nu F\).
Using the induction principle, we can show that every element of a inductively defined set satisfies some condition, by showing that the condition is preserved for each rule of the definition.
We can derive the more familiar principle of mathematical induction using this. Let \(F(X) = \{ 0 \} \cup \{ 1 + x \mid x \in X \}\). This is the set of rules for the natural numbers. It may be more familiar if I write it as the following:
\[\frac{}{0 \in \mathbb{N}} \qquad \frac{n \in \mathbb{N}}{1 + n \in \mathbb{N}}\]Then to prove some fact about the natural numbers, we just need to show that it is preserved when applying these rules in the forwards direction. For example, we will show that \(1 + 2 + \dots + n = \frac{n(n+1)}{2}\) is true for all natural numbers. Let’s take \(X = \{ n \in \mathbb{N} \mid 1 + 2 + \dots + n = \frac{n(n+1)}{2} \}\). Then we will prove that \(\mu F = \mathbb{N} \subseteq X\). This is exactly the conclusion of the Induction Principle, so we need to show that \(F(X) \subseteq X\).
An element of \(F(X)\) can either be \(0\) (the base case), which we can easily verify is in \(X\), or \(1 + n\) (the inductive case) where \(n \in X\) (the inductive hypothesis). This should look familiar. Some fiddling will show that the second case is true as well, and we are done! \(\Box\)
Dually, using the coinduction principle, we can show that an element is in the coinductively defined set.
Using just \(S'\), our only coinductively defined set so far, would not be very interesting, since it would involve only the membership proofs we saw earlier. Let’s make another coinductive definition, this time a relation on elements of \(S'\): let \(F(X) = \{ (\epsilon, \epsilon) \} \cup \{ (\sigma_1 s_1, \sigma_2 s_2) \mid \sigma_1 \le \sigma_2 \land (s_1, s_2) \in X \}\), where \(\le\) is some ordering on the alphabet (the usual one on the English alphabet, for instance). Can you tell what relation this defines? Let’s write down the inference rules:
\[\frac{}{\epsilon \leqslant \epsilon} \qquad \frac{\sigma_1 \le \sigma_2 \qquad s_1 \leqslant s_2}{\sigma_1 s_1 \leqslant \sigma_2 s_2}\]The notation should help: \(\nu F\) is the lexicographic ordering relation on our (possibly) infinite strings, displayed here as \(\leqslant\).
Now we can prove that some strings are related by this relation. For an example, we will show \(aaaa \dots \leqslant baaaa \dots\). Note that these are infinitely long strings.
Using the coinduction principle, we just need to show that \((aaaa \dots, baaaa \dots)\) is in some set of pairs of strings that is closed backwards under \(F\). Let’s try the singleton set \(X = \{(aaaa \dots, baaaa \dots)\}\) first. Then \(F(X) = \{ (\epsilon, \epsilon) \} \cup \{ (\sigma_1 aaaa \dots, \sigma_2 baaaa \dots) \mid \sigma_1 \le \sigma_2 \}\). But then \(X \not \subseteq F(X)\), since the second string of every pair in \(F(X)\) has a \(b\) as the second symbol.
\(X\) is our “coinductive hypothesis”. Like how during induction we sometimes have to strengthen the inductive hypothesis, here we have to strengthen the coinductive hypothesis by making it bigger.
Recall the “backwards closed” intuition. We want to show that by applying some rule “backwards”, we obtain something still in \(X\). If we start with \((aaaa \dots, baaaa \dots)\), we can only apply the second rule, stripping off the first symbol of each string. \(a \le b\), so that premise is fine, and we just need to show that \((aaaa \dots, aaaa \dots) \in X\) now. It looks like we need to grow \(X\) by adding this new pair to it, strengthening the coinductive hypothesis.
Now \(X = \{ (aaaa \dots, baaaa \dots), (aaaa \dots, aaaa \dots) \}\), and \(F(X) = \{ (\epsilon, \epsilon) \} \cup \\ \{ (\sigma_1 aaaa \dots, \sigma_2 baaaa \dots) \mid \sigma_1 \le \sigma_2 \} \cup \\ \{ (\sigma_1 aaaa \dots, \sigma_2 aaaa \dots) \mid \sigma_1 \le \sigma_2 \}\)
Let’s check that \(X \subseteq F(X)\).
\((aaaa \dots, baaaa \dots) = (a\cdot aaaa \dots, b\cdot aaaa \dots)\), and \(a \le b\).
\((aaaa \dots, aaaa \dots) = (a\cdot aaaa \dots, a\cdot aaaa \dots)\), and \(a \le a\).
And since \((aaaa \dots, baaaa \dots) \in X\), we’re done! \(\Box\)
Recently I’ve been working on Interaction Trees, a library that provides a coinductive data structure for reasoning about interactive programs in Coq. Coinduction is less convenient than induction in Coq. For example, in the coinductive proof above the “coinductive hypothesis” included exactly the conclusion we were trying to prove. When doing the proof informally, we know we must apply one of the rules backwards and only then can we apply the coinductive hypothesis.
Doing this in a proof assistant like Coq is more complex. Using “vanilla” Coq, it will allow you to apply the coinductive hypothesis immediately, and then complain that you got it wrong when you try to finish the proof. The paco library solves this problem, but more complex reasoning quickly gets complex, which is why I started learning more about the theory behind coinduction.
I find it really intriguing how (relatively) new coinduction is and how useful it has become. There’s been a lot of work recently on areas related to coinduction, and I’m excited to do more work in this area.
I first encountered coinduction in Types and Programming Languages by Benjamin C. Pierce, where they are introduced to talk about the metatheory of recursive types. While I wouldn’t recommend reading this if you’re just interested in coinduction, it serves as an excellent introduction to programming languages and type systems.
Introduction to Bisimulation and Coinduction by Davide Sangiorgi is a very accessible textbook that goes into detail about all of this and more. It cleared up a lot of questions I had about coinduction, and helped me understand it more rigorously.
Comments
wooo00oo+: So coinduction is induction on coins?
Paul: Not quite, it's more like duction on coins or ions on coinducts.
.: Thank you! :) Great tutorial.