Turing vs Lambda - Boolean Logic Circuits
A short essay about Turing completeness and implementing it in 3 different ways.
Lets talk turing-completeness and boolean logic.
You can make a computer with NAND gates. They're universal. That means that they can compute any computable function.
not a = nand a a
and a b = not (nand a b)
nand a b = magic
or a b = nand (not a) (not b)
nor a b = not (or a b)
xor a b = nand (nand a (nand a b)) (nand (nand a b) b)
xnor a b = not (xor a b)
We can fully expand it to only use nand, so long as NAND itself is magic. Just substitute the definition in for its name whenever it occurs on the right side of the assignment =
.
not a = nand a a
and a b = nand (nand a b) (nand a b)
nand a b = magic
or a b = nand (nand a a) (nand b b)
nor a b = nand (nand (nand a a) (nand b b)) (nand (nand a a) (nand b b))
xor a b = nand (nand a (nand a b)) (nand (nand a b) b)
xnor a b = nand (nand (nand a a) (nand b b)) (nand a b)
Note that if we naively expanded xnor a b
as not (xor a b)
, it results in nand (nand (nand a (nand a b)) (nand (nand a b) b)) (nand (nand a (nand a b)) (nand (nand a b) b))
which is quite a mouthful. Instead, we've used a better version nand (nand (nand a a) (nand b b)) (nand a b)
, symmetrically inferred from NOR-based XOR. More on that in a sec!
You can also make a computer with NOR gates. They're also universal. That means they also can compute any computable function.
not a = nor a a
and a b = nor (not a) (not b)
nand a b = not (and a b)
or a b = not (nor a b)
nor a b = magic
xor a b = nor (nor (not a) (not b)) (nor a b)
xnor a b = nor (nor a (nor a b)) (nor (nor a b) b)
We can fully expand it too.
not a = nor a a
and a b = nor (nor a a) (nor b b)
nand a b = nor (nor (nor a a) (nor b b)) (nor (nor a a) (nor b b))
or a b = nor (nor a b) (nor a b)
nor a b = magic
xor a b = nor (nor (nor a a) (nor b b)) (nor a b)
xnor a b = nor (nor a (nor a b)) (nor (nor a b) b)
These are examples of turing machines - the idea of a computer as a physical object.
Note the structural relation between some NAND- and NOR-based gates. If we swap NAND for NOR (or vice versa), some interesting things happen. The definition of NOT stays the same. OR becomes AND. NAND and NOR themselves swap definitions as one goes from being a single gate to being the most complicated, and the other does the reverse. XOR and XNOR also exchange places.
But lets think for a moment. We can define NAND using NOR, and we can define NOR using NAND, so you don't need one, or maybe you don't need the other. Maybe you don't need either of them. You see, NAND and NOR aren't magic. They're made up of bits, and bits are in turn made up of true and false. So what are true and false made of?
Well, whatever you define them to be.
false f t = f
true f t = t
Here, you see the magic. It doesn't really matter. false
(as a function) takes two arguments - one falsy and one truthy - and simply always returns the falsy argument. And true
always returns the truthy argument.
If I say
false apple banana
It will result in
apple
And if I say
true apple banana
It will result in
banana
Using this, we can start defining all of the other functions that we defined with NAND and NOR gates. We'll start with not
because its easy - , we just give true
as the falsey argument, and false
for the truthy argument, which swaps their places because true
will return false
and false
will return true
.
not b = b true false
Or with lambda notation:
not = \ b -> b true false
We can follow along. The magic is substitution / function application. We start by replacing not
with its definition \ b -> b true false
to get (\ b -> b true false) true
. Then we substitute true
in to get true true false
. We then in turn replace true
with it's definition to get (\ f t -> t) true false
, which turns into (\ t -> t) false
and then simply false
.
It's a bit clearer if we lay it out like this:
not true
(\ b -> b true false) true
true true false
(\ f t -> t) true false
(\ t -> t) false
false
So, if you can define defining, you don't need neither NAND nor NOR, and that's why lambda calculus is the magic - the idea of a computer as an action, an application of knowledge.
Now we can define the rest of the functions again, except using only lambda calculus instead of NAND and NOR. Go ahead and try it out - do the substitutions yourself and you'll see that it works!
false f t = f -- Scott constructor
true f t = t -- Scott constructor
bool f t b = b f t -- Scott case
not b = b true false
and a b = a a b
nand a b = a a b true false -- AKA not (and a b)
or a b = a b a
nor a b = a b a true false -- AKA not (or a b)
xor a b = a b (b true false) -- AKA a b (not b)
xnor a b = a (b true false) b -- AKA a (not b) b
if b t f = b f t
We can fully expand it to lambda notation
false = \ f t -> f
true = \ f t -> t
bool = \ f t b -> b f t
not = \ b -> b (\ f t -> t) (\ f t -> f)
and = \ a b -> a a b
nand = \ a b -> a a b (\ f t -> t) (\ f t -> f)
or = \ a b -> a b a
nor = \ a b -> a b a (\ f t -> t) (\ f t -> f)
xor = \ a b -> a b (b (\ f t -> t) (\ f t -> f))
xnor = \ a b -> a (b (\ f t -> t) (\ f t -> f)) b
if = \ b t f -> b f t
When we made things out of NAND gates we made the NAND gate out of magic, and the same thing for NOR gates. Where is that magic now?
nand = \ a b -> a a b (\ f t -> t) (\ f t -> f)
nor = \ a b -> a b a (\ f t -> t) (\ f t -> f)
It was always in the defining (abstraction) and substitution (application). We take one function, and give it another function as an argument. In fact, all data at rest is made up of functions of action in this way, much as all matter at rest is made up of energy of motion. The magic here rests in knowledge. The computer doens't end at the keyboard, and the knowledge doesn't disappear when we compile down from language to machine - it still rests with us.
PS: This correlation between NAND- / NOR- gates and lambda calculus means that we can actually compile boolean logic in lambda calculus down to a series of gates that could be fabricated into a chip, and it would behave as expected. This gets at the heart of turing completeness, and compiling too. Of course, compiling for a dynamic instruction set instead of a static circuit is a bit more work. We'll talk about this more at length in the future.
- Leo D., October, 2022