Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It is worth noting that while Graham's number might have held a record for the largest number used in a serious proof, his up arrow notation does NOT hold the record for the fastest growing computable function used in a serious proof.

The record for that probably is Goodstein's function. See http://en.wikipedia.org/wiki/Goodstein%27s_function#Sequence... for more. At 12 it already has surpassed Graham's number.

The result that this function was used to prove is that there exist computable functions that cannot be proven to be computable using the Peano axioms. And the reason that it cannot be proven computable is that it grows faster than any function that can be proven computable using the Peano axioms. (Ackermann's function is an example of a fast-growing function that can be proven computable using the Peano axioms, and therefore which must grow more slowly than Goodstein's function.)



When you say "a fast-growing function that can be proven computable using the Peano axioms," do you mean that the Peano axioms are used to compute the function, or that the Peano axioms are used to prove that the function is computable?


I mean that they are used to prove it.

Basically anything that you can prove using direct calculation and induction a finite number of times on the integers can be proven starting from the Peano axioms. It is not obvious that there are computable functions that those methods of proof do not suffice for. Goodstein's function grows too fast for this to suffice for.

The proof that it is actually computable requires transfinite induction on the countable ordinals.


Grain of Salt: I'm not an expert by any means.

This is really cool and interesting, but I have to also admit to being deeply skeptical. I'm going to try to read a lot more on the proof, but it seems incredible (in a negative sense) that transfinite induction could prove that a finite number of operations suffices for anything.

Similarly, "computable" and "transfinite" would seem on their face to never belong in the same proof.

Obviously I don't have the math background to make any claims, but right now I can't convince myself that this could possibly prove what people claim it does. (Edit: Or maybe a better way to state my objection is, it seems to me like the axioms one would need to accept to prove this might not really make sense when trying to describe computable systems.)


The wiki link I gave before outlines the transfinite induction proof. I can give the idea of the same proof without any reference to infinite ordinals.

First let's take a slightly more verbose notation than the wiki link uses. Let's say that B(b, b', n) is the operation put n into hereditary base b notation, then change the base to b'. Let's define G(b, n) as 1 if n = 0, and otherwise as 1 + G(b+1, B(b, b+1, n) - 1). Then the original G(n) is just G(2, n).

If n < b, then by induction on n we can easily prove that G(b, n) is well-defined (here being well-defined is the same as being a finite number), in fact it is just n + 1.

If n = n_1 b+n_0, with n_1 and n_2 < 0, then by induction on n_1 we can easily prove that G(b, n) is well-defined.

Now by induction we can easily prove that if 2, n_0, n_1, n_2 are < b then G(b, n_0 + n_1 b + n_2 b^2) is well defined.

We can go farther and by induction we can prove that if m, n_0, n_1, ..., n_m < b then G(b, n_0 + n_1 b + n_2 b^2 + ... + n_m b^m) is also well-defined.

Now we're in a position to prove that G(b, b^b) is well-defined. After that we repeat the previous proof and find out that if k < b^b then G(b, k + b^b) is well-defined.

Now we can do an induction proof that G(b, 2 b^b) is well-defined.

And so on. In fact what we discover if we're careful is this. For each hereditary base pattern, there exists an inductive proof that G(b, n) with that hereditary base pattern is well-defined. The challenge is that the necessary proof changes as the hereditary base patterns get bigger.

What transfinite induction provides is a way to describe, in a unified way, the underlying idea behind all of those different inductive proofs into a single one. In particular every hereditary base pattern is mapped onto a (possibly infinite) ordinal. The operation B(b, b', n) does not change which ordinal you're mapped on to. But the operation -1 always decreases the ordinal. Thus we get a decreasing sequence of ordinals. And transfinite induction says that all decreasing sequences of ordinals must terminate in a finite number of steps. The pattern of the necessary inductive proofs is dependent upon the ordinal.

What the Peano axioms lack is any way to unify all of those individual inductive proofs into a single one. Thus for any given integer n we can come up with a proof that G(b, n) is well-defined. But we can't prove that fact for all positive integers n with a single proof.

Therefore this is really an exercise about the limitations of the set of possible proof techniques that the Peano axioms provide, and not about the limitations of computability.

(Of course the limitations of computability on real machines are much, much stricter than the limitations of the Peano axioms. We can't say much that is useful about G(12) other than, "Unbelievably big."


OK, that is really cool, thanks!! I may get back to you with questions once I wrap my head around it a bit more.



Coincidentally, the most recent Project Euler problem concerns a variant of this, called weak Goodstein sequences. The lengths of the weak Goodstein sequences don't grow quite as fast.

http://projecteuler.net/problem=396




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: