|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
What is Ackermann's function, why does it grow so fast, and why does
anybody care anyway?
(I'm hoping you can explain this better than Wikipedia...)
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v8 wrote:
> What is Ackermann's function, why does it grow so fast,
That I imagine wikipedia can answer...
> and why does anybody care anyway?
I wouldn't guarantee I'm write, but I believe it's because it's the
first function demonstrated that (basically) you needed indefinite loops
to calculate.
In other words, no number of bounded for-loops will evaluate the
function. You *have* to have while loops in your language to evaluate
the value of the ackerman function.
That's my understanding of why anyone cares, but it's an old memory, so
I might be messing things up.
--
Darren New / San Diego, CA, USA (PST)
Helpful housekeeping hints:
Check your feather pillows for holes
before putting them in the washing machine.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v8 wrote:
> What is Ackermann's function, why does it grow so fast, and why does
> anybody care anyway?
>
> (I'm hoping you can explain this better than Wikipedia...)
>
I'm not Darren, and he'd probably be more familiar with this than I, but
here's what I recall.
The class of computable functions (also known as recursive functions)
are something that you're familiar with. Considered in their most basic
context, these are all functions which take a natural number as input,
provides an integer as output, and which you could write a computer
program to compute. If there's a program which can compute this
function for each possible natural number (in other words, it always
halts), then you have a total computable function.
Way back in the early days of people starting to understand this sort of
stuff -- before diagionalization proofs were used for this sort of
thing, there was a conjecture that this class of total computable
functions was equivalent to the class of what are called primitive
recursive functions. You can think of a primitive recursive function,
basically (iirc), as any function you could code up if only allowed for
loops which didn't change their loop bounds during their execution (no
recursion and no while loops either obviously).
While it seems "obvious" since you're familiar with diagonalization type
arguements, that this class is strictly smaller than the class of total
computable functions, you can get a bit of an idea for why people made
this confusion if you attempt to actually think of a function which is
total computable but not primitive recursive. Almost all the function
you normally deal with (addition, subtraction, division, factorial,
exponentiation, etc) are all primitive recursive.
Here's where the Ackermann function comes in. It's obviously a total
computable function, yet you can show that it cannot be primitive
recursive. The essence of this argument is that the growth rate of the
Ackermann function is larger than what could be achieved by any
primitive recursive function. I believe that the Ackermann function was
the first concrete example of a well defined total computable function
which was not primitive recursive -- hence why people care about it.
If you're curious as to why it grows so quickly, I suggest you code it
up and examine it's behavior for a bit. It's an enjoyable exercise in
experimentally poking around in the guts of a rather odd mathematical
function.
Fun Fact! The union-find data structure has as an amortized complexity
bound for many of many of its operations a running time proportional to
the *inverse* of the Ackermann function! which is the slowest growing
non-constant big-O of any algorithm I'm aware of (though I've explicitly
never looked for others).
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Kevin Wampler wrote:
> I'm not Darren, and he'd probably be more familiar with this than I, but
> here's what I recall.
>
> The class of computable functions (also known as recursive functions)
> are something that you're familiar with.
Er... no.
> Considered in their most basic
> context, these are all functions which take a natural number as input,
> provides an integer as output, and which you could write a computer
> program to compute. If there's a program which can compute this
> function for each possible natural number (in other words, it always
> halts), then you have a total computable function.
>
> Way back in the early days of people starting to understand this sort of
> stuff -- before diagionalization proofs were used for this sort of
> thing, there was a conjecture that this class of total computable
> functions was equivalent to the class of what are called primitive
> recursive functions. You can think of a primitive recursive function,
> basically (iirc), as any function you could code up if only allowed for
> loops which didn't change their loop bounds during their execution (no
> recursion and no while loops either obviously).
>
> While it seems "obvious" since you're familiar with diagonalization type
> arguements, that this class is strictly smaller than the class of total
> computable functions, you can get a bit of an idea for why people made
> this confusion if you attempt to actually think of a function which is
> total computable but not primitive recursive. Almost all the function
> you normally deal with (addition, subtraction, division, factorial,
> exponentiation, etc) are all primitive recursive.
>
> Here's where the Ackermann function comes in. It's obviously a total
> computable function, yet you can show that it cannot be primitive
> recursive. The essence of this argument is that the growth rate of the
> Ackermann function is larger than what could be achieved by any
> primitive recursive function. I believe that the Ackermann function was
> the first concrete example of a well defined total computable function
> which was not primitive recursive -- hence why people care about it.
I see... Thank you for the illumination. Now tell me what a
diagonalization argument is, and how you pronounce it without sounding
stupid. ;-)
> If you're curious as to why it grows so quickly, I suggest you code it
> up and examine it's behavior for a bit. It's an enjoyable exercise in
> experimentally poking around in the guts of a rather odd mathematical
> function.
Note how the Wikipedia article explicitly mentions a Haskell
implementation because it's so easy. ;-) Of course, if you want to
*observe* its behaviour rather than just get the answer, you'd code it a
bit differently...
> Fun Fact! The union-find data structure has as an amortized complexity
> bound for many of many of its operations a running time proportional to
> the *inverse* of the Ackermann function! which is the slowest growing
> non-constant big-O of any algorithm I'm aware of (though I've explicitly
> never looked for others).
Fun fact: This is how I found out about "Ackermann function". Just now.
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On Fri, 11 Jul 2008 20:20:14 +0100, Orchid XP v8 wrote:
>> I'm not Darren, and he'd probably be more familiar with this than I,
>> but here's what I recall.
>>
>> The class of computable functions (also known as recursive functions)
>> are something that you're familiar with.
>
> Er... no.
Do you understand the concept of recursion?
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v8 wrote:
> I see... Thank you for the illumination. Now tell me what a
> diagonalization argument is,
Stuff like proofs that the number of real numbers 0 <= N < 1 is larger
than the number of positive integers.
Map each real number to a positive integer. Now for each number, take
the real number whose first digit differs from the first digit of the
first real, whose second digit differs from the second digit of the
second real, etc. You've just constructed a real which is, by
construction, not on your list that maps all real numbers to integers.
--
Darren New / San Diego, CA, USA (PST)
Helpful housekeeping hints:
Check your feather pillows for holes
before putting them in the washing machine.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>>> The class of computable functions (also known as recursive functions)
>>> are something that you're familiar with.
>> Er... no.
>
> Do you understand the concept of recursion?
Sure.
I don't know much about the formal theory of computation though. I
always assumed that "computable function" just meant "function that can
be computed by a computer". Apparently the definition is much less
general that than...
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New wrote:
> Map each real number to a positive integer.
Um... like, how? There's more of them...
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Orchid XP v8 <voi### [at] devnull> wrote:
> What is Ackermann's function, why does it grow so fast, and why does
> anybody care anyway?
From a programmer's point of view, simplified: The Ackermann's function
is an example of a function which cannot be computed iteratively, but must
be computed recursively.
This proves by example that not all algorithms can be implemented
iteratively.
(And "iteratively" in this context means "only requires O(1) memory to
compute the algorithm".)
--
- Warp
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Warp wrote:
> From a programmer's point of view, simplified: The Ackermann's function
> is an example of a function which cannot be computed iteratively, but must
> be computed recursively.
>
> This proves by example that not all algorithms can be implemented
> iteratively.
>
> (And "iteratively" in this context means "only requires O(1) memory to
> compute the algorithm".)
Mmm, I see. Yes, the third identity requires you to either recurse twice
(requiring more stack), or to build a table of results to avoid the
extra recursion (also requiring more memory).
--
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|