|
|
Jim Henderson wrote:
> This is the problem, though: The assumption is that computing will
> always use a Turing model, like I said.
No, computing doesn't today using a Turing model, and the Halting
problem applies to many more computing models than the Turing model.
The Halting problem isn't solvable. If you come up with a new computing
model that "solves" it, what you're solving isn't the halting problem
any more.
It's like arguing "Maybe 2+2 will equal 6 some day, if 2 turns into 3."
But if 2 turns into 3, you're not longer adding 2+2.
The halting problem is a precisely defined mathematical construct. Maybe
newer computing models might conceivably obsolete the implications of
the halting problem, but they won't actually negate its proof. (In the
same sense, that computers are far faster may obsolete the problems
caused by some algorithms taking O(N^3) instructions, but that doesn't
make the algorithm take fewer instructions.)
--
Darren New / San Diego, CA, USA (PST)
Helpful housekeeping hints:
Check your feather pillows for holes
before putting them in the washing machine.
Post a reply to this message
|
|