|
![](/i/fill.gif) |
On 4/8/2012 9:16, Kevin Wampler wrote:
> On 4/7/2012 3:06 PM, Darren New wrote:
>> On 4/7/2012 14:17, Kevin Wampler wrote:
>>> require very similar encoding of a problem
>>
>> I disagree that the encodings are similar.
>>
>>> in order for a computation to be performed, what are
>>> you viewing as the critical distinction between them? After all both
>>> essentially represent a problem as a string of symbols from an alphabet.
>>
>> Take, for example, quantum computers, for which this is untrue.
>>
>
> I was thinking more of existing computers, but you make a fair point.
Well, the encoding in a real computer consists of varying levels of
electrons, movements of electrons, and/or positions of magnetic moments. In
a Turing machine, they're symbols. Real computers are analog.
> still find the definition of "calculation" you're implicitly using a pretty
> strange though, but to each their own.
Well, Turing machines were intended to offer a definition of computability.
So clearly Turing himself thought anything isomorphic to a computation also
counted as a computation. I just found it amusing that there re things we
have our computers do every day that a Turing machine can't do, including
programming universal turing machines.
--
Darren New, San Diego CA, USA (PST)
"Oh no! We're out of code juice!"
"Don't panic. There's beans and filters
in the cabinet."
Post a reply to this message
|
![](/i/fill.gif) |