POV-Ray : Newsgroups : povray.off-topic : I giggled a bunch at this. Server Time
29 Jul 2024 18:27:29 EDT (-0400)
  I giggled a bunch at this. (Message 21 to 30 of 41)  
<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>
From: Invisible
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 07:01:17
Message: <4e85a17d$1@news.povray.org>
>>> I would argue that algorithms go further back than that.
>>
>> At this point, it becomes necessary to define what you mean by
>> "algorithm".
>>
>> Is long division an "algorithm"? Because the ancient Babylonians
>> apparently had that waaay back in 3100 BC. That's some FIVE MILLENNIA
>> ago.
>
> I would define an algorithm the same way the Wiki does.
> An algorithm is an effective method expressed as a finite list of
> well-defined instructions for calculating a function.
> So I would say that the steps for doing long division are an algorithm.

And I would agree with you.

So, yes, algorithms go back way, waaaay further than Babbage and Lovelace.

>> What I actually /said/ was that computers (by which I mean fully
>> autonomous computational devices)
>
> What do you mean by "fully autonomous computational devices"?

Well, that's the killer, isn't it?

An abacus can add. But only if an intelligent human is operating it. All 
the smarts are in the operator; the abacus is just a memory tool, really.

A pocket calculator can add. Even if it's operated by raindrops getting 
the keys. The smarts are in the device.

Now, how the heck you formulate that into some kind of coherent 
definition......

>> had O(log N) lookup way later than
>
> I am not familiar with the Big O notation so I misread your sentence. As
> usual I tried to make some sort of sense out of what could have been
> typos and or bad grammar and spelling.
> So in English, if possible, what do you mean?

The number of steps required to find what you want is proportional to 
the logarithm of the size of the thing you're searching.

If you search for a definition by looking at every page in the book 
until you find what you're after, the amount of work is obviously 
directly proportional to the size of the book. On the other hand, if you 
look it up in the index and then go straight to that page, the amount of 
time is proportional to the logarithm of the number of pages.

In the former case, if you double the number of pages, you double the 
time required to find anything. In the latter case, doubling the number 
of pages has little effect on the lookup time. (Assuming the book was 
already large to start with, of course.)

>> books (by which I mean large textual documents stored as visible marks
>> on some sort of medium) had it.
>
> That is just juvenile and pompous. Only funny to a teenager.

I wasn't trying to be funny. I was trying to phrase a definition which 
would encompass things that don't look like modern hardback "books". 
Things like rolls of parchment or stone tablets, etc.

>> Given how ancient writing is and how
>> comparatively new functioning computers are, I think that's a safe
>> assertion.
>

> using ones and zeros.
> I have worked on computers that were solely pneumatic. They could add,
> subtract multiply and divide. Standard components could extract square
> roots integrate and average.

I'm well aware that there have been many computers that used decimal 
instead of binary. (A few even used other number systems.) I'm aware 
that people have made computers using a veritable zoo of mechanical, 
hydrolic and other technologies.

Now, you need to be a little bit careful here: There is a distinction to 
be made between devices that can "calculate" things, and devices which 
are considered to be "computers". As far as I'm aware, none of the old 
"analogue computers" were actually Turing-complete, and so strictly they 
are /calculators/ rather than /computers/.

The history of /calculators/ obviously goes back far, far beyond the 
history of /computers/. (Again, depending on how general your 
definitions are.)


Post a reply to this message

From: Stephen
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 08:11:50
Message: <4e85b206$1@news.povray.org>
On 30/09/2011 12:01 PM, Invisible wrote:

>
>>> What I actually /said/ was that computers (by which I mean fully
>>> autonomous computational devices)
>>
>> What do you mean by "fully autonomous computational devices"?
>
> Well, that's the killer, isn't it?
>

AI, I suppose.

> An abacus can add. But only if an intelligent human is operating it. All
> the smarts are in the operator; the abacus is just a memory tool, really.
>
> A pocket calculator can add. Even if it's operated by raindrops getting
> the keys. The smarts are in the device.
>
> Now, how the heck you formulate that into some kind of coherent
> definition......
>

When you do, submit it as a PHD thesis. ;-)

Actually I don't think that I agree with your comparison between the 
abacus and pocket calculator. I think that the latter is just a better 
type of the former.


>> I am not familiar with ...
>
> The number of steps required to find what you want is proportional to
> the logarithm of the size of the thing you're searching.
>
> If you search for a definition by looking at every page in the book
> until you find what you're after, the amount of work is obviously
> directly proportional to the size of the book. On the other hand, if you
> look it up in the index and then go straight to that page, the amount of
> time is proportional to the logarithm of the number of pages.
>
> In the former case, if you double the number of pages, you double the
> time required to find anything. In the latter case, doubling the number
> of pages has little effect on the lookup time. (Assuming the book was
> already large to start with, of course.)

Well put, that completely reverses what I thought you meant.


>
> I wasn't trying to be funny. I was trying to phrase a definition which
> would encompass things that don't look like modern hardback "books".
> Things like rolls of parchment or stone tablets, etc.
>

Fair point considering. :-)


>
> I'm well aware that there have been many computers that used decimal
> instead of binary. (A few even used other number systems.)

I've worked on at least one system that used BCD (Binary Coded Decimal)


>
> Now, you need to be a little bit careful here: There is a distinction to
> be made between devices that can "calculate" things, and devices which
> are considered to be "computers". As far as I'm aware, none of the old
> "analogue computers" were actually Turing-complete, and so strictly they
> are /calculators/ rather than /computers/.
>



alter its input or operating instructions or what?

for each problem they needed to solve. But it could be done.


> The history of /calculators/ obviously goes back far, far beyond the
> history of /computers/. (Again, depending on how general your
> definitions are.)

Indeed.

-- 
Regards
     Stephen


Post a reply to this message

From: Invisible
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 09:25:14
Message: <4e85c33a@news.povray.org>
>> Now, how the heck you formulate that into some kind of coherent
>> definition......
>
> When you do, submit it as a PHD thesis. ;-)

Thanks. I'll pass.

> Actually I don't think that I agree with your comparison between the
> abacus and pocket calculator. I think that the latter is just a better
> type of the former.

In order to work an abacus, /you/ have to know that ten beans on one 
line is worth one bean on the next line. With a pocket calculator, this 
kind of thing is automatic. You don't have to understand it.

I will conceed that automation is a kind of "shades of grey" thing. 
There are greater and lesser degrees of automation, rather than just 
"manual" and "automatic".

>> I'm well aware that there have been many computers that used decimal
>> instead of binary. (A few even used other number systems.)
>
> I've worked on at least one system that used BCD (Binary Coded Decimal)

Yes, but generally you use BCD if you have a /binary/ system and you 
specifically want to do exact /decimal/ arithmetic. (It's a well-known 
fact that 1/10 is not representable as an exact binary fraction. In some 
settings, that might be a problem.)



> alter its input or operating instructions or what?

> for each problem they needed to solve. But it could be done.

The formal definition of Turing-completeness is abstract and technical. 
Essentially Alan Turing proved two main things:

1. It is possible to build *one* machine that can perform *any* possible 
calculation.

2. Some calculations are impossible.

[Simplifying wildly, of course.]

You would think that a more complex, more sophisticated machine would be 
able to calculate more things than a simpler machine. In fact, it turns 
out that if they are both Turing-complete, they can both calculate 
exactly the same stuff. (The more sophisticated machine is probably much 
/faster/, of course...)

Consider an analogue computer, for a moment. It might have circuits that 
can add numbers together. But if I want to add up one million operands, 
I need to wire together [at least] one million adder circuits.

Now consider a simple digital computer. It might have only one binary 
adder circuit, and yet it can add up an unlimited number of operands, 
with unlimited precision.

Analogue computer: More steps = more hardware.
Digital computer: More steps = more time.

That's a fundamental difference. If I have an analogue computer with ten 
adder circuits, I cannot add up eleven numbers. But if I have a 
Turing-complete machine, I can add up an /unlimited/ number of operands. 
You never need more hardware. It just takes more time.

This difference comes about mainly because a digital computer has a 
/memory/. It can /store/ operands and feed them through its arithmetic 
circuits one at a time. An analogue computer can't do that, so it has to 
have one dedicated circuit for each step of the process. A digital 
computer can /reuse/ the same circuits again and again for different 
steps, because the steps won't happen all at once.

Also, you can often "program" an analogue computer. But you basically do 
that by rewiring it. The machine can't rewire itself. But a 
Turing-complete machine can have self-modifying code.

Actually /proving/ that something is Turing-complete is usually not 
simple at all. But probing that something /isn't/ Turing-complete just 
requires you to find one thing that it can't do [and a Turing-complete 
machine can].

You could say it's a calculator if it only calculates one thing (e.g., a 
mechanical tide predictor), it's a calculator construction kit if it can 
calculate many things (e.g., a typical "programmable" analogue 
"computer"), and it's a computer if it's Turing-complete (i.e., it has a 
stored program, decision-making capabilities, etc.)


Post a reply to this message

From: Stephen
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 09:52:19
Message: <4e85c993@news.povray.org>
On 30/09/2011 2:25 PM, Invisible wrote:
>>> Now, how the heck you formulate that into some kind of coherent
>>> definition......
>>
>> When you do, submit it as a PHD thesis. ;-)
>
> Thanks. I'll pass.
>

LOL



>> alter its input or operating instructions or what?

>> for each problem they needed to solve. But it could be done.
>
> The formal definition of Turing-completeness is abstract and technical.
> Essentially Alan Turing proved two main things:
>
> 1. It is possible to build *one* machine that can perform *any* possible
> calculation.
>
> 2. Some calculations are impossible.
>
> [Simplifying wildly, of course.]
>
> You would think that a more complex, more sophisticated machine would be
> able to calculate more things than a simpler machine. In fact, it turns
> out that if they are both Turing-complete, they can both calculate
> exactly the same stuff. (The more sophisticated machine is probably much
> /faster/, of course...)
>

Thanks

> Consider an analogue computer, for a moment. It might have circuits that
> can add numbers together. But if I want to add up one million operands,
> I need to wire together [at least] one million adder circuits.
>
> Now consider a simple digital computer. It might have only one binary
> adder circuit, and yet it can add up an unlimited number of operands,
> with unlimited precision.
>
> Analogue computer: More steps = more hardware.
> Digital computer: More steps = more time.
>
> That's a fundamental difference. If I have an analogue computer with ten
> adder circuits, I cannot add up eleven numbers. But if I have a
> Turing-complete machine, I can add up an /unlimited/ number of operands.
> You never need more hardware. It just takes more time.
>
> This difference comes about mainly because a digital computer has a
> /memory/. It can /store/ operands and feed them through its arithmetic
> circuits one at a time. An analogue computer can't do that, so it has to
> have one dedicated circuit for each step of the process. A digital
> computer can /reuse/ the same circuits again and again for different
> steps, because the steps won't happen all at once.
>

Not so, an analogue machine can have memory. A capacitor will store a 
voltage until another arithmetic step is added to it. So that argument 
does not hold water.

> Also, you can often "program" an analogue computer. But you basically do
> that by rewiring it. The machine can't rewire itself. But a
> Turing-complete machine can have self-modifying code.
>

That makes more sense to me.

> Actually /proving/ that something is Turing-complete is usually not
> simple at all. But probing that something /isn't/ Turing-complete just
> requires you to find one thing that it can't do [and a Turing-complete
> machine can].
>
> You could say it's a calculator if it only calculates one thing (e.g., a
> mechanical tide predictor), it's a calculator construction kit if it can
> calculate many things (e.g., a typical "programmable" analogue
> "computer"), and it's a computer if it's Turing-complete (i.e., it has a
> stored program, decision-making capabilities, etc.)
>
Hmm! I don't know. You can do some very intricate and complicated things 
with analogue electronics. Consider the brain. O_o

-- 
Regards
     Stephen


Post a reply to this message

From: Invisible
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 10:11:12
Message: <4e85ce00$1@news.povray.org>
>> The formal definition of Turing-completeness is abstract and technical.
>> Essentially Alan Turing proved two main things:
>>
>> 1. It is possible to build *one* machine that can perform *any* possible
>> calculation.
>>
>> 2. Some calculations are impossible.
>>
>> [Simplifying wildly, of course.]
>>
>> You would think that a more complex, more sophisticated machine would be
>> able to calculate more things than a simpler machine. In fact, it turns
>> out that if they are both Turing-complete, they can both calculate
>> exactly the same stuff. (The more sophisticated machine is probably much
>> /faster/, of course...)
>>
>
> Thanks
>
>> Consider an analogue computer, for a moment. It might have circuits that
>> can add numbers together. But if I want to add up one million operands,
>> I need to wire together [at least] one million adder circuits.
>>
>> Now consider a simple digital computer. It might have only one binary
>> adder circuit, and yet it can add up an unlimited number of operands,
>> with unlimited precision.
>>
>> Analogue computer: More steps = more hardware.
>> Digital computer: More steps = more time.
>>
>> That's a fundamental difference. If I have an analogue computer with ten
>> adder circuits, I cannot add up eleven numbers. But if I have a
>> Turing-complete machine, I can add up an /unlimited/ number of operands.
>> You never need more hardware. It just takes more time.
>>
>> This difference comes about mainly because a digital computer has a
>> /memory/. It can /store/ operands and feed them through its arithmetic
>> circuits one at a time. An analogue computer can't do that, so it has to
>> have one dedicated circuit for each step of the process. A digital
>> computer can /reuse/ the same circuits again and again for different
>> steps, because the steps won't happen all at once.
>>
>
> Not so, an analogue machine can have memory. A capacitor will store a
> voltage until another arithmetic step is added to it. So that argument
> does not hold water.

I was under the impression that a capacitor only holds its charge for a 
very short period of time.

Regardless, have you ever actually seen an analogue computer with 
independently addressable memory cells? I haven't heard of such a thing.

>> Also, you can often "program" an analogue computer. But you basically do
>> that by rewiring it. The machine can't rewire itself. But a
>> Turing-complete machine can have self-modifying code.
>
> That makes more sense to me.

On the other hand, if the connections were controlled by an array of 
relays or something, and the computer could control the relays... then 
you would be progressing towards a more truly programmable system.

> Hmm! I don't know. You can do some very intricate and complicated things
> with analogue electronics. Consider the brain. O_o

Well, strictly speaking, every digital computer is also an analogue 
computer. They're just designed and used differently.


Post a reply to this message

From: Stephen
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 11:47:22
Message: <4e85e48a$1@news.povray.org>
On 30/09/2011 3:11 PM, Invisible wrote:
> I was under the impression that a capacitor only holds its charge for a
> very short period of time.
>

Depends what you mean by very short period of time. Most circuits that 
use large capacitors will have a discharge resistor in parallel to stop 
the capacitor holding the charge when the power is off. In domestic 
appliances such as radios and TVs, it was recommended that you wait 20 
minutes after switching off before working on them.
Also see:
http://en.wikipedia.org/wiki/Regenerative_capacitor_memory

> Regardless, have you ever actually seen an analogue computer with
> independently addressable memory cells? I haven't heard of such a thing.
>
What to answer first? I don't know, the only actual working valve 
computer I've ever been in the presence of was at Glasgow University, 
over 40 years ago. I've never had hands on experience working with them. 
But you would not use addressable memory as you would in digital 
computers. Remember that we are talking about voltage levels 
representing numbers. Also back in the day, what was being asked of then 
was much simpler and less complex so a lot of memory would not have been 
required.



>> Hmm! I don't know. You can do some very intricate and complicated things
>> with analogue electronics. Consider the brain. O_o
>
> Well, strictly speaking, every digital computer is also an analogue
> computer. They're just designed and used differently.

And use different principles.
http://en.wikipedia.org/wiki/Analog_computer#Electronic_analog_computers
or
http://www.bookrags.com/research/analog-vs-digital-computing-wcs/

-- 
Regards
     Stephen


Post a reply to this message

From: Darren New
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 12:44:02
Message: <4e85f1d2$1@news.povray.org>
On 9/30/2011 2:48, Stephen wrote:
> An algorithm is an effective method expressed as a finite list of
> well-defined instructions for calculating a function.

So, a cake recipe isn't an algorithm? I'll have to disagree.

> So in English, if possible, what do you mean?

O(lg N) approximately means the number of steps it takes to do something is 
roughly proportional to the number of digits in the input. So looking up a 
word in the index of a book, then turning to the page number, is 
approximately difficult based on the number of letters in the longest word 
of the index, the number of digits in the page numbers the index covers, and 
the number of digits in the pages of the book.

-- 
Darren New, San Diego CA, USA (PST)
   How come I never get only one kudo?


Post a reply to this message

From: Darren New
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 12:53:08
Message: <4e85f3f4@news.povray.org>
On 9/30/2011 6:25, Invisible wrote:
> Now consider a simple digital computer. It might have only one binary adder
> circuit, and yet it can add up an unlimited number of operands, with
> unlimited precision.

Except that a Turing machine has an unbounded amount of memory, so you're 
not really comparing apples to oranges here.

> Analogue computer: More steps = more hardware.
> Digital computer: More steps = more time.

The digital computer requires more hardware too, in terms of "tape". All 
those numbers you're adding have to be in the input somewhere.

> That's a fundamental difference. If I have an analogue computer with ten
> adder circuits, I cannot add up eleven numbers. But if I have a
> Turing-complete machine, I can add up an /unlimited/ number of operands. You
> never need more hardware. It just takes more time.

You never need more hardware because, by definition, Turing machines have 
unbounded amounts of hardware attached. If you had an unlimited number of 
analog ALUs, you'd be golden with an analog computer.

> Also, you can often "program" an analogue computer. But you basically do
> that by rewiring it. The machine can't rewire itself. But a Turing-complete
> machine can have self-modifying code.

You do it by rewiring a Turing machine, also. It just so happens that 
because you have an unlimited amount of memory, you can wire up one computer 
that will simulate whatever other computer you put in the memory units 
(i.e., the "tape"). You could do the same with an analog computer as well.

> Actually /proving/ that something is Turing-complete is usually not simple

It's usually not too hard. You just write a turing-machine simulator in 
whatever system you're trying to prove is turing complete. If you can 
simulate a turing machine with an analog computer, then that analog computer 
is turing complete.

-- 
Darren New, San Diego CA, USA (PST)
   How come I never get only one kudo?


Post a reply to this message

From: Orchid XP v8
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 13:05:00
Message: <4e85f6bc@news.povray.org>
On 30/09/2011 05:53 PM, Darren New wrote:
> On 9/30/2011 6:25, Invisible wrote:
>> Now consider a simple digital computer. It might have only one binary
>> adder
>> circuit, and yet it can add up an unlimited number of operands, with
>> unlimited precision.
>
> Except that a Turing machine has an unbounded amount of memory, so
> you're not really comparing apples to oranges here.

Yeah, one of the details I glossed over.

If I have a normal desktop PC then /in principle/ I can insert an 
endless number of disks into it. Obviously, a Turing machine is a 
mathematical abstraction. No physical machine can have infinite 
anything, because the universe is finite.

> You never need more hardware because, by definition, Turing machines
> have unbounded amounts of hardware attached. If you had an unlimited
> number of analog ALUs, you'd be golden with an analog computer.

Small difference: ALUs require energy. A storage medium typically doesn't.

>> Also, you can often "program" an analogue computer. But you basically do
>> that by rewiring it. The machine can't rewire itself. But a
>> Turing-complete
>> machine can have self-modifying code.
>
> You do it by rewiring a Turing machine, also. It just so happens that
> because you have an unlimited amount of memory, you can wire up one
> computer that will simulate whatever other computer you put in the
> memory units (i.e., the "tape"). You could do the same with an analog
> computer as well.

If you're saying "you could make an analogue computer that was 
Turing-complete" then, yes, of course you could. If you're saying "the 
analogue computers what people actually built were Turing-complete" 
then, no, not really...

>> Actually /proving/ that something is Turing-complete is usually not
>> simple
>
> It's usually not too hard. You just write a turing-machine simulator in
> whatever system you're trying to prove is turing complete.

Exhibit A: Totallistic cellular automaton number #110.

-- 
http://blog.orphi.me.uk/
http://www.zazzle.com/MathematicalOrchid*


Post a reply to this message

From: Stephen
Subject: Re: I giggled a bunch at this.
Date: 30 Sep 2011 13:09:13
Message: <4e85f7b9$1@news.povray.org>
On 30/09/2011 5:44 PM, Darren New wrote:
> On 9/30/2011 2:48, Stephen wrote:
>> An algorithm is an effective method expressed as a finite list of
>> well-defined instructions for calculating a function.
>
> So, a cake recipe isn't an algorithm? I'll have to disagree.

Be my guest.
See if I care :-P

-- 
Regards
     Stephen


Post a reply to this message

<<< Previous 10 Messages Goto Latest 10 Messages Next 10 Messages >>>

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.