|
|
>>> I would argue that algorithms go further back than that.
>>
>> At this point, it becomes necessary to define what you mean by
>> "algorithm".
>>
>> Is long division an "algorithm"? Because the ancient Babylonians
>> apparently had that waaay back in 3100 BC. That's some FIVE MILLENNIA
>> ago.
>
> I would define an algorithm the same way the Wiki does.
> An algorithm is an effective method expressed as a finite list of
> well-defined instructions for calculating a function.
> So I would say that the steps for doing long division are an algorithm.
And I would agree with you.
So, yes, algorithms go back way, waaaay further than Babbage and Lovelace.
>> What I actually /said/ was that computers (by which I mean fully
>> autonomous computational devices)
>
> What do you mean by "fully autonomous computational devices"?
Well, that's the killer, isn't it?
An abacus can add. But only if an intelligent human is operating it. All
the smarts are in the operator; the abacus is just a memory tool, really.
A pocket calculator can add. Even if it's operated by raindrops getting
the keys. The smarts are in the device.
Now, how the heck you formulate that into some kind of coherent
definition......
>> had O(log N) lookup way later than
>
> I am not familiar with the Big O notation so I misread your sentence. As
> usual I tried to make some sort of sense out of what could have been
> typos and or bad grammar and spelling.
> So in English, if possible, what do you mean?
The number of steps required to find what you want is proportional to
the logarithm of the size of the thing you're searching.
If you search for a definition by looking at every page in the book
until you find what you're after, the amount of work is obviously
directly proportional to the size of the book. On the other hand, if you
look it up in the index and then go straight to that page, the amount of
time is proportional to the logarithm of the number of pages.
In the former case, if you double the number of pages, you double the
time required to find anything. In the latter case, doubling the number
of pages has little effect on the lookup time. (Assuming the book was
already large to start with, of course.)
>> books (by which I mean large textual documents stored as visible marks
>> on some sort of medium) had it.
>
> That is just juvenile and pompous. Only funny to a teenager.
I wasn't trying to be funny. I was trying to phrase a definition which
would encompass things that don't look like modern hardback "books".
Things like rolls of parchment or stone tablets, etc.
>> Given how ancient writing is and how
>> comparatively new functioning computers are, I think that's a safe
>> assertion.
>
> using ones and zeros.
> I have worked on computers that were solely pneumatic. They could add,
> subtract multiply and divide. Standard components could extract square
> roots integrate and average.
I'm well aware that there have been many computers that used decimal
instead of binary. (A few even used other number systems.) I'm aware
that people have made computers using a veritable zoo of mechanical,
hydrolic and other technologies.
Now, you need to be a little bit careful here: There is a distinction to
be made between devices that can "calculate" things, and devices which
are considered to be "computers". As far as I'm aware, none of the old
"analogue computers" were actually Turing-complete, and so strictly they
are /calculators/ rather than /computers/.
The history of /calculators/ obviously goes back far, far beyond the
history of /computers/. (Again, depending on how general your
definitions are.)
Post a reply to this message
|
|