|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 29/09/2011 02:55 PM, Warp wrote:
> Francois Labreque<fla### [at] videotronca> wrote:
>> Origin of INDEX
>> Latin indic-, index, from indicare to indicate
>> First Known Use: 1561
>
> The first known use of the word doesn't necessarily mean it's the first
> known use of an index in a book.
True - but I think we can safely say that books had indicies /long/
before working computers even existed, which was my original assertion.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 29/09/2011 12:04 PM, Invisible wrote:
.
>
> The first "computer" was arguably the design by Babbage, and the first
> "algorithm" was allegedly written by Ada Lovelace, who died almost
> exactly 1 century before Turing.
>
I would argue that algorithms go further back than that. Consider
control mechanisms. If you want an engine to run at a constant speed
with a variable load. You would use an algorithm like this.
Test the speed of the motor.
If the speed is lower than the target speed then increase the power.
If the speed is greater than the target speed then decrease the power.
If the speed is within limits then neither increase nor decrease the power.
solution to the algorithm. Roman engineers maintained water levels for
their aqueduct system by means of floating valves that opened and closed
at appropriate levels. Someone had to create an algorithm no matter how
simple.
> Now if we could establish when the first book with an index was created...
For library catalogues see
http://en.wikipedia.org/wiki/Library_catalog#History
Long, long ago.
--
Regards
Stephen
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Francois Labreque<fla### [at] videotronca> wrote:
>> Origin of INDEX
>> Latin indic-, index, from indicare to indicate
>> First Known Use: 1561
>
> The first known use of the word doesn't necessarily mean it's the first
> known use of an index in a book.
>
The first use of the word set a key date.
The first book with an index, even if named diferently, was made
strictly before that year.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Francois Labreque<fla### [at] videotronca> wrote:
>> Origin of INDEX
>> Latin indic-, index, from indicare to indicate
>> First Known Use: 1561
>
> The first known use of the word doesn't necessarily mean it's the first
> known use of an index in a book.
>
Agreed.
but if the first documented use the word index to was in 1561, we can
safely assume that the first use of the index itself predates it. I was
only establishing the upper bound.
--
/*Francois Labreque*/#local a=x+y;#local b=x+a;#local c=a+b;#macro P(F//
/* flabreque */L)polygon{5,F,F+z,L+z,L,F pigment{rgb 9}}#end union
/* @ */{P(0,a)P(a,b)P(b,c)P(2*a,2*b)P(2*b,b+c)P(b+c,<2,3>)
/* gmail.com */}camera{orthographic location<6,1.25,-6>look_at a }
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 9/29/2011 11:11 AM, Stephen wrote:
> solution to the algorithm. Roman engineers maintained water levels for
> their aqueduct system by means of floating valves that opened and closed
> at appropriate levels. Someone had to create an algorithm no matter how
> simple.
>
Depending on your definition of "computer", there is evidence of a one
of a kind "Roman" device that was capable of predicting eclipses, and
timing the correct date to start the Olympic games, among other
features. In other words, an electronic calender. But, at that time,
such things where nearly impossible to replicate, so when the ship it
was one sank... Arguably, its gearing system had to have some sort of
"algorithm".
https://secure.wikimedia.org/wikipedia/en/wiki/Antikythera_mechanism
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>> The first "computer" was arguably the design by Babbage, and the first
>> "algorithm" was allegedly written by Ada Lovelace, who died almost
>> exactly 1 century before Turing.
>
> I would argue that algorithms go further back than that.
At this point, it becomes necessary to define what you mean by "algorithm".
Is long division an "algorithm"? Because the ancient Babylonians
apparently had that waaay back in 3100 BC. That's some FIVE MILLENNIA ago.
What I actually /said/ was that computers (by which I mean fully
autonomous computational devices) had O(log N) lookup way later than
books (by which I mean large textual documents stored as visible marks
on some sort of medium) had it. Given how ancient writing is and how
comparatively new functioning computers are, I think that's a safe
assertion.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 30/09/2011 03:14 AM, Patrick Elliott wrote:
> Depending on your definition of "computer"
There is that too.
Turing-completeness is a reasonable definition, until you consider that
a sheet of paper and a pen is Turing-complete given a suitable human to
operate it. So perhaps the significant thing is the sophistication of
computations that the device can perform without human aid.
> there is evidence of a one
> of a kind "Roman" device that was capable of predicting eclipses, and
> timing the correct date to start the Olympic games, among other
> features.
Last I heard, nobody had decided exactly what that device was for. It
seems opinions have changed...
> In other words, an electronic calender.
I think you mean /automated/ calendar. It's only /electronic/ if it
operates by moving electrons around. :-P
> But, at that time,
> such things where nearly impossible to replicate, so when the ship it
> was one sank...
We're talking about something from a /long/ time ago. The fact that no
others have been found yet doesn't mean none existed.
> Arguably, its gearing system had to have some sort of "algorithm".
By that description, the way that trees use the laws of physics to move
exactly the right amount of water from their roots to their leaves could
be considered an "algorithm". Which would mean that algorithms predate
mankind by several billion years...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 30/09/2011 9:08 AM, Invisible wrote:
>>> The first "computer" was arguably the design by Babbage, and the first
>>> "algorithm" was allegedly written by Ada Lovelace, who died almost
>>> exactly 1 century before Turing.
>>
>> I would argue that algorithms go further back than that.
>
> At this point, it becomes necessary to define what you mean by "algorithm".
>
> Is long division an "algorithm"? Because the ancient Babylonians
> apparently had that waaay back in 3100 BC. That's some FIVE MILLENNIA ago.
>
I would define an algorithm the same way the Wiki does.
An algorithm is an effective method expressed as a finite list of
well-defined instructions for calculating a function.
So I would say that the steps for doing long division are an algorithm.
> What I actually /said/ was that computers (by which I mean fully
> autonomous computational devices)
What do you mean by "fully autonomous computational devices"?
had O(log N) lookup way later than
I am not familiar with the Big O notation so I misread your sentence. As
usual I tried to make some sort of sense out of what could have been
typos and or bad grammar and spelling.
So in English, if possible, what do you mean?
> books (by which I mean large textual documents stored as visible marks
> on some sort of medium) had it.
That is just juvenile and pompous. Only funny to a teenager.
> Given how ancient writing is and how
> comparatively new functioning computers are, I think that's a safe
> assertion.
using ones and zeros.
I have worked on computers that were solely pneumatic. They could add,
subtract multiply and divide. Standard components could extract square
roots integrate and average.
--
Regards
Stephen
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
On 30/09/2011 9:18 AM, Invisible wrote:
> Which would mean that algorithms predate mankind by several billion
> years...
Let's not go down that road again :-P
--
Regards
Stephen
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>>> I would argue that algorithms go further back than that.
>>
>> At this point, it becomes necessary to define what you mean by
>> "algorithm".
>>
>> Is long division an "algorithm"? Because the ancient Babylonians
>> apparently had that waaay back in 3100 BC. That's some FIVE MILLENNIA
>> ago.
>
> I would define an algorithm the same way the Wiki does.
> An algorithm is an effective method expressed as a finite list of
> well-defined instructions for calculating a function.
> So I would say that the steps for doing long division are an algorithm.
And I would agree with you.
So, yes, algorithms go back way, waaaay further than Babbage and Lovelace.
>> What I actually /said/ was that computers (by which I mean fully
>> autonomous computational devices)
>
> What do you mean by "fully autonomous computational devices"?
Well, that's the killer, isn't it?
An abacus can add. But only if an intelligent human is operating it. All
the smarts are in the operator; the abacus is just a memory tool, really.
A pocket calculator can add. Even if it's operated by raindrops getting
the keys. The smarts are in the device.
Now, how the heck you formulate that into some kind of coherent
definition......
>> had O(log N) lookup way later than
>
> I am not familiar with the Big O notation so I misread your sentence. As
> usual I tried to make some sort of sense out of what could have been
> typos and or bad grammar and spelling.
> So in English, if possible, what do you mean?
The number of steps required to find what you want is proportional to
the logarithm of the size of the thing you're searching.
If you search for a definition by looking at every page in the book
until you find what you're after, the amount of work is obviously
directly proportional to the size of the book. On the other hand, if you
look it up in the index and then go straight to that page, the amount of
time is proportional to the logarithm of the number of pages.
In the former case, if you double the number of pages, you double the
time required to find anything. In the latter case, doubling the number
of pages has little effect on the lookup time. (Assuming the book was
already large to start with, of course.)
>> books (by which I mean large textual documents stored as visible marks
>> on some sort of medium) had it.
>
> That is just juvenile and pompous. Only funny to a teenager.
I wasn't trying to be funny. I was trying to phrase a definition which
would encompass things that don't look like modern hardback "books".
Things like rolls of parchment or stone tablets, etc.
>> Given how ancient writing is and how
>> comparatively new functioning computers are, I think that's a safe
>> assertion.
>
> using ones and zeros.
> I have worked on computers that were solely pneumatic. They could add,
> subtract multiply and divide. Standard components could extract square
> roots integrate and average.
I'm well aware that there have been many computers that used decimal
instead of binary. (A few even used other number systems.) I'm aware
that people have made computers using a veritable zoo of mechanical,
hydrolic and other technologies.
Now, you need to be a little bit careful here: There is a distinction to
be made between devices that can "calculate" things, and devices which
are considered to be "computers". As far as I'm aware, none of the old
"analogue computers" were actually Turing-complete, and so strictly they
are /calculators/ rather than /computers/.
The history of /calculators/ obviously goes back far, far beyond the
history of /computers/. (Again, depending on how general your
definitions are.)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|