|
|
>The sieve in Atari BASIC, using timings from an article written in 1984 by
>Brian Moriarty, clocks in at: 324 seconds (or just under 5 and a half
>minutes) The Python version, running on hardware that's a generation
>back--no i7 processor or anything like that--completes in: 3 seconds
He draws the wrong conclusions. What he says amounts to the following:
The basic code on a 1 MHz PC in 1984 (very optimitic - probably the Atari
CPU was slower) ran 100x slower than python on a 2400 MHz single core (he
mentions old harware).
If both languages were on par, python should execute at least 2400 times
faster than the old code. Actually, it should execute even faster, since
CPUs got faster per MHz, too.
The (somewhat) right conclusions would be:
- computers got faster since 1984 (surprise)
- Python is being interpreted 24x slower than ancient BASIC (at least per
MHz)
However, all this is comparing apples to pears, like we say here in Germany.
What's the English term?
Post a reply to this message
|
|