|
 |
Am 22.05.2010 07:01, schrieb Warp:
> I don't understand all this "30-year old computers were able to present
> slick user interfaces fast and without delays, yet even today we are
> expecting things to take time to load and interfaces to lag" mentality,
> which doesn't seem to be exclusive to Andrew.
...
> Drawing the graphics and playing sounds was not the major problem back
> then because there was very little to draw or play. Loading data wasn't
> either, because there was very little to load. The majority of the work
> went into optimizing the game logic so that the game wouldn't lag because
> the CPU was so slow.
... and not to forget that back then, what little software code arcade
games had used to be stored in ROM, so just about the fastest persistent
memory you can get; even today, if you want to beat that speed of old
times with modern off-the-shelf equipment you'll need to grab some SSD,
at least when random accesses need to be made.
Also remember that those machines took some seconds to boot as well;
they just had the advantage that they were usually turned on already
when the user approached them.
Not to mention the lags between levels, which weren't only there to
inform the user about their progress, but also re-initialize a few
variables.
Now leave the arcade game behind, and in order to get some more coins to
feed to the machine, go to work at some place where you'd use some
/real/ computer.
You'd probably be glad that you didn't have to literally /carry/ your
input and output to and from the server room anymore, ever since a few
years ago you got your own CRT terminal, and therefore you'd be pretty
forgiving to the fact that the execution time of some programs would
still be measured in minutes or even hours.
Note that back in those days, mass storage I/O would be the biggest
bottleneck for most applications.
Now guess which computer subsystems have seen the smallest performance
increase since then...
Let's face it: Comparing 1980's arcade games with 2010's business
applications is like comparing apples with bananas.
Look at 1980's arcade games, and you'll notice that you see the very
same responsiveness in 2010's games (even the lag between levels is
virtually the same ;-)), except that the games have grown immensely in
complexity.
Look at 1980's business applications, and you'll notice that they, too,
have grown immensely in complexity - but even worse in responsiveness as
today's stuff.
Want to compare 1980's word processing with today's?
1980:
You'd type in your documents in a text-only terminal.
Fonts would be limited to those built into whatever printer you'd be
using; depending on the printing technology, often even the font size
would be fixed. Text decoration would usually be limited to bold,
italics and/or underlined.
To see what the document would really look like, you'd have to print it,
and walk to the printer room to get your draft printout. WYSIWYG? Heck,
the best you could even /dream/ of was a preview functionality that
showed you where that particular printer brand would insert line breaks.
If you found that a particularly long word would create particularly
ugly line breaks, you were lucky if your word processor featured soft
hyphens, so you could manually give hints where words could be broken.
(Not to forget that if you ran into problems, you'd search for the paper
manual (which must be /somewhere/ around here, 'cause I know I had it in
my hands a week ago...), and pray that the buzzwords you can think of
happen to be listed in the index.)
2010:
You type in your documents in your very own personal supercomputer
(which, by the way, is pretty bored 99% of the time).
You can choose between hundreds of more-or less exotic fully scalable
fonts (some even including a full set of glyphs for each and every
language you could possibly imagine), in plenty of more-or-less useful
variations: Bold, italics, underlined, double underlined,
strike-through, small caps, font color, background, borders...
To see what the document would look like, all you need to do is look at
the display as you type. Nowadays you even get WYSIWYG feedback while
pondering what font or text decoration to choose.
You don't even know what a soft hyphen is, because you never need them
anyway, as your word processor features a full dictionary to decide
where to break words /as you type/ - and check not only your spelling,
but even your grammar while you type (and bore your computer to death).
(Not to forget that if you run into problems and can't find anything
suitable in the inbuilt help index, you might try again with a full-text
search. Or even do a full-text search on a worldwide repository of
documentations, tutorials, user group discussions, public diaries and
what-have-you-not. On the fly. With sophisticated sorting of the results
by probable relevance.)
(And all the while your media player keeps pumping studio quality music
(CD quality? What the * is a CD?) out of the inbuilt HiFi Stereo amp
with integrated 10-band equalizer... and still your machine is mostly
busy with being bored...)
Poor responsiveness? Well, I don't think so.
Instead, I think there is some "maximum acceptable lag", which software
"fills out" by adding features and "richer" feedback while processing
and I/O speed increase. And I think that acceptable lag is even slowly
decreasing in business applications. (It has always been pretty low for
games as it seems.)
Post a reply to this message
|
 |