|
|
Overclocking seems to be an obsession. As soon as intel (or whoever)
publishes their new cpu, someone overclocks it.
It's true that they make a series of cpu's, then try which clock frequency
is suitable for each one and sell them that way. This way they can sell
virtually the same cpu's with a much higher price than others with a lower
frequency.
Because of this, people seem to think that they can just buy a cheaper one
and overclock it to the same frequency as the expensive ones. They never
seem to think why they are selling it with a lower clock frequency and much
cheaper.
The reason is obvious: Because it doesn't work perfectly with the higher
frequency. They sell it with a frequency which is suitable for it. If it
worked fine with the higher frequency, of course they could sell it with
that frequency and receive more money. But if the cpu becomes unstable with
those frequencies, they just can't sell them that way (it would be very
expensive to change non-working cpu's; this happened with the pentium's
division bug).
So when you overclock your cpu you are actually setting it to a level
which have been detected as unstable.
This unstability has not necessarily to be immediately visible. Everything
may seem to work just fine. You use the computer for weeks or months and
notice no problems.
But there may be problems anyway: Perhaps a program crashes sometimes. You
don't pay very much attention (programs have bugs etc, don't they?). Perhaps
you try to install the new drivers for the card you just bought, and it
doesn't succeed (it's unable to detect the card or whatever). Sometimes you
see a strange pixel on screen (programs still have bugs, don't they?).
Sometimes there's an error in the text or the spreadsheet data you have saved
on disk... And you never think that's because of the overclocked cpu (it
has been working for months, hasn't it?).
(Btw, the non-working card drivers installation because of overclocked cpu
is a true story.)
And all this for what? I once read a comparison in speed with various type
of programs with and without overclocked cpu's in a computer magazine. It
really looked like the programs were running a lot faster, but they just put
the absolute values, not the relative values (ie. the program runs n times
faster in the overclocked than in the regular cpu).
Well, I calculated it (it's easy: overclocked speed / regular speed). The
result was laughable: The programs worked about 1.3 times faster in the
overclocked computer than in the regular computer.
So you receive this minimal increase in speed at the cost of an unstable
cpu which may fail at any moment (and sometimes without you noticing it or
not paying attention to it, causing hidden and dangerous errors).
Often people think that the only problem with overclocking is the increased
temperature of the cpu. If this was true, you could take a 8086, deep-freeze
it and overclock it to 500MHz. Of course it doesn't work, no matter what the
temperature is.
The problem is of electrical nature. The cpu consists of millions of
transistors. Many transistors are put together to form logical gates, etc.
The transistors change their conductivity. They can't change their state
infinitely fast, but they need their time.
There must be a way to synchronize all those millions of transistors, and
that's what the clock pulses are doing. This means that all the transistors
are working at the speed of the slowest ones. If the clock speed is too high,
the slowest transistors don't have time to change their state in time, so
they begin to fail. This may have no effect, or it may change one bit of
data (which is sometimes enough to crash all the computer) or worse.
So you can't overclock the cpu arbitrarily. There's a physical speed limit
for the transistors.
When you overclock, you are getting nearer and nearer to this limit. You
can see when you have completely crossed the line when the computer just
doesn't work. The bad thing is that you can't clearly see, where's the
line. If you are very near, some transistors may fail sometimes very
randomly (perhaps once per minute or once per week or whatever). You may
use your computer for months without noticing and just wondering those
random-looking small errors.
And all this for about 1.3 times the regular speed...
No thanks. I have never overclocked my computer and I never will.
PS: Overclocking a 3D card may be worthy of consideration, since the only
error you will get is garbage on screen, which isn't very dangerous.
Of course there's the risc that you will burn your 3D card...
--
main(i,_){for(_?--i,main(i+2,"FhhQHFIJD|FQTITFN]zRFHhhTBFHhhTBFysdB"[i]
):5;i&&_>1;printf("%s",_-70?_&1?"[]":" ":(_=0,"\n")),_/=2);} /*- Warp -*/
Post a reply to this message
|
|