|
|
Kari Kivisalo wrote:
>
> I don't know how this can be explained more clearly. Maybe like this:
> Povray thinks that when it sets a pixel at 20% intensity you will also
> see it at 20% intensity. Povray is wrong! You will see the pixel at
> 20%^gamma = 4% intensity. Clearly not good. Use gamma correction and you will
> see what povray wanted you to see.
Gamma's a bit of a bitch to explain, particularly since most people just
assume that all displays work the same; even if they see the screen
being lightened and darkened and someone tells them it's because of
gamma, they still don't have a clue what's really happening (although
after that demonstration, they might think they do, and assume it's the
same as brightness).
I'm still not even sure *I* understand it completely, and I've been
trying to. Here's a site that discusses gamma and its effects at some
length, though:
http://www.aim-dtp.net/aim/index.htm
After reading through that (and experimenting with what it says for
myself), I'm convinced that gamma (1) is evil, (2) sucks, and (3)
eventually needs to be standardized at 1.0, not 2.2 as in sRGB (and
THAT'S a whole rant in itself).
-Xplo
Post a reply to this message
|
|