|  |  | Warp schrieb:
>   Could you be more precise about this? I still don't understand the reason.
> Maybe some concrete examples?
Take any scene with (A) an object with a small-scale black-and-white 
checker pattern (if you're using a CRT display, use a large scale 
horizontally), and (B) some rgb 0.5 object. Render without AA.
Now if you render the whole smash and squint your eyes, you /should/ see 
just plain 50% grey for both objects.
Indeed the black-and-white object (A) /will/ look 50% grey (unless your 
display's black and/or white point are bogus), because POV-Ray will have 
output the perfectly unambiguous values 0 and 255 representing 0% and 
100% intensity respectively, in a 1:1 mix, and averaged no earlier than 
when passing between your eyelids.
In 3.6, however, the rgb 0.5 object (B) will look way darker: POV-Ray 
will have output "127", but your graphics card + display will interpret 
this as a meager 22% grey (0.5^2.2 - provided your display has indeed a 
gamma of 2.2).
3.7, on the other hand, will output "186" (255 * 0.5^(1/2.2)) for rgb 
0.5, which is just the right value for a 2.2-gamma display system to 
show 50% grey ((186/255)^2.2); even if your display gamma is somewhat 
off, like 2.0 or so, it will still be closer to the right thing than the 
3.6 output.
(Or, alternatively, POV-Ray will output "127", but store information in 
the output file that a display gamma of 1.0 has been assumed, leaving it 
  to the image viewer to perform gamma-correction; this will happen for 
instance if you choose PNG output and set File_Gamma=1.0.)
 Post a reply to this message
 |  |