|
![](/i/fill.gif) |
Am 30.11.2010 23:58, schrieb Christian Froeschlin:
>> Now what do I get? Not a straight, smooth, gradual gradient from
>> light to dark but a parabolic curve of values from light to slightly
>> darker but still light to abruptly skewed to black.
>
> I have to admit I don't understand this either. Until now I
> thought that since 3.7 now uses a linear color space, 0.5 would
> represent a true midgray and all the nifty gamma handling ensures
> that everyone perceives that midgray on their display when viewing
> the image file. So I'd also have expected the end result of your
> test scene to visually yield evenly spaced brightness steps.
>
> Looking at the png itself it seems the upper two rows use 127
> for midgray while the lower uses 186. From all I heard so far
> 186 should appear as midgray on a calibrated gamma 2.2 display
> so that would appear to be correct. But I too perceive the
> upper two rows as more visually correct. It's as if my lcd
> used linear color space. I wonder if the Windows 7 built-in
> display calibration is somehow playing tricks on me.
No, it's your eye playing tricks on you.
Our eyes are highly calibratible optical measurement tools - they work
as well at a brightness of 10% as at 1,000% (and then some), and get
object colors right regardless of lighting conditions.
There's the key.
Grab a piece of 100% white cardboard and another piece of 90% gray
cardboard.
Now dim the light to half brightness. Obviously, your white cardboard
will now reflect only 50% of full-brightness white into your eye, and
your gray cardboard will reflect only 45% of full-brightness white.
So the absolute difference between the two pieces of cardboard has
diminished from 10% full brightness to 5% full brightness. But our eyes
are designed and trained to look not for the absolute light intensity,
but for /relative/ difference, so that we can identify the "pigment" of
objects irrespective of lighting conditions. The /apparent/ brightness
difference between 45% and 50% is therefore roughly the same as between
90% and 100%.
As a result, a truly linear gradient doesn't /appear/ linear to us: The
"distance" between 10% and 20%, for instance, is percieved as roughly
equal to that between 50% and 100%. Go figure.
(Theoretically this would imply that a /logarithmical/ gradient would
appear "linear" to us; however, that's not exactly the case either, as
human vision is a bit more complicated than that. My point here is that
perception of brightness is surprisingly non-linear.)
> You can get the old behavior with assumed_gamma 2.2 or
> by replacing rgb x with rgb pow(x,0.45).
... or by "rgb x gamma 2.2".
> Independent of this issue: Would it be useful to have a
> setting of "color_gamma" that tells POV-Ray how to interpret
> literal color values specified in SDL? The default value of 1.0
> would yield the current behavior while e.g. 2.2 would internally
> convert a value as rgb 186/255 to rgb 0.5 without the need for
> plastering your code with gamma macros.
While the idea is compelling, there's a nasty catch to it: With
POV-Ray's SDL being as powerful as it is, while lacking a clear
distinction between colors and vectors, it is pretty difficult - if not
impossible - to draw a clear boundary between "gamma land" and "linear
country" that is both safe, sane and self-consistent.
Post a reply to this message
|
![](/i/fill.gif) |