POV-Ray : Newsgroups : povray.advanced-users : Why assumed_gamma 1.0 should be used (and the drawbacks) : Re: Why assumed_gamma 1.0 should be used (and the drawbacks) Server Time
26 Jun 2024 09:07:34 EDT (-0400)
  Re: Why assumed_gamma 1.0 should be used (and the drawbacks)  
From: Warp
Date: 12 Sep 2011 14:32:00
Message: <4e6e5020@news.povray.org>
Ive <ive### [at] lilysoftorg> wrote:
> While what you write about irradiance is true I completely disagree with 
> all conclusions you draw from this.

  All conclusions? Like what?

  As far as I can see, these are the conclusions I drew:

  When half of the incoming light is reflected from a surface, it looks
to the human eye approximately 73% from the full brightness. Do you
"completely disagree" with this? Can you explain?

  assumed_gamma 1.0 better simulates that perception than assumed_gamma 2.2
(because in the latter case the brightness of the surface looks 50% from
full brightness, rather than 73%, which would mean that significantly less
light is being reflected). You disagree with this? Why?

  Displays with a gamma of 2.2 happen to approximately coincide with the
brightness perception of the human eye, which means that pixel values
scale almost linearly to perceived brightness (which means that eg. a
pixel value of (128,128,128) will look like about 50% gray). You disagree
with this? Please explain.

  If you use assumed_gamma 1.0 in povray, linear gradients will not look
linear (and instead they will look logarithmic). That's because they will
be linear in terms of irradiance, not in terms of perceived brightness.
You disagree with this?

  Because of the previous, designing many textures becomes more complicated,
at least currently. (If you want, for example, a gradient that looks linear,
you would have to somehow compensate from the logarithmic nature of the
perceived brightness of the linear irradiance gradient. This can be quite
difficult to do with complex color maps.) Do you disagree with this, and
why?

  If you are using assumed_gamma 1.0 and you want, for example, a color
that looks 50% gray, you will have to "gamma-uncorrect" rgb 0.5 in order
to achieve that (giving you "rgb .218"). In other words, you need to convert
irradiance values to perceived colors. Please explain your disagreement.

  Using assumed_gamma 2.2 makes it easier to map color values to perceived
colors because it corresponds roughly to a linear scale. On the other hand,
the rendering is not technically accurate because surface lighting will be,
technically speaking, scaled in the wrong way (for example a surface that
should reflect 50% of the incoming light actually will be reflecting about
22% instead). You could disagree with this, but you'll have to explain your
technical reasoning.

  The technically "wrong" illumination calculations do not produce images
that are obviously wrong. There are literally millions of images out there
made by different renderers (which use this same "wrong" gamma handling),
and over 10 years worth of povray renderings made by thousands of people
out there, that attest to this. Hence using assumed_gamma 2.2 is not such
a big deal in practice. Feel free to disagree.

> The main misconception seems to be that you assume there is something 
> like a color that is the inherent property of an object.

  I don't understand what that has anything to do with what I wrote.
I also don't even understand what is it that you are trying to say.

> And all I have to say about "small inaccuracies" and "nobody will notice 
> in practice" is that my experience simply shows the opposite.

  Feel free to point out a few examples out of the millions of images out
there which have been, technically speaking, rendered with the wrong
gamma settings, and which obviously look wrong. You can start with the
POV-Ray hall of fame.

-- 
                                                          - Warp


Post a reply to this message

Copyright 2003-2023 Persistence of Vision Raytracer Pty. Ltd.