|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Ever wondered how on earth you could verify that your POV gamma settings are ok?
This scene is the answer.
Render this scene at 800x600 with strong anti-aliasing (+am2 +r3 +a0.1 does
nicely) and preview enabled.
When viewed with squinted eyes (or without glasses, if you need them for
computer work), the shot should seem to show nothing but a plain gray
rectangle. If that is the case, your gamma settings are ok. Use these
(including the "assumed_gamma" setting) for *ALL* scenes you do. If that makes
your other scenes look too bright or too dark, it's *NOT* the gamma's fault,
but the lighting conditions'. Note that wrong gamma settings will mess up color
saturation in your scenes in a nonlinear, unrealistic way.
Make sure you do *NOT* tamper with the "assumed_gamma 1.0" statement if you can
help it. If you absolutely must to get proper results with both preview and
file output, make sure you use the same value in your other scenes as well.
With POV 3.6, the setting you will want to toy around with is "Display_Gamma".
With POV 3.7, you also have the possibility of toying around with "File_Gamma"
separately to change your output files' gamma. This may be especially handy if
your display gamma is something other than 2.2, but you want to create files
ready for the internet, where a display gamma of 2.2 is a quasi-standard.
--------------------------------------------------
// +w800 +h600 +am2 +r3 +a0.1
global_settings {
assumed_gamma 1.0
}
// ----------------------------------------
camera {
location <0.0, 0.0, -4.0>
direction 1.5*z
right x*image_width/image_height
look_at <0.0, 0.0, 0.0>
}
// ----------------------------------------
plane {
-z, -1
texture {
pigment {
checker
color rgb 1
color rgb 0
scale .02
}
finish{
ambient 1
diffuse 0
specular 0
}
}
rotate x*45
}
sphere {
0.0, 1
texture {
pigment { color rgb 0.5 }
finish{
ambient 1
diffuse 0
specular 0
}
}
}
--------------------------------------------------
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Yo,
"clipka" <nomail@nomail> wrote in message
news:web.4986021ec6af8354f8450bd80@news.povray.org...
> Ever wondered how on earth you could verify that your POV gamma settings
> are ok?
> This scene is the answer.
Perhaps I'll end up looking foolish .... Oh well!! here goes.
..... this is rather like the example that's in docs .... right?
> When viewed with squinted eyes (or without glasses, if you need them for
> computer work), the shot should seem to show nothing but a plain gray
> rectangle. If that is the case, your gamma settings are ok. Use these
> (including the "assumed_gamma" setting) for *ALL* scenes you do. If that
> makes
> your other scenes look too bright or too dark, it's *NOT* the gamma's
> fault,
> but the lighting conditions'. Note that wrong gamma settings will mess up
> color
> saturation in your scenes in a nonlinear, unrealistic way.
I wasn't able to get the results that you seem to be eluding to .... the
idea is to have the sphere appear to meld into the background correct?
> Make sure you do *NOT* tamper with the "assumed_gamma 1.0" statement if
> you can
> help it. If you absolutely must to get proper results with both preview
> and
> file output, make sure you use the same value in your other scenes as
> well.
>
> With POV 3.6, the setting you will want to toy around with is
> "Display_Gamma".
> With POV 3.7, you also have the possibility of toying around with
> "File_Gamma"
> separately to change your output files' gamma. This may be especially
> handy if
> your display gamma is something other than 2.2, but you want to create
> files
> ready for the internet, where a display gamma of 2.2 is a quasi-standard.
I'm running v3.6 and using the example in docs I've arrived at
Display_Gamma = 2.5 for my display and using that value forever. I
understand how this works just by simple empirical testing .... lower value
for D_G makes image darker .... higher lighter. It used to be that some
folks have crt some lcd .... now a days it's pretty much lcd .... does that
make a diff? I'm using the color profile that came with the display, my vid
card (nvidia) has an applet that allows to adjust gamma curve, I've left
that alone (default says gamma=1.0).
You know with all the variables it's no wonder that (gamma correction) is a
frustrating excerise to say the least.
Jim
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Jim Holsenback" <jho### [at] hotmailcom> wrote:
> I'm running v3.6 and using the example in docs I've arrived at
> Display_Gamma = 2.5 for my display and using that value forever. I
> understand how this works just by simple empirical testing .... lower value
> for D_G makes image darker .... higher lighter.
Well, to be precise, it only does that for the midtones. (That's why comparing a
black & white checkerboard pattern with uniform 50% grey shows you whether your
gamma settings are ok.)
> It used to be that some
> folks have crt some lcd .... now a days it's pretty much lcd .... does that
> make a diff?
Not in practice. The LCD are designed so that they behave quite similar to a CRT
when it comes to gamma. After all, people not knowing about these issues expect
colors on the LCD to look similar to their old CRT.
> I'm using the color profile that came with the display, my vid
> card (nvidia) has an applet that allows to adjust gamma curve, I've left
> that alone (default says gamma=1.0).
Yeah, that's another point to tweak. I'd recommend lowering that value a bit, to
get your total display gamma (operating system + graphics card + LCD) to a gamma
of 2.2, so that what you see is what other people (on average) get. The majority
of images out there on the net is probably tuned for a gamma of 2.2.
> You know with all the variables it's no wonder that (gamma correction) is a
> frustrating excerise to say the least.
Definitely so. And we haven't even mentioned white point and black point yet.
Theoretically, to optimize your display gamma (if you don't have expensive
calibration hard- and software), you should start something like this:
- Set your operating system's display settings to use the display profile that
came with your hardware, to (hopefully) get the R/G/B "phosphor" colors right.
- Using your graphics card tools, tune your graphics card output so that it uses
the greatest "swing" for the voltages (or digital values) sent to the display
hardware, to make sure you get the best "resolution". (I presume that this is
usually the standard setting, but I might be mistaken.)
- Tune your display hardware (CRT or LCD) so that 0% black is as dark as your
display can possibly get but 5% grey is still looks slightly brighter, and 100%
white is as bright as your display can possibly get but 95% still looks slightly
darker (black & white point calibration), and white is some "standard" white
(whatever that may be; speaking of color "temperature" here)
- If you couldn't get the black and/or white point properly with the display
hardware settings, use your graphics card tools to do the rest.
(Instead of up to now, you might alternatively want to keep your display
hardware set to standard values so you can easily reset them in case you or
someone else accidently mess them up, and just use graphics card tools to set
black and white point, respectively.)
- Using your graphics card tools, tune your display system (operating system +
graphics card + display hardware) for a total gamma of 2.2 (using e.g. the
"tuning chart" in the POV manual), to best match what other people use on
average.
- Set your POV .ini settings (Display_Gamma and - in case of 3.7 - File_Gamma)
to 2.2 to match your newly optimized display setting of 2.2, and the expected
"other people's displays" setting of 2.2 for the file output.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thank you, Christoph. Works perfectly.
Thomas
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"clipka" <nomail@nomail> wrote in message
news:web.4986021ec6af8354f8450bd80@news.povray.org...
> Ever wondered how on earth you could verify that your POV gamma settings
> are ok?
> This scene is the answer.
You know it's funny how you can read something and not really get it, then
find another source with the same information (slightly different wording)
and the light bulb goes on!!!!! ..... anyway for the benifit of others who
might have had the same disconnect as I did.
http://www.siggraph.org/education/materials/HyperGraph/color/gamma_correction/gamma_intro.html
I case this wraps and breaks the link, I googled "gamma correction tutorial"
.... the SigGraph passage was the top hit.
I really wanted to argue 2.5 vs 2.2 as I had arrived at 2.5 (using the
example in the docs) and had been using that forever. I just didn't feel I
knew enough about it to present my case .... well it's a moot point now as
(haha) I get it!
Cheers
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Jim Holsenback" <jho### [at] hotmailcom> wrote:
> I really wanted to argue 2.5 vs 2.2 as I had arrived at 2.5 (using the
> example in the docs) and had been using that forever. I just didn't feel I
> knew enough about it to present my case .... well it's a moot point now as
> (haha) I get it!
AFAIK the 2.2 gamma correction is part of the sRGB color space definition (which
in turn has become the standard color space for the web), possibly as a
compromise between the typical Mac and PC gammas.
(I thought PC was typically 2.2, but seems I'm wrong. In fact, I tuned my onw
display to 2.2 via software.)
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|