|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I still don't understand how I should use assumed_gamma. I've read the
documentation and it seems to conflict with itself.
"For PC systems, the most common display gamma is 2.2, while for scenes
created on Macintosh systems should use a scene gamma of 1.8. Another gamma
value that sometimes occurs in scenes is 1.0."
then later it says:
"For new scenes, you should use an assumed gamma value of 1.0 as this models
how light appears in the real world more realistically."
So is there a preferred setting? I've noticed that with assumed_gamma 1.0,
I end up having textures with extremely low rgb values (< 0.1), just to get
the texture dark enough. I've also noticed that the T_Wood textures
generally look really bad at assumed_gamma 1.0 under most lighting
conditions.
I've been using higher values recently (around 2.0). Does that mean that
light is not modelled realistically?
--
Slash
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Slashdolt wrote:
> I still don't understand how I should use assumed_gamma. I've read the
> documentation and it seems to conflict with itself.
>
> "For PC systems, the most common display gamma is 2.2, while for scenes
> created on Macintosh systems should use a scene gamma of 1.8. Another gamma
> value that sometimes occurs in scenes is 1.0."
>
> then later it says:
>
> "For new scenes, you should use an assumed gamma value of 1.0 as this models
> how light appears in the real world more realistically."
>
> So is there a preferred setting? I've noticed that with assumed_gamma 1.0,
> I end up having textures with extremely low rgb values (< 0.1), just to get
> the texture dark enough. I've also noticed that the T_Wood textures
> generally look really bad at assumed_gamma 1.0 under most lighting
> conditions.
>
> I've been using higher values recently (around 2.0). Does that mean that
> light is not modelled realistically?
>
Set assumed_gamma to 1.0 in your code. Then set the display_gamma in
your master ini to a corresponding value, usually between 1.8 and 2.2 in
order to get the look you want. Then in theory, everyone else sees the
same thing you do. I use Display_Gamma=1.8 in my ini file
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Slashdolt wrote:
>I still don't understand how I should use assumed_gamma. I've read the
>documentation and it seems to conflict with itself.
>
>"For PC systems, the most common display gamma is 2.2, while for scenes
>created on Macintosh systems should use a scene gamma of 1.8. Another gamma
>value that sometimes occurs in scenes is 1.0."
>
>then later it says:
>
>"For new scenes, you should use an assumed gamma value of 1.0 as this models
>how light appears in the real world more realistically."
>
>So is there a preferred setting? I've noticed that with assumed_gamma 1.0,
>I end up having textures with extremely low rgb values (< 0.1), just to get
>the texture dark enough. I've also noticed that the T_Wood textures
>generally look really bad at assumed_gamma 1.0 under most lighting
>conditions.
>
>I've been using higher values recently (around 2.0). Does that mean that
>light is not modelled realistically?
Join the club!
My own (mis)understanding is that an assumed_gamma of 1 should be correct
PROVIDED your display_gamma setting is set correctly for your monitor.
If I use the test scene to determine my monitor's gamma setting it usually
comes up with around 1.8-1.9, but when I set display_gamma to that all my
textures are too light. I end up with an assumed_gamma somewhere between
1.25 & 1.5 when everything looks right.
If I use neither assumed_gamma or display_gamma the result is roughly the
same as no display_gamma and an assumed_gamma of 2.2. (I really have no
idea what that means, but assume it means my monitor is probably about
average...)
Despite reading numerous postings on this subject I still haven't figured
out how it's supposed to work. Meanwhile, I've been using whatever works
for the specific scene...:-/
RG - and despite Douglas Adams' take on the subject, assumed_gamma 42 didn't
work
P.S. To further complicate things, some finish modifiers seem to be affected
more than the actual color. Brilliance in particular is dramatically
affected by changing assumed_gamma.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Yup, my mistake. I completely didn't realize that it said "display gamma"
in the first paragraph. Maybe it could be worded better, but my mistake
nevertheless...
I nearly posted this in "new users"...
--
Slash
"Jim Charter" <jrc### [at] aolcom> wrote in message
news:3ed7c0ff$1@news.povray.org...
> Slashdolt wrote:
> > I still don't understand how I should use assumed_gamma. I've read the
> > documentation and it seems to conflict with itself.
> >
> > "For PC systems, the most common display gamma is 2.2, while for scenes
> > created on Macintosh systems should use a scene gamma of 1.8. Another
gamma
> > value that sometimes occurs in scenes is 1.0."
> >
> > then later it says:
> >
> > "For new scenes, you should use an assumed gamma value of 1.0 as this
models
> > how light appears in the real world more realistically."
> >
> > So is there a preferred setting? I've noticed that with assumed_gamma
1.0,
> > I end up having textures with extremely low rgb values (< 0.1), just to
get
> > the texture dark enough. I've also noticed that the T_Wood textures
> > generally look really bad at assumed_gamma 1.0 under most lighting
> > conditions.
> >
> > I've been using higher values recently (around 2.0). Does that mean
that
> > light is not modelled realistically?
> >
> Set assumed_gamma to 1.0 in your code. Then set the display_gamma in
> your master ini to a corresponding value, usually between 1.8 and 2.2 in
> order to get the look you want. Then in theory, everyone else sees the
> same thing you do. I use Display_Gamma=1.8 in my ini file
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
>"For new scenes, you should use an assumed gamma value of 1.0 as this models
>how light appears in the real world more realistically."
>So is there a preferred setting? I've noticed that with assumed_gamma 1.0,
>I end up having textures with extremely low rgb values (< 0.1), just to get
>the texture dark enough. I've also noticed that the T_Wood textures
>generally look really bad at assumed_gamma 1.0 under most lighting
>conditions.
If you set both display and assumed gamma to 1.0, POV-Ray does not make any
gamma correction to images. It means that final image pixel rgb values are
values you use. To gamma correct image, put your monitor gamma to display
gamma. This is needed to correct your image brightness because monitors
show lower intensities too dark.
Matti
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Jim Charter skrev i meddelandet <3ed7c0ff$1@news.povray.org>...
>Slashdolt wrote:
>> "For PC systems, the most common display gamma is 2.2, while for scenes
>> created on Macintosh systems should use a scene gamma of 1.8. Another
gamma
>> value that sometimes occurs in scenes is 1.0."
[snip]
>Set assumed_gamma to 1.0 in your code. Then set the display_gamma in
>your master ini to a corresponding value, usually between 1.8 and 2.2 in
>order to get the look you want. Then in theory, everyone else sees the
>same thing you do. I use Display_Gamma=1.8 in my ini file
Do newer CRT monitors still have gamma values in that range (1.8-2.2)?
I was a little surprised when I got a new monitor and measured the gamma
to approximately 1.3, but this seems to be the correct value for it.
I have immediate access to three monitors, and only one of them has a gamma
around 2 (a Viewsonic 15"). The others are both ADI Microscan models
and have gammas in the range 1.2-1.3. The ADI's are both several years
newer then the Viewsonic.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
I guess a factor which really plays into
display_gamma is the actual brightness
you set your Monitor to. It's glowing effect
alters how black (which stays black no matter
what) and white (which may be more or less
bright) interact to get to a gray level.
Also, the better a monitor, the less will
neighbouring pixels pour their colors into
the surrounding area (I'm talking about the
physical effect of the electrons on the phosphor
layer, though I'm not too sure if its still phosphor...).
Since technology keeps evolving, I guess
newer Monitors may be more apt at staying
at a desired gamma of 1.
But perhaps I'm having a distorted perspective
on gamma and monitors, and what I'm saying
is complete crap... :-)
--
Tim Nikias v2.0
Homepage: http://www.digitaltwilight.de/no_lights
Email: Tim### [at] gmxde
>
> >> "For PC systems, the most common display gamma is 2.2, while for scenes
> >> created on Macintosh systems should use a scene gamma of 1.8. Another
> gamma
> >> value that sometimes occurs in scenes is 1.0."
>
>
> [snip]
>
> >Set assumed_gamma to 1.0 in your code. Then set the display_gamma in
> >your master ini to a corresponding value, usually between 1.8 and 2.2 in
> >order to get the look you want. Then in theory, everyone else sees the
> >same thing you do. I use Display_Gamma=1.8 in my ini file
>
>
>
> Do newer CRT monitors still have gamma values in that range (1.8-2.2)?
> I was a little surprised when I got a new monitor and measured the gamma
> to approximately 1.3, but this seems to be the correct value for it.
> I have immediate access to three monitors, and only one of them has a
gamma
> around 2 (a Viewsonic 15"). The others are both ADI Microscan models
> and have gammas in the range 1.2-1.3. The ADI's are both several years
> newer then the Viewsonic.
>
>
>
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Tim Nikias v2.0" <tim### [at] gmxde> schrieb im Newsbeitrag
news:3edba08f@news.povray.org...
> [..]
> But perhaps I'm having a distorted perspective
> on gamma and monitors, and what I'm saying
> is complete crap... :-)
>
Sorry Tim, but yes - thats complete crap!
But it really seems that the point of gamma correction is still
a mystery to so many Povers. So one more try to explain:
The gamma of a crt monitor refers only to the physical fact that a
cathod ray tube does not respond to the input voltage in a linear
way. This is not a matter of quality, this is just a simple fact.
It has nothing to do with the way pixels blend together (this
is significat for the sharpness) and also not with brightness and
contrast settings (both are linear and can be used to adept better
to the surrounding lighting condition). Typical values for crt's are
in the range of 1.8 to 2.7 so 2.2 is usual a quite good average
value.
Do NOT trust any visual test so much (like the well known gray
pattern test). The only way to get the execat gamma value is
measurement with a spectrophotometer or (much easier and
cheaper) have a look at the homepage of the manufactor of
your monitor.
But in fact, the exact value of your own monitor is less important
because the default gamma for windows/linux is 2.2 and as both
OS do no gamma correction every application that tries to handle
images has to handle gamma for itself and does usually also
assume a default gamma of 2.2. In the Mac world this value
is 1.8 and if you are lucky and work on a SGI machine you have
not to care at all because the operating system handles gamma
correction (and every application that runs there knows that).
So all is quite easy. If you are using windows or linux put display
gamma 2.2 into the ini file and mac users 1.8.
But when you try to exchange your image (show it on a web
page or in a newsgoup) I would say the compromise of 2.0 is
the best idea if you will finally use the JPEG format, but if you are
using PNG files you should use the setting that correspondent to
your OS as mentioned above because PNG is able to store the
gamma information and will display with every OS in the same way
(at least if the viewer application handles gamma correct - and not
every image viewer does).
In the professional world are usually file format used that are able
to include not only the gamma information but the complete icc profile
but this is a completly different story...
so long
-Ive
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Thanks for the information, and thanks to everyone else for providing input.
Howerver, if you look at <povray>\scenes\incdemo\woods\woods2.pov, they set
the "assumed_gamma" to 2.2. If you use assumed_gamma 1.0 for any of the
T_Wood textures, they are way too bright, and many look purplish. For that
reason, I generally don't use any of them, and create my own wood textures
instead.
So... I guess that brings me back to my original question.
--
Slash
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Just to make it clear, I was talkin' all the time about the display_gamma
setting that should be in the povray ini-file. The assumed gamma in the
global settings statement of your scene file should be 1.0 as noted in the
docs to make sure PoV does lighting (and also radiosity) calculation within
a linear color space (thats the way it is in the "real world").
> Howerver, if you look at <povray>\scenes\incdemo\woods\woods2.pov, they set
> the "assumed_gamma" to 2.2. If you use assumed_gamma 1.0 for any of the
> T_Wood textures, they are way too bright, and many look purplish. For that
> reason, I generally don't use any of them, and create my own wood textures
> instead.
Yes, because they are quite *old*. I guess some of them go back to the time of
DKBtrace the predecessor of PoV-Ray.
> So... I guess that brings me back to my original question.
>
I have done it this way:
global_settings {
assumed_gamma 1.0
}
#declare OldGamma = 2.2; // to compensate for old non-linear color definitions
// some helper macros for conversion
#macro RGB(C)
rgb <pow(C.red, OldGamma), pow(C.green, OldGamma), pow(C.blue, OldGamma)>
#end
#macro RGBF(C)
rgbf <pow(C.red, OldGamma), pow(C.green, OldGamma), pow(C.blue, OldGamma),
RGB.filter>
#end
#macro RGBT(C)
rgbt <pow(C.red, OldGamma), pow(C.green, OldGamma), pow(C.blue, OldGamma),
RGB.transmit>
#end
and then use search'n'replace to change e.g. the color statements from woodmaps.inc
like this
this:
#declare M_Wood1A =
color_map {
[0.0, 0.1 color RGB(<0.88, 0.60, 0.40>)
color RGB(<0.88, 0.60, 0.40>)]
[0.1, 0.9 color RGB(<0.88, 0.60, 0.40>)
color RGB(<0.60, 0.30, 0.20>)]
[0.9, 1.0 color RGB(<0.60, 0.30, 0.20>)
color RGB(<0.60, 0.30, 0.20>)]
}
this is done within a few seconds and you they still work for old scenes in the old
way by setting
global_settings {
assumed_gamma 2.0 // or whatever
}
#declare OldGamma = 1;
You can even use these macros if you are so used to think in colors in the non-linear
way but want to
put assumed_gamma 1.0 into global_setting for some reason.
hope this helps
-Ive
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|