|
|
Yup, my mistake. I completely didn't realize that it said "display gamma"
in the first paragraph. Maybe it could be worded better, but my mistake
nevertheless...
I nearly posted this in "new users"...
--
Slash
"Jim Charter" <jrc### [at] aolcom> wrote in message
news:3ed7c0ff$1@news.povray.org...
> Slashdolt wrote:
> > I still don't understand how I should use assumed_gamma. I've read the
> > documentation and it seems to conflict with itself.
> >
> > "For PC systems, the most common display gamma is 2.2, while for scenes
> > created on Macintosh systems should use a scene gamma of 1.8. Another
gamma
> > value that sometimes occurs in scenes is 1.0."
> >
> > then later it says:
> >
> > "For new scenes, you should use an assumed gamma value of 1.0 as this
models
> > how light appears in the real world more realistically."
> >
> > So is there a preferred setting? I've noticed that with assumed_gamma
1.0,
> > I end up having textures with extremely low rgb values (< 0.1), just to
get
> > the texture dark enough. I've also noticed that the T_Wood textures
> > generally look really bad at assumed_gamma 1.0 under most lighting
> > conditions.
> >
> > I've been using higher values recently (around 2.0). Does that mean
that
> > light is not modelled realistically?
> >
> Set assumed_gamma to 1.0 in your code. Then set the display_gamma in
> your master ini to a corresponding value, usually between 1.8 and 2.2 in
> order to get the look you want. Then in theory, everyone else sees the
> same thing you do. I use Display_Gamma=1.8 in my ini file
>
Post a reply to this message
|
|