|
![](/i/fill.gif) |
In article <web.437683731f2346275e2aed360@news.povray.org>,
rgo### [at] lanset com says...
> This is a multi-part message in MIME format.
> --------boundary00xxzz99pov77ccvv44
> Content-Type: text/plain; charset=iso-8859-1
> Content-Transfer-Encoding: 8bit
>
> "scott" <spa### [at] spam com> wrote:
>
> > They don't. Most PC monitors (the ones I've measured) have a gamma of
> > almost exactly 2.2. This means, if you tell it to display RGB 128,128,128
> > and measure the brightness, it will be 0.5^(2.2)=0.2 of the brightness of
> > RGB 255,255,255.
>
> Except... (there's always those damn exceptions!) it can't be strictly a
> matter of your monitor, because I have a dual boot PC with Linux & Windows,
> yet I cannot get the two to agree on display_gamma, even though its the
> same hardware. With both set the same, what looks good in Linux will be too
> dark in Windows.
>
> And 2.2 is out of the question! If I set my display_gamma to 2.2, it doesn't
> look right on anything else, and every scene I've ever rendered comes out
> dark and oversaturated.
>
As I mention in another post. The driver, card and display all make a
difference. The driver may be intentionally telling the card to produce a
brighter image, the card may inherently produce a stronger signal and the
display may not 'actually' be 2.2. Pros have to hand tune everything so
it 'is' 2.2. Everyone else gets stuck guessing.
--
void main () {
call functional_code()
else
call crash_windows();
}
Post a reply to this message
|
![](/i/fill.gif) |