|
|
"stm31415" <sam### [at] cscom> wrote:
> ...1.0 is not, in
> fact, a gamma I can get, so I have no way of comparing - but as I
> understand it, if I could reach such a gamma, the top image would look like
> 2.0 did at 2.0.
Hmm. I didn't mean for anyone to have to switch his or her monitor/OS gamma
to equal 1.0 (or 2.0) just to look at my image. (1 makes things look pretty
horrible, IMHO.) The
test image I posted is just a ,jpg file, combined (in Photoshop) from two
..bmp POV renders of my scene, with no embedded gamma info. It should show
up in any typical image viewer *more or less* as I intended. My original
question...and I apologize if it wasn't clear...should have been "Which
rendered image looks more realistic on your own system?" Any differences in
the OVERALL image quality from system to system, gamma-wise, would be
"smaller" than the
differences between my two renders (assuming folks are looking at my image
on a system with a gamma somewhere between 1.8 and 2.2.)
>
> Well, of course, the trick is what *our* monitors are working with. When I
> play with my gamma to get it to 2.0 (which is a wierd value to pick, btw)...
> It might make more sense to use gammas such as 1.8 and 2.2, so you can get
> opinions from people with the proper settings.
Weird? Nope. As I mentioned in my gamma discussion, my own PC's monitor/OS
system gamma is set at 2.0, as a "compromise" between the Mac's 1.8 and
the PC's "normal" 2.2 (though of course, *most* of the world's computer
systems ARE set at 2.2.) This makes perfect sense, if I want to create
image files (though NOT .png files) that will look *just about* the same on
others' systems. Before deciding on 2.0, I did MANY tests, comparing lots
of different images on a system gamma of 1.8 vs. 2.2. (I have both a Mac
and a PC, each with its own monitor, so that was relatively easy.) My
choice of 2.0 seems to work quite well.
Ken
Post a reply to this message
|
|