|
![](/i/fill.gif) |
Am 28.06.2010 12:48, schrieb Gyscos:
> But isn't it possible to query the right value for Display_Gamma from the OS ? I
> mean, how do other viewer software do ?
That would be pretty cool. However, in a typical end user's system, the
OS doesn't know either, or even has wrong information about it.
For more professional systems, where the user (or admin) cares about
display nonlinearity, you'll typically have calibrated displays with a
corresponding ICC profile; but that don't help either, because it's much
more complex than a simple gamma curve, and POV-Ray isn't
color-profile-aware. Not yet.
> Also, about File_Gamma, it is used for the gamma-encoding that is applied when
> creating the file. When reading the file, it is decoded by the software, and
> then sent to the OS.
> I understand that, between the file being sent to the OS and the pixel being
> illuminated on the screen, there can be the gamma-transformation, caused by the
> graphic card, system, screen, ... because of some non-linearity.
> Now, before that, how does the software decode the file ?
> Does it do a reverse-encoding with the gamma value stored in the file ? Or does
> it just send it linearly to the system ?
For JPEG, BMP or the like, the software will essentially send the data
right to the display subsystem unchanged, expecting the display
subsystem's inherent non-linearity to take care of the gamma-decoding.
For PNG, the data will theoretically be gamma-decoded by the software,
then gamma pre-corrected by the same software to fit the display
subsystem's inherent gamma (having the same effect as gamma-encoding the
data again, though possibly with a different gamma value), and finally
the data passed to the display subsystem.
In practice, the initial gamma-decoding and subsequent gamma
pre-correction might be performed in one single step, taking advantage
of the fact that (x^A)^B = x^(A*B).
> I understood the Display_Gamma was mainly to compensate the system+GC+screen non
> linearity.
>
> The question is : What should the File_Gamma value compensate ? The software
> decoding ? The system+GC+screen non linearity ? Or the eye's perception ?
In a sense, File_Gamma is intended to compensate for the nonlinearity in
the eye's dynamic range, i.e. that the eye can tell apart two dark
colors easier than two bright colors.
In the PNG file format, that is its only role.
In JPEG, BMP or the like, the File_Gamma double-features as gamma
pre-correction for the intended output display's inherent nonlinearity,
because for those file formats this is customary.
Post a reply to this message
|
![](/i/fill.gif) |