|
![](/i/fill.gif) |
Am 02.06.2011 19:54, schrieb studionase:
> Yeah, it works. I put in:
>
> global_settings { assumed_gamma 2.0 }
Chances are you should be using assumed_gamma 2.2 instead for exact
reproduction of 3.6 output.
> But I didn't understand what the File_Gamma exactly does.
Image files have a long tradition of being /gamma encoded/, that is, a
pixel value of e.g. 102 does /not/ correspond to a physical brightness
of 102/255 = 40% maximum brightness, but rather to a physical brightness
of 0.40^gamma, where gamma is typically around 2.2, i.e. about 13%
maximum brightness.
One reason for this is that the human perception system's ability to
discern brightness differences depends on the absolute brightness, so
under typical viewing conditions the difference between a brightness
level of e.g. 13% and 14% is perceived as roughly as strong as a
difference between e.g. 97% and 100%, although the latter has thrice the
absolute difference in brightness.
Historically, this /gamma encoding/ double-featured as a /gamma
pre-correction/, as the old VGA standard converted a screen content
pixel value of e.g. 102 to a signal voltage of 40%, but the CRT
display's inherent non-linearity would would lead to this being
displayed as only about 13% maximum brightness.
Having image files encoded with the same gamma that would have to be
applied to correct for the CRT display's non-linearity allowed to simply
take the image file contents and paste them into the VGA screen buffer.
Many file formats still share this double-featuring of gamma encoding as
gamma pre-correction. However, as no two computers have exactly the same
non-linear display behaviour, and with advances in color management,
this practice no longer matches state of the art, and sophisticated
image viewers more and more frequently perform additional gamma
correction before writing the data to the screen buffer, in order to
compensate for the non-linear behaviour of a /particular/ combination of
graphics card & display.
Consequently, gamma encoding (which is a file-format specific process)
and gamma pre-correction (which is a display-specific process) have
begun to become separated. Some file formats such as PNG also fully
support this fact, by including explicit information which encoding
gamma was applied, so that a reading application can properly
gamma-decode the image and then take care of the necessary
display-specific gamma correction itself.
POV-Ray's new File_Gamma setting was introduced to accomodate for this
practice, by providing different settings for the current system's
display gamma (gamma correction applied to the image data for preview,
specified via Display_Gamma) and the gamma encoding to apply to the
output image (specified via File_Gamma).
POV-Ray 3.6x used only a single setting (Display_Gamma), presuming that
the intended gamma encoding would happen to be identical to the gamma
correction required for display preview, which is not always the case.
Note that for most files generated by POV-Ray, gamma encoding still
double-feature as gamma pre-correction; however, with the addition of
the File_Gamma setting, the gamma pre-correction to apply for file
output may be set to a different value than that used for preview
display, in case the image is ultimately to be viewed on a system with
different display gamma.
An exception to this are the file formats PNG (which, as already
mentioned, performs gamma encoding but leaves it up to the viewing
software to decode the image data and then apply proper gamma correction
for the viewing display) as well as HDR and OpenEXR (which do not employ
gamma encoding at all, and instead defer gamma correction to the viewing
software).
> Curiosity, I found here an information, which says, you should not use
> assumed_gamma.. ?!??
>
http://wiki.povray.org/content/HowTo:Fix_old_scenes_to_work_with_the_new_gamma_system
As stated at the beginning of that page, the information therein is
"somewhat outdated" (which might be an understatement).
Post a reply to this message
|
![](/i/fill.gif) |