|
![](/i/fill.gif) |
Thanks, I think I get it better now...
But isn't it possible to query the right value for Display_Gamma from the OS ? I
mean, how do other viewer software do ?
Also, about File_Gamma, it is used for the gamma-encoding that is applied when
creating the file. When reading the file, it is decoded by the software, and
then sent to the OS.
I understand that, between the file being sent to the OS and the pixel being
illuminated on the screen, there can be the gamma-transformation, caused by the
graphic card, system, screen, ... because of some non-linearity.
Now, before that, how does the software decode the file ?
Does it do a reverse-encoding with the gamma value stored in the file ? Or does
it just send it linearly to the system ?
I understood the Display_Gamma was mainly to compensate the system+GC+screen non
linearity.
The question is : What should the File_Gamma value compensate ? The software
decoding ? The system+GC+screen non linearity ? Or the eye's perception ?
Thanks for your patience :)
Post a reply to this message
|
![](/i/fill.gif) |