|
|
> 1) I noticed that on scanning electron microscopes (SEMs), there is often "Auto
> Contrast / Brightness" settings. When you use it, I find that it often gives
> you a histogram that "wastes" at least a third to half of the 256 grey scale
> values, with at least the top and bottom 1/6 unused.
If you have 99.9% of pixels between grey level 64 and 192, should those
99.9% be expanded to fill up GL0 - GL255 (and losing the brightness and
darkness of the other 0.1%)? Auto exposure algorithms (also in normal
digital cameras) must make some sacrifices to give satisfactory results
in the majority of cases to the majority of people, they get it wrong
sometimes (eg making some important shadow detail totally black, or some
bright region totally white).
> 2) I looked at some NASA images from the Mercury mission. ( http://is.gd/cgqJdp
> ) I noticed that for most of the images, they too left unused half of the 256
> values.
Which half was unused? This happens normally, eg if you take a photo of
clouds from a plane window (above the clouds), it's likely a large chunk
of the lower grey levels won't be used at all. Also with images that
have very small very bright areas (eg the sun, or other stars at night)
it may only be a tiny % of pixels that are using most of the range.
> 3) Then there is the never-ending controversy over calibrated monitors.
> Shouldn't people save time, and just tweak the lighting in the scene so that a
> proper distribution of grey (or RGB) values are used?
What do you mean by "proper distribution" though? That will depend on
what monitor you are using to generate the image (assuming you are
working towards a physically realistic image).
Post a reply to this message
|
|