|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
.... use all the 256 grey scale values. ?
1) I noticed that on scanning electron microscopes (SEMs), there is often "Auto
Contrast / Brightness" settings. When you use it, I find that it often gives
you a histogram that "wastes" at least a third to half of the 256 grey scale
values, with at least the top and bottom 1/6 unused.
2) I looked at some NASA images from the Mercury mission. ( http://is.gd/cgqJdp
) I noticed that for most of the images, they too left unused half of the 256
values.
3) Then there is the never-ending controversy over calibrated monitors.
Shouldn't people save time, and just tweak the lighting in the scene so that a
proper distribution of grey (or RGB) values are used?
Am I missing something or is everyone else in the world missing something?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am 21.03.2011 17:39, schrieb gregjohn:
> .... use all the 256 grey scale values. ?
>
> 1) I noticed that on scanning electron microscopes (SEMs), there is often "Auto
> Contrast / Brightness" settings. When you use it, I find that it often gives
> you a histogram that "wastes" at least a third to half of the 256 grey scale
> values, with at least the top and bottom 1/6 unused.
Don't know about this one.
> 2) I looked at some NASA images from the Mercury mission. ( http://is.gd/cgqJdp
> ) I noticed that for most of the images, they too left unused half of the 256
> values.
Maybe those images are calibrated to some physical units?
> 3) Then there is the never-ending controversy over calibrated monitors.
> Shouldn't people save time, and just tweak the lighting in the scene so that a
> proper distribution of grey (or RGB) values are used?
>
> Am I missing something or is everyone else in the world missing something?
You're missing that one and the same image will look differently on any
two uncalibrated monitors. You can tweak your scene all you want, but if
the monitor it will later be displayed on (or actually the display
subsystem as a whole) doesn't happen to be set up for the same white
point, black level and gamma you used, the scene will inevitably look
different from what you intended, and all your tweaking will be in vain.
The ideal solution for this problem would be to calibrate all displays
in the whole world to one and the same set of well-defined settings, so
that "what you see is what others get".
An alternative is to calibrate all displays in the whole world to /any/
well-defined settings, attach information to each image what settings to
best use for display, and have image viewing software automatically
compensate accordingly for the differences in the display settings.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> 1) I noticed that on scanning electron microscopes (SEMs), there is often "Auto
> Contrast / Brightness" settings. When you use it, I find that it often gives
> you a histogram that "wastes" at least a third to half of the 256 grey scale
> values, with at least the top and bottom 1/6 unused.
If you have 99.9% of pixels between grey level 64 and 192, should those
99.9% be expanded to fill up GL0 - GL255 (and losing the brightness and
darkness of the other 0.1%)? Auto exposure algorithms (also in normal
digital cameras) must make some sacrifices to give satisfactory results
in the majority of cases to the majority of people, they get it wrong
sometimes (eg making some important shadow detail totally black, or some
bright region totally white).
> 2) I looked at some NASA images from the Mercury mission. ( http://is.gd/cgqJdp
> ) I noticed that for most of the images, they too left unused half of the 256
> values.
Which half was unused? This happens normally, eg if you take a photo of
clouds from a plane window (above the clouds), it's likely a large chunk
of the lower grey levels won't be used at all. Also with images that
have very small very bright areas (eg the sun, or other stars at night)
it may only be a tiny % of pixels that are using most of the range.
> 3) Then there is the never-ending controversy over calibrated monitors.
> Shouldn't people save time, and just tweak the lighting in the scene so that a
> proper distribution of grey (or RGB) values are used?
What do you mean by "proper distribution" though? That will depend on
what monitor you are using to generate the image (assuming you are
working towards a physically realistic image).
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
In SEM work, it's not that 99.9% of PIXELS exist between GL64 and 192. Instead,
there's a distribution of electron counts (current?), and some bloke arbitrarily
decided to make 64 the min and 192 the max. Why didn't they decide to make it
like GL32 and 224? I can say with some authority that failure to use the full
grey scale range in SEM work in in many cases cause one to lose information that
is present in the imaging conditions. I'm finding it annoying that people write
auto contrast brightness routines that throw away information.
My theory is that the set of aesthetically pleasing images which look good on
properly calibrated monitors will all have a similar histogram. Saying
"calibrate your monitor" while leaving out a "calibrate your aesthetics" allows
a human subjectivity to mess it all up again. Some blokes won't know (I'm not
sure I do and I'll say I'm median intelligence here) who much to lighten up dark
scenes or use appropriate contrast, etc. Just use the full range. If course
there will be artistic choices here, and my guess is they'll boil down to
choices between broad or bimodal distributions.
I would guess that when you write an algorithm for a digital camera to chose
exposure time, you'd just leave the shutter open long enough to hit some target
RGB value. Same in raytracing: keep upping the light color's RGB values until
you've used the full range.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> My theory is that the set of aesthetically pleasing images which look good on
> properly calibrated monitors will all have a similar histogram.
It surely depends on the content of the image (skiing scene, sunny
beach, overcast fields, moon at night)? Did you pick a few random
images (eg from google or flickr) that look good and compare the
histograms? My expectation is that they would be quite different.
> I would guess that when you write an algorithm for a digital camera to chose
> exposure time, you'd just leave the shutter open long enough to hit some target
> RGB value.
Indeed, usually the camera tries to get the average pixel value to
something like 17%. On more expensive cameras you can select the region
over which this average is done (entire frame, center weighted, spot,
etc). This works well enough most of the time, but a serious
photographer will use manual exposure and look at the histogram. A good
example is snow, auto exposure usually gives pictures that look way too
dark (I assume because the camera's "17% rule" is assuming the sky to be
way brighter than the ground, in snow it often isn't).
Usually you can't just take the range of pixel brightnesses and map this
to 0-255, a small number of very dark and very bright pixels will cause
most of the image to lie between 100 and 140 or something and look very
washed out overall.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Am 22.03.2011 12:42, schrieb gregjohn:
> In SEM work, it's not that 99.9% of PIXELS exist between GL64 and 192. Instead,
> there's a distribution of electron counts (current?), and some bloke arbitrarily
> decided to make 64 the min and 192 the max. Why didn't they decide to make it
> like GL32 and 224? I can say with some authority that failure to use the full
> grey scale range in SEM work in in many cases cause one to lose information that
> is present in the imaging conditions. I'm finding it annoying that people write
> auto contrast brightness routines that throw away information.
Leaving the brightest and darkest pixel values unused does have the
advantage that even on poorly calibrated displays the brightest and
darkest regions still show contrast. Keeping clear of the minimum &
maximum representable values also has benefits when it comes to certain
image processing, as some intermediate steps might otherwise
over-/underflow. While these were probably bigger issues some decades
ago, when calibrated displays were sparse and image processing was
almost invariably done on bytes to conserve memory, tradition and
backward compatibility may have preserved it to this date.
Using a range from 64 to 191, rather than 32 to 223, has the advantage
that you get exactly 128 (2^7) brightness levels. Computers love powers
of 2.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
clipka <ano### [at] anonymousorg> wrote:
> Am 22.03.2011 12:42, schrieb gregjohn:
> > In SEM work, it's not that 99.9% of PIXELS exist between GL64 and 192. Instead,
> > there's a distribution of electron counts (current?), and some bloke arbitrarily
> > decided to make 64 the min and 192 the max. Why didn't they decide to make it
> > like GL32 and 224? I can say with some authority that failure to use the full
> > grey scale range in SEM work in in many cases cause one to lose information that
> > is present in the imaging conditions. I'm finding it annoying that people write
> > auto contrast brightness routines that throw away information.
>
> Leaving the brightest and darkest pixel values unused does have the
> advantage that even on poorly calibrated displays the brightest and
> darkest regions still show contrast. Keeping clear of the minimum &
> maximum representable values also has benefits when it comes to certain
> image processing, as some intermediate steps might otherwise
> over-/underflow. While these were probably bigger issues some decades
> ago, when calibrated displays were sparse and image processing was
> almost invariably done on bytes to conserve memory, tradition and
> backward compatibility may have preserved it to this date.
>
> Using a range from 64 to 191, rather than 32 to 223, has the advantage
> that you get exactly 128 (2^7) brightness levels. Computers love powers
> of 2.
Your statement about the best skiing photos would make for a fascinating
challenge. I would bet that the best ones still have very broad or bimodal
distributions, but on sunny days, the dark hump will be very small. It's not
<Nr,Ng,Nb> where *all* N's >>0.5. A ski pole, black hat, or pine tree in the
corner of the photo.
[Later] From Flickr's "Most Interesting" "skiing" photos.
http://www.flickr.com/photos/52756285@N00/1390662667/
http://www.flickr.com/photos/firenzesca/2114824948/
Looking at these pics, they defintely have a (small) hump of very dark subject
matter in the histogram. And these were among the lightest ones I could find.
As for "night" tag at Flickr, many had city lights (rgb>>0.5), and only a few
were all dark.
But as for your other point, finally, someone told me how "I'm missing
something!" ;) . Monitor calibration could be the legacy, the historical reason
it may have sense to set up such a tradition at one time. In my shop, however, I
am the one guy (guy with the one type of job) who is most likely to do
post-processing of images, and I don't want to lose information. Squeezing the
image range down to some arbitrary tight range is just always a bad idea. In the
small subset of cases where I'd end up pushing/ squeezing it myself, I could
very easily compress the range it later. But you cannot uncompress to get back
the original.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Continuing the rant:
In practice, all the technicians know that the Auto Contrast/ Brightness
settings aren't giving good quality images. My personal work-around is to use
ACB, then arbitrarily tweak up the contrast and down the brightness knob on the
SEM *before* capturing an image. That way I get a nice, smooth distribution
that's likely to use the full range. In contrast, most technicians hit ACB, then
take an image, then use C&B settings in a post processing image program to tweak
the histogram or to personal (visual) taste. Not soooo bad, but not ideal. One
problem is that some technicians, perhaps knowing that before-capture ACB is
crap, probably aren't using it, and then they make big C&B changes in the image
afterwards, and save it, and THIS is the version you get. They've expertly
prepared the sample, found the area of interest, and used good focus and stig,
but it's got visible artifacts from the big post-process adjustment. This one
last step can ruin an image from being "publication quality." Perhaps I'm being
overly pedantic, but it just seems that wrong-headed tradition on the part of
the scope manufacturers ruins things. And the same philosophy I believe
answers some of the contrast & "calibration" images that always spring up in CG.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"gregjohn" <pte### [at] yahoocom> wrote:
> Continuing the rant:
[rant snipped] ;)
Out of interest, are these images being produced in secondary or backscatter
mode? My personal experience is slanted toward BSE, since it was compositional
variation I was studying. I too usually adjusted the c/b manually to get the
best histogram possible, finding ACB quite hit-and-miss. I'd usually only use
ACB as a shortcut for a good starting point, and often discard the top end of
the spectrum if I wasn't interested in those sub-micron heavy-metal particles...
I don't think I ever post-processed an image, except for stitching regions
together. Couldn't speak for other users...
Bill
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
|
|