|
|
|
|
|
|
| |
| |
|
|
|
|
| |
| |
|
|
"Warp" <war### [at] tagpovrayorg> wrote in message
news:473f3242@news.povray.org...
> Rune <aut### [at] runevisioncom> wrote:
>> For example, the eyes have an edge detecting layer I think, or something
>> along those lines. This layer could find all the edges of the bars in
>> your
>> image and send information about these edges on to the brain, so that the
>> brain can tell how many bars there are. The eyes also send the actual
>> "raw"
>> brightness info on to the brain, but this may be in a "low resolution"
>> where
>> no more than 16 different shades can be told apart. However, the brain
>> can
>> still count all the bars, becuase of the edges detected in the eyes.
>
> I'm quite certain that if it was an animation where each frame is
> completely filled by a shade of gray and was played eg. at 1 FPS, you
> could clearly see the change.
That might be one test, but it still makes it easy for the brain by showing
transitions.
What if you had a test that didn't allow the eyes/brain to calibrate from
frame to frame?
What if your vision was blanked for a time (eg. half-second or second)
between each successive frame? How many grey shades could you distinguish
reliably?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> What if your vision was blanked for a time (eg. half-second or second)
> between each successive frame? How many grey shades could you
> distinguish reliably?
That wouldn't be a fair test. It's hard for humans to find differences
between two images (even big differences) if the image blanks for a
second before showing the other. I saw an example that swapped two
images with a black screen in between, then did it again without the
black screen and *then% I could notice the differences.
Anyway, I don't care about an accurate test of eye sensitivity. I just
know that I can notice "color bands" on grayscale images, and more bits
per channel would avoid that.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> I just know that I can notice "color bands" on grayscale images, and
> more bits per channel would avoid that.
Oh, and I remember a suggestion of converting a 48-bit image to 24-bit
using dithering. I wonder how good that would look...
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Rune wrote:
> A lot of
> preprocessing is done in the eyes themselves before the visual data is sent
> to the brain.
You'd be amazed. :-)
> For example, the eyes have an edge detecting layer I think, or something
> along those lines.
First a change detector layer, then from that a bunch of edge detectors,
then object detectors, then motion detection, then ...
Before it leaves you're eyes, they already know there's something large
coming towards you from the left. That info is hooked directly from eyes
to neck muscles.
> This layer could find all the edges of the bars in your
> image and send information about these edges on to the brain, so that the
> brain can tell how many bars there are.
True.
> The eyes also send the actual "raw"
> brightness info on to the brain, but this may be in a "low resolution" where
> no more than 16 different shades can be told apart.
Well, it's an analog pulse train. Actually, I'm not sure how far up the
pile before *you* decide it's no longer eyes and has become brain. I'm
also not sure if the cells that actually sense light are integrated into
anything at a higher level at all, or whether all their info is
processed before getting to the "brain" part.
> However, the brain can
> still count all the bars, becuase of the edges detected in the eyes.
> I'm not saying it's like that; just that your image doesn't prove anything
> about the amount of gray scales the brain can tell apart. So it really comes
> down to what you mean by "human visual system" - the eyes or the brain.
I think if you're picking out a monitor, you're worried about your
brain's perception of it. :-)
--
Darren New / San Diego, CA, USA (PST)
Remember the good old days, when we
used to complain about cryptography
being export-restricted?
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nicolas Alvarez nous apporta ses lumieres en ce 2007/11/17 19:12:
> Nicolas Alvarez escribió:
>> I just know that I can notice "color bands" on grayscale images, and
>> more bits per channel would avoid that.
>
> Oh, and I remember a suggestion of converting a 48-bit image to 24-bit
> using dithering. I wonder how good that would look...
It can look very good, if the resolution is fine enough. The best would be to
use something like "error diffusion" dithering. That's a random noise dithering
technique. That's the way I got the best printing for smooth shades.
--
Alain
-------------------------------------------------
One tequila, two tequila, three tequila, floor.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Darren New nous apporta ses lumieres en ce 2007/11/17 19:45:
> Rune wrote:
>> A lot of preprocessing is done in the eyes themselves before the
>> visual data is sent to the brain.
>
> You'd be amazed. :-)
>
>> For example, the eyes have an edge detecting layer I think, or
>> something along those lines.
>
> First a change detector layer, then from that a bunch of edge detectors,
> then object detectors, then motion detection, then ...
>
> Before it leaves you're eyes, they already know there's something large
> coming towards you from the left. That info is hooked directly from eyes
> to neck muscles.
>
>> This layer could find all the edges of the bars in your image and send
>> information about these edges on to the brain, so that the brain can
>> tell how many bars there are.
>
> True.
>
>> The eyes also send the actual "raw" brightness info on to the brain,
>> but this may be in a "low resolution" where no more than 16 different
>> shades can be told apart.
>
> Well, it's an analog pulse train. Actually, I'm not sure how far up the
> pile before *you* decide it's no longer eyes and has become brain. I'm
> also not sure if the cells that actually sense light are integrated into
> anything at a higher level at all, or whether all their info is
> processed before getting to the "brain" part.
>
>> However, the brain can still count all the bars, becuase of the edges
>> detected in the eyes.
>
>> I'm not saying it's like that; just that your image doesn't prove
>> anything about the amount of gray scales the brain can tell apart. So
>> it really comes down to what you mean by "human visual system" - the
>> eyes or the brain.
>
> I think if you're picking out a monitor, you're worried about your
> brain's perception of it. :-)
>
Processing steps for human vision:
- retina catch light
- Raw image processing by the retina itself
- More image processing between the retina and the optical nerve
- Still more processing by the optical nerve (maybe the only nerve that have
actual processing capability)
- Final gross processing by the brain
- Pattern recognition and image reconstruction
- Visual memory cross references image completion
I may have missed/myss ordered some steps...
--
Alain
-------------------------------------------------
There will always be beer cans rolling on the floor of your car when the boss
asks for a ride home from the office.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Nicolas Alvarez wrote:
> First two and last two look quite similar from some angles (damned LCD
> screen). Also, the black ones look all similar (damned gamma; the world
> would be a better place if screens had had built-in gamma correction
> since the beginning).
Wow... Having a color corrected screen does wonders. I can see all 32
levels. I can also see I need to run the calibration tool on my CRT
again (the lower 2 levels don't show, but they do on the LCD)
I've seen these grayscales used on photography sites occasionally to
make sure brightness and contrast are optimal.
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
Gail Shaw wrote:
> My 19" LCD packed up the other night, and I thought I'd replace it with a
> widescreen (20" or 21") LCD
My home monitor is a Samsung 22" widescreen (Can't remember the exact
model number off-hand) The only drawback is if you get far enough
off-axis the colors begin to distort, but it has good contrast ratio,
and very fast response time.
One of my primary complaints about the monitor I have at work is if you
scroll white text over a black background, it literally disappears.
Doesn't happen on my Samsung.
Ah, here it is, the 226BW:
http://tinyurl.com/24544j
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
And lo on Sun, 18 Nov 2007 00:10:30 -0000, Nicolas Alvarez
<nic### [at] gmailisthebestcom> did spake, saying:
>> What if your vision was blanked for a time (eg. half-second or second)
>> between each successive frame? How many grey shades could you
>> distinguish reliably?
>
> That wouldn't be a fair test. It's hard for humans to find differences
> between two images (even big differences) if the image blanks for a
> second before showing the other. I saw an example that swapped two
> images with a black screen in between, then did it again without the
> black screen and *then% I could notice the differences.
Look up saccadic masking, our eyes blank out info when they're moving so
you could have two screens with no intermission if they just in different
places relative to a neutral background.
--
Phil Cook
--
I once tried to be apathetic, but I just couldn't be bothered
http://flipc.blogspot.com
Post a reply to this message
|
|
| |
| |
|
|
|
|
| |
| |
|
|
> Look up saccadic masking, our eyes blank out info when they're moving so
> you could have two screens with no intermission if they just in different
> places relative to a neutral background.
You mean like the attached images? Which pair of squares are the same
colour and which pair are different? (I used colours from Warp's vertical
bar image)
Post a reply to this message
Attachments:
Download 'grey test.png' (3 KB)
Download 'grey test 2.png' (3 KB)
Preview of image 'grey test.png'
Preview of image 'grey test 2.png'
|
|
| |
| |
|
|
|
|
| |