|
 |
If you look at a printer, it's usually rated at either 300 DPI, 600 DPI
or 1200 DPI. So how many DPI do you get for a typical monitor?
To be completely precise, let's look at *pixels* per inch, since this is
unambiguous.
Now, if I understand this correctly, a 14 inch monitor is actually 14
inches across the diagonal. That means that the 14 inches is the length
of the hypotenuse of a right-angle triangle. Assuming the monitor is 4:3
aspect, we have a 3,4,5 triangle. Based on this, we have:
14" 4:3 = 11.2" x 8.4"
21" 4:3 = 16.8" x 12.6"
Suppose our hypothetical monitor can display 1280x768 pixels. Then in
the horizontal direction, we have:
1280 pixels / 11.2 inches = 114.3 pixels/inch
1280 pixels / 16.8 inches = 76.19 pixels/inch
That's quite some variation. Hmm, let's try my actual monitor at home.
Its resultion is the obscure figure 1680x1050, and it has the weird
aspect ratio 16:10.
Apparently a triangle with sides of 16 and 10 has a hypotenuse who's
length is equal to twice the square root of 89 - or approximately 18.87.
That gives me screen dimensions of about 17.81" by 11.3". And so, we have:
1680 pixels / 17.81 inches = 94.329 pixels/inch
In conclusion, it appears that most if not all current computer monitors
have a density slightly above or below 100 pixels/inch. That's about
tree times lower than the crappiest printer.
Now, admittedly, for colour images computer screens have the advantage
of not requiring the halftoning that printers generally require. But for
black text on a white background, printers win by a mile. (Even
including the fact that computer screens can do antialias, which is
generally a waste of time in printed text.)
Post a reply to this message
|
 |