At a Printing seminar I attended, they mentioned the usual issue of tonal compression. The numbers they used from brightest highlight to darkest shadow were as follows:
-Human perception 1,000 to 1
-Film and paper 100 to 1 (I’d like to know what film is by itself.)
-Halftone 20 to 1
They were unable to answer, however, what the ratio is for digital cameras, digital camera backs and scanners–does anyone have any technical documentation on this subject? Obviously, specific numbers will be dependent on the quality and bit depth of the device, but a best case scenario would be sufficient.
One paper I located on the subject gave the range of a scene as 10,000 to 1 and a CRT as 50 to 1. Other papers just say monitors are less than 100 to 1. Then LCD manufacturers are claiming a “Image Contrast Ratio” of 600 to 1. Sounds like apples and oranges and nobody’s using the same bathroom scale!
I’ve seen published numbers in the literature for digital devices, but they are usually expressed in terms of Density Range on a scale of 0 to 4–with many scanner companies claiming numbers higher than 4.0, one becomes skeptical of both the claim and the measuring tool. Here’s a composite of some data I’ve found from different sources so far:
-Desktop scanners 3.9
-Color print 1.9
-Offset press (4-color, coated stock) 1.8
I’d like to “fill in the blanks.” The first method is easier to teach from, so that would be my preference. However, I can use the second if I have to–I just would like to avoid switching horses in the middle of the stream!
Any assistance on this would be greatly appreciated!