chuckroast
Subscriber
All of this, of course, is only of interest to us photo nerds. The vast majority of photographs are of people (or of their lunch) wherein neither dynamic range nor resolution matter very much. If I were still shooting portraits (shudder) or weddings (double shudder), there's not question I'd be using 100% digital capture.
Yes, that's true, but that's not the benchmark. The benchmark is a traditionally printed silver print which has been the gold standard for "photographs" for over a century. If you cannot do at least that well, you're going backwards.
It does, but it does not have the dynamic range of film itself. Film is capable of holding north of 14 stops of dynamic range, albeit in a nonlinear manner. Paper is something like 5-6 stops IIRC. Even higher end monitors only have about a 10-11 stop dynamic range. (There may be crazy expensive monitors for things like medical imaging that exceed this, I don't know.)
This why I personally have never liked scanning film. A good scanner like a V800 can capture 14 stops of SBR, but you can't get it all on the display. But that's maybe just my own pesky biases or lack of expertise.
Again, an inkjet print isn't the standard, a silver print is the reference. I've never seen an inkjet print that - to my eye - approximates how a good silver print should look. The newer printers come closer but they still don't have that classic look. Everyone's mileage likely varies on this. (The best I have ever seen was when I would send my digital files to Costco for RA-4 prints. They were as good as any pro lab I used in many decades and better than most. The sale of that business to Shutterfly was a real loss.)
More importantly, most monitors don't have the resolution to capture the full resolution of film or digital. A 4K monitor is 3840x2160 which is 8.4 mpix - well below what even a low end digisnapper captures. Even small format film like 35mm is north of 20 mpix.
But the specific problem here is that a print reflects light, but a screen emits light. One isn't particularly better than the other, they are just different. Even with hours of tweaking and calibrating on a 10-bit LUT monitor, I've not ever been able to the display look like a print - and it likely never will. And that's great. I long ago accepted that digital and analog output do not compete, they produce different things for different purposes.
As a practical matter, none of this matters anyway. Even assuming we all had monitors that could produce images at the resolution and dynamic range of a film or high-resolution digital camera, there is still a problem. Among the many and various people on this forum there is no guarantee all our scanners and monitors are calibrated to a common standard and that our viewing environments are identical.
The closest I've come so far to scratching this itch is to print traditionally and scan the print. This captures the interpretive step all analog prints undergo of mapping up to 14 stops of light in the negative to the 5-6 of paper. It's not a perfect solution because of the aforementioned reflect vs. emit disconnect, but it's as good a way as I have found to share work with others, again noting the problem of mismatched monitors and viewing environments.
It's probably worth noting that prints themselves have an element of this problem - there is no guarantee the person viewing a physical print will do so from the distance and lighting environment the print maker intended.
Although I dedicate myself with heart and soul to prints on paper, we cannot ignore the fact that the vast majority of the photos we "consume" are digital on a screen. We can also conclude that a modern high-resolution screen easily leaves a photo printed in a newspaper behind.
Yes, that's true, but that's not the benchmark. The benchmark is a traditionally printed silver print which has been the gold standard for "photographs" for over a century. If you cannot do at least that well, you're going backwards.
I am also not at all convinced that an image on an HR screen is so inferior. As some here would have us believe. I must also be very mistaken if a light-emitting screen does not have a much greater light intensity range than reflective paper. In
It does, but it does not have the dynamic range of film itself. Film is capable of holding north of 14 stops of dynamic range, albeit in a nonlinear manner. Paper is something like 5-6 stops IIRC. Even higher end monitors only have about a 10-11 stop dynamic range. (There may be crazy expensive monitors for things like medical imaging that exceed this, I don't know.)
This why I personally have never liked scanning film. A good scanner like a V800 can capture 14 stops of SBR, but you can't get it all on the display. But that's maybe just my own pesky biases or lack of expertise.
addition, the pixel density of the current LED screens is close to that of an inkjet print. I think even though I love paper, it is a myth that a screen is inferior to print.
Again, an inkjet print isn't the standard, a silver print is the reference. I've never seen an inkjet print that - to my eye - approximates how a good silver print should look. The newer printers come closer but they still don't have that classic look. Everyone's mileage likely varies on this. (The best I have ever seen was when I would send my digital files to Costco for RA-4 prints. They were as good as any pro lab I used in many decades and better than most. The sale of that business to Shutterfly was a real loss.)
More importantly, most monitors don't have the resolution to capture the full resolution of film or digital. A 4K monitor is 3840x2160 which is 8.4 mpix - well below what even a low end digisnapper captures. Even small format film like 35mm is north of 20 mpix.
But the specific problem here is that a print reflects light, but a screen emits light. One isn't particularly better than the other, they are just different. Even with hours of tweaking and calibrating on a 10-bit LUT monitor, I've not ever been able to the display look like a print - and it likely never will. And that's great. I long ago accepted that digital and analog output do not compete, they produce different things for different purposes.
As a practical matter, none of this matters anyway. Even assuming we all had monitors that could produce images at the resolution and dynamic range of a film or high-resolution digital camera, there is still a problem. Among the many and various people on this forum there is no guarantee all our scanners and monitors are calibrated to a common standard and that our viewing environments are identical.
The closest I've come so far to scratching this itch is to print traditionally and scan the print. This captures the interpretive step all analog prints undergo of mapping up to 14 stops of light in the negative to the 5-6 of paper. It's not a perfect solution because of the aforementioned reflect vs. emit disconnect, but it's as good a way as I have found to share work with others, again noting the problem of mismatched monitors and viewing environments.
It's probably worth noting that prints themselves have an element of this problem - there is no guarantee the person viewing a physical print will do so from the distance and lighting environment the print maker intended.
Last edited:
