chuck94022
Member
As a film user (as well as, I admit, a digital user), I have read many, many articles on digital versus film quality. Every single article first converts the film to the digital domain by scanning, and then compares the results.
I've come to the conclusion that this can't be right. Film is not a digital recording form, and any scan, at any level, is only an approximation of what has been recorded. Because scanners are discrete systems (versus continuous, analog systems), samples of the film content are necessarily slotted to the closest digital value. Worse, very fine detail is averaged over the area of a much larger sampling sensor, causing a loss of information. This information loss takes many forms, and one of these is the amplification of noisy detail like grain.
Authors of these reviews will say that this doesn't matter, because the human eye can't perceive such differences anyway. And yet, they pixel peep the results, which the human eye can't do at normal viewing distances, and present a result.
I wonder if there is a better way. I'd like to propose an approach that would, on the film side, require the services of a skilled printer. This eliminates me, because at the moment I have no ability to print optically in my darkroom (no enlarger).
In essence, the film should not be scanned. It should be processed and printed optically.
A scene (the Air Force target could be used, but my, how boring) should be chosen with plenty of fine detail. An image should be captured with a film camera and an equivalent digital camera. Ideally, the film and digital cameras would be able to share a lens, so that the lens is taken out of the equation as a variable.
A 100% crop of the digital image, at native printer resolution (e.g., 300 ppi), should be sent to a high quality ink jet printer, printing at, say, 8x10 or perhaps 4x6. The printer should lose no image detail from the original digital capture, thus the need for a crop.
Once the digital crop has been established, an optical enlargement of the film should be made using a high quality enlarger with a professional level lens, operated by a skilled print maker. The enlargement should capture the exact same crop of the image, at the exact same size.
The resulting prints should be compared for fine detail capture, as well as any other metrics desired. These prints could be scanned for viewing on the web, but of course, the final evaluation should be done by comparing the actual prints, in controlled light, perhaps with a loupe if needed.
Has this ever been done? It seems to me that this is the only truly fair way to compare film to a digital process.
I've come to the conclusion that this can't be right. Film is not a digital recording form, and any scan, at any level, is only an approximation of what has been recorded. Because scanners are discrete systems (versus continuous, analog systems), samples of the film content are necessarily slotted to the closest digital value. Worse, very fine detail is averaged over the area of a much larger sampling sensor, causing a loss of information. This information loss takes many forms, and one of these is the amplification of noisy detail like grain.
Authors of these reviews will say that this doesn't matter, because the human eye can't perceive such differences anyway. And yet, they pixel peep the results, which the human eye can't do at normal viewing distances, and present a result.
I wonder if there is a better way. I'd like to propose an approach that would, on the film side, require the services of a skilled printer. This eliminates me, because at the moment I have no ability to print optically in my darkroom (no enlarger).
In essence, the film should not be scanned. It should be processed and printed optically.
A scene (the Air Force target could be used, but my, how boring) should be chosen with plenty of fine detail. An image should be captured with a film camera and an equivalent digital camera. Ideally, the film and digital cameras would be able to share a lens, so that the lens is taken out of the equation as a variable.
A 100% crop of the digital image, at native printer resolution (e.g., 300 ppi), should be sent to a high quality ink jet printer, printing at, say, 8x10 or perhaps 4x6. The printer should lose no image detail from the original digital capture, thus the need for a crop.
Once the digital crop has been established, an optical enlargement of the film should be made using a high quality enlarger with a professional level lens, operated by a skilled print maker. The enlargement should capture the exact same crop of the image, at the exact same size.
The resulting prints should be compared for fine detail capture, as well as any other metrics desired. These prints could be scanned for viewing on the web, but of course, the final evaluation should be done by comparing the actual prints, in controlled light, perhaps with a loupe if needed.
Has this ever been done? It seems to me that this is the only truly fair way to compare film to a digital process.