People also obsess on pixel peeping both digital and film scans at 100%, which is a level of detail you will never get at a normal print viewing size. If you view around 33% in Photoshop, you are seeing approx. the print size if you are working at 300DPI.
Yeah but ,AI also works on scanned negative files.so, AI cannot be part of the comparison.Check this out: https://petapixel.com/2019/03/04/review-topaz-sharpen-ai-is-amazing/
Progress being made in AI sharpening tools. I have a theory that AI is going to take over in a way that makes comparing lenses and cameras moot. In about 5 more years throw any crappy image to the AI and it will reconstruct it from the ground up to look however you want it to look. It's hard to imagine how this works. I saw a similar system in 3D rendering. Previously an object had to spend say 1hr rendering reflectivity of light on a 3D vase, but a new system used AI instead, the AI knew what the scene should look like and constructed the scene within seconds based on that knowledge. Thinks are going to get pretty 'out there' soon.
resolution is only one part of sharpness.Adox CMS II 20 has a resolution of 800l/mm.
Scanned at 8000 DPI the resultant resolution is aprox 77Mp.
Developed as SCALA slide:
Example 1 (38mb Jpg)
Example 2 (38mb JPG)
Developed as negative:
Example 3 (42mb Jpg)
You have vastly overstated the case about the Nyquist theorem. You say it is deeply flawed, but actually, it is a rigorous mathematical theorem. No one has ever discovered a flaw in the theorem.A word of advice. Anytime you see the phrase "Nyquist Theory" on the internet, it's best to disregard the whole thing. 99% of the time it's brought up, it's brought up to prove something it was never intended to provide evidence for. The whole point of the Nyquist theorem is to provide a general guideline for minimal engineering standards for analog to digital conversion. It was never intended to be proof for anything. It is, by it's nature, deeply flawed. For example, the Nyquist Theory assumes a perfectly bandwidth limited system. These do not exist in nature. Therefor, under no circumstance can the Nyquist theory be applied to any system and be relied upon to give accurate results.
About the only time the Nyquist Theorem should be discussed is when you're designing or implementing an ADC system, and you want to know the ballpark for the bare minimum sampling frequency that you could potentially get away with to keep costs, processing, and/or storage space to a minimum. Even then, it should still be tested in a real world scenario to ensure that the results achieved are in line with what was expected.
So it's a handy theory that has it's uses. But more often than not, it's abused on the internet to "prove" some poorly thought out concept concocted by a neophyte with an axe to grind.
guangong: +100 Thank you! Choice is good. Truth is that digital is where the innovation $'s have gone, so many of the advances in photography require digital intervention at some point - even if it's in or after the scan of a negative. But the AF in current production tops what any film camera had. Digital serves certain projects better. Film can be more fun - but it can also take more time to do well. I'd warrant for what most snappers want in an image capture, the market has spoken well and the average snap with digital may well be better than the average snap was with film. Different issue. For my part, I also agree with Cholenpot: "Great picture!" tops "Sharp image!" The latter can almost be a fail... if that's the only thing going for the image.
I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.
So what is the actual truth on all this, or is there any conclusion?
As far as most personal photography is concerned, my opinion is that digital vs film sharpness just doesn't matter that much. You can take a poor photograph with the sharpest of systems and it is still a poor photograph. If whatever you are using works for you, then it's good.
Now there are certainly cases where ultimate sharpness is required. If we're going to the moon or Mars, then we want it all! But in the general case, with today's film or digital gear, the point of sufficient sharpness was passed decades ago.
Besides, if you are really concerned about getting the maximum, sharpness has probably has much more to do with the quality of the lens in use than the "sensor".
Somebody can take just as bad a picture on the moon as on Earth anyway.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?