I get around 100 lp/mm from my Nikon800 but that is resolution not sharpness.I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.
So what is the actual truth on all this, or is there any conclusion?
As the bumper sticker said: "My GG is sharper than your honor student"
Youngster here. (relatively)
This is a silly debate in my opinion.
Modern full frame digital sensors (35mm) runs rings around 99% of 35mm film. 35mm is a lo-fi format that was stretched to it's resolution limit. Digital has far surpassed what 35mm film can do.
However, I'm not shooting 35mm film for it's sharpness or resolution anymore than I'm using a fountain pen for it's smoothness and accuracy. Film is about obtaining a look and enjoying a process, at least for me. Say what you will and this is an analog site, I'm not showing up to a paying humdrum gig with 10 rolls of 35mm. I'd be crazy to shoot a full Bar Mitzvah on film. Sure, I bring some for variety and taste but when I got bills to pay the boring John Henry killing DSLR gets hauled out and used.
Wise observation from a youngster!
This reminded me of a camera club of almost sixty years ago that broke up because two members were constantly arguing over the difference between "sharpness" and "resolution" and whether there was any. After that, the rest of us made certain that the two of them were never in the same organization again. Come to think of it, the argument was between "sharpness" and "definition". Could "resolution" be another term for "definition"? Folks were serious about their photography in those days. Almost as serious as this group.........Regards!With digital, sharpness is as much a function of the processing as anything else.
It is important to understand that sharpness and resolution and contrast are all separate but inter-related. Sharpness being the most subjective of the three.
Some of the best film resolution results have low apparent sharpness.
And some of the results that appear to have the highest amount of sharpness - whether film or digital - have relatively low resolution.
And of course, if there is a digital step anywhere between capture and presentation, then apparent sharpness is likely reliant on how that step is handled.
I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.
So what is the actual truth on all this, or is there any conclusion?
A word of advice. Anytime you see the phrase "Nyquist Theory" on the internet, it's best to disregard the whole thing. 99% of the time it's brought up, it's brought up to prove something it was never intended to provide evidence for. The whole point of the Nyquist theorem is to provide a general guideline for minimal engineering standards for analog to digital conversion. It was never intended to be proof for anything. It is, by it's nature, deeply flawed. For example, the Nyquist Theory assumes a perfectly bandwidth limited system. These do not exist in nature. Therefor, under no circumstance can the Nyquist theory be applied to any system and be relied upon to give accurate results.
About the only time the Nyquist Theorem should be discussed is when you're designing or implementing an ADC system, and you want to know the ballpark for the bare minimum sampling frequency that you could potentially get away with to keep costs, processing, and/or storage space to a minimum. Even then, it should still be tested in a real world scenario to ensure that the results achieved are in line with what was expected.
So it's a handy theory that has it's uses. But more often than not, it's abused on the internet to "prove" some poorly thought out concept concocted by a neophyte with an axe to grind.
I am not sure what "Plato's standard" is, but if you are asking whether film or digital is sharper, you are asking the wrong question.Plus like 100,000. There are so many forum warriors doing back of the napkin math to tell you what you can and can't do. I think they believe everything has to be held to like, Plato's standard. It's best to simply ignore these folks and get back to producing work.
I am not sure what "Plato's standard" is, but if you are asking whether film or digital is sharper, you are asking the wrong question.
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?