• Welcome to Photrio!
    Registration is fast and free. Join today to unlock search, see fewer ads, and access all forum features.
    Click here to sign up

Is Film Sharper then Digital or Vice Versa?

Recent Classifieds

Forum statistics

Threads
203,204
Messages
2,851,334
Members
101,721
Latest member
rptn
Recent bookmarks
0

braxus

Subscriber
Allowing Ads
Joined
Oct 19, 2005
Messages
1,873
Location
Fraser Valley B.C. Canada
Format
Hybrid
I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.

So what is the actual truth on all this, or is there any conclusion?
 
With digital, sharpness is as much a function of the processing as anything else.
It is important to understand that sharpness and resolution and contrast are all separate but inter-related. Sharpness being the most subjective of the three.
Some of the best film resolution results have low apparent sharpness.
And some of the results that appear to have the highest amount of sharpness - whether film or digital - have relatively low resolution.
And of course, if there is a digital step anywhere between capture and presentation, then apparent sharpness is likely reliant on how that step is handled.
 
Conclusion? Never. :smile:
Film is broad term. Film for topography mounted in special cameras and planes, is LF film. It still outperforms digital.
But digital FF camera outperform 135 film. Just compare macro images.
 
It's the whole system used to capture the image, and how the image is viewed. So it depends. Focus errors, camera movement, lens aberrations, and diffraction will work against getting the smallest details onto both a sensor and film. Also, film has grain, but digital has denoiser processing that can remove fine detail.

Then again, so what? Look at a digital image on a 4K screen: that's only 8Mp. An 8x10 print only needs about 8Mp, too. There's very little point having much more than that much resolution.
 
I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.

So what is the actual truth on all this, or is there any conclusion?
I get around 100 lp/mm from my Nikon800 but that is resolution not sharpness.
 
Youngster here. (relatively)

This is a silly debate in my opinion.

Modern full frame digital sensors (35mm) runs rings around 99% of 35mm film. 35mm is a lo-fi format that was stretched to it's resolution limit. Digital has far surpassed what 35mm film can do.

However, I'm not shooting 35mm film for it's sharpness or resolution anymore than I'm using a fountain pen for it's smoothness and accuracy. Film is about obtaining a look and enjoying a process, at least for me. Say what you will and this is an analog site, I'm not showing up to a paying humdrum gig with 10 rolls of 35mm. I'd be crazy to shoot a full Bar Mitzvah on film. Sure, I bring some for variety and taste but when I got bills to pay the boring John Henry killing DSLR gets hauled out and used.
 
Believe me. Im more a film guy then digital. I have a dedicated fridge freezer in my garage with frozen film inside it. But I shoot digital too. I usually bring both to shoots with me. I like having options, and I still love film too much to stop using it because of expense. Matter of fact Im thinking of getting an 8x10 film camera to be the ultimate camera for quality. No digital camera can touch that rez and its tones. But I use my digital on its own when I really don't care a lot for the shot. Its disposable. I use a D800, so its not a bad camera.
 
As the bumper sticker said: "My GG is sharper than your honor student"
 
Youngster here. (relatively)

This is a silly debate in my opinion.

Modern full frame digital sensors (35mm) runs rings around 99% of 35mm film. 35mm is a lo-fi format that was stretched to it's resolution limit. Digital has far surpassed what 35mm film can do.

However, I'm not shooting 35mm film for it's sharpness or resolution anymore than I'm using a fountain pen for it's smoothness and accuracy. Film is about obtaining a look and enjoying a process, at least for me. Say what you will and this is an analog site, I'm not showing up to a paying humdrum gig with 10 rolls of 35mm. I'd be crazy to shoot a full Bar Mitzvah on film. Sure, I bring some for variety and taste but when I got bills to pay the boring John Henry killing DSLR gets hauled out and used.

Wise observation from a youngster!
 
hi braxus
i have no idea which is sharper, my digital camera ( not my phone ) is from IDK 2005? my phone is more recent.
i tend to go the other direction, and shoot my digital cameras through a dirty filter or plastic bag.
 
Check this out: https://petapixel.com/2019/03/04/review-topaz-sharpen-ai-is-amazing/

Progress being made in AI sharpening tools. I have a theory that AI is going to take over in a way that makes comparing lenses and cameras moot. In about 5 more years throw any crappy image to the AI and it will reconstruct it from the ground up to look however you want it to look. It's hard to imagine how this works. I saw a similar system in 3D rendering. Previously an object had to spend say 1hr rendering reflectivity of light on a 3D vase, but a new system used AI instead, the AI knew what the scene should look like and constructed the scene within seconds based on that knowledge. Thinks are going to get pretty 'out there' soon.
 
Wise observation from a youngster!

I'm not as young as I once was. I grew up with film being the only option. I would never want to go back to those days. I always say that I shoot film because of digital. Without the option of digital I'm not sure if I would have taken up photography altogether. Sometimes you just want to take 300 photos and not worry about anything else.

And then I'm the guy who ends up shooting Soviet spy cameras loaded with Microfilm and developing at home. Why not? is my philosophy with film. This is one thing that digital does not do as well, you can experiment to the umpteenth with film. You can too with digital but it's not really the same.
 
With digital, sharpness is as much a function of the processing as anything else.
It is important to understand that sharpness and resolution and contrast are all separate but inter-related. Sharpness being the most subjective of the three.
Some of the best film resolution results have low apparent sharpness.
And some of the results that appear to have the highest amount of sharpness - whether film or digital - have relatively low resolution.
And of course, if there is a digital step anywhere between capture and presentation, then apparent sharpness is likely reliant on how that step is handled.
This reminded me of a camera club of almost sixty years ago that broke up because two members were constantly arguing over the difference between "sharpness" and "resolution" and whether there was any. After that, the rest of us made certain that the two of them were never in the same organization again. Come to think of it, the argument was between "sharpness" and "definition". Could "resolution" be another term for "definition"? Folks were serious about their photography in those days. Almost as serious as this group.........Regards!
 
When you think about it, Human vision is similar to digital in that the rods & cones in our eyes send electrical signals to be interpreted by the brain. As Sean points out, technology keeps marching along.

In the end it doesn't matter what tool a photographer uses, it's the choosing of isolated bits from an infinite combination of possibilities that sets them apart. I heard it expressed something like this once, where somebody said, "Wow. That's some expensive camera gear. You must be a good photographer" to which the photographer replied, "So, if you buy a complete set of the finest kitchenware, would that make you a Chef?"
 
For convenience I now shoot with DX format DSLRs. They sacrifice image detail for much time saved in darkroom work. My current Nikon D5300 resolves maybe 55 lpm on a negative smaller than full frame 35mm, while my old Nikon and Leica gear can do better on the larger format. That is a lot of difference. However, I rarely print as large as 16x20, and the DSLR images look sharp enough to satisfy most people at that size. They have the advantage of as much digital sharpening as desired, which doesn't take the place of the smooth detail in a good enlargement from film. Much of my photography today is pro bono work for others who may not appreciate the advantages of film. It is certainly not practical to use film for that, where the processing and distribution of digital is far easier. I've dabbled in photography for over 70 years and miss the quality of film, but am not blind to the advantages of digital photography for what I can still do in my old age.
 
I was reading some previous comments on how film is still sharper then digital on some films, and how the Nyquist theory works into keeping digital from being better. This is for the same size of sensor/film. I was looking at some images of a landscape and it showed more definition in the trees, while the digital image was softer in that area. Same lens. I don't know how the Nyquist theory mixes into all this. Some films have around 100 or more lp/mm, especially B&W. Yet we see some digital sensors challenging medium format. And the digital image is cleaner without grain getting in the way.

So what is the actual truth on all this, or is there any conclusion?

That’s a loaded question if I ever saw one.

You’re always going to have people say the film will always have more resolution/sharpness/whatever.

I’d propose a couple of measurement criteria:

Resolution is the number of line pairs per mm rendered above 50% contrast ratio. Commonly referred to as spatial resolution.

Sharpness is the amount of contrast there is between the dark and light parts of the edges in the image.

You do have to have a minimum amount of resolution to render a certain level of sharpness. You can’t have contrast between a dark and light part if there’s no spatial resolution to render a dark and light part.

Digital sensors tend to have a contrast response of 100% all the way up to the physical spatial resolution limit of the sensor, whereas film rolls contrast off as spatial resolution is added. This means that film by the measurements defined above tends to leave a lot of “meat on the bone” in the resolution and sharpness departments, whereas digital is what you see is all you’ll get. There’s no more resolution or hidden detail to pull out because the contrast response is always well above the 50% mark.

Why 50% contrast? Even though we have instrumentation that can measure that resolution below 50% contrast our eyes are actually relatively insensitive to less contrast than that, so that’s resolution we tend not to see. It can be brought out with either a developer that “punches up” the image, or assuming you scanned it in with enough spatial resolution, can be brought out with sharpening. When you sharpen as image, all you’re doing is adding contrast along edges. You can tunes that to add contrast down in the low contrast response areas, thereby significantly boosting perceived resolution and sharpness.

As @MattKing said, they’re interdependent, but also pretty inter-related.
 
I wouldn't worry about which is sharper. Sharpness and a good photograph have little to do with each other.

“I believe there is nothing more disturbing than a sharp image of a fuzzy concept!” -- Ansel Adams
 
35mm Digital passed 120 film for sharpness back around 2007 at the 8 megapixel mark. Best to concentrate on subject matter and lighting. I know a semi-famous photographer who dropped $30K on a 45MP Phase One medium format digital back; he shoots street photos with it which are uniformly dull and not worth anyone's attention. Sharpness and detail isn't everything. 45MP of boring is... boring.
 
A word of advice. Anytime you see the phrase "Nyquist Theory" on the internet, it's best to disregard the whole thing. 99% of the time it's brought up, it's brought up to prove something it was never intended to provide evidence for. The whole point of the Nyquist theorem is to provide a general guideline for minimal engineering standards for analog to digital conversion. It was never intended to be proof for anything. It is, by it's nature, deeply flawed. For example, the Nyquist Theory assumes a perfectly bandwidth limited system. These do not exist in nature. Therefor, under no circumstance can the Nyquist theory be applied to any system and be relied upon to give accurate results.

About the only time the Nyquist Theorem should be discussed is when you're designing or implementing an ADC system, and you want to know the ballpark for the bare minimum sampling frequency that you could potentially get away with to keep costs, processing, and/or storage space to a minimum. Even then, it should still be tested in a real world scenario to ensure that the results achieved are in line with what was expected.

So it's a handy theory that has it's uses. But more often than not, it's abused on the internet to "prove" some poorly thought out concept concocted by a neophyte with an axe to grind.
 
A word of advice. Anytime you see the phrase "Nyquist Theory" on the internet, it's best to disregard the whole thing. 99% of the time it's brought up, it's brought up to prove something it was never intended to provide evidence for. The whole point of the Nyquist theorem is to provide a general guideline for minimal engineering standards for analog to digital conversion. It was never intended to be proof for anything. It is, by it's nature, deeply flawed. For example, the Nyquist Theory assumes a perfectly bandwidth limited system. These do not exist in nature. Therefor, under no circumstance can the Nyquist theory be applied to any system and be relied upon to give accurate results.

About the only time the Nyquist Theorem should be discussed is when you're designing or implementing an ADC system, and you want to know the ballpark for the bare minimum sampling frequency that you could potentially get away with to keep costs, processing, and/or storage space to a minimum. Even then, it should still be tested in a real world scenario to ensure that the results achieved are in line with what was expected.

So it's a handy theory that has it's uses. But more often than not, it's abused on the internet to "prove" some poorly thought out concept concocted by a neophyte with an axe to grind.

Plus like 100,000. There are so many forum warriors doing back of the napkin math to tell you what you can and can't do. I think they believe everything has to be held to like, Plato's standard. It's best to simply ignore these folks and get back to producing work.
 
Plus like 100,000. There are so many forum warriors doing back of the napkin math to tell you what you can and can't do. I think they believe everything has to be held to like, Plato's standard. It's best to simply ignore these folks and get back to producing work.
I am not sure what "Plato's standard" is, but if you are asking whether film or digital is sharper, you are asking the wrong question.
 
Photrio.com contains affiliate links to products. We may receive a commission for purchases made through these links.
To read our full affiliate disclosure statement please click Here.

PHOTRIO PARTNERS EQUALLY FUNDING OUR COMMUNITY:



Ilford ADOX Freestyle Photographic Stearman Press Weldon Color Lab Blue Moon Camera & Machine
Top Bottom