Unsurprisingly, lenses for national security and scientific applications are better than those available to photographers.
Leica might take umbrage to that…!
Unsurprisingly, some developers provide more acutance than others, so pick the one most appropriate for the look you are trying to achieve. None of this is rocket science.
- Deleted
- 38 minutes ago
- Reason: Due to my ignorance and apparent stupidity, I posted the wrong data and illustration...
More than that, it completely ignores the factor noise, and it compares a wave, which is observed over time, with a wave, which is observed as a whole packet in one. Whatever the differences between analog vs. digital may be, they are not explained in this chart.
Interesting post. I take it that is numerically higher f stops you mean? Only thing that makes sense in my mind.With digital, you can think of the sensor as being an array of lenses that capture the light being delivered by the taking lens. The sensor is essentially an array of additional tiny "lenses", one per pixel. These become relevant in the question of diffraction. This is why you get diffractive effects at larger f/stops with digital than with film and why early point and shoot digital cameras set their default shooting aperture around f/5.6 or so. These days, a lot of this has been remediated by better sensor design and in-camera digital post processing, but the underlying physics doesn't change.
In many respects, the difference comes down to film being asked to capture/record what is there, whereas digital - with it's many sophisticated in-camera optimization algorithms - is, to one degree or another - constructing an image of what is there. If you then look at the post processing tools like Topaz AI, they take this "construction" of the image to a whole new level using a whole bunch of machine learning tools to try to "fix" or "improve" the image to what it should have been irrespective of what was actually captured.
I say this without judgement, only to note that they are rather different approaches to capturing the same light.
Helge - I am 100% with you there, concerning Delta 100's "little more bite" of edge effect in comparison to TMX100's more "velvety" look (which I prize for portraiture, but not for landscapes). But TMX is more useful to me because the straight line or semi-straight line extends about a full stop deeper into the shadows. And this gives me two benefits : true 100 speed (whereas I have to shoot Delta at 50 to get the shadows further up the curve), and better performance in very high scene contrast ranges often typical of my work in the mountains especially. TMY also has a little better spectral sensitivity for my purposes. But I did take trouble to figure out how to make Delta 100 work as a substitute if necessary (some of its filter factors are different). Hence my Perceptol 1:3 tweak for sake of a little more "tooth" to the TMX grain.
I always keep both rolls of TMX 120 on hand for general 6x7 and 6x9 shooting, as well as sheets of 4x5 and 8x10 for sake of masking in the labs, but which get used at times for shooting too. But for general sheet film shooting, TMY400 is my preferred choice, and it gives good edge acutance with my routine PMK pyro development. But I sometimes shoot it in rolls too, if the wind and weather is so bad as to demand handheld work. And in the case of 35mm, I nearly always shoot handheld, and only print that small, so TMY400 is my routine choice there.
Well, I paid less for a huge freezer than what a box of some 8x10 films cost now. It was a very wise investment, and the only realistic way I can afford to keep using TMY, TMX, and Ektar film - out of my own frozen stockpiile.
Interesting post. I take it that is numerically higher f stops you mean? Only thing that makes sense in my mind.
First, analog is a lot clumsier to post process in the darkroom than something from a digisnapper going onto a Mac for quick fixes. I have worked in the darkroom for 50 years and I am well aware how much effort it takes to really get a silver print right.
This is what originally sold me on computer post processing—the ability to do manipulations undreamed of in the darkroom. That, and sitting in a comfortable chair in normal room lighting, makes for a different, and sometimes more pleasant experience.
I agree with earlier posts in the thread that the controls programs like Photoshop offer are frequently overdone to the point the result looks artificial. I try to use the minimal amount of sharpening possible and rarely boost saturation to try to get that Velvia look.
On the other hand, mixing up chemicals, loading reels, and developing film is also relaxing, in a different way. Ditto for printing, although I’m not at the point yet where I can do that.
But in the end, it’s all photography, and it’s all good, whether analog, digital, or a combination of both.
I mean that digital will start showing diffraction effects at a wider opening than film will, ignoring post processing corrections. Where a film SLR may not diffract significantly until f/16 or f/22 (depending on lens), the equivalent digital capture system might start to show diffraction as early as, say, f/8.
See this for the gory details:
https://www.cambridgeincolour.com/tutorials/diffraction-photography-2.htm
This effect may be weaker in practice, since typical Bayer filters are not all that selective. Therefore most demosaicing filter yield pixel accurate luminosity resolution. It's only color info which gets blurry.
It has more to do with the topography and shape of the sense cells. They are almost always in a well structure, almost always with a micro lens on top and always in the shape of a polygon, never the optimal round.
Sense cells can also affect each other electrically and even optically.
Bayer filters ride the uneasy edge between prioritizing luminance information or colour.
Never able to reach a perfect compromise.
Luminance information is affected, even with the usually desaturated filters. Especially if it’s on a coloured surface like red fabric or even a brick wall at a distance.
The solution is of course to have three sensors, like it was common for video. But then you run into problems with exact optical matching of the three sensors.
Digital sensors seem to do a pretty good job. Film grain and dye clouds are not uniform and uniformly dispersed, and they do a pretty good job too.
They do the job quite differently. With appropriate artifacts to match.
That’s the whole point of this thread.
I guess the best suggestion so far was "because at low contrast, film doesn't show much resolution", which by far exceeds all effects of diffraction, lens sharpness and nominal film/sensor resolution. This, and the massive onslaught of image improvement algorithms on anything which happens to end up on a sensor ("maximum likelihood correct data to the nearest typical subject matter").
Digital sensors seem to do a pretty good job. Film grain and dye clouds are not uniform and uniformly dispersed, and they do a pretty good job too.
You could turn that around and say that “because of necessary edge detection, and contrast enhancement digital shows selective sharpness and smudge elsewhere.
And because of aliasing from various sources in the matrix and digitization, digital shows harshness and artificial sharpness where there isn’t any”.
Film shows resolution in a different kind of way. A way I personally find much more pleasing and in a way that leans into and parallels what psycho acoustics tells us that our ear like.
A frequency response with a slightly down slanted curve toward the highs.
Generally there are more overlaps than not with the very general way in which we respond between the two senses of sight and hearing.
Perfect post to sneak in right before mine...Different media help the artist to see their world in different ways. I shoot film one way. I shoot digital an entirely different way. More importantly, I don't try to map one to the other. They are just ... different and I treat them as nearly unrelated activities...
Perfect post to sneak in right before mine
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?
We use cookies and similar technologies for the following purposes:
Do you accept cookies and these technologies?