Thinking out aloud here, about the debate about scanner vs dSLR replication of images on film...
Back in the day (1960's thru 1970's) per the published lens test reports by
Modern Photography or
Popular Photography, a really really exceptional lens could resolve 120 lp/mm, an 'excellent' lens could achieve about 80 lp/mm, and a 'good' (most) lens could do about 64 lp/mm, assuming a super-fine grained film like Panatomic -X was used for testing. Archived are quite a few of the tests results
http://www.edsawyer.com/lenstests/
120 lp/mm would mean 241 pixels per millimeter, per the Nyquist limits theory (2n +1). So, for the sake of discussion, let's use 80 lp/mm optical resolution captured with fine grain film resolution and simplify and round Nyquist to 160 pixels.
- (160 * 24mm) x (200 * 35mm) = 22.12 MPixels to recreate 80 lp/mm combined optics + film resolution, usually the max on film.
Anything more from a scanner or from dSLR would be overkill, wouldn't it?!
Ok, this is fundamentally a deceiving and pretty complex topic.
Achieving 100 line pairs per milimeter is not unrealistic or that hard, if that is the priority.
Even a so so lens will achieve high resolving power, at least 100 lp/mm stopped down, in the center.
A macro or apo lens, or just a very good normal lens can achieve 100 lp/mm at modest apertures.
If you couple that with slide or slowish negative like Portra 160 or TMX, you’ll be able to get close to the theoretical data sheet number.
So 100 x 2 in 36 x 24 is 7200 x 4800 = 34.560.000 aka 34MP in a theoretical perfectly aligned matrix to the lines.
But to really actually match film with real world photos, you can comfortably double or triple that number.
But that is
line pairs. Not at the outset sine waves/cycles which is what Nyquist/Shannon talks about. And certainly not pixels.
And also N/S is about sampling a signal.
The signal on film is, as alluded to earlier mixed with the speciel kind of substrate noise called grain.
You need to sample that too, to get the full signal.
Noise on noise always creates new unpredictable noise.
The aliasing of the matrix, the sensor readout noise and the grain creates new artifacts, if sampled too sparsely.
There is a reason why optical special effects, with multiple generations was made on larger formats to be printed down to 35mm. That was not about sampling, but simply about interference patterns.
This effect of noise on noise is actually used to generate new textures cheaply on CGI. Look up Perlin noise.
When colour and more chaotic signals get involved things get impossible with the simplistic reading of the theorem.
There is a reason why the term “oversampling” exists.
An OK alternative to macro and stitch is to take multiple shots, while “bumping” the carrier slightly between each. And then merge the exposures.
This will take much of the Bayer filter effect out and oversample the frame pseudo randomly.