gmikol
Subscriber
So your explanation to me reads that the sensor reads the whole film plane at once which doesn't make sense to me.
The PPI should not change at all no matter what size the film is...
To clarify, when reading 35mm or 120 with a given scanner, the PPI does not change. But the sensor itself has a fixed number of pixels. So if you make an engineering decision to cover a larger film area when designing the scanner, that affects what the maximum DPI can be, if you keep the sensor the same, because you have the same number of pixels covering more inches of film (lower pixels per inch). Put another way, for a 4000 PPI scanner to just cover a 35mm frame (~1 inch wide), you need a sensor with 4000 pixels (4000 pixels per inch of film). But to image a 120 frame (~2.25 inches) at 4000, you need a sensor with 9000 pixels, and to image a 4x5 frame at 4000 PPI you need a sensor with ~16000 pixels.
Film scanners don't image the whole film plane at once (like a camera/enlarger), rather, the sensor in a scanner images a single "line" of the film parallel to the width of the roll, and then the motor advances the head (e.g. for a 4000 PPI scanner, the head advances 1/4000") and it images another line, and so on.
So, based on my example above, if you have a 4000 PPI, 120-capable scanner, and you're scanning a 35mm frame, the each line that gets read is data for 9000 pixels. But only ~4000 pixels of that are useful data. So that will likely be slower than a 4000PPI scanner that just covers 35mm, that only has to read 4000 pixels per line.
Hope that helps, instead of just being redundant.
--Greg