I ran across this article that did a bunch of testing to see how many discrete tone steps humans could detect for a given luminance range and thought it would be interesting to post here. It’s in the context of medical imaging display bit depths, however, one could argue that it could apply to how many bits we really need from our scanners when scanning black and white film. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3043920/ If you don’t want to read the whole article, the main takeaway is that ~900 simultaneous discrete luminance tone steps under the most ideal viewing conditions possible is the upper limit of the human visual system, which is well inside of 10 bits. This actually makes a lot of sense, as the cineon system started out at 10 bits and was 10 bits for the longest time (and still pretty much is) and nobody complained about a single thing in terms tonal resolution or image posterization. So, this begs the question, is it really necessary to scan bw film at 16 bits? It certainly won’t hurt, however, I suspect that much beyond 10-11 bits is added information that we actually can’t perceive, even on a high bit depth monitor. I realize most scanners only do 8 bits or 16 bits, so maybe the real question is: after scanning at 16 bits, is it really necessary to store all 16 bits? Less bits would certainly make for a smaller file size.