I have been using scanner as an densitometer. Before that, I used spotmeter which worked fine, but scanner is easier. Scanner is not linear and you have no clue what densities it gives as an output. So I checked densities that scanner gives from stouffer step tablet and created correction function that gives proper density from scanners output. So far, so fine... I used that method successfully couple of years, then I started experiments with tanning and staining developers. During those experiments and zone system calibrations for pyrocat-HD I had access to real densitometer. Guess what? It showed that all my readings were of, the more density the more error I had. Up to about zone V, my scanner method had given enough accurate densities, but after that - error begun to grow. The first I found this with tanned and stained negatives, but soon I realized that error was similar with all negatives, regardless of developers. So what could be reason? I tested stouffer step tabled with densitometer - all readings were exact. I have correction function for scanner which gives exactly right density readings from stouffer tablet. But from negative, it is different. Why? Does scanner see the density of silver film somehow differently than test tablet?