So, as I'm planning for the next revision of the hardware, I finally decided to go ahead and do some basic temperature testing of the meter probe. It look a bit of work to rig this all up, because the easiest way to do the test involved running the meter probe from a computer (instead of my timer unit) and because I needed to illuminate the sensor via a fiber optic cable coming from a light source outside of the test rig.
In any case, I finally got all the pieces together this evening and collected some data.
I used a similar thermal ramp method as I used to test the densitometer's thermal performance, which involves a freezer and a heated 3D printer filament drying chamber. Its great for ramp testing, but I'd obviously want something fancier if I was trying to build an actual thermal profile. So the data is a bit noisy, given that the hardware doesn't heat evenly and never really reaches a steady state, but its still good for answering some basic questions.
Those questions being:
- Does temperature affect the sensor's performance?
- Does temperature affect the sensor's performance enough to actually matter?
- Is it worth adding a temperature sensor, renting a proper thermal chamber, and building an actual thermal calibration profile?
For these tests, I had the sensor at a fairly high gain setting and the light source at its dimmest. All data is normalized around 25C. (The sensor gave readings that would have been in the ballpark of 0.15 lux if it had picked up the light under my enlarger in a normal meter probe configuration.)
Going all the way from 0-45C, this is what the graph looks like:
The graph is showing differences in reading in stops (log2), and the total range is about 1/4 stop. The jaggyness of the line is most likely due to the inconsistencies in the test method, and not sensor behavior. I'd expect a much smoother line with higher quality steady-state testing.
Someone here mentioned an expected usage temperature range of 17-30C, so I plotted that too:
In this case, we're only talking about a range under 1/10th of a stop.
While it might be a good idea, for the sake of completeness, to test a range of light levels and sensor settings, I think this particular test is likely to end up being close to the worst case scenario.
So I guess now the question becomes... Is it work going through all the trouble to build in temperature compensation to improve accuracy by an amount likely below the typical adjustment increment, for people who build their paper profiles during a summer heat wave and then print in the dead of winter, using a darkroom that's located in an uninsulated shack in their backyard?
Unless I'm missing something, which maybe I am, and you'll all be happy to fill me in.
