Can you provide us with an explanation geared for the layman, about how often/when such calibration should be done, either in terms of time intervals or vulume of photos processed or whatever threshhold?
And then also comment about throughput (let's say 1995 vs. now) and how that increases the need to calibrate or the challenge of maintaining calibration?
I could probably sit down and write 50 or 100 pages on this (but I won't). People here don't know me, but way back when, I spent a number of years as the Quality Control manager in a large chain studio lab, printing vast quantities of paper every day, I would imagine dozens of times more than Bob. But completely different sorts of work; we were essentially a high volume picture factory, producing work to get within certain quality standards.
The sort of checks we'd do included processing machine checklists by the operator every shift, inluding visual checks, confirm that circulation and replenishment pumps are working, wash water flow rates are correct, chemical temperature readings and dryer temperatures are ok, squeegees are all ok, and that sort of thing. We typically ran about 3 control strips per shift on paper processors, relying on the operator to notice if something drastic happened in between. For film we were more finicky - our cine machines wouldn't start production in the morning until my department approved a control strip and a "scratch test," roughly 20 feet of film, half fully exposed and half clear, and given a 5 or 10 minute examination for any traces of scratches. After startup, another control strip every couple hours - even with hundreds of gallons of developer in the machine, a developer replenisher problem could run it out of process spec in that time. All told, we ran about 50 control strips per day on a one-shift operation. If anything unusual happened on an individual control plot, someone from my department would check out the machines - compare temperature readout vs a reference thermometer, measure the machine speed, perhaps recalibrate the replenishment pumps or even pull chemical samples for our chem lab. If all the processors showed the same sort of control plot shift, we'd have our chem mix operator switch to a different replenisher tank and isolate the questionable tank until our chem lab found what went wrong.
You may think it's a little nutty to go to these extremes, but the sort of work we did demanded it. Film was "analyzed" on Kodak PVACs then printed on long roll (~575 ft) printers - any shift along the way meant that a lot of paper was going in the trash.
We even did semi-random screening of image stability a couple times a year. The point of this was in case we ran into some undetected chemical or water problem that affected image stability - we would hope to learn about it within several months, as opposed to years later when customers might begin returning defective prints.
In later years, when optical printing went out of favor, machines such as the Noritsu "print to process" machines were not so sensitive to chemical process variations. They had built-in calibration routines where the the machine could "read" its own step wedge results and make corrections to the printing exposure. So instead of correcting a chemical problem right away, one could just run a machine recalibration and the printed work would come right back to aim. For details on how this works, each machine came with about an 8 inch tall stack of printed manuals.
I fear I went beyond the layman's level, but the gist is that your "calibrations," or whatever you want to call them, are sort of a tradeoff between your desired level of control vs what it is reasonable to pay for. Since photofinishing is not usually a life and death situation, the economics of your business usually decide.
Hope I didn't bore everyone too much.