This issue is really kicking my butt.
The light standard (the blue box out of frame on the left) produces 100 foot lamberts which I calculate to be 32 candles per square foot.
I simply cannot get any Weston Master II to read 32. Best is 25.
This difference is so consistent that I think the difference might be the law of inverse squares falloff. You see, I cannot hold the Weston cells directly against the opal glass of the standard. There’s a lip on the standard and the meter cell is also set back, for about 3/8 inch distance where I just can’t get closer.
But I can match the light of the standard with a variable brightness test box (by aiming a Spotmeter). And here I can try different geometries.
I created a small tube with a translucent face that I can insert into the Weston, flush against the bubble lens in front of the cell.
This raised the reading a sixth stop above 25. But I am looking for 32 when the light matches the standard.
I bought a cheap color temperature meter but it doesn’t recognize tungsten light. It’s so stupid.
Meanwhile I am trying to hit the maximum reading off the Weston when other meters tell me the light at the panel is 100, as I vary the color temperature with 80b filter or higher voltage.
Backing up a step…
For the first pass checking light to readings, I poked holes in pieces of foil to attenuate the light, and then fine tuned with the variable light.
A see-saw pattern emerged as I made different range foil screens which needed the fine-tune brightness turned up or down. Readings that should have been a third-stop apart were nearly identical.
So these cells are extremely sensitive to tungsten voltage variations.
My latest rig ties the cell to an enlarger rack.
So I am going to get in range and vary the light by raising the cell. At close range 3/8 inch rise drops the reading about a third stop. And I know it doesn’t change the color temperature.
So my next set will be with the test lamp at full voltage. Ranged by foil, trimmed by raising and lowering.
We’ll see if that gives me a 32.