This doesn't make sense to me (and I'm not just talking about my mathematical inadequacies, which are considerable!). If changes in time affect exposure in a linear fashion, i.e., every doubling of the time increases exposure by one stop, and every halving decreases exposure by one stop, then how does decreasing time by 1/3 not decrease exposure by 1/3 stop?To find the difference in stops ∆ from an initial time t1 to a second time t2, calculate
∆ = 1.4427*ln(t2/t1)
Example 1: Moving from t1 = 4 seconds to t2 = 16 seconds, we have
∆ = 1.4427*ln(16 seconds/4 seconds) = 2.00, a 2-stop increase
Example 2: For t1 = 16 seconds to t2 = 2 seconds
∆ = 1.4427*ln(2 seconds/16 seconds) = - 3.00, a DECREASE of 3 stops (indicated by the negative sign).
For your question of posts #1 and #23, t1 = 8 seconds and t2 = 5.5 seconds, so
∆ = 1.4427*ln(5.5 seconds/8 seconds) = - 0.54, a decrease of 0.54 stops.
I was so confused I even pulled out my D700 and set it on 1/3 stop for exposure adjustment, and decreasing an 8-second exposure by 1/3 stop gave me 6 seconds, which is close enough to the 5.5 seconds I was talking about. One-half exposure change from 8 seconds made it 4 seconds on my D700, so it still seems linear to me.
I'm not being argumentative or stubborn. I genuinely don't understand.
I understand the reasoning behind the f/stop differences Matt was explaining above, because that is dealing with a diaphragm, so I don't expect it to be linear.