This doesn't make sense to me (and I'm not just talking about my mathematical inadequacies, which are considerable!). If changes in time affect exposure in a linear fashion, i.e., every doubling of the time increases exposure by one stop, and every halving decreases exposure by one stop, then how does decreasing time by 1/3 not decrease exposure by 1/3 stop?
I was so confused I even pulled out my D700 and set it on 1/3 stop for exposure adjustment, and decreasing an 8-second exposure by 1/3 stop gave me 6 seconds, which is close enough to the 5.5 seconds I was talking about. One-half exposure change from 8 seconds made it 4 seconds on my D700, so it still seems linear to me.
I'm not being argumentative or stubborn. I genuinely don't understand.
I understand the reasoning behind the f/stop differences Matt was explaining above, because that is dealing with a diaphragm, so I don't expect it to be linear.