hrst: Contrast decreases with increased exposure too.
I don't want to argue with you on this, but I want you to revise your thinking a bit. You are pissed about the net myth which is actually a faulty oversimplification, but you are actually making a same kind of faulty oversimplification yourself. Please.
Both are problematic assumptions that cannot be used as rules of thumb.
I summarize once again;
In cases where shadows are exceptionally low and hold important detail, overexposure increases contrast in them and underexposure decreases contrast in them.
In cases where highlights are exceptionally high and hold important detail,
underexposure increases contrast in them and overexposure decreases contrast in them (this is what you are talking about). This can happen with clouds, especially with sunsets and similar conditions.
The first one of these is probably more frequent because the linear part of the films is long and manufacturers want to take most of the speed available thus rating the films at the highest EI leading to "good" results. (Which is the basis of the ISO standard).
As a sum, in most "typical" conditions, giving a little bit extra exposure as a precaution will result in as much contrast and saturation as possible. Box speed is fine if you are carefully metering from "a little bit shadowy side of the midtones" or something like that. Or you can overexpose a stop "just in case" and then slack in the metering. This will not lead you to shouldering.
If you are seeing a change in contrast, that may be due to many reasons;
(1) we
really are off from the linear part of the film
(2) you have some postprocessing problem or feature. For example, scanner CCD's have small toe and shoulder, furthermore you never really are scanning "raw", you have some piece of firmware and software that can do something unexpected. I have measured the characteristic curve of Epson V700 for example and it is not linear but it clearly starts shouldering at D=2.4, thus having lower contrast when scanning dense negs! You may be seeing this. This does not happen in RA-4 printing that easily.
And last, you cannot evaluate the neg with your eyes for contrast. Your vision system does not have linear response.
So,
in those cases where contrast
really changes, the perceived saturation changes accordingly. I think everything has been said regarding the contrast and cases where it really changes. And, the fact that a big part of perceived saturation is directly related to contrast. But, saturation can also change without changing the overall contrast. For example, Portra 160 VC and Ektar 100 have very similar contrast but Ektar has more saturated colors.
markbarendt;
"Each layer in color film only makes one color and it is always fully saturated because there is no other choice. The density is the only variable I see."
That's true, but densities resulting in from a given scene is not only dependent on exposure. They are also dependent on spectral sensitivities of the sensitization dyes in film. AFAIK, this is the variable that controls saturation along with image-forming dye absorption spectra. But I would expect this is constant regardless of exposure. I might be wrong. But at least we can control this by selecting a different film. At least I would like to hear if there is more to this.
For example, it is known that when printing with sharp-cut RGB filters instead of CMY filtered white light source, saturation is increased because of less crosstalk between the color channels, and this is different from contrast. So, there are more variables in the play.