The graphs show that the film captures more info (higher ISO or sensitivity, S) with increasing development time, at the expense of an increase in averaged contrast (gamma, G).
For the top graph (Sensitivity) use the vertical scale on the left, for the Gamma graph use the scale to the right.
Base fog (D min) is apparently independent of development time (or almost).
Well, I was technically not correct when I said "gradation (overall contrast)". I just wanted to keep things simple...
Gradation is the Steepness of the linear part of the Characteristic Curve (D/logE).
Contrast what you measure with your densitometer is log Dhigh/log Dlow.
So, of course you get different numbers. But, the greater the Gradation you yield in your processing, the higher the the contrast will be when you you take measurents between the same two areas within your negative.
The minimal useful gradient (as in the "0.3 x G" speed determination) is likely the same for all those differnet development times. They have not zeroed the curves to the film base, either. So if you are using the 0.1 ISO speed determination, you have to subtract the film base density to make a comment about speed.
Best and easiest to think of film speed and development as independent.