holmburgers,
I'm pleased to hear the you went into the book; I haven't had a chance to look at it, but maybe this will make sense:
The "sensitometric [or H&D] curve" on the Agfa literature shows pretty extreme contrast; the straight line portion goes from d=0.5 to d=3.5+ in only two stops of exposure (log E=1.2 ---> log E=1.8, and one stop is 0.3 log exposure units). However, this is for x-ray use and you would presumably be using a much less active developer, and see far lower contrast.
The "gamma curve" is the
slope of the density curve, plotted against the film density; notice that it goes to zero near d=0 and again near d=4.5, which is where the H&D curve goes flat, and peaks where the straight-line portion of the sensitometric curve is found. All of the funny business is up above d=3.5, which is a pretty dense negative.
I seem to recall that the Kodak publication mentioned a maximum density of something like 2.5 for color separations, so your problem would be to get the density range of your original (probably something like d=0.1 to d=1.5 or so) to translate to d=0.1 ---> d=2.5 in the separations. The problem with the x-ray film would appear to be that really long "toe"; you might have to pre-expose in order to get up onto the linear portion, and then you would have a lot of base density to print through. But the developer will have a lot to do with this. The x-ray film curve is actually very gentle compared to a lithographic film (in lith developer) which goes from nothing to dead black with a negligible exposure change, and folks successfully tame
that stuff for analog photography.
So I would suggest getting the smallest sample possible, and doing some experiments; after all "one clean experiment is worth a thousand dirty calculations" and the Kodak pub should give some idea of the targets that you are shooting for.
If you'll keep studying, we'll put away that big wooden spoon......