I have dialed my Xtol-R times for HD-LD to be 0.8 and a sample FP4+ negative [1] developed in the same tank is much denser than my usual preference (and much denser than HP5+ shot of the same scene using datasheet time+temp). So in both developers my control strips appear to be somehow "less sensitive" than regular FP4+
[1] Medium-contrast scene metered with an incident meter.
Or it could just be that 0.80 is more contrast than you prefer/are used to. If you were happy at 0.72, I'd just go with that. Lets not forget that LD is ~0.1 above FB+F and HD is most likely 1.3 exposure units up which would put 0.8 at ISO contrast (0.615) which isn't necessarily wrong, but it's generally more than what most people are accustomed to seeing. 0.72 would put it at ~0.55 contrast, which is a lot more in line with what many consider "normal".
Also, when you run your control strips, make sure you run it with the same minimum amount of developer per roll, or better yet, run it in an empty tank with just the control strip. I ran replenished XTOL before switching over to DD and one thing that I thought was my working solution bottle just being really variable, was actually me being inconsistent with how I checked the activity level via the control strip. Now (with DD, because that's what I'm currently using), I always run at least 300ml of working solution per roll when running a control strip, or even better, at the very beginning of a batch of black and white, I'll run a control strip by itself and modify either the working solution bottle with more replenisher (if activity is low), or the development temperature (if the activity level is high), then modify how much replenisher goes in after the run based on what the activity level was. If it was low, I already brought the activity level up, so I replenish as normal, but if it was high (almost never happens, it's usually either right on or low), then I trim down the amount of replenisher slightly, so the next run will be about right, or at least trending in the right direction.