Saw a video of a guy using one for color development and just got one from Amazon. Not expensive. As I tested it, it's one degree cooler than my digital thermometer. Don't know which is closer. Wondering whether the infrared measures surface temps and the digital the (warmer) middle depth.
The advantage is that there is nothing to wash or mistakenly transfer (say) developer into the blix.
For faucet temp, I have an inline analog dial thermometer that agrees pretty well with my lab mercury thermometer.
I only do monochrome film and paper development. I designed and built my own thermocouple based compensating timer that both displays temp and - if desired - corrects timer running time for both film and paper accordingly:
In principle, that timer could be modified to provide better than 1 degree resolution because the thermocouple reports 1/10 degree increments as I recall. For monochrome, I didn't need it, so - to keep the display size smaller/require less circuitry - I round to whole degrees.