Here's an update. I think I'm making progress and I'm enjoying the experiments.
This result is better than I was getting before and is approaching a quality that I can use. I'm still experimenting with my approach, which is relying on a combination of the white/grey cards, using both Lab and RGB color space, and having a cell phone pic of the scene handy to check spot colors for obvious problems that my eyes might not catch. In this case I checked the four spots listed just to see how close I was to what the phone saw (which I know is a processed image but still). (Choosing what to sample is a challenge.) All four are fairly close but I'm sure you'll spot the differences. Without the combo of the white card and cell pic I'd be lost for sure.
I'm learning that the phone over-saturates skin tones when I use the Lab guidelines that
@Mr Bill discussed earlier in this thread so de-saturating the phone pics a bit before relying on them for a disaster check tool is needed. Lab color is really helpful if like me you don't see color well.
I did an experiment this week where I took several pics of my bag in differing lighting conditions and tried to make it read correctly in full sun, full shade, etc. That was a good exercise. Thanks to
@Mr Bill,
@bernard_L and everyone else for the assistance. I appreciate it!
Overall, image saturation is a challenge that will need more experimentation. I'm getting images that are on a spectrum between dull and candy colored and it's hard to get it right. I'll keep at it and will appreciate any feedback you might have.
Here's what the phone saw: