Started with digital just over a year ago
So logically, your experience with how color materializes in the digital environment is superficial at this point. Which leads you to believe things like the type of sensor being responsible for the selective saturation of certain hues or color rendition in general, which is basically just bogus. What you're looking at are decisions in the hardware and (mostly) software/firmware chain that have no relation whatsoever to the semiconductor technology used for the sensor sites as such.
There's no contest that you'll see differences between images from different cameras. Attributing these differences to CCD vs. CMOS is fundamentally impossible without controlling for all other factors. This would require building experimental setups using bare sensors and then adding your own filter arrays on top etc. None of the internet reviewers that jumps to conclusions w.r.t. sensor technology is in a position to do this, or to even understand what kind of approach in general terms would be required to substantiate the conclusions they draw.
Between CMOS and CCD, there are some differences w.r.t. noise and image artefacts. For the most part, modern sensors compensate for these weaknesses (which exist in both technology domains) to yield comparable overall performance.
The color rendition of the RAW files you're looking at is a combination of a multitude of factors, ranging from hardware choices (of which light capture technology is just a small part) to especially algorithms in camera firmware to convert, adjust and map analog signals into a digital array of data, and then lots of interpretation on the MacOS side in how the data in the RAW file are represented as a viewable image. Virtually NONE of what you're looking at has anything to do with sensor technology as such.
It's a total deception - but like I said, we're all human. We like a good story, regardless if it's true. Heck, much of the time, we prefer the story that sounds good over the story that's representative of real events.