120Hz refresh rate or higher. People staring at 60Hz screens don't know what they're missing.
This used to be a thing back in CRT times. With LCD's, it really doesn't matter unless you're a gamer and have issues with tearing at lower refresh rates.
Same with coil whine, which could still occur with fluorescent backlights on LCD's, but really, what screens use these anymore apart from the very lowest segment of budget screens?
I don't think that you can presently buy any bad monitors anymore in the mid- to upper segment. They're all nice LED-backlit IPS or OLED screens with a large gamut. The main criterion is resolution, which is a function of screen size, distance you sit from the screen and quality of vision. For a 30" or so screen it's kind of obvious you'd go for 4k or higher.
I care about color accuracy for printed images.
Then I'd wager to say that profiling your printer is going to do more for you than profiling the monitor. Although evidently, it's not an either/or situation; you need both in order to get good consistency/predictability. Fortunately, this is not an issue anymore and given the naturally limited gamut of prints on paper (even with fancy CMYKOGV printers), gamut of the monitor as such is also not really a criterion, since this will virtually always exceed what a print is capable of by a large margin. The only conceivable exception is prints with extremely saturated or weird spot colors (think of fluorescent inks etc.) These are exceedingly rare in photography, of course.
compatible with Studio M2 & has excellent color-management properties
Those are not really issues; as far as I can tell, the Mac Studio series has both Thunderbolt and HDMI, which means that virtually every display on the present market will connect to it. Color management is a function of the operating system and graphics card driver more so than of the monitor; basically, the operating system and application software ensure to send color data to the monitor that's adjusted to the monitor's idiosyncrasies. When profiling a display, what you're essentially doing is measuring these idiosyncrasies of the monitor so that a translation table or algorithm can be constructed that the OS/application will then use to display correct color data. The only requirement for the monitor is to have a reasonably large gamut, but as said above, this isn't really much of an issue anymore in today's display landscape. It was a limitation up to maybe a decade or 15 years ago.
The TL;DR of all this is that you're pretty much free to choose whatever monitor you fancy within your price range and screen diagonal preferences. The odds of bringing something home that won't meet your requirements are virtually zero.
The real quality differences will be in areas not touched so far in this thread, such as evenness of illumination across the entire screen surface. Again, not really an issue with higher end screens and even some minor unevenness tends to be unnoticeable in a real-world usage scenario, but if you're going to look at monitors in a shop etc., this is one of the things you may want to look out for. Color and intensity changes at oblique viewing angles, too, but again, modern IPS and OLED screens offer good performance in this regard. Note that performance is generally worse on oblique vertical angles vs. horizontal angles - which can be relevant if you have a big monitor and you're sitting close to it.
One of the main decisions you'll have to make is a matte or a glossy finish (hey, it's photography after all), which depends on personal preference and specific viewing/lighting conditions in your work area primarily.
Really, the best advice I could give you is to go to a showroom that has a couple of displays set up that meet your requirements and have a good look at them. A real-world viewing impression supersedes any personal preference that strangers will want to
burden help you with. Personally, I'd not even think of buying a monitor before having been able to see how it performs in real life.