I am not sure I see the same thing. With multisampling their test images show almost no noise whereas the zigzag effect is even more pronounced with multisample enabled.
No, it's not more pronounced with multisampling. It's less pronounced. When grain dissolver is disabled.
Also, and as stated before I've scanned the same image with the same scanner at different stages of its life, and the scans from 2015 do not show the same patterns. We are running circles here.
I'm not disputing your observations. I'm just saying that I don't believe ALL scanners have degraded.
For example,
@scarbantia thought that his scanner started producing a LOT of noise overnight, then it turned out that previous scans are comparable to the new scans. @albeiro thought his scanner had supper clean shadows, turns out it's comparable to other 5400s. I can see a pattern here...
I wouldn't exclude the possibility that current problems are related to changes in operating systems and driver software.
I have tested this scanner on equipment predating the scanner all the way to the modern computers. With OEM software, Silverfast and Vuescan. Vuescan can be a bit of a hit and miss with scanners, but Minolta 5400 support in Vuescan is done properly.
Have only been able to quickly scan the article, but the review claims the issue happens with a combination of multisampling, grain dissolver and ICE?
I used none of the three above and found the band in my last example to show noise. Is it the same issue or just standard thermal noise-derived patterns?
We know the grain dissolver introduces a big amount of noise. That has been verified on every Minolta 5400 so far (maybe someday a 5400 without this problem will surface, but lets assume that this is generally true for now). I believe Fernando (author of the test I linked to) did most of his tests with OEM Minolta Scan software where enabling ICE implicitly enables GD (you can't have ICE without GD in OEM software; they are not linked in Vuescan and Silverfast though).
What I believe happens is that GD introduces a lot of noise (most of it comes from the sensor itself (thermal noise) since GD requires much longer exposure time, but as
@koraks says there is obviously also a second source of noise). If you then enable multisampling, that will clean the colour noise in shadows and you will notice jaggies more easily since there is less colour noise to mask the high contrast transitions.
That's why my proposal was to scan black frame, change exposure, GD, multisampling parameters to see the most prominent offender and to compare out scanners. I forgot to upload my samples yesterday, but they are a bit worse than yours (so, your scanner is at least better than mine

). It could also be down to other factors (temperature of sensor at calibration time, scan time, other interferences...).
Yeah, that would be an interesting approach for sure. I wouldn't stare myself bind on the CRI either. 90-95 is easy to get and will likely do fine. Keep in mind the sensor sees only spectral peaks of R, G and B anyway. As long as those are there in sufficient amounts, you'll be OK.
Do off the shelf LEDs typically go into IR spectrum enough to keep the ICE functionality of the scanner?
BTW, Minolta 5400 II produced worse IQ with LED light source. Presumably because they needed to increase sensor gain to increase scanning speed (LED brightness was not better than CCFL). I believe that LEDs have come a long way since then so the luminosity per W should be better now, right?
5400 II also had huge problems with LED light source uneveness. If today's LEDs are bright enough an additional diffuser could be employed. Space permitting, of course...