I have just not mentioned it above because the topic was medium speed films, not high-speed films. I have tested so far almost all films of the market, and the some remaining not tested yet will be tested in the near future.
So here we go, ISO 400/27° CN films:
Kodak Portra 400 (new, current version): 80 – 100 Lp/mm
Kodak Portra 400 NC-3 (discontinued): 100 – 110 Lp/mm
Kodak Farbwelt 400: 95 – 110 Lp/mm (discontinued; former version of Gold for the German speaking markets)
Kodak Ultra Max 400: 100 – 110 Lp/mm
Fuji Pro 400H: 90 – 105 Lp/mm
Fuji Superia X-Tra 400: 115 (120) – 130 Lp/mm
As for the test result that current Portra 400 has significantly less resolution than its forerunners: That is part of Kodaks 'enhanced for scanning' policy: Those films (it is also valid for Ektar) have finer grain, as grain apperarance is increased by most scanners by scanner noise. Therefore finer grain generally delivers more pleasing scan results. But Kodak unfortunately also sacrificed max. resolution for that. Well, they think max resolution isn't important as scanners cannot use / exploit max. film resolution anyway (but optical printing can), as scanners - especially the most popular and widespred ones (including camera scanning) - have very low resolution values.
Hmm, maybe.
Bits of this test have been bothering me for a few years - not least because I can see clear perceptual differences in the materials that do not translate to resolution test charts (this is why resolution testing is not really used like this as a means of comparing film performance).
There is nothing wrong with your test methodology (in fact it is spot-on enough to actually be able to find the errors), but there are systematic errors at the read-out stage - and they do not account for MTF/ noise relationships being far more important to how we perceive a film being sharp or grainy etc.
You can predict the outcomes of a resolution test once you know the test chart's contrast and have the material's MTF plot (not going to go into massive detail here, but it's covered in the SPSE Handbook) - and your results are accurate enough to suggest that you read them out at around 5% MTF (not extinction - which is the systematic error you made when testing E-6 materials - you appear to have read them at extinction, not 5%, thus they cannot be directly compared with C-41 results - and that makes sense when trying to understand why claims of E-6 resolution do not stack up with the visually obvious optical behaviour of the materials when used for anything other than direct viewing). Having worked with essentially all the materials listed, I can state that by the time you are getting to the point that that level of resolution may or may not make a difference, the noise/ granularity of the material will impact far more on its ability to transmit any useful information from low-contrast real-world objects.
Long story short: most C-41 ISO 400 materials run out of visually detectable (under high contrast test conditions) resolution by around 100lp/mm +/- 10%, but our visual impressions of the materials are formed much more by what they are doing around 10-15lp/mm (cyc/mm) and when visual granularity/ noise kicks in - along with the characteristics of the rest of the optical system being used - be it the MTF of directly exposed print materials (and lenses involved) or the MTF characteristics of whatever scanner was used (which cannot exceed 100% MTF response at low frequencies). Kodak, Fuji, Ilford etc know this (as do Zeiss etc) and it has underpinned the way they make materials - i.e. the more sharpness you can get at low frequencies, the sharper the material will look, but too much sharpness at too high a frequency and the harder that detail obliterating noise/ granularity will kick in.