Hi, to just get a usable speed rating I would tend toward testing in the situation where you actually take your photos.
But if you want to make critical comparisons between different materials then probably a controlled (and thus repeatable) setup is better.
Things that might affect your results include the spectral makeup, loosely color, of the light. If the exposure meter doesn't have the same spectral sensitivity as the film then... well, you can probably see where this is going.
In the case of a color film, something that may not be obvious is that there are basically three different color-sensitive layers, and that each layer needs to be more or less "correctly" exposed. These films are generally "balanced" for "daylight," which is considered to have a color temperature of about 5500K. At any other color temperature there will be a tendency for the three layers to become "unbalanced" exposure-wise. So unless you thoroughly understand these effects, and perhaps use color-balancing filters on the camera, you may possibly end up having one of the color layers being underexposed. An actual shooting test under those conditions should show up such a potential problem.
Fwiw an electronic flash typically has a color temperature of about 5500 to 6000K, pretty close to "daylight," so the flash should be a good testing substitute. Assuming the same meter, etc.
If you are shooting color film outdoors, but in the shade, most likely the light will have a higher color temperature, perhaps 8000 to 10000K as an example. This means that bluish light is a larger proportion of the total. So you expect to get a healthy amounts of exposure in the blue-sensitive layer of the film, but... you may be underexposed in the red-sensitive layer. Or if the light has a lower color temperature then the situation is reversed - the blue-sensitive layer would tend to be underexposed.
Anyway, there is a lot to be said for doing the test under your actual shooting conditions.