It may be worth estimating what the standard deviation of the grain is for a fine grained film. Acros is the finest grain film that I know of, if we restrict the choice to ordinary pictorial-type films. It as a granularity of 7, which means that the standard deviation of the density is 0.007 at a density of 1.
According to Selwyn's law (which is really more like a rule of thumb, but is probably pretty accurate for the purposes of rough estimates) the standard deviation varies according to the one over the area of the sampler.
Grain is specified in terms of a circular aperture of 48 microns. The area of the aperture is 1963 square microns.
Let's assume that a 4000 dpi scanner is a square sensor with an area of 6.35 micron on a side. That would correspond to an area of 40.3 square microns.
The square root of the area ratio is 6.98, which implies that the grain noise for a 4000 dpi scan should be about 6.98*0.007=0.0488.
If we assume a standard deviation of 0.0488 for the density of a scanned image and do a simulation using a random number generator and then transform density into the fraction of light transmitted the standard deviation of the transmitted light falling on a scanner pixel is 0.0114. That is on a 0 to 1 scale. On a 0 to 255 scale (i.e. an 8 bit scale) that becomes as standard deviation of 2.9 ADC step size units. That's roughly ten times more than enough grain to eliminate banding for a density region of 1 on Acros film.
I'm not sure at this point how to do the calculation at other densities, but it's looking like even Acros will probably have enough grain to assure that we won't see banding if doing an 8 bit scan.