Ed, I think you're thinking of neutrinos, when you say the Earth itself doesn't provide much attenuation. Fortunately, they don't interact with film any more than with the whole mass of the Earth (takes something like 30 light year thickness of lead to reliably absorb a neutrino). Gamma rays are nicely stopped by simple mass shielding of any sort -- X number of grams mass per square centimeter works the same whether it's lead, concrete, rock dust, water, or even air (just takes a lot of air to do the job -- fortunately, we've got effectively several tens of miles of it above our heads).
Cosmic radiation is the culprit I've seen implicated in this "cold fogging" -- it's very high energy particle radiation, in its purest form (mostly protons at relativistic velocity), but when it interacts with matter, it produces showers of secondary radiation consisting of a witch's brew of particles and photons from visible and UV through X-ray and far into gamma. That's what fogs the film in the freezer. And the original particles are of high enough energy that any practical shield simply makes it worse -- shielding astronauts on long interplanetary voyages promises to be a much larger problem than ensuring a multi-year supply of oxygen.
Oh, and that concrete basement ceiling? It's most likely slightly radioactive (above background) in and of itself, because the gravel used as aggregate probably includes a significant fraction of granite, which in turn carries trace amounts of uranium and thorium. The alpha radiation that stuff mostly produces isn't a huge problem for film, however (the film packaging will stop alpha, which is helium nuclei); even beta is nicely attenuated by the steel shell of a freezer, unless there's a lot of it or its at unusually high energy. Also note that in some parts of the world, basements tend to collect radon gas, which can get inside the freezer and fog the film with its decay radiation without respect to the freezer's metal shell.
As for "concrete" information about long term fogging of film in cold storage, as far as I know it's all hypothesizing to explain anecdotal evidence, but the hypotheses are based on pretty well proven properties of film and radiation -- the radiation exposes a halide grain the same way a visible light photon would, and just as with visible light, it takes more than one photon to produce a developable latent "image" speck (in this case, not carrying image information, just noise in the form of overall fog). If the film's reciprocity treshold (which is reduced when it's cold, to make matters a little worse) isn't met, that single photon exposure is eventually "forgotten". If the threshold is met, that exposed halide grain becomes part of the film's fog level. Since the faster films require fewer photons (larger grains collect photons more efficiently and combine exposure over a larger area), and those with less reciprocity failure "forget" the sub-threshold exposures less quickly, those are the films that cold fog the worst. Royal-X, Delta 3200, 2475 Recording, etc.? Likely not to last ten years, even in a freezer. Panatomic X? Probably still be okay when your grandkids use up the last of the 100 bulk rolls you bought up before it was killed, providing they can keep it frozen...
Relative to T-Max 100 and Acros, the data isn't in yet, because those films haven't been around long enough, but with their very low level of reciprocity failure (Acros needs only 1.5 stops at 1000 seconds, T-Max about 3 stops at the same time) they're likely to fog badly after many years, even in deep freeze.