Interesting question, and one I haven't thought about. I'm fairly certain exposure to intense UV light will age the negative faster, but is a 20 minute exposure, say 30 or 40 times, enough to make an impact?
Here's some math I'm thinking could find a theoretical answer:
It looks like a 40 watt lamp for a 6x7cm is what as worked best for people, but I read somewhere that UV LEDs are at most 30% efficient, so thats 12 watts of light over 42sq cm, or 2857 watts per square meter. A typical UV index for the northeast US where I live during the summer is 5. UV index is the energy of UV light in watts/m^2, multiplied by 40, so a UV index of 5 is 0.125 watts/m^2. Watts are energy/time though, and we are only exposing the negative for 20 minutes, so now to convert the watts into joules, which is done by multiplying the watts by the exposure time in seconds. 2857watts*1200 seconds is 3428400 joules for the 20 minute exposure. 0.125 watts by 3600 seconds (1 hour) is 450 Joules. 3428400/450 is 7619, so a 20 minutes exposure to a 40 watt UV lamp is at most equal to 7619 hours of exposure to the sun.
That seems like a lot to me, but there could be a mistake in my math (can any of you verify if that is correct?). Does anyone know of research on the stability of film in ultraviolet light? I found one study from the 60s which seemed to indicate 25 hours of exposure to sunlight would halve the density of film, but if that is true our exposures would destroy film very quickly. That could have also been talking about early color film, which I know was not very archival at all. I also found a forum thread saying kodak had research on their website, but the thread was from 2004 and I couldn't find any of the research it discussed.