shutterfinger, your electronics 101 is a little different from mine. In the time I spent composing this post a number of other people have chimed in. The below is in response to post #11.
A light bulb does not draw constant power. By your argument, that 250W bulb would draw 250 amps with 1V across it, which is crazy. A more realistic value is around half an amp for a traditional incandescent (non-halogen, like what Thomas Edison invented) bulb with a tungsten filament. The below discussion relates to standard incandescent bulbs. Halogen bulbs may have different behavior (or may not, I simply don't know).
Tungsten has a temperature coefficient of around +4500ppm/degreeK. The operating temperature is around 3000K, room temperature is around 300K, so the filament resistance increases by roughly a factor of
1 + ((3000-300)*4.5e-3) = 13.15.
That 250W bulb has a filament resistance of 25.6 ohms at operating equilibrium when powered with 80V (either 80VDC or 80Vrms), so working backwards gives a room-temperature filament resistance around 2 ohms. Put 1V across it, there's half an amp, like I said above. These numbers are obviously rough approximations, but they're not unrealistic.
The actual behavior of lightbulbs is surprisingly complex and nonlinear. (I saw a full derivation of this in a lecture when I was in EE grad school back in the early 1980s but those notes are long gone. I wish I still had them, it was a thing of beauty that I've never seen reproduced anywhere else.) In simplified form, it goes like this: The filament is "cold" (at room temperature, 300K) when power is first applied, so it has low resistance and draws a lot of current, much more than it does when it reaches operating temperature. For a very brief time following turn-on the power is actually really high - that 250W bulb with 80V applied will initially draw ~40 amps, an instantaneous power of ~3200W. This huge initial surge causes the filament temperature to rise very rapidly. Filament resistance and power both decrease as the temperature rises. Eventually (and surprisingly quickly, maybe 1/20 of a second) the filament reaches its equilibrium operating temperature with the bulb consuming its rated current of 3.125 amps and dissipating its rated power of 250W.
Decreasing the voltage decreases the operating temperature. This has two effects - it prolongs bulb life, and it lowers the color temperature, making the light more yellowish. While it's true that the filament resistance goes down with reduced voltage, which will increase the current slightly, the effect isn't enough to keep the power dissipation constant. Power and temperature both decrease, giving the color and lifetime changes I mentioned.
You can see this in printing if you have an enlarger with a standard bulb and no voltage regulator, and if you power it from a variable power transformer (variac). Make a black-and-white print on VC paper using the rated supply voltage (120V, or whatever is your local normal). Now turn the voltage down 10% and make a matching print - you'll need to use a higher-contrast filter to do so. In color printing the color balance will change as you change lamp voltage; color enlargers often use regulated power supplies to keep the voltage constant and avoid this problem.
I don't see how a bad socket can cause a bulb to blow. It could cause other problems, like shorting out the supply, though, or increasing in resistance so the bulb can't draw full power.
None of this addresses the OP's issue, but the "explanation" in post 11 is so wrong that I couldn't let it go unchallenged.