Many installers ask whether heating issues come from “low voltage,” but the physics tells a different story. In resistive lighting loads (halogen, incandescent), heat correlates primarily with current, not voltage.

1. Heat is proportional to I²R

P = I²R (Joule Heating)

This means:

  • When current increases, heat rises exponentially.

  • When voltage increases, current increases, and then heat rises even more.

2. Low Voltage Does Not Eliminate Heat

Lowering voltage reduces current but also reduces filament temperature and brightness. However, low voltage does not prevent heat — it only shifts the thermal profile lower.

3. For LEDs, Thermal Behavior Is Different

LEDs do not rely on resistive heating like tungsten filaments; instead:

  • Heat originates from LED junction temperature + driver electronics

  • Driver overcurrent protection helps reduce catastrophic heating

  • Heat impacts lumen maintenance & color shift, not brightness alone

Conclusion

High current is the primary thermal driver.
Voltage changes influence current, and current dictates heat.


Q2C — Can Increasing Transformer Output Voltage Reduce Heat?

Short answer: No — it does the opposite. Increased voltage raises current and heat.

Why Some Installers Misinterpret This

Sometimes installers raise voltage to overcome:

  • Long landscape cable runs

  • Voltage drop over distance

  • Undersized conductors

  • High-load configuration

This makes lamps appear brighter and more stable, but the thermal cost is high.

Overvoltage Effects on Halogen

Output Effect Risk
11.5V Normal Rated
12.5V +Heat +Brightness, shorter life
13.0V ++Heat Filament stress, glass darkening
13.5V +++Heat Failure, overheating, catalyst cracking

Overvoltage Effects on LED

LED MR16 modules incorporate IC drivers. Overvoltage causes:

  • Thermal runaway at driver IC

  • Dimming instability

  • EMI noise

  • Shortened LED lifespan

Conclusion

Increasing voltage never lowers heat.
It increases filament temperature, driver current, and heat dissipation.


Q2D — Why Do Magnetic Transformers Produce Voltage Drop Under Dimmer Control?

A frequent question from lighting contractors is why magnetic transformers experience significant voltage drop when paired with dimmers.

1. Magnetic Transformers Are Linear Devices

They rely on electromagnetic induction:

Vin × Turns Ratio = Vout

Under dimmer-controlled waveforms, the input is no longer sinusoidal.

2. Dimmers Alter the Input Waveform

Phase-cut dimmers clip the AC signal:

  • Leading-edge TRIAC dimmers (common for magnetic)

  • Trailing-edge MOSFET dimmers (for electronic drivers)

The clipped waveform reduces:

  • Effective RMS voltage

  • Magnetizing flux

  • Output regulation capability

3. Heating Increases Under Distortion

The transformer core experiences:

  • Higher magnetizing current

  • Higher core losses (eddy + hysteresis)

  • Copper losses (I²R)

The result is:

More heat, less voltage, less efficiency

4. Load Type Matters

  • Halogen = resistive (stable with dimming)

  • LED = reactive + nonlinear (driver-dependent)

LED drivers can cause:

  • Flicker

  • Buzzing

  • Step-dimming instability

  • EMI noise

  • Additional transformer heating

Conclusion

Voltage drop under dimming is inherent due to waveform distortion and magnetic inefficiency under non-sinusoidal excitation. This is expected behavior, not a defect.