On March 12, 2018 at 19:22, edizzle said...
you mean what what you think is 0%
You say no small thing! The following is more than you want to know but if you've ever measured current and voltage of LEDs when they're running, it hints at why a five watt LED doesn't draw five watts, and other things that don't make sense.
You have to apply a certain amount of voltage, somewhere around 2 volts, for an LED to light up at all. 12V LEDs are typically three LEDs in series, so you might be presenting even 6 volts to the LEDs, but they won't be on. If full power is 12 volts, then this math would result in zero percent light with just under 50% voltage!
But wait, there's more!
If full power is 12 volts because you have a 120 to 12 volt transformer, then actual full power is NOT 12 volts DC. 12 volts AC is reduced by two diode drops (typically 0.7V each) to pulsating DC. But... Every diagram I can find shows two half waveforms that meet at zero volts, even though the diodes (almost completely) don't conduct when voltage lower than 0.7 is applied to them. A time period of
no conduction minuscule conduction should be shown between half pulses!
Now, this waveform with the (nearly) blank space between the bumps will have a peak voltage of the square root of two, times 12 volts, minus two diode drops, or... about 15.3 V.
If that voltage is fed to a capacitor and no current is drawn, the actual voltage will be higher than 15.3, since tiny amounts of current will flow through the diodes even below 0.7 volts across them. I've seen 16 volts under those conditions.
As soon as you draw current, though, the DC value lowers. You'll get clean DC until you draw so much that the capacitors cannot keep the voltage between pulse peaks up to about 12 volts. From that point on down, the load will make flicker possible.
So, yeah, zero percent isn't necessarily zero percent. And 12 volts isn't twelve volts. Et cetera.