OK, so there is a lot being thrown around all of a sudden about electronics, mainly LEDs. And it has brought up some questions for me. My understanding of how to properly operate an LED, at least in series, is that you need to calculate in coming voltage, figure out the operating voltage of the LED and then apply and appropriate resistor on the negative side of the LED which “knocks down” the power of the incoming voltage. So for example I have a 14.8 volt supply it goes through my Electronic Speed Control (ESC) board which causes a .7v drop (due to a transistor I believe) giving me 14.1v on my lighting output. Now the LED I am operating is rated at 3.6v so I wired in a resistor on the negative pole of the LED of the appropriate ohm rating (actually a bit larger because I wanted to under power it). The key to this equation is that each power source voltage would require a different resistor to fully power the LED right? The same resistor would not work for 21v or you would plow the LED and 5v would likely not power it at all, correct?
Question 1) So how are people selling prewired LEDs that are covering everything from 9 to 21 volts.; am I missing something? Or are they using a resistor that controls the highest voltage and then just allowing it to be under powered in all other situations?
Question 2) I don’t understand why the resistor goes on the negative side of an LED, could someone explain that? And would that be true of any electrical component that you need to step down source voltage?