First thanks to all for your help and I hope I am not wearing anyone thin. I am learning a ton about electronics and electricity which I know next to nothing about. So here are some other questions.
Thanks to what I have already learned from you guys and online I understand the difference (at least vaguely) between voltage, amperage or current, watts, and resistance; and how they play together with Ohm’s Law. So to ask the question I will first illustrate what I am working with and what I would like to do if possible. I have acquired a power unit out of an old computer; 115v in and is rated to produce 12v@8 amps and 5v@20 amps. Measured I get 11.5v and 5.1v so basically its doing what it says it will do; I haven’t measured the amps, not sure how. The main purpose of this project is to learn and play with something unfamiliar(and try not to kill myself). Beyond that I want to see if I can build a resistance soldering unit and have it double as a DC power station for testing or otherwise playing with stuff. Now for the resistance soldering unit I know that Watts is what I am after on the Ohm’s law equation, that’s the heat I am going to need. 12V at 8A produces 96W and 5V at 20A produces 100W.
First question: Am I right in thinking that using the higher voltage and lower amperage to produce basically the same wattage would be preferable because it would put less strain on the circuit and not require as heavy duty components? Given the choice of the two sources wouldn’t the 12v 8A be the better choice for the soldering unit?
Second question: Does amperage, or current, increase as resistance is added? Does voltage drop as resistance is added? I would think the answer to both questions is yes, but I really don’t know, why I think that. When I go to apply the current to the work I am soldering there will be an added resistance (that’s what we want so it will heat up right?) and I am making an assumption that this will increase the “load” and require more current (amps) to push the electrons through? Or am I totally off base? Basically where I am going with this is will I need to protect the circuit so that when the load is applied I won’t get a spike in amperage and burn the system out. I am using all 15A rated stuff based on the 8A rating and am thinking a fuse or circuit breaker on the soldering leads is in order to protect say for a 10 amp load. But in that make it a slow blow protection so that it will take the momentary charge for soldering but cut out if things go horribly wrong. To take it a step further could, or should, a person put in a slow blow 10amp circuit protector to allow the soldering but then in series put in a 15A fast blow for an instant protection in case of a large surge? After all soldering is creating a short on purpose so we have to allow some of it but we don’t want it to go beyond the capability of the system right?
Third question: Both for soldering and for a generic DC power supply having the ability to vary the voltage would be nice, albeit for different reasons . I do know that lowering incoming voltage to a transformer also lowers its output. In all of the things I have read about building these things there is mixed opinion on how to do this. As I am trying to use inexpensive on hand or off the shelf at Home Depot items, the idea of using a dimmer on the AC side to lower AC input voltage seems like the ticket but has had its detractors. Since I am going to protect the circuit for 15 amps max can I use a 15 amp dimmer on the AC side to control DC output voltage reliably?
Fourth question: This is the one I am most concerned with. Up to this point everything has revolved around the 12v side of the equation. I have and would like to use the 5v side of things as well. But at 20 amps it is way more current than I would ever use for low voltage applications and do not like the idea of such a high load on the system and it components when it is unnecessary. So can I, and what would be the best way, to maintain voltage but reduce the amperage down to say only 3 or 4 amps? Is it as simple as just adding a resistor? and if I do add a large enough resistor to knock down the current how much, if any, of a voltage drop would I see across said resistor? Is there another relatively simple way? The other idea is to just not use it at all and only use the 12v side and drop it down like mentioned before. It is there just seems like I should use it for something. Is there a variable resistor so one could control amps on demand?
Last question: measuring amperage. Does the power source need to have a load applied in order to measure it? Or is simply connecting the leads to each of the wires going to give me the amperage. Also this goes to the idea of adding resistance and changing the amperage. With no load on the system will I be getting an accurate reading? Or would I need to drop a resistor of some sort (piece of brass) to create the load in order to measure it?
I think that’s all I have for now. Thanks ahead of time. I hope people view this as a chance to give an education and not treat this as annoying.