Originally Posted by Shiranai_Baka
Ah ! Another use for the old Duck tape ! You been watching Red Green ?
I have to post a link :
http://www.redgreen.com/
Originally Posted by Shiranai_Baka
Ah ! Another use for the old Duck tape ! You been watching Red Green ?
I have to post a link :
http://www.redgreen.com/
Last edited by peat moss; 08-25-2005 at 02:22 AM.
Absolutely not true.That would be the case with higher voltage, not amperage.
Amperage is a measure of the amount of electrical current.
2 examples:
1. Taser guns. While they put out thousands of volts of current, the amperage is relatively low, making it a non-lethal weapon. If that same taser gun put out a higher amperage of current, than it would become lethal.
2. Static electricity can be thousands of volts. But, since the amperage is so small, it's truly harmless(as we all know).
Compare it to water flowing through a hose. The temperature of the water represents votage. The amount of water represents amperage. A little splash of boiling water won't hurt you. A lot of boiling water can definitely do some damage.
Hence the old saying..."It's not the volts that kill you, it's the amps"
Originally Posted by harrycary
Is that not Ohm's Law ?
http://hyperphysics.phy-astr.gsu.edu...ic/ohmlaw.html
Last edited by peat moss; 08-26-2005 at 12:24 AM.
You're wrong on this one. High voltage may not kill you if the amperage is low but a high voltage will destroy sensitive electronics. That's why a surge surpressor is to protect against spikes in voltage (not amperage).
You say static electricity won't kill you but it sure will kill memory, CPUs or other sensitive electronics.
With a power adaptor (transformer) the amperage rating is the maximum load that the coil in the transformer can take without overheating. So the higher rating just means that it can maintain the voltage. It doesn't mean more power is going into the device.
Increasing the voltage will cause overload of the device being powered.
Providing enough or more than enough amperage will only provide a stable power under load.
Why do you think we all say that a higher rating for a power supply is better?
It is the load that the device draws from. If there isn't enough there then you get a voltage drop and unstable performance.
Anyway. I know i'm right
Last edited by Virtualbody1234; 08-26-2005 at 02:16 AM.
Oh be fucking careful, I got this router with ac adapter, it needed a 1amp 12v one, so I looked around in all the old electronics and found one. The little plug on the end was too small. And if you just buy the end, at radioshack, it was 5 bucks, and the had the polarity thing, to tell which was + and - perpendicular to the prongs so that without there 25 dollar adapter you couldn't tell which was which. Blew my router up.
...
That is truly ridiculous. The voltage & amperage rating of an AC adapter is its rated output. It has nothing to do with overheating. It merely states what it is capable of outputing. Do you understand? Likewise, the electrical device you're powering will have a similarly rated input. (btw, Alternating Current and Direct Current are measured in the same way)With a power adaptor (transformer) the amperage rating is the maximum load that the coil in the transformer can take without overheating.
Hell, I wasn't going to respond to your post but since you're incorrect, what you're stating can cause people plenty of problems.
I'm not trying to be mean. I'm just stating facts I've learned in school and have applied in real life. (my previous job was with the 2nd largest electrical distributor in the US where I worked on bids for large electrical projects like arenas, stadiums and other commercial applications)
peat moss, Ohms are a measure of resistance and does not apply in this situation.
Mods, now that Duffman has solved his dilemma, please do us all a favor and close this thread.
regards.
@Harrycary. Why close the thread? Are you concerned that someone else might tell you that you're wrong?
And btw the thread is to help Shiranai_Baka not Duffman.
@The other people reading this thread, if you need a power adaptor then get one that has the same voltage and amperage as specified. That's always the best choice.
Lol I don't mind this. Besides, this is on topic. I would like to know if higher or lower is better. (not saying anyone is wrong)
My point is that you should have the same voltage and the same or higher amperage adaptor.
harrycary is saying the same voltage and the same or lower amperage.
Isn't that what the disagreement is about?
A power adaptor is similar to a power supply. I seems easy to understand that a higher rated power supply is better.
I certainly wouldn't want to run my PC with an underpowered power supply.
I would also like to invite others to share their opinion about this.
Bookmarks