So I decided to do a little experiment...
I've created a rig for testing amperage of whatever I want to plug into my rig (an 1110 box with a receptacle, and a meter hooked up inline of the hot wire).
The current running through my computer is an average of 0.6amps at idle.
Watching a fullscreen hd video it can get as high at 0.7A, average of 0.8A.
Playing a game the average is 0.78A, getting as high as 0.81A.
Testing the line voltage gives me a reading of 121.8volts.
So wattage = volts * amps = 121.8*0.8 = 97.44watts.
Now I know that's not accurate because I'm not taking inductance (this is ac we're talking about) into account, but It's still close...
Who said we need 1000watt power supplys?I could get away with a 150watt...
Unless someone can prove me wrong here. What am I missing?
btw, to prove the meter is accurate I tested my basement's lighting circuit which gave me ~460watts. It's 8 50watt (8*50 = 400w) pot lights (low voltage, so they have a transformer = inefficiency).
I also tested a plug-in space heater which si rated at 15 amps, it gave a reading of ~10amps once it was running (I didn't leave it too long as I have a 6amp fuse in this meter that I don't want to burn...).
Other interesting readings I found:
My Monitor (19inch widescreen lcd from samsung) = 0.18amps
My TV (32inch widescreen lcd from sharp) = 0.86amps
xbox 360 = 0.8amps in low-power mode, 1.1amps in a game
Bookmarks