Some time ago, a poster wrote about using a 500 Watt power supply in a PC, wrongly assuming that is how much power was consumed all the time. I argued that a typical PC left powered up in an office, would consume between 70 -100 Watts. I finally got around to testing my workshop PC in actual use. I made no modifications to the machine. I left all the cards and drives plugged in.
The computer comprises:
Gigabyte GAP35DS3P motherboard Intel Core 2 Duo 6420 2.13GHz2GB RAM Main video card - NVIDIA GeForce 8500GT Secondary video card - NVIDIA GeForce 8500GT 1 X 1TB SATA hard disk 3 X 500GB SATA hard disk 1 X Floppy drive 2 X DVD/CD burners 1 X Sound Blaster 1 X TV tuner card 1 X 350 Watt power supply 1 X Case fan (120mm)
Here are the figures I measured:
- Computer off (soft switch off) - 6 Watts
- Monitor off (soft switch off) - 4 Watts (Very ancient Sony 17" LCD - analogue input only)
- Computer operating (web browsing, spreadsheet, et al - hard disks are NOT asleep) - 139 Watts. When the hard drives power down, that figure will fall to below 100 Watts.
- Monitor operating - 33 Watts
I am confident that more modern computers and those using on-board video, on board sound, no TV card and one hard disk will consume considerably less energy. Standby power figures for monitors would be lower, I expect. Typical hard disks use more than 10 Watts each, so just using one hard disk will reduce consumption by more than 30 Watts. In fact, that is precisely what I intend doing. I will use USB connected drives, as an when I need extra space.