Having operated my computer in accordance with some popular advice, I decided it was time to put the theory to the test. I plugged my home computing setup-- a CPU, monitor, printer, and speakers-- to a single power strip and plugged the lot into the Kill-A-Watt. Then I fired the whole thing up to try and figure out the truth. As part of the trial, it was important to differentiate between startup power and running power. Here's my methodology:
There is no way for me to connect the Kill-A-Watt to they system after it has already booted to monitor energy consumption while its already up and running. So I logged the energy use during the first five minutes after startup. Then I monitored the total consumption over one hour and subtracted the energy consumed in the first five minutes to give me a conservative approximation of running power. Here's how it broke down:
Total energy used during the five-minute startup was 0.01 kWh. BUT a full hour of running all the equipment consumed 0.11 kWh. Taking the former from the latter, it looks like a conservative measure of power consumed while running is about 0.1 kWh. So firing up my particular system uses only about one-tenth the power necessary to run the system for only an hour.
Clearly, the popular wisdom we were testing did not hold true from our trial. BUT, I still wanted to give our microprocessing friend a fair shake...