Hiya, this is my second power consumption related thread. I just wanted to know how power consumption is affected when you overclock a CPU by 5%.


CPU: Intel Pentium III 600MHz, Katmai. 2.05V core voltage
Mobo: Abit BE6
HDD: Maxtor 8.4GB 5400RPM
Video: ATI Rage Fury Pro Vivo
CD: one generic 24x10x40 CD-RW drive
mouse: Logitech Optical
KB: Microsoft Natural Elite
230W generic PSU
160MB of generic PC100 memory at default latency

At the official frequency of 600MHz I got these results.

when idling at Windows desktop with no disk activity:

power consumption: 84watts

more details for the tech inclined people:
line voltage 122.5V
power factor=0.62

CPU fully loaded with Sandra CPU Multimedia test

power consumption: 93W
Sandra result: (3246 Integer, 3971 floating points)

When overclocked to 630MHz
Idle consumption= 86W

line voltage=122.6V
power factor=0.62

CPU loaded in same manner:
power consumption: 95W
Sandra result 3402 integer, 4163 floating point)

Section II:
Power supply capacity calculation. A generally accepted figure for the typical computer power supply unit is 70%. At 95W power consumption at the plug, I am using 7/10 of input power at the output which turns out to be 66.5W. Looks like this computer is only loading this particular 230W PSU to 30% leaving me with 163.5W of unused capacity. I can see how pre-built computers with 140W and 90W power supplies runs fun. If this computer was to be build with 140W PSU, you still have 73.5W of power available. 73.5W is enough for several HDD's, a CD-ROM drive and a graphic accelerator.

My Athlon machine only use 90W at idle. I don't see how an 180W power supply can't be used on it. People always say you need at least 300 and preferebly 350. I don't know what this hype is all about.