I was reading an article at the Inquirer here:

http://www.theinquirer.net/?article=16952

It talks about a few differant articles about the power requirements and heat generated by the new Prescott cores. The coolest thing was when he talked about the total system power draw (strate from the power plug) for a P4 3.2C, P4 3.2E, and Athlon64 3200+. He did two systems, an average system and a top-of-the-line gaming machine. He gave the idle and load temps and the Cool and Quiet (for the Athlon64 of course) power draws:

Standard PC - No load/Full load/Cool'n'Quiet
P4 3.2 GHz Prescott - 114/192
P4 3.2 GHz Northwood - 76.5/144
Athlon 64 3200+ - 106/115/70.6

High-end PC - No load/Full load/Cool'n'Quiet
P4 3.2 GHz Prescott - 165/248
P4 3.2 GHz Northwood - 113/182
Athlon 64 3200+ - 158/168/120
Now that got me thinking, so I e-mailed the Inq (which I have done several times before) and this was my thought...

What about the cost to run these things? From what I have heard (and could be wrong) is that the cost of electricity for businesses in the US is about 5x what the home user pays and power is even more in most other countries than it is in the US. I have heard that the power savings of an LCD will pay for the increased cost of the LCD in 1-2 years for businesses. Now look at the total system power for the “standard system” of 114/192w for the Prescott. Now figure that the servers are likely sitting idle about 50-75% of the time you are looking at about 140w on average. Now look at the older P4 system at 76/144w, you will average about 100w. That is about a 40w power premium PER server to use the new Prescott core and I'm guessing that is in watts per hour. Imaginer if you were a business with 1000 servers. That is 40,000 extra watts you are paying for per hour. If my power bill is correct that would cost a business about $0.67 per KW/H (plus the cost of AC to keep the server room cool), times 8544 hours in a year, times the 40w premium, times 1000 servers. That would be an extra $230,000 in one year to power those 1000 servers just by replacing the Northwood cores with Prescott cores. Maybe my concepts and calculation (and the cost of electricity is way off), but it makes you think. $230,000 is a lot of money. Now for the average home user with one computer, that would likely less than $4 a month and only if they ran it 24/7, but in say Tokyo Japan, that could be as much as $20 a month extra.

Now we throw in AMD’s Cool and Quiet. Those numbers are just impressive… 106/115/70w. Now lets do a similar server configuration (full load say 1/3 the day) and you are looking at 85w average. That is about 15w LESS than the Northwood. That would make the 1000 server scenario work out to a savings of $86,000 in 1 year over the Northwood or $313,000 over the Prescott. That is $313 per sever per year! Now think of some of these server venders like Rack Space. They could save millions of dollars a year and how much do you think it would cost to cool a room with 1000 Prescott servers generating about 140 watts each? This is not even based on the top dogs either. This is based on average machines with these cores.
So what is your appinion on this. Do my numbers and theories sound right? Or am I messed in the head? I should here back from him soon.