IGP Power Consumption - 780G, GF8200, and G35
by Gary Key on April 18, 2008 2:00 AM EST- Posted in
- CPUs
Flight Simulator X Test
Whether playing Solitaire since the Windows 3.0 days or flying around your favorite locale in the longest running series on the PC, casual gaming on the PC is an activity just about everyone with a PC has done at one time or another. Our test today consists of a flight around the Honolulu harbor area and is something all three chipsets can handle, with the appropriate settings and resolutions. The actual winner is the 780G chipset, but what we are concerned about is how many dollars your electric company will collect when you relax and play a game.
We listed minimum and maximum voltages during our six-minute trip around the harbor, but the charts show the average. The current version of Flight Simulator X with the Acceleration Pack is fully multi-core aware and provides a workout for both the CPU and GPU as you crank up the settings. Our AMD LE1600 is a single-core CPU so the voltage variations are not as great compared to the multi-core CPUs. This processor stayed near the 100% utilization level during FSX testing, although frame rates did not seem to suffer that much at our 1024x768 medium settings.
The results are closer this time around between our two AMD platform chipsets. The GeForce 8200 once again finishes slightly in the lead but our upcoming gaming tests will show it following in the footsteps of the AMD 780G. The results are a tie between the AMD chipsets but a 25W advantage over G35 with the low-end processors, a 2W and 13W improvement with the dual cores, and a soft landing with 2W and 8W differences between the quad cores.
First Thoughts
We told you it would be a short article. However, we think the results today should provide a better indication to each platform's power requirements with a variety of CPUs likely to end up in these boards and applications that tend to stress those same processors. Unless the next NVIDIA driver set increases power consumption, then we have a surprise winner between the latest IGP chipsets from the big three. The next question we will answer is performance per watt and those results are likely to lead us down another path.
The power bill savings alone are not that large, of course. Running 24/7, 365 days per year, a 10W power difference works out to 87.6 kWh of energy over the course of a year. At a rate of 10 cents per kWh, that would work out to less than a dollar per month. In the bigger picture, however, the 5W to 25W difference between chipsets or platforms can be far more meaningful. If you want a silent - or at least very quiet - PC, every watt saved can be meaningful. Cooler running chipsets can also avoid the need for "noisy cricket" fans or monstrous cooling configurations. All other things being equal, we would definitely prefer less power-hungry components.
44 Comments
View All Comments
Darth Farter - Saturday, April 19, 2008 - link
awesome, over here where it's US$ 30cents/kWH you can understand that it will start to make a difference. Only thing I would like to see is Undervolting tho that like overclocking depends on the mileage. I'm running a G1 brisbane at 2Ghz with 0.975vcore on a 690g for 24/7 download/internet box. I wonder what it costs me/monthJarredWalton - Saturday, April 19, 2008 - link
Given our earlier calculations of $0.10/kWh, tripling the cost of energy means you're looking at savings of up to $30 per year for 24/7 use and a difference of 10W. If you're running a 100W PC 24/7 for a whole year, that PC would cost $262.80 at $0.30/kWh or $87.60 at $0.10/kWh.royalcrown - Saturday, April 19, 2008 - link
What is going on with the fried mosfets also, we never did get that weekend update ;) ?royalcrown - Saturday, April 19, 2008 - link
Why don't you have ANAND buy you guys some meters and on EVERY GFX card or PROCESSOR review list the actual wattage used by the systems. This NEEDING of at LEAST a 550 watt ps is BS for those of us that will never use dual cards.I just calculated that my new system on FULL load should draw about 280 watts with an 8800gt, so a 400 watt supply with 450 peak is fine for me . I read than Nvidia claims 125 watts on their page and the real draw is a lot less when they use the meter.
I for one am sick of these companies pushing monster PSU when they AREN'T needed in every case, and sites like Anandtech should give us the scoop instead of plastering ads for 1200 watt psu and not telling readers that we may not need even 550.
Zaranthos - Saturday, April 19, 2008 - link
That's a fact. I'm so sick of seeing insanely large power supplies shoved down peoples throats. I keep upgrading my computer and my 300W power supply keeps running my computer just fine. You'd think that wasn't even possible by most of the reviews/ads/propaganda. I'd like to see tests showing what the minimum power supply requirements are.JarredWalton - Saturday, April 19, 2008 - link
You mean like our PSU reviews where we repeatedly state that the only way you can even come near the point where a 1000W PSU is required is if you're heavily (i.e. water- or phase-cooling) overclocking your quad-core CPU and running 3-way or 4-way GPUs?Most PSUs are at maximum efficiency around the 50% load mark, but even at 30% load the good PSUs are above 83% efficiency. Couple that to the fact that a 600W PSU is generally quieter delivering 150W than a 300W PSU delivering the same wattage, and there are reasons to buy higher-spec PSUs. The biggest reason to buy a higher spec PSU, of course, is that it's very difficult to find good quality PSUs rated under 400W. (Seasonic and the Seasonic-built PSUs are about the only option.)
All that is totally overlooking the fact that *testing* with a highly-rated 520W PSU is not the same as saying the PSU is required. What's important is consistency, and here we are using the same PSU for all tests. It should have an 80-85% efficiency across the tested power requirements, which is well within the margin of error. If we drop to a 300W Seasonic, power draw might change slightly, but proportionately the results should be nearly identical to what we see in this article.
Perhaps Gary can chime in here with some comments; I know that he sent me an initial configuration table for this article on Thursday and then changed the PSU and case later that night. The original PSU was a Seasonic unit, so perhaps he ran into some difficulties. Again, not that it really makes a difference.
Wirmish - Saturday, April 19, 2008 - link
Flight Simulator X Test:nVidia vs AMD -> 0W to 3W, or ~2%.
Ok... nVidia win by 2%.
And "watt" about the FPS during these benchs ?
Did nVidia 8200 have -2% FPS vs AMD 780G ?
And if the 780G is faster, can you underclock it, or overclock the 8200 ?
Try it... just to compare the consumption at the same performance level.
Esben - Saturday, April 19, 2008 - link
Thanks for shedding light on the current IGP situation. It's great to see Nvidia is still competitive in the IGP-business, consumption wise. Now we eagerly await the performance numbers.Please keep writing about IGPs and power consumption. I'd find it very interesting if you made an articles about maximizing performance per watt, and how far in performance you can push the IGP.
An IGP-system is fitting most peoples needs, so the interest is definitely there.
jacito - Friday, April 18, 2008 - link
The artical is very well written, and this is going to sound rather stupid, but what does IGP stand for?JarredWalton - Friday, April 18, 2008 - link
IGP = Integrated Graphics Processor