Gaming Benchmarks - Battlefield 2
For our gaming benchmarks, we ran the systems in two different modes: a lower quality mode that allowed the IGPs to get relatively playable frame rates, and a higher quality mode that was a more realistic setting for the discrete graphics cards. Most of the discrete graphics cards will be system/CPU limited in the low quality mode, but that information can be interesting as well. Obviously, the IGP solutions really aren't intended for serious gamers, but we wanted to see if they could still provide a playable gaming experience with lower detail settings.
The above screenshots show the settings that we used in Battlefield 2. Our low quality BF2 configuration uses the "Low" preset, with texture quality also downgraded to low. Anisotropic filtering and antialiasing are disabled, and we tested at 640x480, 800x600, 1024x768, and 1280x1024. (We don't consider 640x480 playable in BF2, as the text is unreadable, but it was included for reference.) The higher quality mode uses the "High" preset, with textures also set to high. Antialiasing is disabled, and anisotropic filtering is set to 8x. The 7800 GTX generally wouldn't have a problem with 4xAA on the tested resolutions, but realistically, we don't expect users to put a 7800 GTX in this type of system.
Starting with the low quality performance modes, even a stock 6600 is almost 3X as fast as the best IGP solutions. It's a bit surprising to see that the ATI chipset appears to be system limited to 25 FPS, while the 6150 scores significantly higher at lower resolutions. We would venture that the limitation is caused by the HP BIOS, as we have not seen this sort of behavior in previous testing on the ATI Xpress 200 chipset. It could also be something specific to Battlefield 2, as none of the other games show this behavior. At 1280x1024, the two IGPs are basically tied, though neither is really playable at 20 FPS or less. Surprisingly, I actually played through a map with the 800x600 setting on the ATI IGP, and I found that the game worked okay. Battlefield 2 doesn't really seem to require much more than about 25 FPS for casual gameplay, though higher performance is obviously preferred.
While the integrated graphics solutions struggle to maintain playable frame rates even at the lowest resolutions, sticking with low detail levels and switching to even a 6600 provides very good results. In fact, the 6600 is capable of running very well in the medium quality modes while still maintaining reasonable frame rates. With the more powerful graphics cards, we can see that the faster memory and lower latencies provided on our custom system end up winning out - even with the slower processor. The 3800+ Venice core is limited to 171 FPS, while the HP system is limited to 157 FPS.
Moving onto the higher-quality mode, the IGP solutions are way too slow. Both systems were providing single digit frame rates, so we halted benchmarking rather than waiting for each test to complete. With the other graphics cards, the custom system ends up leading by as much as 9%, but you really wouldn't notice the difference in practical use. It appears that the improved memory controller and cache subsystem on the Venice core makes up for the lack of cache relative to the ClawHammer core. At that point, the lower latency RAM takes the lead.
For our gaming benchmarks, we ran the systems in two different modes: a lower quality mode that allowed the IGPs to get relatively playable frame rates, and a higher quality mode that was a more realistic setting for the discrete graphics cards. Most of the discrete graphics cards will be system/CPU limited in the low quality mode, but that information can be interesting as well. Obviously, the IGP solutions really aren't intended for serious gamers, but we wanted to see if they could still provide a playable gaming experience with lower detail settings.
Click on images to enlarge. |
The above screenshots show the settings that we used in Battlefield 2. Our low quality BF2 configuration uses the "Low" preset, with texture quality also downgraded to low. Anisotropic filtering and antialiasing are disabled, and we tested at 640x480, 800x600, 1024x768, and 1280x1024. (We don't consider 640x480 playable in BF2, as the text is unreadable, but it was included for reference.) The higher quality mode uses the "High" preset, with textures also set to high. Antialiasing is disabled, and anisotropic filtering is set to 8x. The 7800 GTX generally wouldn't have a problem with 4xAA on the tested resolutions, but realistically, we don't expect users to put a 7800 GTX in this type of system.
Starting with the low quality performance modes, even a stock 6600 is almost 3X as fast as the best IGP solutions. It's a bit surprising to see that the ATI chipset appears to be system limited to 25 FPS, while the 6150 scores significantly higher at lower resolutions. We would venture that the limitation is caused by the HP BIOS, as we have not seen this sort of behavior in previous testing on the ATI Xpress 200 chipset. It could also be something specific to Battlefield 2, as none of the other games show this behavior. At 1280x1024, the two IGPs are basically tied, though neither is really playable at 20 FPS or less. Surprisingly, I actually played through a map with the 800x600 setting on the ATI IGP, and I found that the game worked okay. Battlefield 2 doesn't really seem to require much more than about 25 FPS for casual gameplay, though higher performance is obviously preferred.
While the integrated graphics solutions struggle to maintain playable frame rates even at the lowest resolutions, sticking with low detail levels and switching to even a 6600 provides very good results. In fact, the 6600 is capable of running very well in the medium quality modes while still maintaining reasonable frame rates. With the more powerful graphics cards, we can see that the faster memory and lower latencies provided on our custom system end up winning out - even with the slower processor. The 3800+ Venice core is limited to 171 FPS, while the HP system is limited to 157 FPS.
Moving onto the higher-quality mode, the IGP solutions are way too slow. Both systems were providing single digit frame rates, so we halted benchmarking rather than waiting for each test to complete. With the other graphics cards, the custom system ends up leading by as much as 9%, but you really wouldn't notice the difference in practical use. It appears that the improved memory controller and cache subsystem on the Venice core makes up for the lack of cache relative to the ClawHammer core. At that point, the lower latency RAM takes the lead.
48 Comments
View All Comments
gibhunter - Wednesday, December 14, 2005 - link
That's not an X2 3800+. It's the standard 2.4GHZ 3800+ single core.Regarding these HPs, I have a few of these at work. They are realy great. For $500 and change you get an Athlon 64, 512MB of ram and a WinXP Pro. Try and put a system like that yourself and you'll spend just as much or more and that's not counting the snazzy keyboard and mouse that comes with that system. It really is a good deal.
JarredWalton - Thursday, December 15, 2005 - link
Just to reiterate, the linked 3800+ is indeed an X2:"We actually have an X2 3800+ Smart Buy, sku # pz635ua#aba....it might be
listed incorrectly as a 3800+, but it's an X2. I'm in the process of
getting that fixed."
That's from an HP representative, one of the marketing managers of the small-business division.
Googer - Thursday, December 15, 2005 - link
Equally impressive for the $500-ish range is this http://e4me.com/products/products.html?prod=eMachi...">e-machineLoneWolf15 - Thursday, December 15, 2005 - link
If you're a business, e-Machines isn't equally impressive. Part of what you are paying for is the support. The system reviewed carries a three-year warranty (par for the course on business systems) and probably carries business-level support too. Most HP systems also use a fair number of brand-name parts (i.e., ASUS mainboards in most systems). I don't deny that eMachines has its place, but it comes nowhere near something that HP puts out.P.S. While I like most of HP's system configurations, even home ones, I haven't heard good things about home-level support. And one other thing, Jared...why does the article say this system has a Clawhammer core CPU? I thought Clawhammers went the way of the dinosaur on Socket 939 long ago. Anything this new ought to have a Venice or San Diego core chip in it.
JarredWalton - Thursday, December 15, 2005 - link
Well, it does have a ClawHammer -- at least the system I have does. You have to remember that AMD only has one fab producing 90 nm parts, and they have an old fab that still produces 130 nm parts. Perhaps AMD gives them a better deal on the older chips? Or perhaps it's just that this model was made a little while ago? If it had used a San Diego core, I expect power draw would have dropped another 20 W at least.mino - Saturday, December 17, 2005 - link
You are wrong on this. AMD publicly stated sometime in the Q2 that they have converted all of their lines onto 90nm production.Also AMD does have only one fab - FAB30 - currently in producing CPU's. While there is FAB25 it produces flash and is part of Spansion division and there is also FAB35(or 36?) in qualification process the only fab producing AMD CPU's in volume is currently FAB30 on 200mm wafers.
JarredWalton - Friday, December 23, 2005 - link
Hmm... obviously I'm not paying close enough attention to AMD's fabs. I could have sworn they still had a 130nm fab making CPUs. I would have thought 130nm would be sufficient for a lot of stuff - better to keep what you have running instead of retrofitting old fabs, right? Then again, new fabs are getting more and more expensive.mino - Sunday, January 1, 2006 - link
Well, Austin FAB25 was not suitable nor meant for smaller than 180nm process (for logic products). AMD thus made a cash cow out of it during hard AthlonXP times. Also the capacity of any FAB is measured in wafers/time not chips pre time. In other words AMD could make twice as many K8 CPU's on 90nm than on 130nm. Couple that with huge capacity constraints AMD faced in 2005 and fact they had only one 200mm FAB and it becomes clear why not to produce on 130nm. Around this time FAB35 should come online so the tight supply of the last quarter should not repeat for some time. Also AMD's 90nm SOI process is pretty good so don't expect FAB30 phase-out anytime soon(90nm is last logic process for FAB30). Shame FAB35 wasn't online in 2005, Intel would've had a way hotter year than it had.