It should come as little surprise that since the G210 is the cheapest card we have tested, it’s also the slowest. Rather than rattle off individual benchmarks, we’re just going to show one or two charts for each game in our test suite, including a chart comparing it to next-tier cards such as the GT 220.
CPU: | Intel Core i7-920 @ 3.33GHz |
Motherboard: | Intel DX58SO (Intel X58) |
Chipset Drivers: | Intel 9.1.1.1015 (Intel) |
Hard Disk: | Intel X25-M SSD (80GB) |
Memory: | Patriot Viper DDR3-1333 3 x 2GB (7-7-7-20) |
Video Cards: |
AMD Radeon HD 5970 AMD Radeon HD 5870 AMD Radeon HD 5850 AMD Radeon HD 5770 AMD Radeon HD 5750 AMD Radeon HD 5670 512MB AMD Radeon HD 5450 512MB AMD Radeon HD 4890 AMD Radeon HD 4870 1GB AMD Radeon HD 4850 AMD Radeon HD 3870 AMD Radeon HD 4770 AMD Radeon HD 4670 512MB AMD Radeon HD 4550 512MB NVIDIA GeForce GTX 295 NVIDIA GeForce GTX 285 NVIDIA GeForce GTX 275 NVIDIA GeForce GTX 260 Core 216 NVIDIA GeForce GTS 250 NVIDIA GeForce 8800 GT NVIDIA GeForce 9600 GT NVIDIA GeForce GT 240 NVIDIA GeForce GT 220 NVIDIA GeForce 210 |
Video Drivers: |
NVIDIA ForceWare 190.62 NVIDIA ForceWare 195.62 AMD Catalyst Beta 8.66 AMD Catalyst Beta 8.66.6 AMD Catalyst 9.9 AMD Catalyst Beta 8.69 |
OS: | Windows 7 Ultimate 64-bit |
Generally speaking, the G210 is playable at 1024x768 at low settings, but little more than that. The exceptions to this are our less-complex games: Resident Evil 5, Batman, and Left 4 Dead. Resident Evil can sustain 30fps at medium quality graphics, while Batmand and L4D can sustain that framerate using high quality graphics.
On paper the G210 should be slightly more power hungry and slightly warmer than the Radeon HD 5450 we tested last week. In practice it does better than that here, although we’ll note that some of this likely comes down to throttling NVIDIA’s drivers do when they see FurMark and OCCT. At any rate the G210 is the lowest consuming card out of all the cards we have tested, while its load temperatures are second only to the 5450. Idle power usage and temperatures are at 121W and 51C respectively, which is in-line with other passively cooled cards of this nature.
24 Comments
View All Comments
hwhacker - Tuesday, February 16, 2010 - link
Hmm, maybe he knows something we don't?Last I heard circulating AMD was going to get (sample?) product from both TSMC and GF on 32nm, but that got all borked when TSMC cancelled 32nm. As such, now they will transition to each company's respective 28nm process instead. This is said to have messed up Northern Islands' release, but may result in a better (not just smaller/faster) product. Who knows if that's true. All things being equal, I'm sure AMD would like to use 28nm bulk at GF.
As for nVIDIA, it's been interesting to watch. First they said absolutely not to GF, then 40nm at TSMC happened. After that Jensen was said to be in talks with GF, publically said some good things about GF over TSMC (likely because they're angry about 40nm RE: Fermi and used it for intimidation) and that's all we know. All things being equal, I'm sure nVIDIA would like to use 28nm bulk at TSMC.
Natfly - Tuesday, February 16, 2010 - link
You're right, both companies canned their 32nm bulk processes. So either the author is insinuating that nVidia is going to switch to 32nm SOI or he means 28nm.Ryan Smith - Tuesday, February 16, 2010 - link
He means 28nm.Natfly - Tuesday, February 16, 2010 - link
My apologies, I would have referred to you by name if I wasn't too lazy to go back to the article from the comments page to check :P