NVIDIA’s GeForce GT 220: 40nm and DX10.1 for the Low-End
by Ryan Smith on October 12, 2009 6:00 AM EST- Posted in
- GPUs
A Better HTPC Card: MPEG-4 ASP Decoding & LPCM Audio
Along with the addition of DirectX 10.1 functionality, the latest members of NVIDIA’s GPU lineup have picked up a couple of new tricks specifically geared towards HTPC use.
The first of these is a newer video decoding engine. Officially NVIDIA is moving away from the VP* moniker, but for the time being we’re sticking to it as we don’t have a better way of easily differentiating the feature sets of various video decoding engines. NVIDIA’s vendors are calling this VP4, and so are we.
Successive VPs have focused on adding support for additional video formats. VP2 had full H.264 decoding, and VP3 (which never made it into a GTX 200 series part) added VC-1 decoding. For VP4, NVIDIA has added support for full decoding of MPEG-4 (Advanced) Simple Profile, better known as DivX or XviD. With this addition, NVIDIA can now offload the decoding of most of the MPEG formats – the only thing not supported is MPEG-1, which as the oldest codec is trivial to decode on a CPU anyhow.
To be frank, we’re a bit puzzled by this latest addition. By no means are we unhappy (we’ll always take more acceleration!), but MPEG-4 ASP isn’t particularly hard to decode. Even an underclocked Nehalem with only a single core (and no HT) enabled can handle HD-resolution MPEG-4 ASP with ease; never mind what even a low-end dual-core Pentium or Celeron can do. This would be a good match for the Atom, but those almost always use integrated graphics (and Ion isn’t slated to get VP4 any time soon). So while this addition is nice to have, it’s not the kind of game changer that adding H.264 and VC-1 were.
The unfortunate news here is that while the hardware is ready, the software is not, and this is something that caught us off-guard since these parts have been going to OEMs since July. NVIDIA has yet to enable MPEG-4 ASP acceleration in their drivers, and won’t be doing so until the release 195 drivers. So at this point we can’t even tell you how well this feature works. We’re not pleased with this, but we’re also not particularly broken up about it since as we just mentioned the cost of CPU decoding isn’t very much in the first place.
On a final note with video decoding, one of NVIDIA’s marketing pushes with this launch is touting the fact that they have been working with Adobe to bring video decode acceleration to Adobe Flash 10.1, and that the GT 220/G 210 series are well suited for this. This is going to be absolutely fantastic to have since Flash Video is a CPU-hog, but Flash 10.1 is still 6 months (or more) away from being released. More to the point, as far as we know this is being implemented via DXVA, which means everyone else will get acceleration too. And notably, this is only for H.264, as VP6 (the older Flash Video codec) is not supported in hardware on any card.
Moving on, the other new HTPC feature is that NVIDIA has finally stepped up their game with respect to HDMI audio on cards with discrete GPUs. Gone is the S/PDIF cable to connect a card to an audio codec, which means NVIDIA is no longer limited to 2-channel LPCM or 5.1 channel DD/DTS for audio. Now they are passing audio over the PCIe bus, which gives them the ability to support additional formats. 8 channel LPCM is in, as are the lossy formats DD+ and 6 channel AAC. However Dolby TrueHD and DTS Master Audio bitstreaming are not supported, so it’s not quite the perfect HTPC card. Lossless audio is possible through LPCM, but there won’t be any lossless audio bitstreaming.
Finally, we’re still waiting to see someone do a passive cooled design for the GT 220. The power usage is low enough that it should be possible with a dual-slot heatsink, but the only cards we’ve seen thus far are actively cooled single-slot solutions with the heatsink sticking out some.
80 Comments
View All Comments
Guspaz - Tuesday, October 13, 2009 - link
Errm, Valve's latest hardware survey shows that only 2.39% of gamers are using 2+ GPUs with SLI or Crossfire. ATI has a 27.26% marketshare.Of those who did buy multi-GPU solutions, some may be "hidden" (GTX295, the various X2 solutions), in which case it had no impact whatsoever (since it's presented as a single card). Some may have used it as an upgrade to an existing card, in which case SLI/Crossfire may not have driven their decision.
It's true that SLI (2.14%) has greatly outsold Crossfire (0.25%), but that's such a tiny market segment that it doesn't amount to much.
ATI has managed to hold on to a respectable market share. In fact, their 4800 series cards are more popular than every single nVidia series except for the 8800 series.
So, I think I've sufficiently proven that SLI wasn't a knockout blow... It was barely a tickle to the market at large.
Seramics - Tuesday, October 13, 2009 - link
When Sli came out? Stop mentioning ancient news. Right now, Sli n Xfire r abt equally sucks. Heard of Hydra? Thats the cool stuff dude. And yeah nvidia is very innovative indeed, renaming old products to look new to deceive customers, shave half the spec of a products n keep the same name (9600gso), releasing crappy products n selling it overprice.... MAN! Thats really innovative dun u think?Souleet - Tuesday, October 13, 2009 - link
Are you ignorant or something, ATI fanboy. GT220 is a 40nm and 9600GSO is a 65nm. How can you say they just changed the name? I thought so...gx80050 - Monday, October 12, 2009 - link
Die painfully okay? Prefearbly by getting crushed to death in a
garbage compactor, by getting your face cut to ribbons with a
pocketknife, your head cracked open with a baseball bat, your stomach
sliced open and your entrails spilled out, and your eyeballs ripped
out of their sockets. Fucking bitch
I really hope that you get curb-stomped. It'd be hilarious to see you
begging for help, and then someone stomps on the back of your head,
leaving you to die in horrible, agonizing pain. Faggot
Shut the fuck up f aggot, before you get your face bashed in and cut
to ribbons, and your throat slit.
gx80050 - Monday, October 12, 2009 - link
Fuck off and die retardSeramics - Monday, October 12, 2009 - link
Let's face it. Nvidia is NOT competitive at every front at every single price point. From ultra low end to mid range to ultra high end, tell me, which price point is nvidia being competitive?Well, of cos I believe Fermi will be something different. I truly believe so. In fact, given that HD5870's slightly below par performance for its spec (very likely bcos memory bandwith limited), and Fermi being on a much larger die and higher transistor count, I EXPECT nVidia next gen Fermi to easily outperform HD5870. Just like how GTX285 outperform HD4890. But by how much? For almost 100 USD more for juz 5-10% improvements? I believe this will likely be the case with Fermi vs 5870. Surely its faster, but ur mayb paying 100% more to get 25% extra fps.
CONCLUSION: Even if Nvidia retake the top single GPU performance crown, they were never a winner in price to performance at ANY price point. They care about profits than they care about you.
Souleet - Monday, October 12, 2009 - link
I agree what your conclusion. Definitely price point ATI has always been on the top of their game but NVIDIA innovations is what make the two apart. But who knows, maybe one day ATI/AMD comes out with CPU/GPU solution that will change the technology industry. That would be cool.formulav8 - Monday, October 12, 2009 - link
NVidia brought out the FX5800 Ultra??
TRIDIVDU - Tuesday, September 21, 2010 - link
My son plays GTA, FIFA, POP, Tomb Raider, NFS etc. in my P4, 3.06 GHz WinXP m/c with N 9400 GT (MSI) 1GB card without any problem in a 19inch LCD monitor. Now that I am planning to exchange the 4 year old m/c with a new i5 650, 3.2 GHz, Win7 m/c fitted with GT220 1 GB card, please tell me whether he will find the new machine a better one to play games with.Thatguy97 - Tuesday, June 30, 2015 - link
nvidias mid range was shit back then