NVIDIA's GeForce 8800 (G80): GPUs Re-architected for DirectX 10
by Anand Lal Shimpi & Derek Wilson on November 8, 2006 6:01 PM EST- Posted in
- GPUs
General Purpose Processing
With all the talk about how general purpose G80 is, can we expect it to replace our shiny new quad core desktop processor? This isn't quite possible at this point due to the way most general purpose code uses the CPU. Many dependencies and low parallelism prevent NVIDIA from simply dropping this in a motherboard and running Windows on it.
But there are general purpose tasks that lend themselves well to the parallelism of G80, and NVIDIA is enabling developers to take advantage of this via a technology they call CUDA (Compute Unified Device Architecture).
The major thing to take away from this is that NVIDIA will have a C compiler that is able to generate code targeted at their architecture. We aren't talking about some OpenGL code manipulated to use graphics hardware for math. This will be C code written like a developer would write C.
A programmer will be able to treat G80 like a hugely parallel data processing engine. Applications that require massively parallel compute power will see huge speed up when running on G80 as compared to the CPU. This includes financial analysis, matrix manipulation, physics processing, and all manner of scientific computations.
NVIDIA has written a totally separate driver for G80 that will be used to run compiled C code targeted at G80. The reason they've done this is because the usage model for GPGPU programming is so different from that of graphics. Both the graphics driver and the CUDA driver can be running on G80 at the same time. This may allow programmers to take advantage of CUDA for in game physics on a single card. The driver changes the conceptual layout of the GPU into something that looks more like this:
This design, along with stream output capabilities, allows programmers to treat the GPU like a general purpose data processing engine. Each block of 16 SPs is able to share data with each other and can perform multiple passes on the data without having to write out and read back in from the onboard graphics memory. Developers are given the ability to manage the caches themselves.
Will NVIDIA make an x86 CPU? Most likely not, but we may see NVIDIA produce even more general purpose CPUs for the handheld, CE, integrated markets. NVIDIA may end up becoming a producer of system on a chip solutions utilizing its graphics technology and simply expanding G80 to be more general purpose (and obviously get rid of some of the SPs in order to lower costs).
111 Comments
View All Comments
yyrkoon - Thursday, November 9, 2006 - link
If you're using Firefox, get, and install the extension "flashblock". Just did this myself today, tired of all the *animated* adds bothering me while reading articles.Sorry AT guys, but we've had this discussion before, and its realy annoying.
JarredWalton - Thursday, November 9, 2006 - link
Do you want to be able for us to continue as a site? Because ads support us. Anyway, his problem is related to not seeing images, so your comment about blocking ads via flashblock is completely off topic.yyrkoon - Thursday, November 9, 2006 - link
Of course I want you guys to continue on as a site, just wish it were possible without annoying flashing adds in a section where I'm trying to concentrate on the article.As for the off topic part, yeah, my bad, I mis-read the full post (bad habit). Feel free to edit or remove that post of mine :)
archcommus - Thursday, November 9, 2006 - link
What browser are you using?falc0ne - Thursday, November 9, 2006 - link
firefox 2.0JarredWalton - Thursday, November 9, 2006 - link
If Firefox, I know there's an option to block images not on the originating website. In this case, images come from image.anandtech.com while the article is on www.anandtech.com, so that my be the cause of your problems. IE7 and other browsers might have something similar, though I haven't ever looked. Other than that, perhaps some firewall or ad blocking software is to blame - it might be getting false positives?archcommus - Thursday, November 9, 2006 - link
Wow to Anandtech - another amazing, incredibly in-depth article. It is so obvious this site is run by dedicated professionals who have degrees in these fields versus most other review sites where the authors just take pictures of the product and run some benches. Articles like this keep the AT reader base very very strong.Also wow to the G80, obviously an amazing card. My question, is 450W the PSU requirement for the GTX only or for both the GTX and GTS? I ask because I currently have a 400W PSU and am wondering if it will be sufficient for next-gen DX10 class hardware, and I know I would not be buying the highest model card. I also only have one HDD and one optical drive in my system.
Yet another wow goes out to the R&D monetary investment - $475 million! It's amazing that that amount is even acceptable to nVidia, I can't believe the sales of such a high end, enthusiast-targeted card are great enough to warrant that.
JarredWalton - Thursday, November 9, 2006 - link
Sales of the lower end parts which will be based off G80 are what make it worthwhile, I would guess. As for PSU, I think that 450W is for the GTX, and more is probably a safe bet (550W would be in line with a high-end system these days, although 400W ought to suffice if it's a good quality 400W). You can see that the GTX tops out at just under 300W average system power draw with an X6800, so if you use an E6600 and don't overclock, a decent 400W ought to work. The GTX tops out around 260W average with the X6800, so theoretically even a decent 350W will work fine. Just remember to upgrade the PSU if you ever add other components.photoguy99 - Thursday, November 9, 2006 - link
I just wanted to second that thought -AT articles have incredible quality and depth at this point - you guys are doing great work.
It's actually getting embarrasing for some of your competing sites, I browsed the Tom's article and it had so much fluff and retread I had to stop.
Please don't forget the effort is noticed and appreciated.
shabby - Wednesday, November 8, 2006 - link
It wasnt mentioned in the review, but whats the purpose of the 2nd sli connector?