NVIDIA's Back with NV35 - GeForceFX 5900 Ultra
by Anand Lal Shimpi on May 12, 2003 8:53 AM EST- Posted in
- GPUs
The Problem with Understanding Graphics
When NVIDIA introduced the GeForceFX, quite a ruckus ensued around the possibility that the NV30 did not have 8 distinct pixel rendering pipelines and in fact only had 4.
ATI quickly capitalized on the revelation and changed all of their marketing docs to point out the fact that their R3x0 GPUs had twice as many pipelines as NVIDIA's flagship. The community screamed foul play and NVIDIA was chastised, but the ironic part of it all was that the majority of the stones that were thrown were based on poor information.
In fact, the quality of information that has been coming from both ATI and NVIDIA in recent history has deteriorated significantly. Whereas companies like AMD and Intel are very forthcoming in the details of their microprocessor architectures, ATI and NVIDIA are very cryptic when they discuss their GPUs. The matter is further complicated by the introduction of marketing terms like "vertex engines" and referring to some parts of the GPU as a "pipeline" and others not when they both actually are "pipelines."
Now that GPUs are becoming much more like CPUs it is important that we understand the details of their architecture much like we do CPUs. You will find discussions in our forums revolving around the Pentium 4's 20 stage pipeline, but the closest parallel in the graphics discussions are about counting pixels.
We can understand why both ATI and NVIDIA are much less forthcoming with information than their counterparts in the CPU industry; remember that a new microarchitecture is introduced every five years in the CPU world, whereas the same occurs in the GPU world every 6 - 12 months. ATI and NVIDIA have to be very protective of their intellectual property as revealing too much could result in one of their innovations being found in a competitor's product 6 months down the road.
With that said, with this article we decided to dive in a little deeper into the GPU and begin drawing some parallels to what we know from our experience with CPUs. If you're not interested in learning how these GPUs work feel free to skip right ahead, otherwise grab some popcorn.
19 Comments
View All Comments
hivix - Wednesday, December 26, 2018 - link
hello this gonna works for you https://gamescraft.co/gamescraft-org-ml/seoshouts - Saturday, December 29, 2018 - link
Nvidia is the best graphic card i have used among.https://www.linkedin.com/company/orlando-seo-consu...
johncena2018 - Sunday, January 13, 2019 - link
https://freemusicallyfollowers.club/saber2 - Monday, February 25, 2019 - link
can I use it to Fortnite ? https://star2fut.fr/achat-v-bucks-fortnite-pas-che...don't sure
happywheelspace - Monday, June 3, 2019 - link
Great to know about this. Worth reading article, https://happywheelspace.com/yeeeeman - Tuesday, August 6, 2019 - link
Hi. Can you please update the graphs on these articles, since they don't show up anymore?spaces - Friday, February 28, 2020 - link
It's worth knowing it, thanks for sharing https://geometrydashspace.com/NewCM - Sunday, September 20, 2020 - link
So we can see here for soemthing important.https://enoot.eu/