No matter whether we've got a low end or high end system, we all expect the realtime 3D revolution to continue until we achieve near parity with reality. The push forward is backed by many factors including pure hardware performance and brilliant advances in techniques for better approximating what we see. But there's another side to the equation beyond just hardware and developers: there is the graphics API.
Unlike CPUs, graphics hardware (GPUs) do not have a common instruction set upon which tools and software can be built. In order to get the power of the hardware out to the public, we need a common interface that works no matter what GPU is underneath. It's left to the graphics hardware designer to take the code generated by this application programming interface (API) and translate it into something that their chip can use. Because it's the developer's single point of contact, the graphics API is incredibly important. It defines how much flexibility programmers have in using hardware and shapes the world of high performance realtime 3D graphics.
Some of the key work done through the graphics API is taking descriptions of 3D objects in a 3D world, sending those objects and other resources to the hardware, and then telling the hardware what to do with them. There is sort of a step by step process that needs to be followed that we generally call a pipeline. Graphics API pipelines have stages where different work is done. Here's the general structure of a 3D graphics pipeline:
First vertex data (information about the position of the corners of shapes) is taken in and processed. Then those shapes can then be further manipulated and re-processed if needed. After this, 3D objects are broken down from 3D shapes by projecting them into 2D fragments called pixels (this step is called rasterization), and then these pixels are each processed by looking up texture information and using lighting techniques and so on. When pixels are finished processing, they are output and displayed on the screen. And that's the mile high overview of how 3D graphics work.
For the past dozen years (it seems longer doesn't it?), we've seen makers of 3D graphics hardware accelerate two very prominent APIs: OpenGL and DirectX.
We recently touched on advancements tangential to OpenGL in our OpenCL article, but today our focus will be on DirectX. Microsoft's DirectX graphics API is much more heavily used in game engines than OpenGL, in a large part because DirectX tends to move much more quickly and sets the bar for both the hardware and DirectX in terms of feature set and flexibility. That always makes upcoming versions of DirectX exciting to talk about: they define the future capabilities of hardware and expose improved tools to developers. Upcoming DirectX versions are glimpses into our graphical future. Currently we have a lot of DirectX 9 and DirectX 10 games available and in development, but DirectX 11 looms on the horizon.
As usual, Microsoft will be trying to time the release of their next DirectX revision with the release of compatible graphics hardware. As with last time, DirectX 11 will also be released with Windows 7. With the Windows 7 Beta already under way, we expect the OS to be done some time this year.
Microsoft has been rather aggressive with Windows 7 scheduling in light of the rejection of Vista, so it appears they are stepping up to the plate to get everything out sooner rather than later. There was a little more than 4 years between the release of DirectX 9 and DirectX 10. As it hit the streets with Vista in January of 2007, DirectX 10 has just turned 2 and we are already anticipating it's replacement in the very near future. As we will learn, this speedy transition should be very good for DirectX 11 adoption as DirectX 10 hasn't even become pervasive yet: many games are still DirectX 9 only.
But let's take a closer look at what we are talking about before we go any further.
109 Comments
View All Comments
DerekWilson - Saturday, January 31, 2009 - link
Hi, thanks for the feedback ... I've already talked the vista issue to death elsewhere in these comments, so I'll skip that, but ...3) You are right that to get the most out of DX10 you need renderer designed for DX10 not ported from DX9. At the same time, there are things that can be done to make DX9 stuff faster by using DX10 capabilities that don't require an engine rewrite. Yes this also requires development time, but it seems developers have opted to put time into adding effects with DX10 rather than increasing framerate. I'm not disappointed with that direction, but performance was an option.
I agree with what you said about OGL.
4) I know it's not going to happen, but it didn't need to not be possible. MS could have designed the API to expose new hardware featuers without requiring the driver model change. DX9 can run on XP or Vista's new driver model for example. They chose to make it so that it was impossible to back-port rather than designing DX10 (and subsequent versions) to be tied to the driver model. OpenGL exposes most of the featuers of DX10 to WinXP and all of the things that are interesting to graphics programmers).
5) You are right -- it's not required to support multithreading but it is required if you want a performance benefit from that multithreading.
bobvodka - Saturday, January 31, 2009 - link
Well, to be fair, many of my comments were directed at others in the thread :)4) The thing is, OpenGL and DX9, specifically D3D9, live in different places. D3D9 had alot more kernel side code which caused expensive switches when issuing certain commands; this is why D3D9 got all that instance draw stuff and OpenGL didnt, because on small batch sizes OpenGL could be between 2.3x and 1.4x quicker at executing the draw call then D3D9. OpenGL sits the otherside of the kernel calls and gives the implimenters more control on when that switch occures. D3D10 also sits the other side, again not wasting that time.
There were also changes in the resource model, the driver model and various other areas; some of which were to make the Vista windowing system possible. This resource model and everything about it was very much tied to D3D10 and how it does things. OpenGL's resource model was also fundamentally different to the D3D9 model, with again the implimenter having alot more control; you never suffered a 'lost device' in OpenGL for example and the runtime automatically controlled allocated memory unlike in D3D9. There are some things OpenGL could do which D3D9 couldn't as well; such as on-card async memory copies and render-to-vertex buffer (well, ATI had a hack for it but the OpenGL method was cross-hardware).
Could they have done it? Well, of course they could have done but unlike previous DX updates it would have taken alot more effort, time and money. All for a, at the time, 5 year old OS they were hoping to begin getting rid of.
Personally, I think this was a good move all in all, the problem was with how it was presented to the general masses who suddenly saw they had to pay a few 100 USD for a new DX version.
Dribble - Saturday, January 31, 2009 - link
The reason all our games are DX9c are because that's what the consoles support (yes I know PS3 uses open GL, but it has 9c feature support - basically has 7800GTX in it).Most games are made for console as well as PC, the majority use some cross platform renderer (e.g. unreal 3 engine) and that will support DX9c. Cross platform DX support won't change until the Xbox 720 and PS4 arrive.
I don't see how DX11 will change this?
DerekWilson - Saturday, January 31, 2009 - link
That's a really good point and something I should have considered.I think the gap in performance between the PC and consoles will so heavily favor PCs that it will inspire developers to once again shift their focus to the PC. I could be wrong though.
ssj4Gogeta - Sunday, February 1, 2009 - link
That will change if Microsoft choose Larrabee for XBox 720. Cause then it will support all the DirectX (even unreleased) versions and all the OpenGL versions. If MS chooses Larrabee, and it also becomes popular for PC gaming, we PC gamers may see a huge benefit because then the games will be built for the latest DX version.bobvodka - Sunday, February 1, 2009 - link
It's all well and good saying that but Larrabee is currently utterly unproven technology.Don't get me wrong, I'd like to see it do well if only because 3 players in the GPU race will be better than 2 from a technology and consumer stand point, however everyone seems to be pinning their hopes on this technology when there hasn't even been a working demo of DX9 at the same speed as NV/AMD, never mind DX10 or DX11.
ssj4Gogeta - Sunday, February 1, 2009 - link
You're right. But I'm really excited. It would be so nice if Intel can really pull off such feat. we'll have 2 TFLOPS of general purpose parallel processing power!bobvodka - Saturday, January 31, 2009 - link
The problem is, the consoles is where the money is. Combine that with a fixed hardware platform (well, hard drives and different screen sizes not withstanding) it makes for a much easier time for devs.piroroadkill - Saturday, January 31, 2009 - link
That's definitely a good point. Any game engine these days has to support the major consoles for it to be successful - even if that only means the 360, you're still hamstrung by DirectX9, regardless of whether XP has it or not.From a personal point of view I'm still running XP because I run a clusterfuck of graphics cards that vista would shit the bed thinking about.
William Gaatjes - Saturday, January 31, 2009 - link
"Many under-the-hood enhancements mean higher performance for features available but less used under DX10. "I must be remembering it wrong but the same thing was sad about dx10. However it turned out to be nothing more then getting people to buy vista. I wonder how much will come true this time.