DirectX 10
Visual changes aside, there are numerous changes under the hood of Vista, and for much of our audience, DirectX 10 will be the biggest of such changes. DirectX 10 has enjoyed an odd place recently in what amounts to computer mythology, as it has been in development for several years now while Microsoft has extended DirectX 9 to accommodate new technologies. Previously, Microsoft was doing pretty good at providing yearly updates to DirectX.
Unlike the previous iterations of DirectX, 10 will be launched in a different manner due to all the changes to the operating system needed to support it. Of greatest importance, DirectX 10 will be a Vista-only feature; Microsoft will not be backporting it to XP. DirectX 10 will not only include support for new hardware features, but relies on some significant changes Microsoft is making to how Windows treats GPUs and interfaces with them, requiring the clean break. This may pose a problem for users that want to upgrade their hardware without upgrading their OS. It is likely that driver support will allow for DX9 compatibility, while new feature support could easily be added through OpenGL caps, but the exact steps ATI and NVIDIA will take to keep everyone happy will have to unfold over time.
There seems to be some misunderstanding in the community that DX9 hardware will not run with DirectX 10 installed or with games designed using DirectX 10. It has been a while, but this transition (under Vista) will be no different to the end user than the transition to DirectX 8 and 9, where users with older DirectX 7 hardware could still install and play most DX 8/9 games, only without the pixel or vertex shaders. New games which use DirectX 10 under Vista while running on older DX9 hardware will be able to gracefully fall back to the proper level of support. We've only recently begun to see games come out that refuse to run on DX8 level hardware, and it isn't likely we will see DX10-only games for several more years. Upgrading to Vista and DX10 won't absolutely require a hardware upgrade. The benefit comes in the advanced features made possible.
While we'll have more on the new hardware features supported by DirectX 10 later this year, we can talk a bit about what we know now. DirectX 10 will be bringing support for a new type of shader, the geometry shader, which allows for the modification of triangles in the middle of rendering at certain stages. Microsoft will also be implementing some technology from the Xbox 360, enabling the practical use of unified shaders like we've seen on ATI's Xenos GPU for the 360. Although DirectX 10 compliance does not require unified hardware shaders, the driver interface will be unified. This should make things easier for software developers, while at the same time allowing hardware designers to approach things in the manner they see best. Pixel and vertex shading will also be receiving some upgrades under the Shader Model 4.0 banner.
DirectX 10 will also be implementing some significant optimizations to the API itself, as the continuous building of DirectX versions upon themselves along with CPU-intensive pre-rendering techniques such as z-culling and hidden surface removal has resulted in a fairly large overhead being put on the CPU. Using their developer tools, ATI has estimated that the total CPU utilization spent working directly on graphics rendering (including overhead) can approach 40% in some situations, which has resulted in some games being CPU limited solely due to this overhead. With these API changes, DX10 should remove a good deal of the overhead, and while it still means that there will be a significant amount of CPU time required for rendering (20% in ATI's case), the 20% savings can be used to ultimately render more or more complex frames. Unfortunately, these API changes will work in tandem with hardware changes to support them, so these benefits will only be available to DirectX 10 class hardware.
The bigger story at the moment with DirectX 10, however, is how it also forms the basis of Microsoft's changes to how Windows will treat and interface with GPUs. With current GPU designs and the associated treatment from the operating system, GPUs are treated as single-access devices; one application is effectively given sovereign access and control of the GPU's 3D capabilities at any given moment. To change which application is utilizing these resources, a very expensive context switch must take place that involves swapping out the resources of the first application for that of the second. This can be clearly seen today when Alt+Tabbing out of a resource intensive game, where it may take several seconds to go in and out of it, and is also part of the reason that some games simply don't allow you to Alt+Tab. Windowed rendering in turn solves some of this problem, but it incurs a very heavy performance hit in some situations, and is otherwise a less than ideal solution.
With the use of full 3D acceleration on the desktop now with Aero, the penalties become even more severe for these context switches, which has driven Microsoft to redesign DirectX and how it interfaces with the GPU. The result of this is a new group of interface standards, which Microsoft is calling the Windows Display Driver Model, which will replace the older XP Display Driver Model used under XP.
The primary change with the first iteration of the WDDM, which is what will be shipping with the release version of Vista, is that Microsoft is starting a multi-year plan to influence hardware design so that Windows can stop treating the GPU as a single-tasking device, and in the inevitable evolution of GPUs towards CPUs, the GPU will become a true multi-tasking device. WDDM 1.0 as a result is largely a clean break from the XP DDM; it is based on what current SM2.0+ GPUs can do, with the majority of the change being what the operating system can do to attempt multitasking and task scheduling with modern hardware. For the most part, the changes brought in WDDM 1.0 will go unnoticed by users, but it will be laying the groundwork for WDDM 2.0.
While Microsoft hasn't completely finalized WDDM 2.0 yet, what we do know at this point is that it will require a new generation of hardware, again likely the forthcoming DirectX 10 class hardware, that will be built from the ground up to multitask and handle true task scheduling. The most immediate benefit from this will be that context switches will be much cheaper, so applications utilizing APIs that work with WDDM2.0 will be able to switch in/out in much less time. The secondary benefit of this will be that when there are multiple applications running that want to use the full 3D features of the GPU, such as Aero and an application like Google Earth, that their performance will be improved due to the faster context switches; at the moment context switches mean that even in a perfectly split load neither application is getting nearly 50% of the GPU time (and thus fall short of their potential performance in a multitasking environment). Even further in the future will be WDDM 2.1, which will be implementing "immediate" context switching. A final benefit is that the operating system should now be able to make much better use of graphics memory, so it is conceivable that even lower-end GPUs with large amounts of memory will have a place in the world.
In the mean time, Microsoft's development of WDDM comes at a cost: NVIDIA and ATI are currently busy building and optimizing their drivers for WDDM 1.0. The result of this is that along with Vista already being a beta operating system, their beta display drivers are in a very early state, resulting in what we will see is very poor gaming performance at the moment.
Visual changes aside, there are numerous changes under the hood of Vista, and for much of our audience, DirectX 10 will be the biggest of such changes. DirectX 10 has enjoyed an odd place recently in what amounts to computer mythology, as it has been in development for several years now while Microsoft has extended DirectX 9 to accommodate new technologies. Previously, Microsoft was doing pretty good at providing yearly updates to DirectX.
Unlike the previous iterations of DirectX, 10 will be launched in a different manner due to all the changes to the operating system needed to support it. Of greatest importance, DirectX 10 will be a Vista-only feature; Microsoft will not be backporting it to XP. DirectX 10 will not only include support for new hardware features, but relies on some significant changes Microsoft is making to how Windows treats GPUs and interfaces with them, requiring the clean break. This may pose a problem for users that want to upgrade their hardware without upgrading their OS. It is likely that driver support will allow for DX9 compatibility, while new feature support could easily be added through OpenGL caps, but the exact steps ATI and NVIDIA will take to keep everyone happy will have to unfold over time.
There seems to be some misunderstanding in the community that DX9 hardware will not run with DirectX 10 installed or with games designed using DirectX 10. It has been a while, but this transition (under Vista) will be no different to the end user than the transition to DirectX 8 and 9, where users with older DirectX 7 hardware could still install and play most DX 8/9 games, only without the pixel or vertex shaders. New games which use DirectX 10 under Vista while running on older DX9 hardware will be able to gracefully fall back to the proper level of support. We've only recently begun to see games come out that refuse to run on DX8 level hardware, and it isn't likely we will see DX10-only games for several more years. Upgrading to Vista and DX10 won't absolutely require a hardware upgrade. The benefit comes in the advanced features made possible.
While we'll have more on the new hardware features supported by DirectX 10 later this year, we can talk a bit about what we know now. DirectX 10 will be bringing support for a new type of shader, the geometry shader, which allows for the modification of triangles in the middle of rendering at certain stages. Microsoft will also be implementing some technology from the Xbox 360, enabling the practical use of unified shaders like we've seen on ATI's Xenos GPU for the 360. Although DirectX 10 compliance does not require unified hardware shaders, the driver interface will be unified. This should make things easier for software developers, while at the same time allowing hardware designers to approach things in the manner they see best. Pixel and vertex shading will also be receiving some upgrades under the Shader Model 4.0 banner.
Click to enlarge |
DirectX 10 will also be implementing some significant optimizations to the API itself, as the continuous building of DirectX versions upon themselves along with CPU-intensive pre-rendering techniques such as z-culling and hidden surface removal has resulted in a fairly large overhead being put on the CPU. Using their developer tools, ATI has estimated that the total CPU utilization spent working directly on graphics rendering (including overhead) can approach 40% in some situations, which has resulted in some games being CPU limited solely due to this overhead. With these API changes, DX10 should remove a good deal of the overhead, and while it still means that there will be a significant amount of CPU time required for rendering (20% in ATI's case), the 20% savings can be used to ultimately render more or more complex frames. Unfortunately, these API changes will work in tandem with hardware changes to support them, so these benefits will only be available to DirectX 10 class hardware.
The bigger story at the moment with DirectX 10, however, is how it also forms the basis of Microsoft's changes to how Windows will treat and interface with GPUs. With current GPU designs and the associated treatment from the operating system, GPUs are treated as single-access devices; one application is effectively given sovereign access and control of the GPU's 3D capabilities at any given moment. To change which application is utilizing these resources, a very expensive context switch must take place that involves swapping out the resources of the first application for that of the second. This can be clearly seen today when Alt+Tabbing out of a resource intensive game, where it may take several seconds to go in and out of it, and is also part of the reason that some games simply don't allow you to Alt+Tab. Windowed rendering in turn solves some of this problem, but it incurs a very heavy performance hit in some situations, and is otherwise a less than ideal solution.
With the use of full 3D acceleration on the desktop now with Aero, the penalties become even more severe for these context switches, which has driven Microsoft to redesign DirectX and how it interfaces with the GPU. The result of this is a new group of interface standards, which Microsoft is calling the Windows Display Driver Model, which will replace the older XP Display Driver Model used under XP.
The primary change with the first iteration of the WDDM, which is what will be shipping with the release version of Vista, is that Microsoft is starting a multi-year plan to influence hardware design so that Windows can stop treating the GPU as a single-tasking device, and in the inevitable evolution of GPUs towards CPUs, the GPU will become a true multi-tasking device. WDDM 1.0 as a result is largely a clean break from the XP DDM; it is based on what current SM2.0+ GPUs can do, with the majority of the change being what the operating system can do to attempt multitasking and task scheduling with modern hardware. For the most part, the changes brought in WDDM 1.0 will go unnoticed by users, but it will be laying the groundwork for WDDM 2.0.
While Microsoft hasn't completely finalized WDDM 2.0 yet, what we do know at this point is that it will require a new generation of hardware, again likely the forthcoming DirectX 10 class hardware, that will be built from the ground up to multitask and handle true task scheduling. The most immediate benefit from this will be that context switches will be much cheaper, so applications utilizing APIs that work with WDDM2.0 will be able to switch in/out in much less time. The secondary benefit of this will be that when there are multiple applications running that want to use the full 3D features of the GPU, such as Aero and an application like Google Earth, that their performance will be improved due to the faster context switches; at the moment context switches mean that even in a perfectly split load neither application is getting nearly 50% of the GPU time (and thus fall short of their potential performance in a multitasking environment). Even further in the future will be WDDM 2.1, which will be implementing "immediate" context switching. A final benefit is that the operating system should now be able to make much better use of graphics memory, so it is conceivable that even lower-end GPUs with large amounts of memory will have a place in the world.
In the mean time, Microsoft's development of WDDM comes at a cost: NVIDIA and ATI are currently busy building and optimizing their drivers for WDDM 1.0. The result of this is that along with Vista already being a beta operating system, their beta display drivers are in a very early state, resulting in what we will see is very poor gaming performance at the moment.
75 Comments
View All Comments
shamgar03 - Friday, June 16, 2006 - link
"3) final verdict? same as it ever was -- i'll be running vista for games and linux for programming. and since i've recently been bitten by the switch bug, os x for everything else."Ditto
darkdemyze - Friday, June 16, 2006 - link
Personally I'm excited to see where Vista is going. But I still myself in the same position as stated above ^CSMR - Friday, June 16, 2006 - link
What has OS got to do with programming?Pirks - Friday, June 16, 2006 - link
The guy's obviously coding some Linux stuff - do you want him to code stuff in cygwin on Vista? I don't think he's THIS kind of pervert, now is he? :))fikimiki - Friday, June 16, 2006 - link
AMD will replace Intel, Linux is going to replace Windows.Microsoft is close to death, Bill is gone, Ballmer is crazy.
They are going to make this system usable with SP3 working on Athlon64 16000+ (which is just 4x4000) acting as a fast turtle....
Pirks - Friday, June 16, 2006 - link
and Mesa 3D is going to replace DX10 - woohoo man keep this stuff coming, you're doin' great :))stash - Friday, June 16, 2006 - link
lol close to death, and yet they somehow find a way to ring up a billion (with a B) dollars in profit every single MONTH.Xenoid - Friday, June 16, 2006 - link
BEWARE THE MAN WITH THE TINFOIL HATTHE WORLD IS ENDING!
darkdemyze - Friday, June 16, 2006 - link
lol gg.I think some people need to get a grip..
sprockkets - Friday, June 16, 2006 - link
1. Nice fade into the desktop.2. I'm sure they'll make new sounds and music, otherwise, it sadly is a new UI with and old annoying XP theme.
3. Still can't use anything other than .wav for sounds? Why?
4. Everything is all over the place, yet the classics are still there if you need it.
5. Finally, an all GUI installer. Welcome to the rest of the world haha.
6. Instead of asking for permission all the time, why not allow the control panel to open, then ask, then do not ask again when using anything in it?
7. Like mentioned, why make it so hard to hide the turn off button? Stupid.
8. It will take getting used to. Might as well switch to a Mac or even Linux, because you will be spending effort to get used to the differences. "Where is the start menu? No display properties? OK, it is personalize. Where did all the usual menus go? "
9. Funny, doing a file download in IE7 shows a nice progress bar, with the old as hell earth graphic with the flying piece of paper into the folder with the little red crash mark. Couldn't think of anything to replace it, or feeling nostolgic?
10. Major annoyances gone with fresh new ones.
11. Usual Microsoft behavior: Change for the sake of change (that damn power button!)
Other thoughts: Yeah, OSX officially runs on x86 hardware, as long as it has an Apple logo on it. We did it to not have to worry about drivers and such. Yeah, as if you don't both have the same Intel chipset to support.
Sometimes in Xp you cannot burn unless you are an Admin. I couldn't even run Asus Probe for whatever reason, and all it does is check for temps and such.
Is Expose the same as the new compiz and XGL?