DirectX 10
Visual changes aside, there are numerous changes under the hood of Vista, and for much of our audience, DirectX 10 will be the biggest of such changes. DirectX 10 has enjoyed an odd place recently in what amounts to computer mythology, as it has been in development for several years now while Microsoft has extended DirectX 9 to accommodate new technologies. Previously, Microsoft was doing pretty good at providing yearly updates to DirectX.
Unlike the previous iterations of DirectX, 10 will be launched in a different manner due to all the changes to the operating system needed to support it. Of greatest importance, DirectX 10 will be a Vista-only feature; Microsoft will not be backporting it to XP. DirectX 10 will not only include support for new hardware features, but relies on some significant changes Microsoft is making to how Windows treats GPUs and interfaces with them, requiring the clean break. This may pose a problem for users that want to upgrade their hardware without upgrading their OS. It is likely that driver support will allow for DX9 compatibility, while new feature support could easily be added through OpenGL caps, but the exact steps ATI and NVIDIA will take to keep everyone happy will have to unfold over time.
There seems to be some misunderstanding in the community that DX9 hardware will not run with DirectX 10 installed or with games designed using DirectX 10. It has been a while, but this transition (under Vista) will be no different to the end user than the transition to DirectX 8 and 9, where users with older DirectX 7 hardware could still install and play most DX 8/9 games, only without the pixel or vertex shaders. New games which use DirectX 10 under Vista while running on older DX9 hardware will be able to gracefully fall back to the proper level of support. We've only recently begun to see games come out that refuse to run on DX8 level hardware, and it isn't likely we will see DX10-only games for several more years. Upgrading to Vista and DX10 won't absolutely require a hardware upgrade. The benefit comes in the advanced features made possible.
While we'll have more on the new hardware features supported by DirectX 10 later this year, we can talk a bit about what we know now. DirectX 10 will be bringing support for a new type of shader, the geometry shader, which allows for the modification of triangles in the middle of rendering at certain stages. Microsoft will also be implementing some technology from the Xbox 360, enabling the practical use of unified shaders like we've seen on ATI's Xenos GPU for the 360. Although DirectX 10 compliance does not require unified hardware shaders, the driver interface will be unified. This should make things easier for software developers, while at the same time allowing hardware designers to approach things in the manner they see best. Pixel and vertex shading will also be receiving some upgrades under the Shader Model 4.0 banner.
DirectX 10 will also be implementing some significant optimizations to the API itself, as the continuous building of DirectX versions upon themselves along with CPU-intensive pre-rendering techniques such as z-culling and hidden surface removal has resulted in a fairly large overhead being put on the CPU. Using their developer tools, ATI has estimated that the total CPU utilization spent working directly on graphics rendering (including overhead) can approach 40% in some situations, which has resulted in some games being CPU limited solely due to this overhead. With these API changes, DX10 should remove a good deal of the overhead, and while it still means that there will be a significant amount of CPU time required for rendering (20% in ATI's case), the 20% savings can be used to ultimately render more or more complex frames. Unfortunately, these API changes will work in tandem with hardware changes to support them, so these benefits will only be available to DirectX 10 class hardware.
The bigger story at the moment with DirectX 10, however, is how it also forms the basis of Microsoft's changes to how Windows will treat and interface with GPUs. With current GPU designs and the associated treatment from the operating system, GPUs are treated as single-access devices; one application is effectively given sovereign access and control of the GPU's 3D capabilities at any given moment. To change which application is utilizing these resources, a very expensive context switch must take place that involves swapping out the resources of the first application for that of the second. This can be clearly seen today when Alt+Tabbing out of a resource intensive game, where it may take several seconds to go in and out of it, and is also part of the reason that some games simply don't allow you to Alt+Tab. Windowed rendering in turn solves some of this problem, but it incurs a very heavy performance hit in some situations, and is otherwise a less than ideal solution.
With the use of full 3D acceleration on the desktop now with Aero, the penalties become even more severe for these context switches, which has driven Microsoft to redesign DirectX and how it interfaces with the GPU. The result of this is a new group of interface standards, which Microsoft is calling the Windows Display Driver Model, which will replace the older XP Display Driver Model used under XP.
The primary change with the first iteration of the WDDM, which is what will be shipping with the release version of Vista, is that Microsoft is starting a multi-year plan to influence hardware design so that Windows can stop treating the GPU as a single-tasking device, and in the inevitable evolution of GPUs towards CPUs, the GPU will become a true multi-tasking device. WDDM 1.0 as a result is largely a clean break from the XP DDM; it is based on what current SM2.0+ GPUs can do, with the majority of the change being what the operating system can do to attempt multitasking and task scheduling with modern hardware. For the most part, the changes brought in WDDM 1.0 will go unnoticed by users, but it will be laying the groundwork for WDDM 2.0.
While Microsoft hasn't completely finalized WDDM 2.0 yet, what we do know at this point is that it will require a new generation of hardware, again likely the forthcoming DirectX 10 class hardware, that will be built from the ground up to multitask and handle true task scheduling. The most immediate benefit from this will be that context switches will be much cheaper, so applications utilizing APIs that work with WDDM2.0 will be able to switch in/out in much less time. The secondary benefit of this will be that when there are multiple applications running that want to use the full 3D features of the GPU, such as Aero and an application like Google Earth, that their performance will be improved due to the faster context switches; at the moment context switches mean that even in a perfectly split load neither application is getting nearly 50% of the GPU time (and thus fall short of their potential performance in a multitasking environment). Even further in the future will be WDDM 2.1, which will be implementing "immediate" context switching. A final benefit is that the operating system should now be able to make much better use of graphics memory, so it is conceivable that even lower-end GPUs with large amounts of memory will have a place in the world.
In the mean time, Microsoft's development of WDDM comes at a cost: NVIDIA and ATI are currently busy building and optimizing their drivers for WDDM 1.0. The result of this is that along with Vista already being a beta operating system, their beta display drivers are in a very early state, resulting in what we will see is very poor gaming performance at the moment.
Visual changes aside, there are numerous changes under the hood of Vista, and for much of our audience, DirectX 10 will be the biggest of such changes. DirectX 10 has enjoyed an odd place recently in what amounts to computer mythology, as it has been in development for several years now while Microsoft has extended DirectX 9 to accommodate new technologies. Previously, Microsoft was doing pretty good at providing yearly updates to DirectX.
Unlike the previous iterations of DirectX, 10 will be launched in a different manner due to all the changes to the operating system needed to support it. Of greatest importance, DirectX 10 will be a Vista-only feature; Microsoft will not be backporting it to XP. DirectX 10 will not only include support for new hardware features, but relies on some significant changes Microsoft is making to how Windows treats GPUs and interfaces with them, requiring the clean break. This may pose a problem for users that want to upgrade their hardware without upgrading their OS. It is likely that driver support will allow for DX9 compatibility, while new feature support could easily be added through OpenGL caps, but the exact steps ATI and NVIDIA will take to keep everyone happy will have to unfold over time.
There seems to be some misunderstanding in the community that DX9 hardware will not run with DirectX 10 installed or with games designed using DirectX 10. It has been a while, but this transition (under Vista) will be no different to the end user than the transition to DirectX 8 and 9, where users with older DirectX 7 hardware could still install and play most DX 8/9 games, only without the pixel or vertex shaders. New games which use DirectX 10 under Vista while running on older DX9 hardware will be able to gracefully fall back to the proper level of support. We've only recently begun to see games come out that refuse to run on DX8 level hardware, and it isn't likely we will see DX10-only games for several more years. Upgrading to Vista and DX10 won't absolutely require a hardware upgrade. The benefit comes in the advanced features made possible.
While we'll have more on the new hardware features supported by DirectX 10 later this year, we can talk a bit about what we know now. DirectX 10 will be bringing support for a new type of shader, the geometry shader, which allows for the modification of triangles in the middle of rendering at certain stages. Microsoft will also be implementing some technology from the Xbox 360, enabling the practical use of unified shaders like we've seen on ATI's Xenos GPU for the 360. Although DirectX 10 compliance does not require unified hardware shaders, the driver interface will be unified. This should make things easier for software developers, while at the same time allowing hardware designers to approach things in the manner they see best. Pixel and vertex shading will also be receiving some upgrades under the Shader Model 4.0 banner.
Click to enlarge |
DirectX 10 will also be implementing some significant optimizations to the API itself, as the continuous building of DirectX versions upon themselves along with CPU-intensive pre-rendering techniques such as z-culling and hidden surface removal has resulted in a fairly large overhead being put on the CPU. Using their developer tools, ATI has estimated that the total CPU utilization spent working directly on graphics rendering (including overhead) can approach 40% in some situations, which has resulted in some games being CPU limited solely due to this overhead. With these API changes, DX10 should remove a good deal of the overhead, and while it still means that there will be a significant amount of CPU time required for rendering (20% in ATI's case), the 20% savings can be used to ultimately render more or more complex frames. Unfortunately, these API changes will work in tandem with hardware changes to support them, so these benefits will only be available to DirectX 10 class hardware.
The bigger story at the moment with DirectX 10, however, is how it also forms the basis of Microsoft's changes to how Windows will treat and interface with GPUs. With current GPU designs and the associated treatment from the operating system, GPUs are treated as single-access devices; one application is effectively given sovereign access and control of the GPU's 3D capabilities at any given moment. To change which application is utilizing these resources, a very expensive context switch must take place that involves swapping out the resources of the first application for that of the second. This can be clearly seen today when Alt+Tabbing out of a resource intensive game, where it may take several seconds to go in and out of it, and is also part of the reason that some games simply don't allow you to Alt+Tab. Windowed rendering in turn solves some of this problem, but it incurs a very heavy performance hit in some situations, and is otherwise a less than ideal solution.
With the use of full 3D acceleration on the desktop now with Aero, the penalties become even more severe for these context switches, which has driven Microsoft to redesign DirectX and how it interfaces with the GPU. The result of this is a new group of interface standards, which Microsoft is calling the Windows Display Driver Model, which will replace the older XP Display Driver Model used under XP.
The primary change with the first iteration of the WDDM, which is what will be shipping with the release version of Vista, is that Microsoft is starting a multi-year plan to influence hardware design so that Windows can stop treating the GPU as a single-tasking device, and in the inevitable evolution of GPUs towards CPUs, the GPU will become a true multi-tasking device. WDDM 1.0 as a result is largely a clean break from the XP DDM; it is based on what current SM2.0+ GPUs can do, with the majority of the change being what the operating system can do to attempt multitasking and task scheduling with modern hardware. For the most part, the changes brought in WDDM 1.0 will go unnoticed by users, but it will be laying the groundwork for WDDM 2.0.
While Microsoft hasn't completely finalized WDDM 2.0 yet, what we do know at this point is that it will require a new generation of hardware, again likely the forthcoming DirectX 10 class hardware, that will be built from the ground up to multitask and handle true task scheduling. The most immediate benefit from this will be that context switches will be much cheaper, so applications utilizing APIs that work with WDDM2.0 will be able to switch in/out in much less time. The secondary benefit of this will be that when there are multiple applications running that want to use the full 3D features of the GPU, such as Aero and an application like Google Earth, that their performance will be improved due to the faster context switches; at the moment context switches mean that even in a perfectly split load neither application is getting nearly 50% of the GPU time (and thus fall short of their potential performance in a multitasking environment). Even further in the future will be WDDM 2.1, which will be implementing "immediate" context switching. A final benefit is that the operating system should now be able to make much better use of graphics memory, so it is conceivable that even lower-end GPUs with large amounts of memory will have a place in the world.
In the mean time, Microsoft's development of WDDM comes at a cost: NVIDIA and ATI are currently busy building and optimizing their drivers for WDDM 1.0. The result of this is that along with Vista already being a beta operating system, their beta display drivers are in a very early state, resulting in what we will see is very poor gaming performance at the moment.
75 Comments
View All Comments
Pirks - Friday, June 16, 2006 - link
http://www.macworld.com/2006/05/reviews/osxfirewal...">http://www.macworld.com/2006/05/reviews/osxfirewal..."The emphasis is on incoming. As it ships from Apple, the firewall does not monitor traffic that may be originating from your own computer. If your Mac gets possessed by a malware application that then attempts to attack or infect other computers via your Internet connection (a not-uncommon trick), OS X’s firewall won’t, by default, pay any attention. And, there’s no way to change this default setting from your System Preferences. To force the firewall to monitor outbound traffic, you must use Terminal’s command-line interface."
See - IT CAN monitor and block outbound traffic, contrary to what you say. It's just a matter of configuring it properly. You should at least correct your article and stop saying OSX ipfw CAN'T track outbound connections. You can say this: it's SET UP not to monitor outbound connections BY DEFAULT but anyone can CONFIGURE it to monitor outbound connections either through third party GUI like Flying Buttress or via command line. Then you won't look like a liar to any Mac guy who cares to read your review. Excuse me, what? How about this then:
Ryan Smith - Friday, June 16, 2006 - link
I see your point, but I believe there's nothing in the article that needs changing. Tiger's firewall can't block outbound connections without having to drop to the terminal to muck with IPFW, I do not classify that as an ability any more than I classify Vista x64 as being amateur driver programmer friendly(since you need to drop to the terminal to turn off the x64 integrity check). When a version of Mac OS X ships with a proper GUI for controlling outbound firewalling(as is the Apple way), then it will be capable by a reasonable definition. Right now it's nothing more than a quirk that results from using the BSD base.Pirks - Friday, June 16, 2006 - link
OK, got your point, agreed, issue closed. Thanks :) Excellent point! So, when (and if) Mac OS X will see its share of virii and malware, THEN Apple will incorporate outbound connection settings in OS X GUI - right now it's not needed by Mac users, and the rare exceptions are easily treated with third party apps and command line.
bjtags - Friday, June 16, 2006 - link
Vista x64I have been pounding on it for 4 days never crash or even farted once!!!
Have all HalfLife 2 and CS running Just Great!!!
Had at one time 10 IE windows open, MediaPlayer, Steam updating, download driver,
updating windows drivers, and 3 folder explorer windows open, and tranfering
4gig movie to HD!!!
Still ran fine... I do have AMD 4800 x2 with 2gigs...
Poser - Friday, June 16, 2006 - link
Two questions:1. What's the ship date for Vista supposed to be? Q4 of 2006?
2. I seem to remember that speech recognition would be included and integrated with Vista. Is it considered too much of a niche toy to even mention, not considered to be part of the OS, or am I just plain wrong about it's inclusion?
It was a extremely well written article. Very nice job.
Ryan Smith - Friday, June 16, 2006 - link
1. Expected completion is Q4 with some business customers getting access to the final version at that time. It won't be released to the public until 2007 however.2. You're right, speech recognition is included. You're also right in that given the amount of stuff we had to cover in one article it was too much of a niche; voice recognition so far is still too immature to replace typing.
ashay - Friday, June 16, 2006 - link
"Dogfooding" is when a company uses their own new product (not necessarily beta) for internal use.(maybe even in critical production systems).Term comes from "eat your own dog-food". Meaning if you're a dog food maker, the CEO and execs eat the stuff. If they like it they dogs hopefully will.
http://en.wikipedia.org/wiki/Eat_one%27s_own_dog_f...">Wikipedia link
fishbits - Friday, June 16, 2006 - link
Yes, I know it's still beta, we'll see. The UAC and signed drivers schemes sound like they'll be flops right out of the gate. Average user will quickly realize he can't install or use anything until he adopts a "just click 'Yes'" attitude, which will reward him with a functioning device/running program. I've lost count of how many drivers I've installed under XP that were for name-brand devices, yet didn't have the official seal of approval on them. Again, get trained to "just click 'Yes'" in order to be able to do anything useful. Without better information given to the user at this decision point, all the scheme does is add a few mouse-clicks and no security. Like when you install a program and your security suite gives a "helpful" warning like "INeedToRun.exe is trying to access feccflm.dll ... no recommendation."As expected, it looks like the productivity gains of GPU-acceleration were immediately swallowed up by GUI overhead. Whee! "The users can solve this through future hardware upgrades." Gotcha. For what it's worth, the gadgets/widgets look needlessly large and ugly, especially for simply displaying things like time, cpu temp/usage. Then it sounds like we're going to have resource-hungry programs getting starved because of GPU sharing, or will have an arms-race of workarounds to get their hands on the power they think they need.
Ah well, I've got to move to 64-bit for RAM purposes relatively soon. Think I'll wait a year or two after Vista 64 to let it get stable, faster, and better supported. Then hopefully the programs I'll need to upgrade can be purchased along the lines of a normal upgrade cycle. Games I'm actually not as worried about, as I expect XP/DX9 support to continue for a decent bit and will retain an XP box and install Vista on a brand new one when the time comes.
shamgar03 - Friday, June 16, 2006 - link
I really hope that will mean for BETTER GPU performance not worse. I would really just like to be able to boot into a game only environment where you have something like a grub interface to pick games and it only loads the needed stuff for the game.darkdemyze - Friday, June 16, 2006 - link
beta implies "still in developement". chances are very high performance will see an increase by the time of release. I agree with your seconds statement though.