Drilling Down: DX11 And The Multi-Threaded Game Engine
In spite of the fact that multi-threaded programming has been around for decades, mainstream programmers didn't start focusing on parallel programming until multi-core CPUs started coming along. Much general purpose code is straightforward as a single thread; extracting performance via parallel programming can be difficult and isn't always obvious. Even with talented programmers, Amdahl's Law is a bitch: your speed up from parallelization is limited by the percent of code that is necessarily sequential.
Currently, in game development, rendering is one of those "necessarily" sequential tasks. DirectX 10 isn't set up to appropriately handle multiple threads all throwing commands at the GPU. That doesn't mean parallelization of renderers can't happen, but it does limit speed up because costly synchronization techniques or management threads need to be implemented in order to make sure nothing steps out of line. All this limits the benefit of parallelization and discourages programmers from trying too hard. After all, it's a better idea to put more of your effort into areas where performance can be improved more significantly. (John Carmack put it really well once, but I can't remember the quote... and I'm doing too much benchmarking to go look for it now. :-P)
No matter what anyone does, some stuff in the renderer will need to be sequential. Programs, textures, and resources must be loaded up; geometry happens before pixel processing; draw calls intended to be executed while a certain state is active must have that state set first and not changed until completion. Even in such a massively parallel machine, order must be maintained for many things. But order doesn't always matter.
Making more things thread-safe through an extended device interface using multiple contexts and making a lot of synchronization overhead the responsibility of the API and/or graphics driver, Microsoft has enabled game developers to more easily and effortlessly thread not only their rendering code, but their game code as well. These things will also work on DX10 hardware running on a system with DX11, though some missing hardware optimizations will reduce the performance benefit. But the fundamental ability to write code differently will go a long way to getting programmers more used to and better at parallelization. Let's take a look at the tools available to accomplish this in DX11.
First up is free threaded asynchronous resource loading. That's a bit of a mouthful, but this feature gives developers the ability to upload programs, textures, state objects, and all resources in a thread-safe way and, if desired, concurrent with the rendering process. This doesn't mean that all this stuff will get pushed up in parallel with rendering, as the driver will manage what gets sent to the GPU and when based on priority, but it does mean the developer no longer has to think about synchronizing or manually prioritizing resource loading. Multiple threads can start loading whatever resources they need whenever they need them. The fact that this can also be done concurrently with rendering could improve performance for games that stream in data for massive open worlds in addition to enabling multi-threaded opportunities.
In order to enable this and other threading, the D3D device interface is now split into three separate interfaces: the Device, the Immediate Context, and the Deferred Context. Resource creation is done through the Device. The Immediate Context is the interface for setting device state, draw calls, and queries. There can only be one Device and one Immediate Context. The Deferred Context is another interface for state and draw calls, but many can exist in one program and can be used as the per-thread interface (Deferred Contexts themselves are thread unsafe though). Deferred Contexts and the free threaded resource creation through the device are where DX11 gets it multi-threaded benefit.
Multiple threads submit state and draw calls to their Deferred Context which complies a display list that is eventually executed by the Immediate Context. Games will still need a render thread, and this thread will use the Immediate Context to execute state and draw calls and to consume the display lists generated by Deferred Contexts. In this way, the ultimate destination of all state and draw calls is the Immediate Context, but fine grained synchronization is handled by the API and the display driver so that parallel threads can be better used to contribute to the rendering process. Some limitations on Deferred Contexts include the fact that they cannot query the device and they can't download or read back anything from the GPU. Deferred Contexts can, however, consume the display lists generated by other Deferred Contexts.
The end result of all this is that the future will be more parallel friendly. As two and four core CPUs become more and more popular and 8 and 16 (logical) core CPUs are on the horizon, we need all the help we can get when trying to extract performance from parallelism. This is a good move for DirectX and we hope it will help push game engines to more fully utilize more than two or even four cores when the time comes.
109 Comments
View All Comments
Logikal - Saturday, January 31, 2009 - link
Derek , For the most part I enjoyed your column. Some of it was over my head, but generally it was very informative. I believe its "geek nature" to slander and pull bias in one direction or another regarding certain technology/software/hardware, but honestly What you call "just information" is more personal opinion about your experiences with Vista then anything else. this is evident when you talk about "slow or poor adoption" or "compares to Win ME" like you actually know for sure that was the case. I work for one of Canada's larger computer retail/corporate sales entities, and we had more people buying Vista at a faster pace then any other OS i can remember. the small amount of technical issues that were monumentally blown out of proportion by..maybe...the competition (which you notably mentioned making larger strides in its technology :) caused more "slow adoption" to DX10 then MS's OS and its proportionately smaller problems themselves. I can admit bias, jumping from Windows 2000 directly to Vista, that I haven't had any problems at all. Small ones, mind you (again in proportion to the sheers vastness of what a OS needs to be today) sure. But honestly Vista was a huge step in the right direction, and It did well all things considered. As an end-user, If you had 2 year old hardware, No vista was not going to run functionally. And if you were stuck in the past software wise, or you were backward thinking in that, then no vista wasn't going to work for you. But the step needed to be made, and today we are able to download Windows 7 (in its refined glory) and experience what the vista motif' can be. After all, look how long it took for them to polish up XP ? :)flexy - Saturday, January 31, 2009 - link
Logikal,an opinion is ALWAYS biased.
The interesting thing is that you want to counter his opinion, but you don't really give a solid argument but rather list that "...more people were buying Vista" which hardly can be a serious argument, now in regards to technical details.
I share derek's opinion, i am not "selling" Vista but i work with all kind of OS for a long time already. The comparison to ME is legitimate, IMHO.
And yes, its Geek nature to be able to criticize, compare and see shortcomings where the "common" man (avg customer)might not see any. That's a good thing.
flexy - Saturday, January 31, 2009 - link
that he is so open and just says it.Yes, Vista SUCKS <---
Related to the subject:
I haven't read the entire article (yet)...but tesselatlion reminds me somewhat of a few years ago when we had Radeons 8500 w/ hardware tessellation features. What was it called again? Truform.
Sadly, in practice this feature rather flopped.
ltcommanderdata - Saturday, January 31, 2009 - link
On the issue of OpenCL's potential in games being limited by the availability of DX11 Compute Shaders, I think one way to get OpenCL incorporated into games is to rewrite the Havok physics engine in OpenCL. For one thing, OpenCL was developed to not only run on GPUs, but also CPUs, so the Havok engine can be GPU-accelerated on newer hardware with a software CPU fallback for older GPUs all with the same codebase. I'm not sure if DX11's Compute Shaders were designed to run on CPUs. OpenCL also has the largest target base allowing Havok to maintain it's portability across Windows including XP, OS X, and Linux compared to DX11 which would be limited to Vista and Windows 7. Havok itself is well placed to promote OpenCL since it's made by Intel, who would support it for their increasingly multicore CPUs and Larrabee, is actually backed by AMD over PhysX and who is now also promoting OpenCL over their own Brook+ language, and using OpenCL will also allow compatibility with nVidia GPUs unlike the proprietary CUDA PhysX. Havok already claims interoperability with 3ds Max, Maya, and XSI which rely on OpenGL, so going to OpenCL would be a natural fit in the development ecosystem. The more general nature of OpenCL compared to DX11 Compute Shaders would also help expand Havok's market beyond gaming to simulation and science which probably fits well within Intel's Visual Computing parent group. And of course, moving Havok to OpenCL may also encourage more game developers to use OpenGL, which probably isn't a bad thing.I'm actually kind of surprised with all the talk of DX10 not taking off because of the huge existing XP market that developers didn't think to switch to OpenGL since both nVidia and ATI exposed all the features of their DX10 GPUs in Windows XP through OpenGL extensions. The same will no doubt be true with DX11 GPUs with features available in XP through OpenGL extensions.
bobjones32 - Friday, January 30, 2009 - link
Derek needs to stick with the hardware and stop making the ridiculous comments about software. "Rejected" Vista? Slow adoption? Porting DirectX 11 to XP?Give me a break. I thought it was common knowledge by now that the only people "rejecting" Vista are those buying into the sensationalist blogosphere that built the ridiculous perception in the first place. And since when is 150-200 million users of an OS in 2 years considered "slow adoption"?
And finally, I would have certainly expected Derek to understand how ridiculous the proposition of DX10 or DX11 on XP is. One of the fundamental design purposes of DX10 was, as Derek actually pointed out, to interface with Vista's overhauled driver model.
Are you really expecting Microsoft to spend millions in development resources to back-port a completely different driver model to an 8-year-old operating system just to make it possible for DX10/11 to run on there too?
*sigh* Please. Stick with the hardware. Anandtech's informative articles are not where I want to see butthurt opinions. Save it for your blog.
AlphaTango1 - Saturday, January 31, 2009 - link
Derek, great article and very informative, thanks for taking the time in putting it together.
It's very interesting to see the changes in design and architecture from earlier DX versions, and improvements being made to assist in moving our current graphics forward.
It's hilarious to sit back and read through the replies and see the Vista purchasers barking out emotional comments, defending how they love using Vista on their own PC at home.
It's also funny to note that this is the main thing people have commented on from a multi page article that goes into great depths about graphics, and the future architecture our games will be utilising, and yet we have people still barking on about how much they love Vista.
Let’s forget your emotions for a minute, and the need to defend your purchase, or your love for Microsoft; and look at the actual outcomes to the user here in relation to graphics.
Derek's point about Vista is mainly that it's deployment strategy, marketing, performance issues and initial instability didn't assist in moving DX10 forward for us to significantly benefit from. It also resulted in setting some tracks in place that actually continued preference for DX9 over DX10.
I remember when DX10 was originally being marketed and pumped by Microsoft, and was going to deliver unparalleled graphics improvements to gamers. Comparison split screen vidoes of games were shown, with DX9 and DX10 samples, and talk of "a field of individual blades of grass all moving individually" etc
Well what has that actually provided you with today? What benefits are you seeing in your games now, versus the natural improvements developers have made over time by learning how to use DX9 even better? I've read countless articles comparing DX9 and DX10 versions of the same games, with screen shots in 1920x1280 and above, and you often literally have to sit there for a good minute to actually see some slightly extra textured dirt, or a slight increase in transparency in the water. "Oh, ok there's the DX10 feature...Hmmm was that worth the 20fps hit in performance?" These improvements are not substantial, and wouldn't be noticed during gaming anyhow.
Fact is, DX10 has not delivered on the promises and marketing hype, it will be purely a stepping stone in evolving DX11 to what it is.
For all of you now rushing to type replies of all those extra texture changes you've noted in your DX10 games, seriously don't bother. Even if you can come up with a few examples, you'd be fooling yourself to think DX10 has actually made a significant change to current games, versus the millions of dollars and thousands of person-hours spent developing it.
You'll also be madly trying to type a reply to defend yourself for the extra $$'s you've spent on hardware to get that DX10 game running, while your mate running the same game on XP with DX9 is laughing at you trying to defend your water transparency.
On that point, yes XP was shaky in the first period, but that's simply why you never adopt a new OS until a SP or two come out, otherwise you're just a lemming beta tester for MS. My gaming group has 20+ players, and those of us still on XP had many laughing and banter sessions during the first year or so of Vista. We'd regularly be in-game playing, while the guys with Vista were still trying to get the sound driver working with their new game, or getting the fps to a playable level (and these were IT peopl). I was always rock solid on XP, with DX9 and my game looked just as good and was running faster, and that to me is a good way to measure the success of something. What improvement was Vista and DX10 going to give me, and why in the heck would I bother changing!?!
Yes agree! Vista is probably quite fine for most people to use now, and the issues will have been ironed out. But back to the graphics topic, has it been worth it in relation to the DX10 experience received? This is what this should be about, not how much you love Vista! The question is, has your Vista vehicle and the associated DX10 version provided you with anything substantial? (Besides some Aero graphics interface to make you feel better for buying it).
From seeing and using both, I can say the answer would be a 'No', versus the cost you paid for Vista, the cost of extra hardware required, the marketing hype and spend; and then wrapping that up and comparing it to users on XP SP3 playing games on DX9. The justification ego of people and the effect of marketing can be a funny thing to watch.
I suggest some of you go an actually read some articles from industry researchers and advisors such as Gartner, and the view on Vista in the industry, and advice to Business on selection. Sorry to say, but it's a little bigger than your experience on your little PC and home.
From reading Derek's article I think he's trying to put forward the fact that DX11 has the potential to actually deliver on those DX10 promises and that we may actually see some results, and big changes in our gaming experiences. He also is saying that if the vehicle in which DX11 will be released on (i.e. an OS, Win 7 mainly in this case) is a success, then we have a good chance of it also being adopted in greater numbers by developers of the games we'll be playing.
DerekWilson - Saturday, January 31, 2009 - link
I NEVER said DX11 would or could be ported to WinXP as it is.However, there really is little reason that MS couldn't have chosen to design DX10 so that applications written to target DX10 could still run on XP. But they did not. For the sake of DirectX I think this was the wrong decision.
The change in driver model is very largely an issue in how the graphics driver is implemented to handle DX10. Graphics IHVs developed new DX9 drivers for Vista as well using WDDM, while they still had XP DX9 drivers using the old model. If MS had divorced the API from the driver model changes then they could have implemented DX10 in both XP and Vista with no problem and we'd see higher adoption today.
There is NO technical reason that the functionality in DX10 that is useful for graphics programming (pipeline changes, numbers of registers and resource constraints, stream out, programmable AA, etc.) could not have been implemented without requiring a new driver model.
Let me paint the picture: there is no reason you need fine grained power management and linked adapters to add the GS to the pipeline. Developers could have had an software interface that exposed the functionality of the hardware without the new driver model. Case in point is that OpenGL on Windows XP is able to exploit all DX10 hardware functionality that is not tied to the driver model (read everything that is really useful for programming a game).
So, yes, it would be stupid and wasteful to back port the driver model to XP. But it was stupid and counterproductive to design DX10 in a way that required the new driver model rather than being capable of running under both driver models (like DX9/DX9L).
Microsoft wanted DX10 to push upgrades. They wanted to give people a reason to buy a new OS. But by doing this they instead hampered the uptake of DX10.
LeStuka - Friday, January 30, 2009 - link
Vista adoption was slow. Driver support wasn't very good early on (not really Microsofts fault). It's resource intensive and runs more slowly than people expected. It performed tasks more slowly than other OS's.SP1 fixed a lot of issues - people seem to forget what it was like pre-SP..
Microsoft has admitted that Vista wasn't all it should have been. Why do you think Windows 7 is coming so quickly to market? Hint: It's not because Vista was a huge success and is still bringing in mountains of cash.
Also, the extended and re-extended support & supply of XP. Why? From a marketing perspective it makes no sense if Vista is a successful OS.
I'd like to see a link to this "150-200 million users" article/quote (that doesn't include bundled copies that were "down"graded to XP right?)
It'd also be interesting to see how many computers are running XP..
Derek sounds like he's just telling it like it is to me.
You sound like just another one of those internet tossers that take digs at other peoples work for the sake of it; just because it's there.
Are you sure you want to post this reply?
Are you sure you're sure?
Are you suuure...?
bobjones32 - Saturday, January 31, 2009 - link
It doesn't matter what the OS *was*, the only thing that matters is what the OS *IS*.Today. On January 30th, 2009.
XP was an utter piece of crap for the first two years of its release. Unstable, insecure, and far worse compatibility and driver support than Vista ever dealt with. But of course, you're not judging XP today based on XP on release, are you? Of course not. On the contrary, you're comparing XP now to Vista on release. Hardly fair, and absolutely not relevant.
Why do I think Windows 7 is coming so quickly? Based on what? You realize that the Windows Vista --> Windows 7 timeframe (~3 years) is *longer* than any other release of Windows outside of XP-->Vista, right? You realize that's nearly twice as long as any subsequent release of OS X, right?
Extension of XP, why? Because people are dumb enough to continue to think they need it. Microsoft already lost the perception war, may as well milk the uneducated while they can. However, the 150-200 million number is absolutely accurate:
http://www.microsoft-watch.com/content/vista/vista...">http://www.microsoft-watch.com/content/vista/vista...
Microsoft shipping 150 million licenses of Vista as of May of 2008. If you don't think that number of people are *using* vista by now, 8 months later, then you are completely oblivious.
Derek is not "telling it like it is." He's buying into the anti-Vista sensationalist nonsense without using real facts and figures to back it up, making ridiculous assumptions in lieu of evidence, and making impossibly absurd requests like asking for DX10/11 on XP.
leexgx - Saturday, January 31, 2009 - link
most of my customers prefer XP over Vista with AICH been used and no option to turn it off its an big fuss to get XP onto new OEM computers nowto run vista you need 2gb of ram good hard disk (250gb) and an dual core cpu, OEMs selling systems with single core cpus and 1 gb of ram with 256mb shared video on an 965 intel IGP that only needs 64mb for aero, no one likes vista for the lower end laptops
allso this comment box has not been tested with opera (box is to small)