Beginnings of the Holodeck: AMD's DX11 GPU, Eyefinity and 6 Display Outputs
by Anand Lal Shimpi on September 10, 2009 2:30 PM EST- Posted in
- GPUs
The First Generation Holodeck by 2016
When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.
Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.
Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.
Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.
Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.
It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.
Eyefinity isn't anywhere close, but if you understand the direction: it's a start.
We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?
First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.
It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.
Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.
If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.
137 Comments
View All Comments
strikeback03 - Thursday, September 10, 2009 - link
As I understand it, you can't just maximize windows into screens. The applications and OS don't know your desktop is spread across several screens, so using maximize will maximize the window to cover all available screens.Which kinda sucks if you want to have several different windows all fullscreen on their own monitor.
HelToupee - Thursday, September 10, 2009 - link
Can you have Windows manage the monitors instead of going through this AMD software trickery? I would imagine you would want them to appear to Windows as 6 separate monitors, but then turn on the AMD single-surface-whizbang when you launched an OpenGL or DirectX app.I see they're touting Linux support for this. I hope they start taking their Linux drivers more seriously.
This will be huge news for guys using DMX and Chromium (not the Google browser, the other Chromium) to do the giant wall-o-monitors displays.
Lonyo - Thursday, September 10, 2009 - link
It says that you can manage windows within the software.In the article it mentions taking 6 monitors and dividing it so 4 are one "screen" and the other two form a "second" screen. I presume that means within each grouping applications would maximise as they would if you had 2 physical monitors and were letting Windows control them.
It's like a grid system (which already exists within at least the NV driver options) but spread across multiple monitors in groups, I would assume.
Windows will see whatever ATI get their drivers to show it, so if ATI allow you to manipulate monitors into groupings, that's what Windows will see.
Lonyo - Thursday, September 10, 2009 - link
This sort of setup isn't ideal for all games, and I doubt anyone would argue it is, but it is great for some titles.In RTS games the borders don't matter.
In racing games the borders don't really matter, the added visual field is very advantageous if you are using an in-car view. Going from a 20" flanked by two 17" monitors to a single 24", I notice the loss of peripheral vision, and the monitor breaks weren't far off the roof pillar positions anyway.
In flight sums, as has been said in the article, not a problem.
In FPS games maybe it will be more of a nuisance, but not every game is an FPS.
imaheadcase - Thursday, September 10, 2009 - link
I would think it would be a problem in ANY game. It even looks like it from the screenshots in article.Look, the best way to implement a muilti-monitor setup is what has been done in games already. Supreme commander is a good example..you make a monitor setup with each monitor acting independent of the other.
Open map on one monitor, game on other, stats on one, etc.
Having a large screen, with bezels like that, does not impressive or work towards a advantage in a game to the user. Having a multi-monitor setup with the game outputting scenes you want to each monitor would be far more impressive. So many more ways a game could take advantage of that.
The game that comes out that has those features implemented well into the game play would drive the sells of these setups into the gaming market. But till then its going to never take off.
Just my opinion :P
zebrax2 - Thursday, September 10, 2009 - link
I agree with lonyo. The bezels aren't really that intrusive in some games. A 3x2 setup in a FPS would be a PITA as that would mean my corsair is cut into 2 but a 3x1 setup is just dandy.skrewler2 - Thursday, September 10, 2009 - link
Define 'maxed out'. Was multisampling maxed and triple buffering turned on? I can't imagine a single card could drive that resolution at 80fps. If so, wow, nvidia is in serious trouble and AMD is going to be getting -a lot- of customers..skrewler2 - Thursday, September 10, 2009 - link
Ok, derr, they obviously didn't have triple buffering turned on if they were getting 80fps.therealnickdanger - Friday, September 11, 2009 - link
Not necessarily. Simply put, triple buffering still allows the GPU to push max frames of its ability, but it throws out frames not synched to the display frequency. So while the GPU may be rendering at 80fps internally, you only see 60fps (assuming 60Hz display).This explains it better:
http://www.anandtech.com/video/showdoc.aspx?i=3591...">http://www.anandtech.com/video/showdoc.aspx?i=3591...
snakeoil - Thursday, September 10, 2009 - link
did you notice that mr shimpi is only impressed by anything intel says,like the lynnfield review, come on, ''harder,stronger,longer'' give me a break.
anyway not that i care.