Beginnings of the Holodeck: AMD's DX11 GPU, Eyefinity and 6 Display Outputs
by Anand Lal Shimpi on September 10, 2009 2:30 PM EST- Posted in
- GPUs
The First Generation Holodeck by 2016
When AMD first told me about the RV770 they told me another interesting story. For the past several years AMD (and ATI before it) has been obsessed with trying to figure out when it would be possible to render an image so convincing that it was indistinguishable (at least mostly) from reality.
Given the right art and a good technique to render the scene, this is totally possible not only within our lifetimes but within the next decade. Unfortunately, that's not enough.
Carrell estimates that the human eye can directly resolve around 7 million pixels, almost twice the resolution of a 30" display. But that's just what it's directly focusing on, all of the peripherial vision brings the total up to around 100 million pixels. The Eyefinity demo I showed earlier was running at 24.5 million pixels on a single GPU; you can estimate that at this generation we'll be able to do about 50 million pixels with two GPUs and one more generation from now we'll get to that 100 million pixel marker. That's two years for a single GPU. Then give it a few more years to be able to render that many pixels but with enough complexity to actually look real.
Rendering something at the max resolution that the human eye can resolve isn't enough however; you have to feel immersed in the graphics. That's where Eyefinity comes in, at least what it starts to come in.
Carrell believes that in seven years we can have the first generation Holodeck up and running. For those of you who aren't familiar with the Trek reference, Carrell believes it'll take seven years to be able to deliver a 180 degree hemispherical display (you're not completely surrounded by displays but at least your forward and peripheral vision is) with positionally accurate and phase accurate sound (both calculated by the GPU in real time). The GPU will also be used to recognize speech, track gestures and track eye movement/position.
It doesn't solve the issue of not being able to walk forward indefinitely, but again this is only the first generation Holodeck.
Eyefinity isn't anywhere close, but if you understand the direction: it's a start.
We're at six 2560 x 1600 displays today, is it too far fetched to imagine a totally immersive display setup that renders at life-like resolutions?
First person shooters pretty much dictate that you'll need an odd number of displays to avoid your crosshairs spanning multiple monitors. With three displays you can begin to get the immersion effect, but buy five and you'll be completely surrounded by your game. And as I mentioned before, it doesn't require any special application or OS support, the drivers take care of everything: it just appears as a single, large, surface.
It seems trivial but honestly we haven't had the ability to easily support the ridiculous display setups we always see in sci-fi movies. Eyefinity at least makes it look like we can build the PCs from the Matrix.
Will it succeed? Who knows. Does it sound gimmicky? Sure. Is it cool? Yeah, I'd say so.
If panel prices could drop significantly enough where putting together an Eyefinity display setup didn't cost more than the graphics card, I think it'd be a much easier sell. Obviously AMD's next-generation GPU is more than just Eyefinity, but you'll hear about the rest late this month.
137 Comments
View All Comments
MadMan007 - Friday, September 11, 2009 - link
Yes it's nuts and you can thank ATi for finally upping the texture units and ROPs. I think they've been the same in number, although they've gotten faster, since the x1k series!Golgatha - Thursday, September 10, 2009 - link
No, I would agree that this resolution with those kinds of framerates is just nuts.skrewler2 - Thursday, September 10, 2009 - link
Yeah, seriously, the performance is unreal. I'm wondering if the settings were really maxed out..AznBoi36 - Thursday, September 10, 2009 - link
Probably were. But look, it's a lvl 1 toon wandering the world. I wanna see the performance in a 25 man raid or simply wandering around Dalaran. Oh and the shadows better be maxed too!theslug - Thursday, September 10, 2009 - link
I could see it being practical if the bezel of each monitor wasn't visible. They would need the actual LCD panels attached to one another instead.HelToupee - Thursday, September 10, 2009 - link
LCD's require control electronics around all 4 sides, making the bezel a necessity. It could easily be 1/4 the width of current monitors. I messed around with stitching the images from 3 rear-mounted projectors together. The image was seamless, but the price would be astronomical. That, and you have to have a VERY good screen to project on to, or all your wonderful resolution gets muddied.USRFobiwan - Friday, September 11, 2009 - link
How about the Samsung 460UTn with just 4mm bezels...mczak - Friday, September 11, 2009 - link
Or the Nec X461UN, which looks very similar (btw you don't need the 460UTn, the 460UT would do as there's no use for the built-in PC in this scenario)Those are really expensive (>5000 USD), are huge and low-res (1366x768). That's really for big video walls, not suitable for some monster gaming setup. But really, it shouldn't be much of a problem manufacturing 24 inch or so tfts with similar slim bezels. There just hasn't been a market for this up to now...
snakeoil - Thursday, September 10, 2009 - link
wow this is spectacular.intel is in big trouble because intel graphics are pretty much garbage
while amd's graphics are real gems.
TA152H - Thursday, September 10, 2009 - link
You make a good point, but the other side of the coin is also true - Intel processors are very strong, and AMD processors suck by comparison.It's a pity ATI stopped making chipsets for Intel motherboards. They'd make money, Intel would still sell processors, and the only real loser would be NVIDIA. It's surprising how many chipsets they sell. I don't know many people who would buy NVIDIA chipsets, like most people, but it seems they sell them well with HP and Dell, where no one asks or knows the difference. ATI should really make chipsets for the Atom too. That would be a great combination.