Beginnings of the Holodeck: AMD's DX11 GPU, Eyefinity and 6 Display Outputs
by Anand Lal Shimpi on September 10, 2009 2:30 PM EST- Posted in
- GPUs
Wanna see what 24.5 million pixels looks like?
That's six Dell 30" displays, each with an individual resolution of 2560 x 1600. The game is World of Warcraft and the man crouched in front of the setup is Carrell Killebrew, his name may sound familiar.
Driving all of this is AMD's next-generation GPU, which will be announced later this month. I didn't leave out any letters, there's a single GPU driving all of these panels. The actual resolution being rendered at is 7680 x 3200; WoW got over 80 fps with the details maxed. This is the successor to the RV770. We can't talk specs but at today's AMD press conference two details are public: 2.15 billion transistors and over 2.5 TFLOPs of performance. As expected, but nice to know regardless.
The technology being demonstrated here is called Eyefinity and it actually all started in notebooks.
Not Multi-Monitor, but Single Large Surface
DisplayPort is gaining popularity. It's a very simple interface and you can expect to see mini-DisplayPort on notebooks and desktops alike in the very near future. Apple was the first to embrace it but others will follow.
The OEMs asked AMD for six possible outputs for DisplayPort from their notebook GPUs: up to two internally for notebook panels, up to two externally for conncetors on the side of the notebook and up to two for use via a docking station. In order to fulfill these needs AMD had to build in 6 lanes of DisplayPort outputs into its GPUs, driven by a single display engine. A single display engine could drive any two outputs, similar to how graphics cards work today.
Eventually someone looked at all of the outputs and realized that without too much effort you could drive six displays off of a single card - you just needed more display engines on the chip. AMD's DX11 GPU family does just that.
At the bare minimum, the lowest end AMD DX11 GPU can support up to 3 displays. At the high end? A single GPU will be able to drive up to 6 displays.
AMD's software makes the displays appear as one. This will work in Vista, Windows 7 as well as Linux.
The software layer makes it all seamless. The displays appear independent until you turn on SLS mode (Single Large Surface). When on, they'll appear to Windows and its applications as one large, high resolution display. There's no multimonitor mess to deal with, it just works. This is the way to do multi-monitor, both for work and games.
Note the desktop resolution of the 3x2 display setup
I played Dirt 2, a DX11 title at 7680 x 3200 and saw definitely playable frame rates. I played Left 4 Dead and the experience was much better. Obviously this new GPU is powerful, although I wouldn't expect it to run everything at super high frame rates at 7680 x 3200.
Left 4 Dead in a 3 monitor configuration, 7680 x 1600
If a game pulls its resolution list from Windows, it'll work perfectly with Eyefinity.
With six 30" panels you're looking at several thousand dollars worth of displays. That was never the ultimate intention of Eyefinity, despite its overwhelming sweetness. Instead the idea was to provide gamers (and others in need of a single, high resolution display) the ability to piece together a display that offered more resolution and was more immersive than anything on the market today. The idea isn't to pick up six 30" displays but perhaps add a third 20" panel to your existing setup, or buy five $150 displays to build the ultimate gaming setup. Even using 1680 x 1050 displays in a 5x1 arrangement (ideal for first person shooters apparently, since you get a nice wrap around effect) still nets you a 8400 x 1050 display. If you want more vertical real estate, switch over to a 3x2 setup and then you're at 5040 x 2100. That's more resolution for less than most high end 30" panels.
Any configuration is supported, you can even group displays together. So you could turn a set of six displays into a group of 4 and a group of 2.
It all just seems to work, which is arguably the most impressive part of it all. AMD has partnered up with at least one display manufacturer to sell displays with thinner bezels and without distracting LEDs on the front:
A render of what the Samsung Eyefinity optimized displays will look like
We can expect brackets and support from more monitor makers in the future. Building a wall of displays isn't exactly easy.
137 Comments
View All Comments
strikeback03 - Thursday, September 10, 2009 - link
As I understand it, you can't just maximize windows into screens. The applications and OS don't know your desktop is spread across several screens, so using maximize will maximize the window to cover all available screens.Which kinda sucks if you want to have several different windows all fullscreen on their own monitor.
HelToupee - Thursday, September 10, 2009 - link
Can you have Windows manage the monitors instead of going through this AMD software trickery? I would imagine you would want them to appear to Windows as 6 separate monitors, but then turn on the AMD single-surface-whizbang when you launched an OpenGL or DirectX app.I see they're touting Linux support for this. I hope they start taking their Linux drivers more seriously.
This will be huge news for guys using DMX and Chromium (not the Google browser, the other Chromium) to do the giant wall-o-monitors displays.
Lonyo - Thursday, September 10, 2009 - link
It says that you can manage windows within the software.In the article it mentions taking 6 monitors and dividing it so 4 are one "screen" and the other two form a "second" screen. I presume that means within each grouping applications would maximise as they would if you had 2 physical monitors and were letting Windows control them.
It's like a grid system (which already exists within at least the NV driver options) but spread across multiple monitors in groups, I would assume.
Windows will see whatever ATI get their drivers to show it, so if ATI allow you to manipulate monitors into groupings, that's what Windows will see.
Lonyo - Thursday, September 10, 2009 - link
This sort of setup isn't ideal for all games, and I doubt anyone would argue it is, but it is great for some titles.In RTS games the borders don't matter.
In racing games the borders don't really matter, the added visual field is very advantageous if you are using an in-car view. Going from a 20" flanked by two 17" monitors to a single 24", I notice the loss of peripheral vision, and the monitor breaks weren't far off the roof pillar positions anyway.
In flight sums, as has been said in the article, not a problem.
In FPS games maybe it will be more of a nuisance, but not every game is an FPS.
imaheadcase - Thursday, September 10, 2009 - link
I would think it would be a problem in ANY game. It even looks like it from the screenshots in article.Look, the best way to implement a muilti-monitor setup is what has been done in games already. Supreme commander is a good example..you make a monitor setup with each monitor acting independent of the other.
Open map on one monitor, game on other, stats on one, etc.
Having a large screen, with bezels like that, does not impressive or work towards a advantage in a game to the user. Having a multi-monitor setup with the game outputting scenes you want to each monitor would be far more impressive. So many more ways a game could take advantage of that.
The game that comes out that has those features implemented well into the game play would drive the sells of these setups into the gaming market. But till then its going to never take off.
Just my opinion :P
zebrax2 - Thursday, September 10, 2009 - link
I agree with lonyo. The bezels aren't really that intrusive in some games. A 3x2 setup in a FPS would be a PITA as that would mean my corsair is cut into 2 but a 3x1 setup is just dandy.skrewler2 - Thursday, September 10, 2009 - link
Define 'maxed out'. Was multisampling maxed and triple buffering turned on? I can't imagine a single card could drive that resolution at 80fps. If so, wow, nvidia is in serious trouble and AMD is going to be getting -a lot- of customers..skrewler2 - Thursday, September 10, 2009 - link
Ok, derr, they obviously didn't have triple buffering turned on if they were getting 80fps.therealnickdanger - Friday, September 11, 2009 - link
Not necessarily. Simply put, triple buffering still allows the GPU to push max frames of its ability, but it throws out frames not synched to the display frequency. So while the GPU may be rendering at 80fps internally, you only see 60fps (assuming 60Hz display).This explains it better:
http://www.anandtech.com/video/showdoc.aspx?i=3591...">http://www.anandtech.com/video/showdoc.aspx?i=3591...
snakeoil - Thursday, September 10, 2009 - link
did you notice that mr shimpi is only impressed by anything intel says,like the lynnfield review, come on, ''harder,stronger,longer'' give me a break.
anyway not that i care.