ATI's X8xx CrossFire Graphics Arrive
by Derek Wilson on September 26, 2005 1:00 PM EST- Posted in
- GPUs
Super AA and CrossFire
By rendering the same frame on both cards with different subpixel sample patterns, the images can be blended together to provide a smoother image at any given resolution than a single card is capable of rendering. At the same time, as resolution increases and pixel size decreases, antialiasing becomes less important. This is yet more support for believing that greater than 1600x1200 resolutions should be supported on a high end setup like this.For those who will be running CrossFire solutions at 1280x1024 or even 1600x1200, Super AA will be a welcome addition to image quality. ATI already leads the industry in AA quality because they use a programmable sample pattern when antialiasing a pixel while NVIDIA uses a fixed ordered grid approach. The difference really starts to add up when SLI AA and Super AA are compared.
From left to right, 4xAA, 8xSLI AA/10xSuper AA, 16xSLI AA/14xSuperAA. ATI is the top row.
The more even spread that ATI is able to maintain over a single pixel gives the CrossFire solution a better result. Even though NVIDIA's 16x SLI AA has 16 geometry sample points and 4 texture sample points compared to ATI's 12 geometry points and 2 texture points, the distribution of ATI's sample points provide more efficient coverage. Incidentally, if ATI had named their modes like NVIDIA's AA modes, they would have said 20xAA rather than 16xAA.
NVIDIA's 4 texture sample points (essentially super-sampling/SSAA sample points) could provide better interior, texture, and transparent surface antialiasing. Unfortunately, their arrangement limits their usefulness in this regard. Thus we have to declare ATI the clear winner in the AA department. Of course, Super AA mode does take quite a performance hit as we will find out later. But take a look at what it can do to Half-Life 2 at 800x600 (no AA, 4xAA, 6xAA, 10xAA, 14xAA from top to bottom):
While differences beyond 4xAA are harder to spot, take a look at the antennae on the bottom-left of the images. You can see how the barely-visible parts are rendered better, particularly with the 10xAA and 14xAA modes. If you have the GPU performance to handle such features, they're a nice addition.
The down side of Super AA (aside from the performance hit) is that it will only run in full screen applications. Windowed applications are still stuck with 2x 4x 6x and no AA. Even if a Super AA mode is selected, only half the subpixel samples are used. We are not sure if this is a hardware or software limitation, but those of us who play MMORPGs in the background will need to be aware of this issue.
76 Comments
View All Comments
Pete - Monday, September 26, 2005 - link
Dangher, you won't find an article to support your claims. It was speculated (in many a forum and possibly by Josh at Penstarsys) that AFR could double XF's single-link TDMS' refresh rate or resolution by interleaving frames, but that's been ruled out, as apparently the RAMDAC must run at the TDMS engine's rate, and the CE doesn't have buffer enough to support RAMDAC refresh rates indpendent of the TDMS engine.So, I'd be surprised if you do.
And Derek won't be sued for libel unless he intentionally published false info. I'm sure much of his info came from ATI themselves, as well as hands-on experience (which shows a 16x12@60Hz limit across the review-site board).
JarredWalton - Monday, September 26, 2005 - link
I think there has been speculation about what could be done with additional low-level hardware and driver tweaks. For now, X8xx Crossfire does not appear to have any support for anything beyond 1600x1200@60 Hz. That's terrible, in my opinion. I have a 9 year old 21" CRT that can run 1600x1200@75Hz. Anyone that has the money to buy Crossfire is highly likely to have a better monitor than that. Meanwhile, my 2405FPW may only run at 60Hz, but lack of 1920x1200 output makes X850 Crossfire a definite no.My only hope is that ATI has spent more effort on R520 Crossfire and will manage to support at least 2048x1536@85 Hz. That's about where top quality CRTs max out, and there are far more 22" CRT owners than Apple 30" Cinema Display owners. :|
Fluppeteer - Tuesday, September 27, 2005 - link
I'm surprised that any single-link resolution isn't possible (so a digitally driven2405FPW ought to work), but it's clear that there's a problem with CRTs. The R520's
dual-link outputs would appear to solve the problem with reasonable headroom, coincidentally supporting dual link monitors.
Dangher's post *could* make sense - by interleaving pixels one could, in theory, take
two single-link images and produce a dual-link one. But the chips aren't really set
up to render like that - it's certainly not one of the announced Crossfire modes.
It would probably also be slower than the existing modes.
AFAIK there's very little intelligence in the CE (or in the SLi combiner) - the
chip not producing output for the relevant bit of screen just outputs black, and
the CE/SLi combiner just ORs the values from the two heads together. There's a bit
of genlock involved and the DVI receiver and transmitter, but the amount of actual
logic is tiny. Unless I'm wrong about how it works, but I don't see the need for
more (except for the multi-card antialiasing, which presumably needs some blending
support - I was a bit surprised that nVidia could retrofit this for that reason).
You could do all kinds of clever things if the SLi bridge/Crossfire connection
was actually a general-purpose high bandwidth link between the two cards, but to
the best of my knowledge, it's not: it's video only, so you're limited to what
the cards can drive on their digital video outputs when it comes to displaying
the result, and uneven splitting won't help you - it's the peak rate of output
which matters, not the average throughput.
On the plus side, with enough supersampling 1280x1024 on a CRT might not look
much worse than 1600x1200 with less...
Fluppeteer - Thursday, September 29, 2005 - link
I've belatedly picked up on something. Sorry if I'm being slow, but to confirm:The multi-card supersampling mode... is the frame from the secondary card sent
over the PCI-e bus, rather than over the Crossfire link? If so, this would
explain a large performance drop as it's implemented, but also explain how
nVidia could implement the equivalent mode without having built blending
directly into their SLi combiner in the first place (and also suggest that
the Crossfire combiner doesn't need to be clever enough to blend). It might
alse explain why nVidia's implementation coincided with a bridgeless SLi
capability (once you've done the work in the driver...)
If they *do* this, there's no reason for it to be limited to 1600x1200 (or
single-link bandwidth), other than that the PCI-e bus will be limiting the
refresh at some point.
Just wondering, and curious whether I'm imagining it.
--
Fluppeteer
DerekWilson - Monday, September 26, 2005 - link
:-)I don't get offended easily. I'm certainly the first person who wants to know if I got something wrong. At the same time, it is my responsibility to get across the clearest way possible, so I'm also concerned when it doesn't seem that I have communicated the facts clearly enough.
Derek Wilson
erinlegault - Monday, September 26, 2005 - link
All of the reviews I've been reading today on Crossfire have been saying the same thing. Can you tell us how they are all wrong?Leper Messiah - Monday, September 26, 2005 - link
The last table on the last page is missing, there's just a [table] tag.DerekWilson - Monday, September 26, 2005 - link
sorry again ... I'll drop in in a second.Leper Messiah - Monday, September 26, 2005 - link
If this had been released 6 months ago, it would be good. Right now with one 7800GTX beating it in some benchies, and SLi GTs and GTX raping it, this just doesn't cut it. Hopefully ATi has something amazing with the R520, otherwise they are heading back to the days of pre-R300.sxr7171 - Monday, September 26, 2005 - link
Okay, I don't get this. I'm running a 24" widescreen monitor at 1920x1200@60HZ using single link DVI. The limit for single-link DVI at 60HZ is said to be 2.6 megapixels which is quite a bit higher than 1600x1200.