ATI's Late Response to G70 - Radeon X1800, X1600 and X1300
by Derek Wilson on October 5, 2005 11:05 AM EST- Posted in
- GPUs
Budget Performance
For budget performance, we feel that 1024x768 is the proper target resolution. People spending near the $100 mark can't expect to acheive performance at high resolutions. But with current hardware, we can play games at moderate resolutions without loosing any features.
The X1300 is targeted at the budget market, but we focued on testing our X1300 Pro against slightly higher performing parts because of it's pricing. The X1300 does quite well versus the traditional low end 6200 TC and X300 parts, but can't really compete with the 6600 GT which is priced near the $149 MSRP of the X1300 Pro.
Under Doom 3 (and many OpenGL applications) NVIDIA holds a lead over ATI hardware. While it is understandable that the X1300 Pro isn't able to match preformance with NVIDIA's $150 6600 GT, the $250 MSRP X1600 XT laggs far behind as well. It is quite interesting to note that the X1600 closes that gap (and performs slightly better than the 6600 GT) when 4xAA and 8xAF are enabled at this resolution. But at such low res, the better bet is to increase the setting to 1280x1024 with no AA where the 6600 GT maintains about a 20% performance lead. Doom 3 is also a fairly low contrast game, meaning that jagged edges are already hard to see.
Under Valve's Day of Defeat: Source, the latest resurrection of a past title by Valve (and also the first to feature HDR), The 6600 GT and X800 perform on par with what we would expect while the more expensive X1600 XT lags behind and the X1300 looks to perform where a budget card should. Enabling 4xAA and 8xAF on this game closes the gap between the 6600 GT and X1600 XT: they both run at about 48 fps under this setting, followed by the X800 at nearly a 43 fps average.
Far Cry provides a victory for the X1600 XT over the 6600 GT, but we still have the expensive X1300 Pro lagging it's closer cost competitor by a large margin.
Everquest II on very high quality mode shows the X1600 XT to lead this segment in performance. Current ~$100 parts are shown to perform horribly at this setting scoring single digit framerates. The X1300 Pro is definitely playable at very high quality at 1024x768 (which we would recommend over a lower quality setting at a higher resolution). Extreme quality still doesn't perform very well on any but the most expensive cards out there and really doesn't offer that much more interms of visual quality.
When testing Splinter Cell: Chaos Theory, the new X1000 series of cards give a very good performance. This time around, the X800 and 6600 GT don't perform equally, and it looks as though the additions to the RV5xx architecture can make quite a difference depending on the game being played.
To see the continuing saga of the X1600 XT, we will take a look at midrange performace numbers at 1280x960.
For budget performance, we feel that 1024x768 is the proper target resolution. People spending near the $100 mark can't expect to acheive performance at high resolutions. But with current hardware, we can play games at moderate resolutions without loosing any features.
The X1300 is targeted at the budget market, but we focued on testing our X1300 Pro against slightly higher performing parts because of it's pricing. The X1300 does quite well versus the traditional low end 6200 TC and X300 parts, but can't really compete with the 6600 GT which is priced near the $149 MSRP of the X1300 Pro.
Under Doom 3 (and many OpenGL applications) NVIDIA holds a lead over ATI hardware. While it is understandable that the X1300 Pro isn't able to match preformance with NVIDIA's $150 6600 GT, the $250 MSRP X1600 XT laggs far behind as well. It is quite interesting to note that the X1600 closes that gap (and performs slightly better than the 6600 GT) when 4xAA and 8xAF are enabled at this resolution. But at such low res, the better bet is to increase the setting to 1280x1024 with no AA where the 6600 GT maintains about a 20% performance lead. Doom 3 is also a fairly low contrast game, meaning that jagged edges are already hard to see.
Under Valve's Day of Defeat: Source, the latest resurrection of a past title by Valve (and also the first to feature HDR), The 6600 GT and X800 perform on par with what we would expect while the more expensive X1600 XT lags behind and the X1300 looks to perform where a budget card should. Enabling 4xAA and 8xAF on this game closes the gap between the 6600 GT and X1600 XT: they both run at about 48 fps under this setting, followed by the X800 at nearly a 43 fps average.
Far Cry provides a victory for the X1600 XT over the 6600 GT, but we still have the expensive X1300 Pro lagging it's closer cost competitor by a large margin.
Everquest II on very high quality mode shows the X1600 XT to lead this segment in performance. Current ~$100 parts are shown to perform horribly at this setting scoring single digit framerates. The X1300 Pro is definitely playable at very high quality at 1024x768 (which we would recommend over a lower quality setting at a higher resolution). Extreme quality still doesn't perform very well on any but the most expensive cards out there and really doesn't offer that much more interms of visual quality.
When testing Splinter Cell: Chaos Theory, the new X1000 series of cards give a very good performance. This time around, the X800 and 6600 GT don't perform equally, and it looks as though the additions to the RV5xx architecture can make quite a difference depending on the game being played.
To see the continuing saga of the X1600 XT, we will take a look at midrange performace numbers at 1280x960.
103 Comments
View All Comments
Gigahertz19 - Wednesday, October 5, 2005 - link
On the last page I will quote"With its 512MB of onboard RAM, the X1800 XT scales especially well at high resolutions, but we would be very interested in seeing what a 512MB version of the 7800 GTX would be capable of doing."
Based on the results in the benchmarks I would say 512MB barely does anything. Look at the benchmarks on Page 10 the Geforce 7800GTX either beats the X1800 XT or loses by less then 1 FPS. SCALES WELL AT HIGH RESOLUTIONS? Not really, has the author of this article looked at their own benchmarks included? When the resolution is at 2048 x 1536 the 7800GTX creams the competition except in Farcry where it loses by .2FPS to the X1800XT and Splinter Cell it loses by .8FPS so basically it's a tie in those 2 games.
You know why Nvidia does not have a 512MB version because look at the results...it does shit. 512Mb is pointless right now and if you argue you'll use it for the future then will till future games use it and then buy the best GPU then, not now. These new ATI's blow wookies, so much for competition.
NeonFlak - Wednesday, October 5, 2005 - link
"In some cases, the X1800 XL is able to compete with the 7800 GTX, but not enough to warrant pricing on the same level."From the graphs in the review with all the cards present the x1800xl only beat the 7800gt once by 4fps... So beating the 7800gt in one graph by 4fps makes that statement even viable?
FunkmasterT - Wednesday, October 5, 2005 - link
EXACTLY!!ATI's FPS numbers are a major disappointment!
Questar - Wednesday, October 5, 2005 - link
Unless you want image quality.bob661 - Wednesday, October 5, 2005 - link
And the difference is worth the $100 eatra dollars PLUS the "lower" frame rates? Not good bang for the buck.Powermoloch - Wednesday, October 5, 2005 - link
Not the cards....Just the review. Really sad :(yacoub - Wednesday, October 5, 2005 - link
So $450 for the X1800XL versus $250 for the X800XL and the only difference is the new core that maybe provides a handful of additional frames per second, a new AA mode, and shader model 3.0?Sorry, that's not worth $200 to me. Not even close.
coldpower27 - Thursday, October 6, 2005 - link
Perhaps a up to 20% performance improvement, looking at pixel fillrate alone.
Shader Model 3.0 Support.
ATI's Avivo Technology
OpenEXR HDR Support.
HQ Non-Angle Dependent AF User Choice
You decide if that's worth the 200US price difference to you, Adaptive AA, I wouldn't count as apparently through ATI's driver all R3xx hardware and higher now have this capability not just R5xx derivatives, sort of like the launched with R4xx feature Temporal AA.
yacoub - Wednesday, October 5, 2005 - link
So even if these cards were available in stores/online today, the best PCI-E card one can buy for ~$250 is still either an X800XL or a 6800GT. (Or an X800 GTO2 for $230 and flash and overclock it.)I find it disturbing that they even waste the time to develop, let alone release, low-end parts that price-wise can't even compete. Why bother wasting the development and processing to create a card that costs more and performs less? What a joke those two lower-end cards are (x1300 and x1600).
coldpower27 - Thursday, October 6, 2005 - link
The Radeon X1600 XT is intended to replace the older X700 Pro, not the stop gap 6600 GT competitors, X800 GT, X800 GTO, which only came into being because ATI had leftoever supplies of R423/R480 & for X800 GTO only R430 cores and of course due to the fact that X700 Pro wasn't really competitive in performance to 600 GT in the firstp lace, due to ATI's reliance on Low-k technology for their high clock frequencies.I think these are sucessful replacements.
Radeon X850/X800 is replaced by Radeon X1800 Technology.
Radeon X700 is replaced by Radeon X1600 Technology.
Radeon X550/X300 is replaced by Radeon X1300 Technology.
X700 is 156mm2 on 110nm, X1600 is 132mm2 on 90nm
X550 & X1300 are roughly around the same die size, sub 100mm2.
Though the newer cards use more expensive memory types on their high end versions.
They also finally bring ATI's entire family as having the same feature set, something that hasn't been seen ever before by ATI I believe. I mean having a high end, mainstream & budget core based on the same technology.
Nvidia achieved this item first with the Geforce FX line.