Introduction
Next week (we are hearing July 5th), Ubisoft will release their second patch to CryTek's FarCry. This is the game that shows off the beautiful CryEngine renderer that CryTek has put together. The images and scenery is truly beautiful, and with the new patch comes a much needed update to run speed (~15%) and run duration (~30%). These new features make the game an even more enjoyable experience.But that's not the major update that we are here to talk about. The FarCry 1.2 will feature a new rendering path based on Shader Model 3.0 (Vertex and Pixel Shader 3.0), which is currently only supported by NVIDIA's 6800 series cards and not by ATI's X800 line of cards.
We are here today to test out the new patch on six different levels in FarCry and see if the new methods, which CryTek were able to include in their new path, offer any kind of advantage. As the game play experience is meant to be the same no matter what card we're using, we'll clear the air before we start, and say that there will be no new eye candy available through the SM3.0 path. The game should be rendered exactly the same way it was under SM2.0, and we will take a look at IQ as we go through our tests just to make sure that we keep on track. This is a very important point to take away as it means that regardless of whether you buy an ATI X800 or an NVIDIA 6800, the game will still look and play the same.
Well, if there are no new bells and whistles, why should the end user care? Because there are some performance increases that CryTek was able to squeeze out of the engine with their new render path. How much, we're about to find out, but first, let's take a look at what exactly has changed.
UPDATE: It has recently come to our attention that our 4xAA/8xAF benchmark numbers for NVIDIA 6800 series cards were incorrect when this article was first published. The control panel was used to set the antialiasing level, which doesn't work with FarCry unless set specifically in the FarCry profile (which was not done here). We appologize for the error, and have updated our graphs and analysis accordingly.
For a more positive update, after a discussion with CryTek about the new rendering path, we have learned that the lighting model implimented in the SM3.0 Path is exactly the same as was used in the SM2.0 Path. The only exception is that they used the conditional rendering (branching in the pixel shader) to emulate multipass lighting in a single pixel shader. The performance gains we see actually indicate that PS3.0 branching does not have as significant a performance hit as previously thought (and proves to be more efficient than using multiple pixel shaders in a scene).
36 Comments
View All Comments
Anemone - Friday, July 2, 2004 - link
Am one of the increasing numbers of folks who does ues 1600x1200 on everything that supports it, just fyi. Now it's lcd, but before my 19" crt happily did that res too, and that's now many years old.Just for note only. :)
bearxor - Friday, July 2, 2004 - link
Yea, when are we even going to be able to buy a Ultra or "Ultra Extreme".Heck, I never even heard of "Ultra Extreme" until this preview.
I guess when ATi releases new drivers, nVidia will have to launch the long-rumored and much-hyped Geforce 6800 Ultra-Extreme Hyper Edition.
Then, during the ATi refresh,we will all be greeted the the Geforce 6900, 6900 Ultra, 6900 Turbo and 6900 Ultra Hyper Fighting Edition.
They're getting as bad as Capcom these days...
Pete - Friday, July 2, 2004 - link
Whoa, some huge gains for nV. I honestly didn't expect to see such clear differences this early--props to them.ATi's AA hit may be due to an under-performing programmable memory controller, per ATi ppl. We may see them improve memory-intense AA+AF numbers with newer drivers that better utilize the controller. Dunno if that can compensate for nV's huge SM3.0 gains, though.
I'm still a little baffled by the ever-faster "Ultra Extreme" models, though, considering we haven't seen one for even presale (AFAIK) in the many weeks since the 6800U's launch.
TheSnowman - Friday, July 2, 2004 - link
well Jeff, that explains why ati's peformace tanks, but it does nothing to explain why nvidia's doesn't.Jeff7181 - Friday, July 2, 2004 - link
Very nice article guys.Only thing I'd like to see that I didn't was lower res benchmarks, since I think it's safe to say that most people don't have monitors that support 1600x1200 at a decent refresh rate. Hell... mine can't do 1280x1024 at a decent rate.
Oh... and gordon151... I wonder if it could be because of the large amounts of objects to be anti-aliased. Grass, trees, etc. ... combine that with the HUGE draw distances and you've got quite a task on your hands. Just my theory anyway :)
gordon151 - Friday, July 2, 2004 - link
I've been wondering lately why performance tanks so much with the x800 series when AA is enabled in Farcry. It almost cuts in half when applying 4xAA, which is something you don't see in other games.