Ghost Recon Advanced Warfighter Tests
And the short story is that the patch released by AGEIA when we published our previous story didn't really do much to fix the performance issues. We did see an increase in framerate from our previous tests, but the results are less impressive than we were hoping to see (especially with regard to the extremely low minimum framerate).Here are the results from our initial test, as well as the updated results we collected:
There is a difference, but it isn't huge. We are quite impressed with the fact that AGEIA was able to release a driver so quickly after performance issues were made known, but we would like to see better results than this. Perhaps AGEIA will have another trick up their sleeves in the future as well.
Whatever the case, after further testing, it appears our initial assumptions are proving more and more correct, at least with the current generation of PhysX games. There is a bottleneck in the system somewhere near and dear to the PPU. Whether this bottleneck is in the game code, the AGEIA driver, the PCI bus, or on the PhysX card itself, we just can't say at this point. The fact that a driver release did improve the framerates a little implies that at least some of the bottleneck is in the driver. The implementation in GRAW is quite questionable, and a game update could help to improve performance if this is the case.
Our working theory is that there is a good amount of overhead associated with initiating activity on the PhysX hardware. This idea is backed up by a few observations we have made. Firstly, the slow down occurs right as particle systems or objects are created in the game. After the creation of the PhysX accelerated objects, framerates seem to smooth out. The demos we have which use the PhysX hardware for everything physics related don't seem to suffer the same problem when blowing things up (as we will demonstrate shortly).
We don't know enough at this point about either the implementation of the PhysX hardware or the games that use it to be able to say what would help speed things up. It is quite clear that there is a whole lot of breathing room for developers to use. Both the CellFactor demo (now downloadable) and the UnrealEngine 3 demo Hangar of Doom show this fact quite clearly.
67 Comments
View All Comments
Tephlon - Wednesday, May 17, 2006 - link
yeah, no. I know that havoc is doing genaric physics. And light poles DO normally bend without the card. Cars do shake and explode. Cans can be kicked. All that stuff is normally threre.I'm just saying the card seems to accentuate all of it. Not just more particles, but better explosions. Better ragdoll. Pots break a bit different, etc.
It was definately there before, but I think it all looks better with the physx. My roommate said he noticed the difference as well. I let him borrow it for a while while I was at work.
Again, I know I have no proof, atleast not to show you atm... but to me it all seems better than before.
If I get a chance I'll fraps a run through a level once with and once without, and throw the links up here. I personally have seen several sites' comparison vids, but I don't feel they show everything very well.
Again, I'd heard it only adds particles to explosions, like you did, but I swear I can see the difference with everything.
Anyone ever heard Ageia say EXACTLY what difference there is for GRAW with their card?
DerekWilson - Wednesday, May 17, 2006 - link
Perception of experiences can greatly be affected by expectations. None of us are ever able to be 100% objective in all cases.That being said, in your first post you mention not "feeling" the performance impact. If you'll take a look at our first article on PhysX (and the comments) you will notice that I reported the same thing. There aren't really huge slow downs in GRAW, but only one or two frames that suffer. If the errant frame(s) took a quarter of a second to render, we would definitely notice it. But while an AVERAGE of 15 frames per second can look choppy having a frame or two take 0.066 seconds to render is not going to significantly impact the experience.
Minimum framerates are important in analysing performance, but they are much more difficult to properly understand than averages. We want to see high minimum framerates becuase we see that as meaning less slow-down or stutter. But generally (in gpu limited situations) minimum framerates aren't outliers to the data set -- they mark a low point where framerate dips down for a good handful of frames. In the case of GRAW with PhysX, the minimum is really non contiguous with the performance of the rest of the frames.
CoV is another story. The framerate drops several times and we see stuttering. It's definitely something easily "felt" during gameplay. But CoV Issue 7 is still beta, so we might see some performance imporvements when the code goes live.
Tephlon - Wednesday, May 17, 2006 - link
Derek, I totally agree. I wasn't arguing about anything technicial the article, or the relativity minimum fps has on the 'feel' or 'playablilty'. It just doesn't seem like most readers (here and elsewhere) understand it. I also won't hide the fact that I DO WANT this tech to succeed, partly because I heard them speak at quakecon and I liked what I heard/saw, and partly because I've dropped $300 in good faith that my early adoption will help the cause and push us forward in the area of true physics in gaming. And even though my perception is undoubtably affected because of my expectations, its not entirely misled either. Even with my bias I can be realistic and objective. If this thing did nothing for visuals/gameplay and killed my experiance with crappy performance, I'd of course have a different opinion on the matter.I was simply saying that readers seem to lose sight of the big picture. Yeah, its in the rough stages. Yeah, it only works with a few games. I'm not here to pitch slogans and rants to make you buy it, I just wanted people to understand that device 'as it is now' isn't without its charm. It seems the only defense that's brought in for the card is that the future could be bright. It DOES have some value now, if your objective about it and not out to flame it immediately. I like what it does for my game, even if its not revolutionary. I just hope that there are enough people objective enough to give this company/card a chance to get off the ground. I DO think its better for industry if the idea of a seperate physics card can get off the ground.
I dunno, maybe I see too much of 3DFX in them, and it gets me nostalgic.
Again, Derek, I wasn't knocking on the report at all, and I hope it wasn't taken that way. I think it said just what it was supposed to or even could say. I was more trying to get the readers a balenced look at the card on the use side, since straight numbers send people into frenzies.
Did all that get to what I was trying to convey? I dunno, I confuse myself sometimes. I wasn't meant to be an author of ANYTHING. In any case, good luck to you.
Good luck to all.
DerekWilson - Wednesday, May 17, 2006 - link
lol, I certainly didn't take it as a nagative commentary on anything I said. I was trying to say that I appreciate what you were saying. :-)At a basic level, I very much agree with your perspective. The situation does resemble the 3dfx era with 3d graphics. Hardware physics is a good idea, and it would be cool if it ends up working out.
But is this part the right part to get behind to push the industry in that direction?
AnandTech's first and foremost responsibility is to the consumer, not the industry. If the AGEIA PhysX card is really capable of adding significant value to games, then its success is beneficial to the consumer. But if the AGEIA PhysX card falls short, we don't want to see anyone jump on a bandwagon that is headed over a cliff.
AGEIA has the engine and developer support to have a good chance at success. If we can verify their capabilities, then we can have confidence in recommending purchasing the PhysX card to people who want to push the agenda of physics hardware. There is a large group of people out there who feel the same way you do about hardware and will buy parts in order to benefit a company or industry segment. If you've got the ability and inclination, that's cool.
Honesly, most people that go out and spend $300 on a card right now will need to find value in something beyond what has been added in GRAW, CoV, and the near term games. If we downplayed the impact of the added effects in GRAW and CoV, its because the added effects are no where near worth $300 they cost. It is certainly a valid perspective to look towards the future. You have the ability to enjoy the current benefits of the hardware, and you'll already have the part when future games that make more compelling use of the technology come out.
We just want to make sure that there is a future with PhysX before start jumping up and down screaming its praises.
So ... I'm not trying to say that anything is wrong with what you are saying :-)
I'm just saying that AnandTech has a heavy responsibility to its readers to be more cautious when approaching new markets like this. Even if we would like to see it work out.
Tephlon - Thursday, May 18, 2006 - link
true. I do get your point.And again, you're right. With a more balenced perspective on the matter, I sure can't see you suggesting a 300 dollar peice of hardware on a hunch either. I do respect how your articles are based on whats best for the little guy. I think I'd honestly have to say, if you were to suggest this product now AS IF it was as good as sliced bread... I would be unhappy with my purchase based on your excitment for it.
teheh. Yeah, you made the right call with your article.
Touche', Derek. TOUCHE'
thehe. I guess not everyone can gamble the $300, and thats understandable. :-(
Like I said... here's hopin'. :-D
RogueSpear - Wednesday, May 17, 2006 - link
I'm not an expert on CPUs, but all of this has me wondering - isn't physics type code the kind of mathematical code that MMX and/or SSE and their follow-ons were supposed to accelerate? I'm sure physics was never mentioned way back then, but I do remember things like encryption/decryption and media encoding/decoding as being targets for those technologies. Are game developers currently taking advantage of those technologies? I know that to a certain point there is parity between AMD and Intel CPUs as far as compatibility with those instruction sets.apesoccer - Wednesday, May 17, 2006 - link
Seems like this was a pretty limited review...Were you guys working with a time table? Like 4hrs to use this card or something?I think i would have tried more then just single core cpu's...since we're heading mostly towards multicore cpus. I also would have run tests at the same lvl (if possible; it feels like we're intentionally being kept in the dark here) to compare software and hardware with the same number of effects, at different levels and at resolutions...At low res, you're maxing the cpu out right? Well, then if the ppu uses 15% of the cpu but outputs 30% more effects, you're being limited by the cpu even more...You should see greater returns the higher the resolution you go...Since you're maxing your gpu's out more (rather then the cpus) the higher the res. All of this is moot if the overhead cpu usage by the ppu can be run on a second cpu core...since that's where the industry is headed anyway. And making software/hardware runs on a dual core should give us a better idea of whether or not this card is worth it.
peternelson - Wednesday, May 17, 2006 - link
To the people who say it's a decelerator. It is a little slower but it is NOT doing the same amount of work. The visual feast is better in the hardware accelerated game than without the card. But we need a way to quantify that extra as just "fps" ignores it.Second, Anandtech PLEASE get yourselves a PCI bus analyser, it need not be expensive. I want to know the % utilisation on the PCI bus. At 32 bit 33MHz it is potential max 133 MByte/sec.
How much of that is being used to talk to and from the PHYSX card, and is it a bottleneck that would be solved by moving to PCI Express? Also in your demo setups, considering what peripherals you are using, are you hogging some of the PCI bandwidth for (say) a PCI based soundcard etc which would be unfair on the physx card.
ALSO one of the main purposes of THIS review I would say is to COMPARE the ASUS card with the BFG card. You don't seem to do that. So assuming I want a physx card, I still don't know which of the two to buy. Please compare/contrast Asus vs BFG.
DerekWilson - Wednesday, May 17, 2006 - link
honestly, the asus and bfg cards perform identically, pull about the same ammount of power and produce similar levels of noise.If you are trying to decide, buy the cheaper one. There aren't enough differences to make one better than the other (unless blue leds behind fans really does it for you).
We didn't do a more direct comparison because we have an engineering sample ASUS part, while our BFG is full retail. We generally don't like to make direct comparisons with preproduction hardware in anything other than stock performance. Heat, noise, power, pcb layout, and custom drivers can all change dramatically before a part hits retail.
We will look into the pci bus utilization.
peternelson - Wednesday, May 17, 2006 - link
Thanks, so, I will have to choose on features like the nice triangle box on the BFG ;-)
In gaming on older machines where both the sound and network and possibly other things are all on the same PCI bus, then either the physx or the other stuff could suffer from bus contention.
I hope you can either ask or do some analysing to watch the amount of traffic there is.