Playing Demos on PhysX
Even though we can't benchmark CellFactor or Hangar of Doom in any useful way, they can't be left out when talking about the usefulness of PhysX hardware. It is very clear that using the AGEIA PhysX technology in a game can yield some impressive results. It is just as clear that current production games, while adding compelling visual effects, suffer a performance penalty that is difficult to justify.What we don't know is just how much more physics a PhysX card and do than the CPU or GPU already in every computer. We just haven't found a real world scenario in which to test the same large physics load on both the CPU and the PPU. None of the games that support PhysX include the ability to enable advanced physics features in the absence of hardware. The one small demo we do have that can run in either hardware or software mode does show a good improvement with hardware, but this test diagnostic app isn't designed as a performance analysis tool (nor is it a real world example of anything).
Click to enlarge |
Now that the CellFactor demo is downloadable, there is a little more value in picking up the hardware. Even though there is only one level to play with, the CellFactor demo is quite enjoyable in a multiplayer situation. It's not $300 USD worth of goodness, but it is a step in the right direction. It is rather impressive on a technical level, but with the full version of the game nowhere near release, the success of PhysX can't rely on CellFactor. We have a short video (3.7MB) available, although you might prefer the 400 MB video available on the CellFactor web site.
Click to enlarge |
Hangar of Doom is a demo based on Epic's UnrealEngine 3. This engine will power Unreal Tournament 2007, as well as a whole host of other games. Currently, UT2007 won't be requiring the PhysX hardware, but that shouldn't stop licensees from being able to take full advantage of it. While this demo isn't as complex as CellFactor, it demonstrates some neat ideas about the destructibility of objects in a game (planes fall apart when shot down). Again, we have a short video (2.1MB) available for download.
If you would like to try grabbing all six videos (13MB including the two from the original PhysX article) using a BitTorrent client, you may find that to be a faster solution (depending on how many people are seeding the files). Just download the torrent file if you're interested.
Unfortunately, even though these demos are very interesting and compelling, developers are not targeting levels of interactivity on this scale for the near future. With the current multiplayer trend, it doesn't make sense for developers to allow gameplay to rely on hardware that many users won't have. It isn't possible to have different gameplay in multiplayer environments. Effects are a different story, and thus the first games to support PhysX do have a tacked on feel to them.
Truly innovative uses of the AGEIA's technology are out there, but we are stuck with a chicken and egg problem. Publishers don't want to require the hardware until a large install base exists, and end users won't buy the hardware until a good number of titles support it.
67 Comments
View All Comments
phusg - Wednesday, May 17, 2006 - link
> Performance issues must not exist, as stuttering framerates have nothing to do with why people spend thousands of dollars on a gaming rig.What does this sentence mean? No, really. It seems to try to say more than just, "stuttering framerates on a multi-thousand dollar rig is ridiculous", or is that it?
nullpointerus - Wednesday, May 17, 2006 - link
I believe he means that the card can't survive in the market if it dramatically lowers framerates on even high end rigs.DerekWilson - Wednesday, May 17, 2006 - link
check plus ... sorry if my wording was a little cumbersome.QChronoD - Wednesday, May 17, 2006 - link
It seems to me like you guys forgot to set a baseline for the system with the PPU card installed. From the picture that you posted in the CoV test, the nuber of physics objects looks like it can be adjusted when the AGIEA support is enabled. You should have ran a benchmark with the card installed but keeping the level of physics the same. That would eliminate the loading on the GPU as a variable. Doing so would cause the GPU load to remain nearly the same with the only difference being to do the CPU and PPU taking time sending info back and forth.Brunnis - Wednesday, May 17, 2006 - link
I bet a game like GRAW actually would run faster if the same physics effects were run directly on the CPU instead of this "decelerator". You could add a lot of physics before the game would start running nearly as bad as with the PhysX card. What a great product...DigitalFreak - Wednesday, May 17, 2006 - link
I'm wondering the same thing."We still need hard and fast ways to properly compare the same physics algorithm running on a CPU, a GPU, and a PPU -- or at the very least, on a (dual/multi-core) CPU and PPU."
Maybe it's a requirement that the developers have to intentionally limit (via the sliders, etc.) how many "objects" can be generated without the PPU in order to keep people from finding out that a dual core CPU could provide the same effects more efficiently than their PPU.
nullpointerus - Wednesday, May 17, 2006 - link
Why would ASUS or BFG want to get mixed up in a performance scam?DerekWilson - Wednesday, May 17, 2006 - link
Or EPIC with UnrealEngine 3?Makes you wonder what we aren't seeing here doesn't it?
Visual - Wednesday, May 17, 2006 - link
so what you're showing in all the graphs is lower performance with the hardware than without it. WTF?yes i understand that testing without the hardware is only faster because it's running lower detail, but that's not clearly visible from a few glances over the article... and you do know how important the first impression really is.
now i just gotta ask, why can't you test both software and hardware with the same level of detail? that's what a real benchmark should show atleast. Can't you request some complete software emulation from AGEIA that can fool the game that the card is present, and turn on all the extra effects? If not from AGEIA, maybe from ATI or nVidia, who seem to have worked on such emulations that even use their GFX cards. In the worst case, if you can't get the software mode to have all the same effects, why not then atleast turn off those effects when testing the hardware implementation? In the city of villians for example, why is the software test ran with lower "Max Physics Debris Count"? (though I assume there are other effects that get automatically enabled with the hardware present and aren't configurable)
I just don't get the point of this article... if you're not able to compare apples to apples yet, then don't even bother with an article.
Griswold - Wednesday, May 17, 2006 - link
I think they clearly stated in the first article, that GRAW for example, doesnt allow higher debris settings in software mode.But even if it did, a $300 part that is supposed to be lightning fast and what not, should be at least as fast as ordinary software calculations - at higher debris count.
I really dont care much about apples and oranges here. The message seems to be clear, right now it isnt performing up to snuff for whatever reason.