Exclusive: ASUS Debuts AGEIA PhysX Hardware
by Derek Wilson on May 5, 2006 3:00 AM EST- Posted in
- GPUs
ASUS PhysX Card
It's not as dramatic as a 7900 GTX or an X1900 XTX, but here it is in all its glory. We welcome the new ASUS PhysX card to the fold:The chip and the RAM are under the heatsink/fan, and there really isn't that much else going on here. The slot cover on the card has AGEIA PhysX written on it, and there's a 4-pin Molex connector on the back of the card for power. We're happy to report that the fan doesn't make much noise and the card doesn't get very warm (especially when compared to GPUs).
We did have an occasional issue when installing the card after the drivers were already installed: after we powered up the system the first time, we couldn't use the AGEIA hardware until we hard powered our system and then booted up again. This didn't happen every time we installed the card, but it did happen more than once. This is probably not a big deal and could easily be an issue with the fact that we are using early software and early hardware. Other than that, everything seemed to work great in the two pieces of software it's currently possible to test.
Our test system is setup similarly to our graphics test systems, with the addition of a low speed CPU. We were curious to find out if the PhysX card helps out slower processors more than fast CPUs, so we set our FX-57 to a 9X multiplier to simulate an Opteron 144. Otherwise, the test bed is the same as we've used for recent GPU reviews:
AMD Athlon 64 FX-57
AMD Opteron 144 (simulated)
ASUS NVIDIA nForce4 SLI X16 Motherboard
2GB OCZ DDR RAM
ATI Radeon X1900 XTX
ASUS PhysX PPU
Windows XP SP2
OCZ PowerStream 600W PSU
Now let's see how the card actually performs in practice.
101 Comments
View All Comments
iNsuRRecTiON - Saturday, May 6, 2006 - link
Hey,the ASUS PhysX card does already have 256 MB RAM instead of 128 MB RAM, compared to BFG Tech. card..
best regards,
iNsuRRecTiON
fishbits - Friday, May 5, 2006 - link
I want the physics card to be equivalent to a sound card in terms of standardization and how often I feel compelled to upgrade it. In other words, it would be upgraded far less often than graphics cards are. Putting the physics hardware on a graphics card means you would throw away (or sell at a loss) perfectly good physics capability just to get a faster GPU, or get a second card to go to SLI/Crossfire. This is a bad idea for all the same reasons you'd say putting sound card functionality on a graphics card is a bad idea.Calin - Friday, May 5, 2006 - link
Yes, you could do all kind of nice calculations on the physics boards. However, moving geometry data from the video card to the physics board to be calculated and moving them back to the video card would be shooting yourself in all feets.I think this could run well as an accelerator for rendering images or for 3D applications... how soon until 3DStudio, PhotoShop and so on take advantage?
tonjohn - Friday, May 5, 2006 - link
The pre-production cards had both PCI and PCIe support at the same time. You simply flipped the card depending on which interface you wanted to use. So I believe that the PPU offers native PCIe support and that BFG and ASUS could produce PCIe boards today if Ageia would give them permission to.
Bad idea. Putting the PPU onboard with a GPU means higher costs all around (longer PCBs, possibly more layers, more ram). Also, the two chips will be fighting for banwidth which is never a good thing.
Higher costs and lower performance = a bad idea.
FYI: I have a BFG PhysX card.
saratoga - Friday, May 5, 2006 - link
Actually, putting this on the GPU core would be much cheaper. You'd save by getting rid of all the duplicated hardware: DRAMs, memory controller, power circuitry, PCI bridge, cooling, PCB, etc.Not to mention you'd likely gain a lot of performance by having a PCI-E 16x slot and an ondie link to the GPU.
Calin - Monday, May 8, 2006 - link
I wonder how much of the 2TB/s internal bandwidth will be used on the Ageia card... if enough of it, then the video card will have very little bandwidth remaining for its operations (graphic rendering). However, if the cooling really needs that heat sink/fan combo, and the card really needs that power connector, you won't be able to put one on the highest end video cards (for power and heat reasons).kilkennycat - Friday, May 5, 2006 - link
"I have a BFG PhysX card"Use it as a door-stop ?
Pray tell me where you plug one of these if you have the following:-
Dual 7900GTX512 (or dual 1900XTX)
and
Creative X-Fi
already in your system.
Walter Williams - Friday, May 5, 2006 - link
Actually, I use it to play CellFactor. Your missing out.
SLi and CrossFire are the biggest waste of money unless you are doing intense rendering work.
I hope people with that setup enjoy their little fps improvement per dollar while I'm playing CellFactor, which requires the PPU to run.
kilkennycat - Friday, May 5, 2006 - link
Cellfactor MP tech demo....Cellfactor to be released in Q4 2007.. maybe... Your PhysX is going to be a little old by the time the full game is released...Should be up to quad-core CPUs and lots of cycles available for physics calculations by that time.
I have recently been playing Oblivion a lot, like several million others. The Havok software physics are just great --- and you NEED the highest-end graphics for optimum visual experience in that game --- see the current Anandtech article. Sorry, I care little about (er) "better particle effects" or "more realistic explosions", even when I play Far Cry. In fact, from my experiences with BF2 and BF1942 I find them more than adequately immersive with their great scenery graphics and their CURRENT physics effects -- even the old and noble BF1942.
On single-player games, I would far prefer seeing additional hardware, or compute-cycles, being directed at advanced-AI than physics. What point fancy physics-effects if the AI enemy has about as much built-in intelligence as a lump of Swiss cheese? Sure does not help the game's immersive experience at all. And tightly-scripted AI just does not work in open-area scenarios (c.f: Unreal 2 and the dumb enemies easily sneaked from behind -- somebody forgot to script that eventuality amongst many others that can occur in an open play-area). The successful tightly-scripted single-play shooters like Doom3, HL2, FEAR etc all have overt or disguised "corridors". So, the developers of open-area games like Far Cry or Oblivion chose an algorithmic *intelligent-agent AI* approach, with a simple overlay of scripting to set some broad behavioral and/or location boundaries. A distinct move in the right direction but there are some problems with the AI implementation in both games. More sophisticated AI algorithms will require more compute-power, which, if performed on the CPU, will need to be traded off with cycles available for graphics. Dual-core will help, but a general-purpose DSP might help even more... they are not expensive and easily integrated into a motherboard.
Back to the immediate subject of the Ageia PPU and physics effects:-
I am far more intrigued by Havok's exercises with Havok FX harnessing both dual-core CPU power and GPU power in the service of physics emulation. Would be great to have action games with a physics-adjustable slider so that one can trade off graphics with physics effects in a seamless manner, just as one can trade-off advanced-graphics elements in games today.... which is exactly where Havok is heading. No need to support marginal added hardware like the PhysX. Now, if the PhysX engine was an option on every high-end motherboard, for say not more than $50 extra, or as an optional motherboard plug-in at say $75, (like the 8087 of yore) and did not take up any additional precious peripheral slots, then I would rate its chances of becoming main-stream to be pretty high. Seems as if Ageia should license their hardware DESIGN as soon as possible to nVidia or ATi at (say) not more than $15 a copy and have them incorporate the design into their motherboard chip-sets.
The current Ageia has 3 strikes against it for cost, hardware interface ( PCI ) and software-support reasons. The PhysX PPU certainly has NO hope at all as a periphreal device as long as it stays in PCI form. Must migrate to PCIe asap. Remember that a X1, or X4 PCIe card will happily work in a PCIe X16 slot, and there are still several million SLI and Crossfire motherboard with empty 2nd video slots. Plus, even on a dual-SLI with dual-slot-width video cards and an audio card present, it is more likely to find one PCIeX1 or X4 slot vacant that does not compromise the video-card ventilation than to find a PCI slot that is not either covered up by the dual-width video cards or that does not completely block airflow to one or other of the video cards.
So if a PCIe version of the PhysX ever becomes available... you will be able to sell your PCI version... at about the price of a doorstop. Few will want the PCI version if a used PCIe version is also available.
Hard on the wallet being an early adopter at times.....
tonjohn - Friday, May 5, 2006 - link
The developers did a poor job when it came to how the implemented PPU support in GRAW.CellFactor is a MUCH better test of what the PhysX card is capable. The physics in CellFactor are MUCH more intense. When blowing up a load of crap, my fps only drop 2fps at the most, and that is mainly b/c my 9800Pro is struggling to render the actual effects of a grenade explosion.