By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - PC - PhysX Cards - Future or Flop?

This is an offshoot of the UT3 impressions thread, so I am quoting SSJ from that thread.....

 

ssj12 said:
Sqrl said:
PhysX cards are on the way out right now, mark my words.

And the 8800s have physics support, granted a seperate card would take some stress off the rest of the system, no doubt. But with current prices of PhysX cards from Ageia being around $135 and the performance increases being somewhat negligible in rigs similar to mine I really don't see the point in it.

Actually my rig is set up with a nice PSU and my motherboard is SLI ready so my next upgrade will probably be another 8800 GTS when they become available at sub $200 prices.

i doubt it.. it seems more devs are getting into the feel of having a separate card for physics. UT3 does install Physx drivers. UT3 will be a big push for Ageia. They are making a better model of their card that actually supports PCIe ports which will improve it's proformance. I'. still waitinf for a motherboard with 4 or 5 PCIe2 ports. lol Quad-SLI with a physx card = dream.


 

PhysX cards have a lot of problems to become viable...

1) Benchmarks showing that PhysX cards don't help but rather hurt performance.  This is partly due to some early driver issues but it still impacts public perception of the product.
2) Both NVidia and ATI are integrating competing technologies into their already very popular (and most importantly essential for gaming) video cards. 
3) The Ageia PhysX software is competing against one of the biggest names in software phsyics and that of course is the Havok Engine which is what I am pretty sure NVidia is basing their technology off of.
4) Not only do they need to compete on a hardware level but on a software level where there are already many games using the Havok engine making it a lot easier for people to justify the purchase not only because its integrated into their video card but because it is already being used in the games they own.
5) Developers are already familiar with dealing with the Havok engine and it is known to work on all major gaming platforms (ie Windows, Xbox/Xbox 360, GC, Wii, PSX, PS2, PS3, PSP, OS X, and Linux).

Basically in short the known industry entities of ATI and NVidia are going to step in and absorb the market into their product.  I don't see a reason to think this will be difficult for them, especially considering the support of Havok.



To Each Man, Responsibility
Around the Network

sooo..that means my 2900XT HD 1 gig is good right? :>



Neos - "If I'm posting in this thread it's just for the lulz."
Tag by the one and only Fkusumot!


 

I am pretty sure the 2900XT has hardware physics support, but like all hardware solutions, software needs to utilize it or its worthless. Thats sort of where all of these physics hardware solutions are at right now, early adoption. Not a vast amount of support but its increasing.



To Each Man, Responsibility

Don't forget that Intel bought Havok, so next builds of CPUs/Havoks will be very optimized between them.

Since the first moment I thought it's pointless to have such a expensive card for physics itself, if they don't offer them below the $100 range I'm sure they will fail.



I think separate physics card isn't going to make it. Integration onto 2d/3d card is the way to go mainly because:

1. many ppl aren't willing to pay for it
2. takes one slot in the time when compactness and integration is the way to go



.

Around the Network

Hello

Ageia cards are between Gizmondo flop and Phantom flop. :)

Bye.



Zones : I still don't understand all the love for Blizzard, what was the last game they developed worth playing?

Wow, I didn't think there was this much anti-physics card sentiment out there. This post has only been up for like 10 minutes =P

 

Edit: Hate short posts so adding a bit to this,

 

I think the PPU idea will survive but I don't think the future of the technology is as a seperate solution.  I think it will live on as an integrated solution.  I see Ageia either dying a slow painful death or being bought out by an AMD or Intel type company.  Ultimately I think eventually physics will recieve a dedicated core for its calculations, but it could be 2 years until that is widespread or it could be 10 years.  It all depends on what happens in the next 12 months imo.



To Each Man, Responsibility
mrpapaye said:
Hello

Ageia cards are between Gizmondo flop and Phantom flop. :)

Bye.
  

Well, i don't se it as a such big flop. I think that console makers are waiting it to flop on PC market as a separate card so they can buy the company or the tech.



.

It seems to me that with the move to unified shader model cards should make it easier to do physics work on the GPU since you should be able to allocate shader units to physics work without potentially bottlenecking a step in the graphics pipeline. At least, that's the way I understand it.

It really doesn't make much sense to have two pieces of hardware dedicated to doing vector work when one can do the job. I suspect that in the future, graphics cards will have an increasing amount of slack capacity for doing odd jobs like physics calculations as their power continues to increase and the returns on investing more horsepower into graphics tasks diminish.



"Ho! Haha! Guard! Turn! Parry! Dodge! Spin! Ha! Thrust!" -- Daffy Duck

I don´t see these cards getting A: Broad dev. support. B: Acceptance of customers who just payed 250$+ for a good graphics card.

Not a snowmans chance in hell if you ask me.