Sparnage said:
I find it hard to believe a PPU would not change much. Games using the physics card like will run far from it's potential without it. It's not really debatable; have specialised hardware for the software, it is capable of doing more. I guess it comes down to whether you can justify it or not at this point in time.
Hahah, just the reaction I expected.
No offense Sparny, but you're as fervent as ever in your efforts to defend stupid causes, and you miss the point too. You may find it hard to believe but in the first demo issued it didn't change much, if anything at all. You can go ahead, download it and test it by yourself. There are articles about it all over the web. Now since that game is specifically created to make use of that PhysX card, you can bet they'll try all they can as time goes by to make it unplayable without it, even aside from the fact that using its capacities should help the performances (otherwise it'd be completely useless). Now, did you know that their PhysX-based engine is too intense for current graphic cards? And that the developers of Cell Factor had to lower the graphics' quality of the game in order to fully implement the physics engine in there? Sounds like a good case of "you can't justify it at this point in time" to me.
Which is what I said in my post, did you read it?
"I don't think most games will require that kind of card before a while, if they ever do."
I stand by what I said, and I don't think your reply to it is very pertinent. And yeah, if they ever do, because ATI/AMD and NVIDIA aren't going to sit there idly while it happens.
Sparnage said:
Like Tiddlywinks said it's like the Voodoo running graphics with separate hardware from the CPU. It's going to become mainstream in one way eventually.
It's like that
in a way. There are differences, even though you may not see them. I don't feel like going into details over this right now, but the context isn't quite the same.
Sparnage said:
And even if... IF most mainstream games don't require such a card in, oh say 3 to 5 odd years
Hahaha, great prediction there buddy. How about 10 years? When you talk about "a while" in this domain it's a year and even that is long; now 3 to 5 years, that's enough time for anything to happen. It just moves so fast.
Sparnage said:
Crysis probably would've done well with using more than a software PPU, but perhaps started development before the concept was more recognised.
They could have done well using hardware physics acceleration? No way!?
No shit they could have, who couldn't? Hardware acceleration is always a plus for everything by default, it's pretty sad that you felt this was an argument. Nobody is contesting it. The point was that they didn't need to. Neither did the guys at Valve when they made HL2. Just nitpicking, but a processing unit can't be software by the way.
Sparnage said:
The PS3 is investing in PPU for their system now.
No, they bought the rights to include the PhysX development kit, which is basically a big middleware library, into their own PS3 SDK. No "PPU" here. Since the PS3's supposed to come out in 3 months it'd be pretty sad if they were still "investing" in new hardware for it.
Sparnage said:
The only thing that would stop me buying this PPU is how much compatibility it will have with future games; ATi claim that you can use graphics cards to perform physics calculations just as if not better than Ageia. I have heard Nvidia also want to start putting PPU capabilities onto their cards in the future. So it's acknowledged it has much more potential than physics software by itself.
Shouldn't it rather be that the only thing enticing you to buy this card is its compatibility with games and the gain of performance (as well as a hypothetic and unlikely lack of concurrence)? That's what I was getting at from the beginning, and it's the reason I'm not buying one of these PhysX cards. You seem to agree though so I guess we're cool.
Sparnage said:
It could also mean many games end up supporting the future graphics cards with PPU attached because the market responded better to something more familiar and convenient [...] It's likely games wouldn't run under both PPU's.
ATI was already developing hardware physics acceleration before Ageia came out of the woodwork. And it's Half Life 2 that started the trend and made physics engines mainstream with Havok. Also, Ageia requires the use of specific, proprietary technology and libraries, so I doubt the "games would run on both" part in the current context.
Have fun blowing your 10,000 crates anyway; and sorry if I ruined your mood, it wasn't my intent.
Tiddlywinks said:
Aazealh, please. The decision to use Ageia's solution is because of developer ineptitude?
Well, developer ineptitude and budget costs, yes. They choose not to develop their own engine and use one pre-made for them (well, for the major part at least), and in the case of Cell Factor they get paid for it. It also gives a lot of publicity to their game which otherwise would just be FPS_0021544.
It's really not all that different from people buying graphics engines... Remember Unreal? These guys had the balls and the skills to create their own engine and give id Software the finger. Same with Valve when they made Half-Life, and with the CryEngine, etc.
Tiddlywinks said:
I'm sure it has nothing to do with wanting to utilise the hardware, but I can't for the life of me figure out why these slack developers didn't just go with Havok. I mean, that way you don't alienate any potential customers when there's clearly no benefit to be had from hardware acceleration.
Actually, it's a valid point. There's a reason Ageia is paying for Cell Factor's development, and it's because otherwise no game would make use of their product in such a way that playing without it would kill the gaming experience. What they're trying to do basically is to take over the market before the big companies (ATI/AMD & NVIDIA) create their own solutions. I'm not sure they'll succeed honestly, but I can't blame them for trying. They've also managed to sell their technology to Sony so in that respect they've already done well. I don't really see them sticking around as king of PPUs, they'll probably be bought or other technologies will eclipse theirs. If ATI/AMD and NVIDIA develop their own thing fast enough and work with Microsoft to optimize DirectX then it'll be the end of Ageia's little enterprise.
Concerning Havok though I can understand the reticence to use it, simply because you have to pay for it (and I'm sure Valve is selling it at a ridiculous price). That doesn't change much as long as you have to pay for something else, but excluding the case of Cell Factor, the advantage of working with Ageia's solution is that it's currently less costly (and easier, more complete or less restrictive? I don't know). That certainly doesn't mean it's the end of software physics engines though, but they'll be cheaper to make with hardware physics acceleration in the future, and cheaper to buy off others too.