You think Nvidia's going to have their answer to the 5000 series out before December? Everything I've ever heard around here says no.
I think there is a possibility.
...fixed that for you there.
If you think Nvidia isn't "encouraging" developers to design in ways that only makes it faster on their hardware while simultaneously making it slower on other brands, you're fooling yourself. It's business, and if they can make money off of it, you can bet they're doing it.
There is no such thing as a conspiracy in the business world if there's money to be made.
I'd rather the game run the same speed on ALL hardware, and not have to buy a certain company's products to get full speed.
That isn't true, and there isn't a single thing even indicating it. When you show me a sliver of proof, I'll talk to you about this, until then, don't spread lies.
Its simply because
Physx is an Nvidia trademark, and in case you havent noticed it is dying and will be dead and gone when DX11 games start to come out. There where very few titles that used physx and it was a non starter from the get go, the score from an nvidia card running physx is not comparable to a ATI card of roughly the same perf cause the phyics tests in vantage were supposed to be done by the cpu and were not specifically for the Physx API. So saying a Nvidia card that got 40% more than a comparable ATI card in vantage is not representative of real life benchmarks and games, it was a bug that got fixed, end of.
When DX11 comes out, I'll be glad, as a unified physic API is what the industry needed.
However, DX11 and PhysX dying has nothing to do with Vantage, a DX10 benchmark. Vantage was not just a benchmark to give raw graphical performance, it was a benchmark to give
overall computer performance. The PhysX API was included, and meant to be used. Futuremark even allowed it entirely using dedicated PhysX cards. All the world records were set with Ageia cards in the machines, it was perfectly acceptable.
The PhysX tests in Vantage were meant to test PhysX performance, it didn't matter where it was calculated at, until nVidia started doing it on the GPUs, and the ATi fans started freaking out.
Wrong again, development my ass, what about the .Exe names that could be changed to give ATI users the same performance that nvidia users were getting in these so called " optimised for Nvidia" games, its marketing BS and it is wrong for game devs to bend over for a quick buck from Nvidia when the games shouldnt run any worse on ATI hardware
Show me this, because I doubt it ever happened. What you are saying isn't possible. Changing the EXEs improves performance for ATi for several reasons, but not the reason you are stating. Usually it is done to invoke optimizations from and older game for a new game, in the drivers that were applied by
ATi. It has noething to do with the game devs at all. Changing the name of the EXE fools outside programs(drivers), but it doesn't change the what the program runs itself. The in game optimizations won't be change if you rename the EXE.
But I fear we are going way off topic here, so I'll end it here. I think if you want to discuss it further, a dedicated topic would be best.