Assassins creed? TWIMTBP games?
That's not proof of anything and I provided proof of that via PM, because I didn't want to go off-topic. Basically Nvidia didn't make Ubi retire DX10.1 from Assassins Creed, because the ONLY DX10.1 feature that AC uses is the multisample depth buffer readback, something that
Nvidia has always supported in their DX10 cards. Plain and simple.
Also there's nothing like bribes under TWIMTBP, nor is anything shaddy there, just as there is not in any of the games under the less known AMD program of the same nature. TWIMTBP games run better in Nvidia's cards because they were optimized better, thanks to in-situ help from Nvidia engineers. Ati does the same, but with less dedication, except on the tittles under their program. The end result is that games that were more supported by each company gets better performance with that company's cards. But
NEVER the implication of any of the two companies in the optimization of a game makes that game run worse on the other's cards. It seems to run worse in comparison, because it's running better in the other.
Optimization is very important and can improve performance by as much as 25-50%, otherwise look at newer driver releases with game specific optimizations. What makes you think that game optimization is any less important and easier to perform?? It sometimes takes Ati (or Nvidia) many drivers releases (months) until they manage an improvement in some released games, and THEY KNOW better than any other how their cards work. Game developers do the best they can, but without that deep knowledge of the architecture there's very little they can do to improve that latest ladder. Nvidia through TWIMTBP does something that Ati has never done, and that is to give developers part of that knowledge, a bit of the secrets inside their chips, with all the risk that supposes, because Nvidia offers that info BEFORE the cards have been released. And not only that, Nvidia (Ati never did that neither) also gives the developers access to a 500+ different PC configurations lab they have, with many different GPUs, mobos, CPUs, etc...
Game developers make a different rendering path for each brand because half the key code related to performance on the core of the engine is always written in the language of each GPU architecture. How much each GPU company gets involved in the development of that code, revealing key info, is what marks the future performance of the cards. But the other rendering path is not altered, unless in the process they find something that would help increase the performance in the other path too, which happens a lot, BTW.
I'm TIRED of the FALSE acusations made against the various RESPECTABLE game developers. I could care less if bribing acusations only involved Nvidia, but involves a good 60%++ of game developers and that is something that I will never permit.
Plus is simply stupid to think that a company with $100-300 million profits a year, can bribe a game industry that earns more than 20 billions per year. It's stupid.