ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)
MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.
You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.
again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.
to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.
It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.
like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)
but hey, you must be right, nvidia can do no wrong after all.....
Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.
I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?
to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)
In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........
this stuffs happened many times over the years.
tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)
I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.
if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.