• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Losing CPU Market-Share to AMD

no proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.

and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
http://www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml



theres more advanced versions but the megagames one is easy to find thats why i use it :)

fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)
And I guess innocent until proven guilty means very little to you?

Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.
 
And I guess innocent until proven guilty means very little to you?

Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.

duno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.

if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.
 
duno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.

if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.

Where does it say they left it out on purpose? And by adding that code, they would also have to add some sort of detection and switch routine to the code. I don't see how that is sabotage.
 
Thats like saying that games using Havok are fighting Physx development and therefore Nvidia.

If the programmers don't have the time or money to spend on optimizing you can throw them some money to get it done. If ATi helped with funding too I am sure both would have been running on par.

I don't see why it has to be a conspiracy. ;)
 
Ive known TWIMTBP titles actually play extrememly well on ATI parts.
I didn't call you a fanboy. I said fanboys made it up. Did you make it up?

And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.

And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.
 
okay, i wasn't sure what to thk, and imo you can call the Doom 3 coding w/e you want. Great support for Nvidia and or conspiracy against ATI, either way they could have easily fixed ATI's support in that game but no matter no.

However i found this article interesting about the Dx10.1 removal in Assassin's Creed.

http://techreport.com/discussions.x/14707

So they responded that there were no image quality differences with Dx10.1 compared to Dx10, only performance improvements for compliant hardware. then they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience? just sounds like bullshit to me. We see benchmarks of great performance with ATI in a game that is TWIMTBP, then it's removed and not thought about since to re-instate dx10.1
 
then they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience?
Because the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code. It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path. Ya know, fix it for the masses, not the few.
 
*looks at Q9450 with a weird, undecided face*
 
Because the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code. It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path. Ya know, fix it for the masses, not the few.

Says the person with the Nvidia card.
 
Why use an "inferior" path? I mean 10.1 runs better.
 
Why use an "inferior" path? I mean 10.1 runs better.

For a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.
 
just dont update AC, you can play the game with 10.1 if u dont update it.....and it works perfectly...
 
For a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.

So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?
 
So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?

nVidia didn't develop the title. Ubi did. Just like most of the major dev houses, they feel it's a waste to spend too much dev time on patching games, unless they really need it. AC doesn't need 10.1 to run, so they felt their dev money was better spent elsewhere, especially considering that all modern cards can already run it in DX10 anyway. Why implement and debug DX10.1 for a minority of gamers, on a game that is perfectly stable without, when they can use that dev time for newer titles?

Some of their newer titles do have 10.1 back in them.
 
i believe he was referring to Nvidia failing to implement Dx10.1 onto their GPU's, not the game itself. I agree lol why can't a very successful company like Nvidia implement dx10.1 yet ATI could? i do realize at first that ATI released their first dx10/10.1 card 6months after the G80 but dam Nvidia's had plenty of time to get dx10.1. It just amazes me how much pull nvidia has over so many other large companies like microsoft. I realize that it made sense for microsoft to lower dx10 requirements at first cause nvidia had the only dx10 compatible cards, but after SP1 it should have been strictly DX10.1, that coulda turned tables in an instant, and performance in games such as crysis and stalker maybe would be much better.
 
Back
Top