Wednesday, April 1st 2009

Intel Losing CPU Market-Share to AMD

With the introduction of the K8 architecture, years ago, AMD found inroads into significantly rising in market-share for CPUs, at the expense of Intel. That growth ceased with Intel's introduction of the competing Core microarchitecture, following which, AMD was pushed into some deep financial trouble. Eventually the company spun-off its manufacturing division to form the Globalfoundries with investment from Advanced Technology Investment Company, recently.

With the introduction of the 45 nm Phenom II series of processors however, sharp growths in demand for AMD have been observed, with Phenom II X3 700 series triple-core, and the Phenom II X4 920 quad-core desktop processors. The surge in demand is caused due to recent price-cuts by the company. Motherboard vendors forecast the overall global market-share for AMD desktop processors to go up by 30 percent in Q2, 2009. With a conservative-estimate of its current market share to be around 20 percent, the growth would send the figure to 26 percent. The company plans to further expand its desktop CPU lineup with the introduction of an entry-level desktop platform before September.
Source: DigiTimes
Add your own comment

115 Comments on Intel Losing CPU Market-Share to AMD

#76
[I.R.A]_FBi
FryingWeeselgood news.

and a little FYI for you all, i5 is just a core2 chip with IMC, the performance isnt that much better, infact many benches dont show any signifigant gain at all.

i5 also is plant to be very hard to overclock, intel wants to block overclocking unless u buy their "enthusist" platforms, they have been working on ways to make the cpu fail if you overclock it(no joke)

so well, i will stick with my amd rigs, I cant see amd removing clocking from their cpu's, hell their black edition chips are a great buy IMHO.
core2 is still competitive so no problems
Posted on Reply
#77
pepsi71ocean
thank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.
Posted on Reply
#78
[I.R.A]_FBi
pepsi71oceanthank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.
This is what everyone should look forward to
Posted on Reply
#79
alucasa
SparkyJJOThat surprising? :D
Well?

I thought all CEOs did was playing golfs and put stamp on documents. How naive of me !
j/k, of course. I know better than that.

Still, it's quite surprising that, once Hector is gone, AMD is starting to get better. Hector is the bad mojo, or ... you know, he sucks at what he is supposed to do.

How he got his fame, that's beyond me.
Posted on Reply
#80
Wile E
Power User
wiakassassins creed developer was gagged and slapped by nvidia when ati owned nvidia in the game :P
You mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.
Posted on Reply
#81
HammerON
The Watchful Moderator
I remember the first computer I built with an Athlon 64 3200. Then a 3500, then a 3700, then a 4000, and finally a FX 53. Those were all great CPUs compared to Intel (at the time). Then I switched with the Core2 Duo.
It is nice to see AMD back in the "game" and applying pressure to Intel :toast:
Posted on Reply
#82
FryingWeesel
Wile EYou mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.
wrong, one of the main features of 10.1 was/is to remove that extra rendering pass, so no code was left out, no effects where being missed, fact is that was an excuse to explain the patch, not not a conpiracy made up by ati fanboys, in this case it really is a fact that the game was patched to keep nvidia happy since they had dumped money(or in this case hardware) into helping dev the game.

there are plenty of links about it, those that go into depth explain it quite well, 10.1 removes the need for extra rendering passes for some effects, the same effects that gave the perf boost to ati cards.

so you can read up about this and get the FACTS not the excuses used by Ubi to placate nVidia.

techreport.com/discussions.x/14707
.....So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate. The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality. Ubisoft claims it's pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game....
basickly it was removed to remove the advantege ati had shown due to their cards supporting 10.1 when NOTHING nvidia had or even has today can support 10.1(true dx10)
Posted on Reply
#83
FryingWeesel
little more info:
10.1 in assains creede is actually legitimate because you can just reuse the depth buffer in DX10.1 instead of second pass.

again, something nvidia cards cant do because nvidia didnt want to support true dx10(hence ms cutting dx10 specs and having to bring out dx10.1 later)

ATI on the other hand had true dx10(now called 10.1) support with the hd2k cards but....well, nvidia didnt want to follow ms's specs, and cryed enought that ms backed down and removed the stuff nvidia couldnt/wouldnt support.

mind you, im on an 8800gts 512...so dont say im an nvidia hater, i love this card, but i dont love the actions of tthe company behind it.

www.pcgameshardware.de/aid,645430/Ubisoft-No-Direct-3D-101-support-for-Assassins-Creed-planned/Assassins-Creed/News/
You might remember: The enormously successful Assassin's Creed was the first game to have DX 10.1 support. But the patch 1.02 removed this feature.

PCGH was able to get some more information about this business. Below you will find an email interview with Ubisoft.

PCGH: D3D 10.1 support in Assassin's Creed was a hidden feature. Why do you choose not to announce this groundbreaking technology?
Ubisoft: The support for DX10.1 was minimal. When investigating the DX10 performance, we found that we could optimize a pass by reusing an existing buffer, which was only possible with DX10.1 API.


PCGH: What features from Direct 3D 10.1 do you use with the release version? Why do they make Assassin's Creed faster? And why do FSAA works better on D3D 10.1 hardware?
Ubisoft: The re-usage of the depth buffer makes the game faster. However, the performance gains that were observed in the retail version are inaccurate since the implementation was wrong and a part of the rendering pipeline was broken.
This optimization pass is only visible when selecting anti-aliasing. Otherwise, both DX10 and DX10.1 use the same rendering pipeline.


PCGH: Why do you plan to remove the D3D 10.1 support?
Ubisoft: Unfortunately, our original implementation on DX10.1 cards was buggy and we had to remove it.


PCGH: Are there plans to implement D3D 10.1 again?
Ubisoft: There is currently no plan to re-implement support for DX10.1.
if that dosnt look like somebodys just making excuses for patching out something that offers a bennifit to the "other team" i duno what you been smoking.....
Posted on Reply
#84
FordGT90Concept
"I go fast!1!11!1!"
Let me put this in bullets:

1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.

2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista). That is, it is more or less the same as DirectX 9.0c. Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).

3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.

4) Ubisoft developed the game with DirectX 10.1 in mind. First, they saw that NVIDIA announced they had no plans to support DirectX 10.1. Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched. They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.

And that's pretty much it. It wasn't worth fixing so they removed it. NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost. The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.
Posted on Reply
#85
soryuuha
4) is just for assassin creed case rite?

Since Ubee put DX10.1 in Tom Clancy HAWX
Posted on Reply
#86
ShadowFold
HAWX runs awesome with DX10.1 on, so does STALKER but it's an AMD game so. I really like the boosts I get with games that have 10.1, too bad nvidia doesn't use it. I don't get why they don't use it, it really is great...
Posted on Reply
#87
a_ump
exactly, i mean why in the hell would publishers even care what nvidia had to say? i sure as hell wouldn't. all TWIMTBP games have that because nvidia like sponsers and gives em cahs money right? if games were DX10.1 i'd bet my entire system that ATI and Nvidia would have switched rolls with ATI on top.
Posted on Reply
#88
FryingWeesel
FordGT90ConceptLet me put this in bullets:

1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.

2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista). That is, it is more or less the same as DirectX 9.0c. Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).

3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.

4) Ubisoft developed the game with DirectX 10.1 in mind. First, they saw that NVIDIA announced they had no plans to support DirectX 10.1. Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched. They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.

And that's pretty much it. It wasn't worth fixing so they removed it. NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost. The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.
read my posts+links, fact is that there was no botched code, it was just an excuse to remove something that was making TWIMTBP look bad, there was no "need" to remove it, the need came from greed, nvidia was PISSED that an nvidia game was running better on ATI hardware due to a FEATURE of dx10(the ability to avoid a 2nd rendering pass by re-using the depth buffer)

I have personaly seen the game on ati hardware vs my 8800gts, it looks/runs better on a 3850/3870 or even 2900xt then it runs for me on my 8800gts 512mb (755/1900/2200) once AA is enabled.

the r600 and higher are TRUE dx10(whats now called 10.1) cards, the 4k cards add back some features of dx9 cards(hardware aa support insted of doing it all in shaders)

had nvidia not refused to support true dx10 and convenced MS to dumb it down they would have bennifited from one less rendering pass being needed, but nv refuses to support 10.1, and when it showed a bennifit for ATI on a game NV supported(either with cash, advertising or hardware) NV was PISSED and got ubi to remove it.....

its not a conspiricy theory, its just buisness, and nv doing what i would call a dirty trick to the public at large, even their own customers.
Posted on Reply
#89
a_ump
lol just the fact that they wouldn't get dx10.1 going for their hardware just makes me laugh at them. i mean i wonder how much better performance in games like stalker and crysis we would have if they were dx10.1 not 10. not to mention that is probly what microsoft had in mind when they said dx10 would run better than dx9. they were refering to what we call dx10.1, well at least that's my theory. Nvidia is so pathetic :roll:
Posted on Reply
#90
FryingWeesel
a_ump, thats exectly what they where refering to, there are features that can improove both perf and quility that where removed from "10" to placate nvidia, as such we are not getting the best possible game experiance, insted we get dx9c games with some dx10 shader effects taged on, and when a company puts out a true dx10.1 path on a TWIMTBP title giving 10.1 hardware better perf, nvidia has it removed because it makes them look bad.

hell, the g80, g92, gt200 and we still dont see dx10.1 out of nvidia, they COULD do it, but it would take more work then just re-using stuff they already have :/
Posted on Reply
#91
Wile E
Power User
How do you know MS removed them to placate NV? How do you know there wasn't botched code in Assassin's Creed? Everything you are claiming has no solid evidence.

Plain and simple, it's a conspiracy theory made up by ATI fanboys to make themselves feel better. Nv never paid off the multitude of other vendors whose TWIMTBP titles ran better on ATI hardware.

It's all a bunch of BS.
Posted on Reply
#92
FryingWeesel
ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)

MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.

You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.

again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.

to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.

It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.

like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)

but hey, you must be right, nvidia can do no wrong after all.....:rolleyes:

Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.

I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?

to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)

In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........

this stuffs happened many times over the years.

tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)

I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.

if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.
Posted on Reply
#93
Wile E
Power User
FryingWeeselms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)

MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.

You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.

again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.

to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.

It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.

like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....)

but hey, you must be right, nvidia can do no wrong after all.....:rolleyes:

Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.

I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?

to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down)

In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........

this stuffs happened many times over the years.

tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples)

I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.

if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.
I didn't call you a fanboy. I said fanboys made it up. Did you make it up?

And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.

And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.
Posted on Reply
#94
HammerON
The Watchful Moderator
Well stated Wile E
Posted on Reply
#95
DaedalusHelios
People that buy Intel had a good competitive processor a year or three ago. The cutting edge types still upgrade though.

People that only buy AMD (for whatever reason) finally found a great competitive processor with Phenom 2.

So the "AMD only" group wasn't very motivated until the Phenom 2, for an upgrade. Most upgraded from the dismal original phenom or the good old ground breaking X2 939 or AM2.

I buy AMD and Intel. Why would you limit yourself to only one or the other. Its not a sports team.... its a processor.


PS I am just saying that Intel was ahead of the game by alot from the Core2 launch until Phenom 2 finally caught up but is still behind Core i7.
Posted on Reply
#96
FryingWeesel
some do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.

many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)

Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........

there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in) AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........
Posted on Reply
#97
Wile E
Power User
FryingWeeselsome do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.

many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)

Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........

there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in) AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........
Adding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.

And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.
Posted on Reply
#98
DaedalusHelios
Wile EAdding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.

And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.
The hardware DX10.1 compliance from ATi isn't compatible with the latest microsoft directX 10.1 software provided to developers by microsoft. I remember reading about it thinking no wonder why nobody bothers with 10.1. :laugh:

I know that affected 3xxx series ATi, but maybe 4xxx fixed the mistake?
Posted on Reply
#99
FryingWeesel
no proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.

and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml
Enhance the ATI Experience


It is, of course, a well known fact that Doom 3 is a game which performs best when using boards by nVidia. This has left ATI fans frustrated and eager for a driver update or some other fix. Since ATI has not yet responded, a way of improving the way Doom 3 handles on ATI cards has been posted on the Beyond3D forums. According to the author, the performance increase can increase frame rate from 34fps in 1280x1024 to 48fps. Changes would, of course, depend on each individual set-up. A further suggestion from the forum is that the fix really kicks-in if vsync is enabled. Please feel free to post your experience with the fix on the MegaGames Forums.

The fix involves changing some code which can be found in the Doom 3 pak000.pk4 file. For those not interested in the technical side of the fix, an already changed file is available by following the download tab above. Extract so that the shader file goes under doom3\base\glprogs. This replaces a dependent texture read with equivalent math, which runs better on ATI cards, but seems to run slower on NV boards, so only apply this if you got an ATI card.

...this should be good enough proof that ATI hardware can run Doom3 just as good if not better than nVidia, and that we can pass on all the "ATI suck in OpenGL", "ATI's drivers suck" etc. into the trashcan where it belongs.

The full, do-it-yourself, fix is as follows:

I picked up Doom3 today and let be begin by saying it's a kickass game so far. A few minuses like weapon reload (which I find add nothing to a game, except annoyance, so I don't know why many devs keep adding it to their games), but overall much above my expectations.

Anyway, to the fun part, exploring the technology.
I think I've found the source of why this game runs comparably slow on ATI hardware vs. nVidia at the moment, and found a solution to the problem.

First, open your doom3\base folder. Doubleclick on the pak000.pk4 file. In the "window can't open this file .. .bla bla" dialog, go on and associate the file with an app like WinRar. With this file open in WinRar, go to the glprogs directory in the file. In there you'll find the shaders. The interaction.vfp file seems to be the main rendering shader. Altering this shader to output a constant color turns most objects into that constant color, except for stuff like computer screens etc.

So doubleclick the interaction.vfp file to open it (you may have to associate the .vfp extension with a text editor like notepad or wordpad first since we're going to edit the file). Scroll down to the fragment shader. You'll find these rows:

Code:

PARAM subOne = { -1, -1, -1, -1 };
PARAM scaleTwo = { 2, 2, 2, 2 };


Add this right below them:

Code:

PARAM specExp = { 16, 0, 0, 0 };


Now scroll down to this:

Code:

# perform a dependent table read for the specular falloff
TEX R1, specular, texture[6], 2D;


Comment out that line by adding a "#" to it, and add another line that will do the same thing with math instead, so it should look like this:

Code:

# perform a dependent table read for the specular falloff
# TEX R1, specular, texture[6], 2D;
POW R1, specular.x, specExp.x;


Save the file and close your text editor. WinRar will ask if you want to update the file in the archive, select yes. Close WinRar and enjoy about 40% higher performance in Doom3. Haven't done extensive testing yet, but my performance went from 34fps in 1280x1024 to 48fps.

Conclusion and discussion:
I don't want to complain about Carmack's work, I still consider him to be the industry leader in graphics engines. Though when I read the shader it striked me how many texture accesses it did compared to the relatively short shader, even for stuff that could just as well be done with math for a small cost in instructions. Using a dependent texture lookup for POW evaluation makes a lot of sense for R200 level hardware due to instruction set limits, but for R300 and up it's much better to just spend the three cycles it takes to evaluate POW with math instead of risking texture cache trashing with a dependent texture read, which may be much more costly, especially since the access pattern in this case will be far from linear. Also, using math improves the quality too, even though it may not be very noticable in this game.

I should point out though that I'm not sure if the constant specular factor 16 that I chose is the one that the game uses, so output may be slightly different, but if this solution will be worked into the game in a future patch, then this is easily configurable by the game so that there won't be a difference, except a lot faster.

An interesting follow-up discussion may be why this dependent texture lookup is much slower on our hardware than on nVidia. Maybe there's an architectural difference that's to blame, or maybe something else? The main point here though is that this should be good enough proof that ATI hardware can run Doom3 just as good if not better than nVidia, and that we can pass on all the "ATI suck in OpenGL", "ATI's drivers suck" etc. into the trashcan where it belongs.
theres more advanced versions but the megagames one is easy to find thats why i use it :)

fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)
Posted on Reply
#100
Wile E
Power User
FryingWeeselno proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.

and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml



theres more advanced versions but the megagames one is easy to find thats why i use it :)

fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)
And I guess innocent until proven guilty means very little to you?

Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.
Posted on Reply
Add your own comment
Dec 26th, 2024 04:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts