# Intel Losing CPU Market-Share to AMD



## btarunr (Apr 1, 2009)

With the introduction of the K8 architecture, years ago, AMD found inroads into significantly rising in market-share for CPUs, at the expense of Intel. That growth ceased with Intel's introduction of the competing Core microarchitecture, following which, AMD was pushed into some deep financial trouble. Eventually the company spun-off its manufacturing division to form the Globalfoundries with investment from Advanced Technology Investment Company, recently. 

With the introduction of the 45 nm Phenom II series of processors however, sharp growths in demand for AMD have been observed, with Phenom II X3 700 series triple-core, and the Phenom II X4 920 quad-core desktop processors. The surge in demand is caused due to recent price-cuts by the company. Motherboard vendors forecast the overall global market-share for AMD desktop processors to go up by 30 percent in Q2, 2009. With a conservative-estimate of its current market share to be around 20 percent, the growth would send the figure to 26 percent. The company plans to further expand its desktop CPU lineup with the introduction of an entry-level desktop platform before September.

*View at TechPowerUp Main Site*


----------



## Darknova (Apr 1, 2009)

See? Everyone loves AMD really ^_^


----------



## Melvis (Apr 1, 2009)

WOW im surprised since yea the last few yrs haven't been to good for them, but gezz the new Phenom II is doing really well, thats good news indeed for AMD  good to see


----------



## aCid888* (Apr 1, 2009)

Am I the only one who sees this thread going downhill when the fanboys jump on it? Let's hope I'm wrong. 

Either way..good to see AMD doing well, not only because they need to but it keeps Intel in check too....more competition = better prices/products for us.


----------



## FordGT90Concept (Apr 1, 2009)

Remember, Core i7 is still priced outside of budgets most people care to spend.  It still is very much a Core 2 Duo/Quad (Penryn) versus Phenom II world and Phenom II usually takes it in terms of price/performance.  OEMs reflect that price difference in their products and that translates into sales.  We also can't forget the economic situation where people are inclined to pinch pennies.

AMD may be miles behind Core i7 in terms of performance but the majority of buyers aren't in the market for $1500+ computers.  AMD needs to watch out for Core i5 and the subsequent reductions in prices across Core 2 parts, however.


----------



## Nkd (Apr 1, 2009)

well I am thinking about building a new rig, honestly the features and price of amd boards is really really tempting,  but honestly there is bearly any difference between the phenom 2's and i7's when it comes to gaming, other than far cry 2. Most people are going to go for best bang for their buck, their are always going to be more budget oriented people than super highend market. I guess numbers aren't everything, amd is beating nvidia and intel with price/performance game, which is perfect for consumers. I already have an i7 for my personal build but this one would be for office.

AMD will have a whole new architecture in 2011 so I think Phenom II's and success of radeon hd series has given them a little breathing room.


----------



## Imsochobo (Apr 1, 2009)

IT ALL COMES DOWN TO THIS:

Do you encode movies? Core 7

Encode and play games : core 7.

Casual use: PHII

Gaming : PHII / Core 7.

and then there is the money.

Core 2 series are outdated, wont come new cpu's to the platform, soo, that leaves AM2 and AM3 a good platform.

Many users with AM2 boards from 2005 2006, can actually snatch up on theese cpu's and put them in, yeah AM3 ones too, so you can use AM3 cpu's on Nforce 3 chipset 

Core7 is faster, but for what price. its faster at numbercrunching and encode stuff. gaming too, but not by much.

Use the money on a GTX295 or 4870x2. or just a normal gfx, or a 2nd one. like i did


----------



## Nkd (Apr 1, 2009)

Imsochobo said:


> IT ALL COMES DOWN TO THIS:
> 
> Do you encode movies? Core 7
> 
> ...



well put. encoding is certainly i7, no questions, anything other than that will be Phenom 2 with money spent well elsewhere.


----------



## FordGT90Concept (Apr 1, 2009)

Imsochobo said:


> Core 2 series are outdated, wont come new cpu's to the platform, soo, that leaves AM2 and AM3 a good platform.


Four new Core 2 processors were recently launched.

Core i7 is more than enough for any modern game (and then some).


----------



## Weer (Apr 1, 2009)

Intel made a crucial mistake. They allowed AMD to launch Phenom II before launching Core i5.

But Phenom II still has nothing over Core i7. Anyone with the money is advised to buy Core i7.

That being said, it's nice to see AMD getting some competition in, even if it's not on the highest tier.
Hopefully Intel won't be able to produce and ship Core i5 quick enough, and have to drop prices on i7.


----------



## Nkd (Apr 1, 2009)

FordGT90Concept said:


> Four new Core 2 processors were recently launched.
> 
> Core i7 is more than enough for any modern game (and then some).



those were mobile processors I believe.


----------



## BarbaricSoul (Apr 1, 2009)

makes total sense to me, make a product that can match performance levels of your competitors main product(core2 for Intel) for same/lower prices, your market share will go up.


----------



## soryuuha (Apr 1, 2009)

Weer said:


> Intel made a crucial mistake. They allowed AMD to launch Phenom II before launching Core i5.
> 
> But Phenom II still has nothing over Core i7. Anyone with the money is advised to buy Core i7.
> 
> ...



We thinks alike 

I wonder Core i5 can takeover C2Q's territory


----------



## PaulieG (Apr 1, 2009)

From my experience with i7,PII and C2Q, I would have to say that for bleeding edge without budget constraints, i7 is the undisputed champion. I've never seen such a jump in performance when upgrading a platform before. However, I'd have to say that beyond i7, I would chose PII 100% of the time. Even though benchmarks have shown PII to be on par with C2Q, I found my system ran smoother with my 940 than it ever did with my Q9550. 

It is really nice to have AMD back, and I hope they continue to improve and increase market share.


----------



## ShadowFold (Apr 1, 2009)

Great to see. This is gonna pressure intel to try and take back the mid range and budget sectors now, which is gonna lower prices even more


----------



## tkpenalty (Apr 1, 2009)

AMD has actually grown during the financial crisis and quite a lot recently. Still much smaller than intel, but that growth is a huge threat.


----------



## BazookaJoe (Apr 1, 2009)

Intel is still way ahead of AMD in technology , but AMD gaining market share is good for everyone - the closer the battle gets, the bigger the guns that both teams will pull out to try and grab another 1% from each other, and that only translates to good news for the consumer no matter what you buy.


----------



## 3870x2 (Apr 1, 2009)

Imsochobo said:


> IT ALL COMES DOWN TO THIS:
> 
> Do you encode movies? Core 7
> 
> ...


outdated? I wouldnt say as much when clock for clock e6750 beats a pII superpi score...what is that...1.5 years apart?
Still, the PII prices are amazing, not only tempting, but a good deal for anyone, primarily gamers, when that extra $120 you didnt spend on core architecture goes towards bumping you up from a HD4830 to an HD4890.  Lets see....which would perform better on games? i7 + 4830 or pII + 4890?
I own core architecture, /notafanboy.


----------



## Weer (Apr 1, 2009)

Paulieg said:


> From my experience with i7,PII and C2Q, I would have to say that for bleeding edge without budget constraints, i7 is the undisputed champion. I've never seen such a jump in performance when upgrading a platform before. However, I'd have to say that beyond i7, I would chose PII 100% of the time. Even though benchmarks have shown PII to be on par with C2Q, I found my system ran smoother with my 940 than it ever did with my Q9550.
> 
> It is really nice to have AMD back, and I hope they continue to improve and increase market share.



Well, the fanboys seem to like your honesty.

But, what kind of a person interested in performance leaps would buy a 940 when he already has a Q9550? The performance leap that made wind in the past two years was the transition from 2 cores to 4 cores, and even 8 threads. I really don't trust your definition of smooth, no offense, of course.


----------



## 3870x2 (Apr 1, 2009)

Weer said:


> Well, the fanboys seem to like your honesty.
> 
> But, what kind of a person interested in performance leaps would buy a 940 when he already has a Q9550? The performance leap that made wind in the past two years was the transition from 2 cores to 4 cores, and even 8 threads. I really don't trust your definition of smooth, no offense, of course.



At the time, he wasn't interested in performance leaps, as an enthusiast, he decided to test another platform.
Sometimes I have to remind people that this is TechPowerUp! and that if you keep your hardware past 3-5 months, you are a DULLARD! YES! I SAID IT!


----------



## btarunr (Apr 1, 2009)

Weer said:


> But, what kind of a person interested in performance leaps would buy a 940 when he already has a Q9550?



A person who is comparing the two solutions.


----------



## MilkyWay (Apr 1, 2009)

i hope me upgrading to an x3 720be was the right choice because intel core i5 is just around the corner

amd is a good choice i mean okay phenom 1 was balls but that x2 5000+ BE i had was actually okay if only it had more cache

that is the saving grace of the core 2s i mean large cache and fast speeds, the fsb was holding it back tho

the x3 720be is already £99 combine that with a £100 motherboard that is a cheap upgrade for some people, also amd could make a killing in the OEM sector of the market coz these are great for offices that would benefit with having an ATI IGP instead of intels crap, and the low cost but im saying like they do AM2+ they get some cheap 2gb of ram and a 80gb hard drive, a dvd drive is like £10 now for a basic thing, you combine that with a quad or dual or tri core and a cheap under £50 mATX with IGP and that is a good office pc


----------



## mdm-adph (Apr 1, 2009)

3870x2 said:


> outdated? I wouldnt say as much when clock for clock e6750 beats a pII superpi score...what is that...1.5 years apart?



SuperPi isn't the do-all and end-all -- it's an old, simple benchmark.    A dual-core core pentium probably gets a better superpi score than a phenom II 940, but does it really matter?


----------



## MTnumb (Apr 1, 2009)

MilkyWay said:


> i hope me upgrading to an x3 720be was the right choice because intel core i5 is just around the corner
> 
> amd is a good choice i mean okay phenom 1 was balls but that x2 5000+ BE i had was actually okay if only it had more cache
> 
> that is the saving grace of the core 2s i mean large cache and fast speeds, the fsb was holding it back tho



i think you made the right choice otherwise you would just end up waiting for that new great product thats "just around the corner" the 720BE is a great cpu. i tried it compared it[to the q9550 that is..] and well..it rocks you should be happy no worried XD


----------



## jydie (Apr 1, 2009)

With the current state of the economy, the lower priced AMD processors are MUCH more inviting... even if they offer less performance in some areas.  Heck, if I am encoding a movie I really do not care if it takes 15 minutes or 30 minutes... I usually walk off and do other things or set up a bunch of conversions to run over night.


----------



## MilkyWay (Apr 1, 2009)

MTnumb said:


> i think you made the right choice otherwise you would just end up waiting for that new great product thats "just around the corner" the 720BE is a great cpu. i tried it compared it[to the q9550 that is..] and well..it rocks you should be happy no worried XD



yeah i had this feeling before i bought it i was like fk should i wait for the i5 but its a good option i mean it is similar to my old x2 5000+ BE and performs similar to a e8600-e8200 but better in multithread/multi core apps

this is like back in the day everyone wanted Durons because they were cool, more recently the q6600 was a killer cpu now it seems to be the x3 720BE


----------



## johnnyfiive (Apr 1, 2009)

*Intel Losing CPU Market-Share to AMD*


----------



## 3870x2 (Apr 1, 2009)

mdm-adph said:


> SuperPi isn't the do-all and end-all -- it's an old, simple benchmark.    A dual-core core pentium probably gets a better superpi score than a phenom II 940, but does it really matter?



It is a benchmark that measures the 100% true, raw power of a single core.  Doesn't matter how old, or simple it is.


----------



## lepra24 (Apr 1, 2009)

Go Amd Go yes yes yes:


----------



## PCpraiser100 (Apr 1, 2009)

Go AMD!


----------



## HolyCow02 (Apr 1, 2009)

BOOOYA!!!!



GO AMD!!! I like this! Time to buy some stocks! Well TBH, I should have bought when they were $1.83 a share... I'd be up a couple hundred bucks by now... oh well!


----------



## btarunr (Apr 1, 2009)

3870x2 said:


> It is a benchmark that measures the 100% true, raw power of a single core.  Doesn't matter how old, or simple it is.



Real-world applications do, not synthetic tests. If an AMD processor is faster than an Intel processor in a given application, while being slower than it in SuperPi, it is still considered better...unless you're a performance enthusiast and a synthetic score is your dealbreaker.


----------



## farlex85 (Apr 1, 2009)

Good stuff, maybe we can get some nice price wars going in this sector too. Cheap PCs for everyone. 

And in response to everyone saying PII is a better choice than i7 in casual use, I'd say C2D is plenty, and PII is overkill. Rarely is 4 cores actually needed. Gaming, internet, normal media? C2D is still your best bang/buck.


----------



## mdm-adph (Apr 1, 2009)

3870x2 said:


> It is a benchmark that measures the 100% true, raw power of a single core.  Doesn't matter how old, or simple it is.



...while being optimized to run on Intel chips, of course.  It's like saying Nvidia cards are "faster" at gaming because they always beat ATI's cards in Crysis benchmarks -- I don't believe that, either.


----------



## 3870x2 (Apr 1, 2009)

btarunr said:


> Real-world applications do, not synthetic tests. If an AMD processor is faster than an Intel processor in a given application, while being slower than it in SuperPi, it is still considered better...unless you're a performance enthusiast and a synthetic score is your dealbreaker.



I'm guessing that would have to do with particular instruction sets given to a processor.  SuperPi is a very good test for raw processing power down to the core.


----------



## Imsochobo (Apr 1, 2009)

Please use Wprime 1.55 to test raw power.
Nothings better at doing so by my means, ive done calculations, i got them on a paper.

They are never off by more than 10%.

While Superpi is off by 10000000 % (jk) but well, it doesnt! it likes cache, is cache realworld experience builder:
NO.
Intel is faster due to faster cache, and larger size. easy enough. wprime gives a certant idea of how fast it is, and amd/intel is pretty comparable. select number of cores and you get a pretty good score.


I would like to see a P4 beat my PH II in a singlecore game like Clear sky.
Clock the P4 to where the time is better on the P4, and lets see guys!

My bet is on the PH II. or any other cpu thats not a P4.
Superpi is just an ancient app some people tend to like, who knows why, its like oldskool and a certant shine about it. looking the numbers roll down.

Personally, i dont like it, maybe cause it took 3 ages before the next number came down when i grew up


----------



## suraswami (Apr 1, 2009)

Darknova said:


> See? Everyone loves AMD really ^_^



Ofcourse, AMD really got me into building fast PCs.  Cheap, affordable and reliable.

I hope they don't get greedy and raise the current champs prices.


----------



## blkhogan (Apr 1, 2009)

Nice to see AMD rebounding from the brink of despair. Every little market share they recover keeps Intel competitive and honest. Would like to see AMD's share up around 30 to 35% of total market. Competition keeps companies on their toes! With competition comes advantages on the consumer market (price, avaliability and ever evolving and improving product). Kudos to AMD for sticking out the hard times.


----------



## Imsochobo (Apr 1, 2009)

I will contribute with a little experience here. from a longtime intel fan, who realized he got screewed by buying P4's all the time, and switched to AMD.

And he had his X2 for soo long, and he now bought a PHII. so lets take the price. its at 9000 NOK (900 quid, uk pounds) , which is like nothing for a system that runs games like that.

PH II 940 running stock. stock cooler.
4870X2
Gigabyte MA790FX-DS5
Crucial ballistix pc8500.
Antec sonata 3. changed psu to Pc power & cooling 750W
And a samsung drive @ 1tb.

And well, he runs 1680x1050, and he havnt seen a game lagg yet, other than crysis with AA and Cryostasis(which needs either 4870x2CF or GTX295, and physx graphics card to run smooth at max.

and he is soo pleased, and he have had his system for a while now.

He couldnt have built that system cheap without the PHII, it would have gotten 100-200 uk pounds more expensive with intel core 2 or core 7. and would not have given a better experience!

So do you game, not a bad choice, but its up to you, and how the pricing is in youre country, ati is hilariously cheap over here, while its other way round in other countries.

No wonder why amd is taking a piece of the cake, those tricores are just amazing for their price 

They are also fun to oc, but not fun to do encoding at, maybe it changes in near future, would be cool with better compotition all across the board 

Core 7 is still a masterpiece of a cpu though.


----------



## TheMailMan78 (Apr 1, 2009)

MWAHAHAHAHAHAHAHAHAHAHA! Thank you guys so much. Now lets see the stocks refelct this already. Right now they are at 3.14


----------



## btarunr (Apr 1, 2009)

3870x2 said:


> I'm guessing that would have to do with particular instruction sets given to a processor.  SuperPi is a very good test for raw processing power down to the core.



What I'm trying to say is that when an Intel E5200 performs better than a Phenom II 940 at SuperPi, that doesn't exactly make it a better CPU.


----------



## TheMailMan78 (Apr 1, 2009)

btarunr said:


> What I'm trying to say is that when an Intel E5200 performs better than a Phenom II 940 at SuperPi, that doesn't exactly make it a better CPU.



You shut up with your crappy "facts" nobody wants to know the "truth".


----------



## a_ump (Apr 1, 2009)

btarunr said:


> What I'm trying to say is that when an Intel E5200 performs better than a Phenom II 940 at SuperPi, that doesn't exactly make it a better CPU.



i couldn't agree more, i mean would i rather have a processor that performs great for 10-15seconds, or buy something at equal price that provides better performance in games? who buys a computer system based on SuperPI  i can understand enthusiasts buying system for benchmarks but specifically SuperPI no, lol no one bases the quality of a CPU off SuperPI, that'd be like someone basing which GPU to buy based on Furmark scores or something


----------



## btarunr (Apr 1, 2009)

TheMailMan78 said:


> You shut up with your crappy "facts" nobody wants to know the "truth".



You mean like how every Intel CPU and NVIDIA GPU you buy contributes to the Illuminati's secret economy?


----------



## laszlo (Apr 1, 2009)

Darknova said:


> See? Everyone loves AMD really ^_^



i think you're wrong

not everyone love AMD ,i think you don't need proofs....

AMD has achieved this position with prices and perf/price ratio but...

i say 50 % of buyer don't have a clue who is AMD or INTEL when buy a pc;they leave the decision to seller and only the final price matter

other 30% want to invest minimum for a rig which they use at least 2 year without upgrades.. and again the price is the major decision

the rest...20% know exactly what they buy and they decide depending on their knowledge, preferences (here come the fanboy) or performance

to summarize amd now has the best price/perf almost at all products but remember this ia accomplished with a minimal benefit so amd is still in... s..t   you know what i mean,with all the debts they have all is about survival


----------



## Easy Rhino (Apr 1, 2009)

I'm guessing it has to do with the slow economy. I hope AMD can capitalize on this.


----------



## sLowEnd (Apr 1, 2009)

suraswami said:


> I hope they don't get greedy and raise the current champs prices.



Given their current financial situation, I can't see them doing that


----------



## TheMailMan78 (Apr 1, 2009)

btarunr said:


> You mean like how every Intel CPU and NVIDIA GPU you buy contributes to the Illuminati's secret economy?



Now see you're just being dumb. You and I both know Intel and Nvidia finance the space monkey mafia.


----------



## erocker (Apr 1, 2009)

HolyCow02 said:


> BOOOYA!!!!
> 
> 
> 
> GO AMD!!! I like this! Time to buy some stocks!



Yes, everyone please buy AMD stock.  It's dirt cheap and they are a company on the rise.  I thought $11.20 a share was a deal when I invested in them.  I need to make some money back.


----------



## aj28 (Apr 1, 2009)

Imsochobo said:


> IT ALL COMES DOWN TO THIS:
> 
> Do you encode movies? Core 7
> 
> ...



That's really poor logic and a pretty lame analysis of the situation... And btw, nForce 3 was a Socket 939 platform (DDR), which is in fact _not_ compatible with AM3. Only AM2/AM2+ boards can do that. I love AMD and think this is a great new development for them, sure to only get better when the 45nm dual-cores come out, but really, Intel bleeding market share doesn't mean much till they're down to about 60%, just because it's so much harder to command the last quarter of the market than the first two.


----------



## Valdez (Apr 1, 2009)

aj28 said:


> That's really poor logic and a pretty lame analysis of the situation... And btw, nForce 3 was a Socket 939 platform (DDR), which is in fact _not_ compatible with AM3. Only AM2/AM2+ boards can do that.



Infact nforce 3 is a chipset, not a platform. You can use nf3 with am3 socket if you'd like.
Asrock has am3 cpu ready am2 socket motherboard with nforce3 chipset.


----------



## Assimilator (Apr 1, 2009)

While this is good news for AMD - and the CPU industry as a whole - I can't help but wonder what's going to happen when Core i5 arrives at the same price as Phenom II and makes the K10 architecture look outdated.

Also, I'd like to see stats on which processors are giving the biggest boost to AMD - my bet is that the X3s are responsible. Selling not-quite-up-to-spec quad cores as triple core CPUs was probably the best decision AMD have made in a long time - dual-core is perfect for gaming, quad-core is pefect for multimedia, and triple-core is a good compromise between both.


----------



## TheMailMan78 (Apr 1, 2009)

Assimilator said:


> While this is good news for AMD - and the CPU industry as a whole - I can't help but wonder what's going to happen when Core i5 arrives at the same price as Phenom II and makes the K10 architecture look outdated.
> 
> Also, I'd like to see stats on which processors are giving the biggest boost to AMD - my bet is that the X3s are responsible. Selling not-quite-up-to-spec quad cores as triple core CPUs was probably the best decision AMD have made in a long time - dual-core is perfect for gaming, quad-core is pefect for multimedia, and triple-core is a good compromise between both.



You're assuming the i5 will blow away the P2s latest incarnation. Thats a big assumption.


----------



## v12dock (Apr 1, 2009)

It has been a great year for AMD


----------



## a_ump (Apr 1, 2009)

TheMailMan78 said:


> You're assuming the i5 will blow away the P2s latest incarnation. Thats a big assumption.



yep, and i don't see i5 blowing away phenom II's, at least not in gaming. if you looks at the benchmarks now, i7 in some games does blow away the competition, yet their retail is 1.5-3x as much as the Phenom II 940, which matches the Q9650 in most benchmarks and occasionally beats it, while usually only trailing the i7 by 2-5fps with a few exceptions. AMD did a good job on Phenom II and if its close to i7's tail in gaming then i dout i5 is going to blow it away. we shall see.


----------



## WarEagleAU (Apr 1, 2009)

Sweet news. Yes, let all of us Fanboys rejoice. Everyone was saying AMD was done for. Yet strangely this is good news. The new architecture may need to come before 2011 though.


----------



## ShadowFold (Apr 1, 2009)

WarEagleAU said:


> Sweet news. Yes, let all of us Fanboys rejoice. Everyone was saying AMD was done for. Yet strangely this is good news. The new architecture may need to come before 2011 though.



You'd think with all the money they must be raking in that they would hurry up and get that out.. all I know is I want a 4ghz Phenom X6


----------



## [I.R.A]_FBi (Apr 1, 2009)

Darknova said:


> See? Everyone loves AMD really ^_^



No, everyone loves norp, see sig


----------



## a_ump (Apr 1, 2009)

ShadowFold said:


> You'd think with all the money they must be raking in that they would hurry up and get that out.. all I know is I want a 4ghz Phenom X6



eh lol, they've had a hard time, and i don't dout they're doing the best they can to create an architecture that can close the gap between them and intel. 4Ghz Phenom x6 haha, that'd be interesting. Though i can see Intel doing an 8-core chip similar to the way they did the c2q series with 2 dies. duno if that would work with IMC or not


----------



## nailzer (Apr 1, 2009)

Imsochobo said:


> IT ALL COMES DOWN TO THIS:
> 
> Do you encode movies? Core 7
> 
> ...



Yes, but my slower is money in my pocket.


----------



## mtosev (Apr 1, 2009)

i really don't care about market share. the important for me is performance. there Intel rules/dominates.


----------



## wiak (Apr 1, 2009)

Imsochobo said:


> I will contribute with a little experience here. from a longtime intel fan, who realized he got screewed by buying P4's all the time, and switched to AMD.
> 
> And he had his X2 for soo long, and he now bought a PHII. so lets take the price. its at 9000 NOK (900 quid, uk pounds) , which is like nothing for a system that runs games like that.
> 
> ...


hehe, your nearly my copy, i also got a Gigabyte GA-MA790FX-DS5, HD 4870 (non X2), OCZ Gold PC64000 4GB Dual Channel, 2x 1TB Samsung F1s, i also got a Phenom 9850 BE (gonna get a Phenom II 955 soon)

if you compare i7 with Phenom II at gaming, they are just as fast, unless you game on 640x480 resolution and uber low details

Phenom II is faster than the similar priced Core 2 Quads (Phenom II 940 vs Q9400)
on encoding Phenom II is as fast as a Q9550

Phenom IIs have higher lowest framerate, that translate into smoother gameplay in games

given the fact that AMDs phenom was before its time, they couldnt add more L3 cache to it, so they just made it like it was, ok budget cpu for anything

Phenom II has all the L3 Cache it needs and then many tweaks, to fix some performance bottlenecks

@Imsochobo how fast does your stock HT run on that motherboard?, Gigabyte removed 9850 BE support on my board after i bought it! 

am using Bios F5 it works fine with 9850 BE, anything higher just reboots system, damn Gigabyte!


----------



## Wile E (Apr 2, 2009)

mdm-adph said:


> ...*while being optimized to run on Intel chips, of course.*  It's like saying Nvidia cards are "faster" at gaming because they always beat ATI's cards in Crysis benchmarks -- I don't believe that, either.


No it isn't. It came out in the days of P4 and A64. A64 kicked P4's ass in it.

Seriously, I get so sick of people claiming that a benchmark is "optimized" for a particular cpu. That is almost never the case in actuality.


----------



## TheMailMan78 (Apr 2, 2009)

Wile E said:


> No it isn't. It came out in the days of P4 and A64. A64 kicked P4's ass in it.
> 
> Seriously, I get so sick of people claiming that a benchmark is "optimized" for a particular cpu. That is almost never the case in actuality.



How do you feel about GPUs? The way its meant to be played conspiracy?


----------



## Wile E (Apr 2, 2009)

TheMailMan78 said:


> How do you feel about GPUs? The way its meant to be played conspiracy?



It's just that, a conspiracy. Sure, nVidia may help them getting it running smoother on their hardware, but not at the expense of ATI's performance. In fact, there have been quite a few TWIMTBP titles where ATI has reigned supreme.


----------



## eidairaman1 (Apr 2, 2009)

No Kidding, UT2003/4 ran great on a AMD combo.


----------



## Bluefox1115 (Apr 3, 2009)

Sweet! AMD kicking ass again! 

I always said that their pricing and marketing techniques are what is going to drive Intel down. They may not be as superior chips, but for a LOT less cash, they stand up to mighty Intels beefy expensive processors in performance for gamers and the everyday user.


----------



## wiak (Apr 4, 2009)

Imsochobo said:


> IT ALL COMES DOWN TO THIS:
> 
> Do you encode movies? Core 7
> 
> ...



more like this
if your rich = i7
if your not rich = Phenom II


*Intel*
$288.99 = Core i7 920
$224.99 = MSI X58 Platinum 
$90.99 = OCZ 6GB (3 x 2GB) 240-Pin DDR3 SDRAM DDR3 1333 
$169.99 HIS Radeon HD 4870 512MB 
= $774.96

*AMD Rig #1*
$225.00 = AMD Phenom II X4 940
$99.99 = JetWay JHA04-LF 790FX
$91.98 = OCZ 8GB DDR2 1066 (2x OCZ 4GB (2 x 2GB) 240-Pin DDR2 SDRAM DDR2 1066)
$249.99 MSI Radeon HD 4890 OC 1GB
= $666.96

*AMD Rig #2*
$225.00 = AMD Phenom II X4 940
$99.99 = JetWay JHA04-LF 790FX
$45.99 =  OCZ 4GB (2 x 2GB) 240-Pin DDR2 SDRAM DDR2 1066
$429.99 = Sapphire Radoen HD 4870 X2 2GB
= $800.97

all prices are from newegg.com


----------



## wiak (Apr 4, 2009)

Wile E said:


> It's just that, a conspiracy. Sure, nVidia may help them getting it running smoother on their hardware, but not at the expense of ATI's performance. In fact, there have been quite a few TWIMTBP titles where ATI has reigned supreme.


assassins creed developer was gagged and slapped by nvidia when ati owned nvidia in the game


----------



## FryingWeesel (Apr 4, 2009)

good news.

and a little FYI for you all, i5 is just a core2 chip with IMC, the performance isnt that much better, infact many benches dont show any signifigant gain at all.

i5 also is plant to be very hard to overclock, intel wants to block overclocking unless u buy their "enthusist" platforms, they have been working on ways to make the cpu fail if you overclock it(no joke) 

so well, i will stick with my amd rigs, I cant see amd removing clocking from their cpu's, hell their black edition chips are a great buy IMHO.


----------



## JATownes (Apr 4, 2009)

FryingWeesel said:


> and a little FYI for you all, i5 is just a core2 chip with IMC, the performance isnt that much better, infact many benches dont show any signifigant gain at all.
> 
> i5 also is plant to be very hard to overclock, intel wants to block overclocking unless u buy their "enthusist" platforms, they have been working on ways to make the cpu fail if you overclock it(no joke)



Source??


----------



## FryingWeesel (Apr 4, 2009)

just what i been told by a friend who works for intel, it was an easy "upgrade" adding IMC to the core2 design and the cores are cheaper then i7 cores.

as to the overclocking, intel wants people who are going to overclock to buy chips they have rated for such on a platform they rated for such, a platform with a significantly higher priced platform mind you!!!

apparently intel cancled i3 and i4 (what where orignaly gonna be the core2 based chips with IMC) due to OEM's not wanting to have so many diffrent things/names to deal with, can u really blame company's like dell, they want to make sure most of their systems are just variations on a theme like most of their towers use the same matx/mbtx boards just diffrent chips and cases most of the time, maby a diffrent bios to, I have taken apart enought dells to know that many times the high end setups are the same board/ram as the "high end" stuff just that they stick better cpu and sometimes ram into the higher end board...(either way it still sucks )

Also i5 requiers ddr3, something amd rigs dont need, even AM3 setups dont show much if any gain by moving to ddr3, and currently ddr3 still costs more then equivlant ddr2 sets :/

intel already pushed back i5 and amd am3(at least a bit) due to the ecoimy and fact that ddr3 costs more and dosnt give any real benifit over ddr2 currently.

we will see what happens, i know that im not in any rush to upgrade, this systems able to play all my games at 1600x1200 without any hiccups so why upgrade it?

8800gts 512mb(at higher then 9800gtx/gtx+ clocks) 
4gb wintec ampx ddr2 800@960
2gb hynix ddr2 667@960
6000+@3.3xx

i have setup phenom systems for people, as well as 2 pII systems and 1 i7 rig, really the diffrances in my day to day use would be small by moving up currently, i7 is over priced, phenom1/2 are nice but not a big enought leap for me to bother buying a new chip yet(my board can take any am2/am2+/am3 chip amd makes)   

why upgrade when you dont need to?

sure i could speed up my encoding, but...meh i dont do as much of that as i use to, and my systems not to slow at that anyway, start a batch and go to bed, get up and its done


----------



## alucasa (Apr 5, 2009)

I kinda expected that AMD would regain some of market shares. I mean Core i7 isn't exactly cheap. It's a multi-task monster, but the price tag is just too high.

My main system is i7 920 and using BONIC at only 50% cpu cores at 90% cpu usage gives me 9000 WCG points a day at stock speed.
There is little doubt to me that i7 is good. The problem is that the only chipset that can support is X58 which is, let's face it, ridiculously expensive even for the lowest end mobo.

At the moment, AMD's Phenom II is on par (If not a little better) against outdated Core 2. AMD mobo is dirt cheap compared to X58 mobo as well. It's a very attractive choice for gaming rigs.

*Hector is gone and AMD starts to blossom. lol !*


----------



## FryingWeesel (Apr 5, 2009)

alucasa said:


> I kinda expected that AMD would regain some of market shares. I mean Core i7 isn't exactly cheap. It's a multi-task monster, but the price tag is just too high.
> 
> My main system is i7 920 and using BONIC at only 50% cpu cores at 90% cpu usage gives me 9000 WCG points a day at stock speed.
> There is little doubt to me that i7 is good. The problem is that the only chipset that can support is X58 which is, let's face it, ridiculously expensive even for the lowest end mobo.
> ...



also something else to note about this, your not gonna see any non-intel chipsets for at least a while for i7 because intel is insisting that nVidias licence dosnt cover making chipsets for i7(they are going to court over it) so intels got you tied in, you WILL use intel board and intel chipset, no other choice.....

amd you can use

ati/amd chipset, nvidia chipset, sis(seen a couple boards for am2+ with sis's older sets), and there are LOTS of chipsets to choose from, amd/ati 690,740,780,790x,790fx,790gx, nvidia nf3/4/5/6/7 chipsets all can be used, really theres a HUGE slection of boards that can take a phenom2   my ta770 can take phenom2 and it only cost me like 45bucks shiped from newegg a year or so ago  (overclocks like a beast to!!!)


----------



## alucasa (Apr 5, 2009)

I personally have no problem using intel-only motherboards. I've had terrible experiences with Nvidia chipsets. They can go bye-bye for me.
Though, as it stands now, it seems that Intel is trying to have everything their way. Perhaps, success of Core 2 has blinded them.

I use all builds I create for BONICing, so I prefer Intel CPU since they are more efficient at it. If it weren't for that, I'd go for building AMD rigs.

I may not look it, but I was once a hardcore AMD supporter (A fanboy as you may call.), but I got burned hard with AMD quad-FX 4x4 platform and AMD B2 Barcelona CPUs fiasco. Ever since those incidents, I feel uncomfortable going for AMD rigs.

But, you know, ultimately everyone will have their ups and downs. Intel has enjoyed their ups for years now. It's time for AMD to have some although, at this point, AMD is nowhere close having their former glorious days as Athlon XP days.


----------



## SparkyJJO (Apr 5, 2009)

alucasa said:


> *Hector is gone and AMD starts to blossom. lol !*



That surprising?


----------



## [I.R.A]_FBi (Apr 5, 2009)

FryingWeesel said:


> good news.
> 
> *and a little FYI for you all, i5 is just a core2 chip with IMC*, the performance isnt that much better, infact many benches dont show any signifigant gain at all.
> 
> ...



core2 is still competitive so no problems


----------



## pepsi71ocean (Apr 5, 2009)

thank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.


----------



## [I.R.A]_FBi (Apr 5, 2009)

pepsi71ocean said:


> thank god AMD is making a comeback, hopfully it will cause intel to cut their prices to remain competitive.



This is what everyone should look forward to


----------



## alucasa (Apr 5, 2009)

SparkyJJO said:


> That surprising?



Well? 

I thought all CEOs did was playing golfs and put stamp on documents. How naive of me !
j/k, of course. I know better than that.

Still, it's quite surprising that, once Hector is gone, AMD is starting to get better. Hector is the bad mojo, or ... you know, he sucks at what he is supposed to do.

How he got his fame, that's beyond me.


----------



## Wile E (Apr 5, 2009)

wiak said:


> assassins creed developer was gagged and slapped by nvidia when ati owned nvidia in the game



You mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.


----------



## HammerON (Apr 5, 2009)

I remember the first computer I built with an Athlon 64 3200. Then a 3500, then a 3700, then a 4000, and finally a FX 53. Those were all great CPUs compared to Intel (at the time). Then I switched with the Core2 Duo.
It is nice to see AMD back in the "game" and applying pressure to Intel


----------



## FryingWeesel (Apr 5, 2009)

Wile E said:


> You mean the whole 10.1 issues? No, it wasn't. 10.1 didn't give the gains, they accidentally left out part of the code in the patch, so the 10.1 cards actually weren't having to render as much as 10.0 cards. That's where the performance gains came from. Once that code was corrected, the gains disappeared. The conspiracy was made up by ATI fanboys.



wrong, one of the main features of 10.1 was/is to remove that extra rendering pass, so no code was left out, no effects where being missed, fact is that was an excuse to explain the patch, not not a conpiracy made up by ati fanboys, in this case it really is a fact that the game was patched to keep nvidia happy since they had dumped money(or in this case hardware) into helping dev the game.

there are plenty of links about it, those that go into depth explain it quite well, 10.1 removes the need for extra rendering passes for some effects, the same effects that gave the perf boost to ati cards.

so you can read up about this and get the FACTS not the excuses used by Ubi to placate nVidia.

http://techreport.com/discussions.x/14707



> .....So we have confirmation that the performance gains on Radeons in DirectX 10.1 are indeed legitimate.  The removal of the rendering pass is made possible by DX10.1's antialiasing improvements and should not affect image quality.  Ubisoft claims it's pulling DX10.1 support in the patch because of a bug, but is non-commital on whether DX10.1 capability will be restored in a future patch for the game....



basickly it was removed to remove the advantege ati had shown due to their cards supporting 10.1 when NOTHING nvidia had or even has today can support 10.1(true dx10)


----------



## FryingWeesel (Apr 5, 2009)

little more info:
10.1 in assains creede is actually legitimate because you can just reuse the depth buffer in DX10.1 instead of second pass.

again, something nvidia cards cant do because nvidia didnt want to support true dx10(hence ms cutting dx10 specs and having to bring out dx10.1 later)

ATI on the other hand had true dx10(now called 10.1) support with the hd2k cards but....well, nvidia didnt want to follow ms's specs, and cryed enought that ms backed down and removed the stuff nvidia couldnt/wouldnt support.

mind you, im on an 8800gts 512...so dont say im an nvidia hater, i love this card, but i dont love the actions of tthe company behind it.

http://www.pcgameshardware.de/aid,6...Assassins-Creed-planned/Assassins-Creed/News/



> You might remember: The enormously successful Assassin's Creed was the first game to have DX 10.1 support. But the patch 1.02 removed this feature.
> 
> PCGH was able to get some more information about this business. Below you will find an email interview with Ubisoft.
> 
> ...



if that dosnt look like somebodys just making excuses for patching out something that offers a bennifit to the "other team" i duno what you been smoking.....


----------



## FordGT90Concept (Apr 5, 2009)

Let me put this in bullets:

1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.

2) DirectX 10 support is usually coded as an alternate in software (it's easier on the hardware when ran on Vista).  That is, it is more or less the same as DirectX 9.0c.  Very, very few developers go out of their way to focus on DirectX 10 support (ehm, games released exclusively for DirectX 10).

3) DirectX 10, being mostly useless from the sales and development standpoint, carries over to DirectX 10.1; however, even fewer people have DirectX 10.1 support than DirectX 10.

4) Ubisoft developed the game with DirectX 10.1 in mind.  First, they saw that NVIDIA announced they had no plans to support DirectX 10.1.  Then they ran into problems themselves with the DirectX 10.1 code path in the game after it was launched.  They decided that about 1 out of every 10 cards playing the game could handle DirectX 10.1 and decided it would cost too much to fix the botched code in comparison to just removing it altogether.

And that's pretty much it.  It wasn't worth fixing so they removed it.  NVIDIA's dominance and them saying they won't support DirectX 10.1 may have something to do with deciding it wasn't worth fixing but, as with most publishers, it comes down to cost.  The cost to fix it exceeded the amount they were willing to pay so they just got rid of it.


----------



## soryuuha (Apr 5, 2009)

4) is just for assassin creed case rite?

Since Ubee put DX10.1 in Tom Clancy HAWX


----------



## ShadowFold (Apr 5, 2009)

HAWX runs awesome with DX10.1 on, so does STALKER but it's an AMD game so. I really like the boosts I get with games that have 10.1, too bad nvidia doesn't use it. I don't get why they don't use it, it really is great...


----------



## a_ump (Apr 5, 2009)

exactly, i mean why in the hell would publishers even care what nvidia had to say? i sure as hell wouldn't. all TWIMTBP games have that because nvidia like sponsers and gives em cahs money right? if games were DX10.1 i'd bet my entire system that ATI and Nvidia would have switched rolls with ATI on top.


----------



## FryingWeesel (Apr 5, 2009)

FordGT90Concept said:


> Let me put this in bullets:
> 
> 1) Most games are still designed for DirectX 9.0c so they don't lose the enormous customer potential of Windows 98 through Windows XP.
> 
> ...



read my posts+links, fact is that there was no botched code, it was just an excuse to remove something that was making TWIMTBP look bad, there was no "need" to remove it, the need came from greed, nvidia was PISSED that an nvidia game was running better on ATI hardware due to a FEATURE of dx10(the ability to avoid a 2nd rendering pass by re-using the depth buffer) 

I have personaly seen the game on ati hardware vs my 8800gts, it looks/runs better on a 3850/3870 or even 2900xt then it runs for me on my 8800gts 512mb (755/1900/2200) once AA is enabled.

the r600 and higher are TRUE dx10(whats now called 10.1) cards, the 4k cards add back some features of dx9 cards(hardware aa support insted of doing it all in shaders) 

had nvidia not refused to support true dx10 and convenced MS to dumb it down they would have bennifited from one less rendering pass being needed, but nv refuses to support 10.1, and when it showed a bennifit for ATI on a game NV supported(either with cash, advertising or hardware) NV was PISSED and got ubi to remove it.....

its not a conspiricy theory, its just buisness, and nv doing what i would call a dirty trick to the public at large, even their own customers.


----------



## a_ump (Apr 6, 2009)

lol just the fact that they wouldn't get dx10.1 going for their hardware just makes me laugh at them. i mean i wonder how much better performance in games like stalker and crysis we would have if they were dx10.1 not 10. not to mention that is probly what microsoft had in mind when they said dx10 would run better than dx9. they were refering to what we call dx10.1, well at least that's my theory. Nvidia is so pathetic


----------



## FryingWeesel (Apr 6, 2009)

a_ump, thats exectly what they where refering to, there are features that can improove both perf and quility that where removed from "10" to placate nvidia, as such we are not getting the best possible game experiance, insted we get dx9c games with some dx10 shader effects taged on, and when a company puts out a true dx10.1 path on a TWIMTBP title giving 10.1 hardware better perf, nvidia has it removed because it makes them look bad.

hell, the g80, g92, gt200 and we still dont see dx10.1 out of nvidia, they  COULD do it, but it would take more work then just re-using stuff they already have :/


----------



## Wile E (Apr 6, 2009)

How do you know MS removed them to placate NV? How do you know there wasn't botched code in Assassin's Creed? Everything you are claiming has no solid evidence.

Plain and simple, it's a conspiracy theory made up by ATI fanboys to make themselves feel better. Nv never paid off the multitude of other vendors whose TWIMTBP titles ran better on ATI hardware.

It's all a bunch of BS.


----------



## FryingWeesel (Apr 6, 2009)

ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know) 

MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.

You can compair the image quility of dx9, 10 and 10.1 on assassins creede yourself and see that theres not a problem, You can read the dx10.1 specs and see that what they refered to (the "missing rendering pass" is also a SPICIFIC FEATURE of dx10.1 that makes it more efficent then dx10 by allowing the depth buffer to be re-used insted of needing a 2nd rendering pass.

again, if you look at the statements that ubi made when interviewed about it, they dont hold up, they are vauge or use dbl talk to avoid telling people what the real reasion is.

to me it comes off as them saying whatever they have to in order to justify removing something that works fine for ati owners.

It dosnt effect me directly as thru the whole time I have had a g92 card, yet you say im an ati fanboi because I dont just accept the excuses ubi and nvidia put out for their actions.

like nvidia saying they didnt put 10.1 support in the gtx260 and gtx280 cards because "nobodys using 10.1", then why even bother supporting dx10 at all? NOBODY is truely using dx10 because it would cut off to large a portion of the market, those people who are running 2k/xp with dx9 hardware, they could have just made a really bitchin dx9 card since nobodys really using 10......but that would look really insain......(hell it looks insain that they put out extreamly high priced cards with no dx10.1 to me....) 

but hey, you must be right, nvidia can do no wrong after all..... 

Personaly, I have seen the stuff nV has pulled over the years, and dispite really liking my current card and being impressed by nvidias current driver development I dont think they are what you seen to think them to be, they are not flawless, they are not above bribery and other dirty tricks to keep their lead in benchmarks.

I guess you also think that the doom3 "conspiricy" was thought up by ati fanboys?

to refresh your memory, nvidia and id work togather, and intentionaly put in code that would run like SHIT on ati hardware, they used "texture lookups" insted of shader code, nvidia hardware did texture lookups insainly well back then, ati's hardware did shader work insainly well, by editing 1 file and replacing the texturelookup code with equivlant shader code ati cards became FASTER then nvidia cards with no quility diffrance(but these changes also slowed nvidia cards down even more then texture lookups slowed ati cards down) 

In the end ati put a fix in their drivers to get around the "problem", clearly if you looked at what they did, it wouldnt have been hard to have put both paths in the game and have it auto detect ati vs nvidia and use the proper path for that card, but they didnt..........

this stuffs happened many times over the years.

tiger woods first golf game for example wouldnt run in 3d mode on non-nvidia cards, u could trick it into running in full 3d mode with all features by using an app to change the device id to that of an nvidia card.
and that was an early TWIMTBP title and they have continued to do that kinda stuff over the years, hey its a good marketing move if you dont get caught as they did with AC, doom3 and tigerwoods(just 3 examples) 

I mean if you can keep your perf higher then the compeditors for the first months benching your set, if you can keep it going longer, your golden.

if u get caught, you just get the company to say the game/app needs patched because of flawed code or some other excuse.


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> ms cut back 10 due to nvidia request, there have been a couple articals about it online over the last couple years, the g80 CANT do some stuff that orignal dx10 specs called for, so MS pulled them out since at the time nvidia was the only maker with a dx10 card(the 2900 wasnt avalable yet as u full well know)
> 
> MS im sure hoped that by cutting back 10 and allowing the g80 to be a true dx10 card(by changing what dx10 really was) they would beable to draw more people to vista and dx10, it didnt work, mostly due to bad press and the fact that pre sp1 vista was a buggy pain in the ass to deal with.
> 
> ...


I didn't call you a fanboy. I said fanboys made it up. Did you make it up?

And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.

And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.


----------



## HammerON (Apr 6, 2009)

Well stated Wile E


----------



## DaedalusHelios (Apr 6, 2009)

People that buy Intel had a good competitive processor a year or three ago. The cutting edge types still upgrade though.

People that *only* buy AMD (for whatever reason) finally found a great competitive processor with Phenom 2.

So the *"AMD only"* group wasn't very motivated until the Phenom 2, for an upgrade. Most upgraded from the dismal original phenom or the good old ground breaking X2 939 or AM2.

I buy AMD and Intel. Why would you limit yourself to only one or the other. Its not a sports team.... its a processor.


PS I am just saying that Intel was ahead of the game by alot from the Core2 launch until Phenom 2 finally caught up but is still behind Core i7.


----------



## FryingWeesel (Apr 6, 2009)

some do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.

many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info) 

Id is one of those companys I use to have nothing but respect for, they use to be very even handed, they would add optimizations for most common readely avalable hardware, 3dfx,ati,nvidia,hell even powervr got support in quake1 and 2, then came doom3........

there are things I will accept as optimizations and things I wont accept as purely being optimizations, doom3 is one title that was clearly coded with extream bias to nvidia(it would have been easy to have put both code paths in)  AC, well from what i read myself its very clear that nV presured Ubi to "fix" their problem, the easiest fix was to just dissable/remove dx10.1 and say it was flawed/borked........


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> some do some dont do things to hamper perf on ati/nvidia cards on titles that are related to their hardware, as you should fully know, some companys are well known for it, most arent so blatant about it tho.
> 
> many times you see "unexplainable" perf issues with one or the other companys hardware for no apparent reasion, i mean HL2 vs doom3, well ati is just better at d3d and also the game/engine was optimized at least at the time for ati, BUT it also had rendering path optimizations that helped some nvidia cards run better as well, Doom3 had a spicific peice of coding that ran VERY poorly on ati hardware, somebody found the fix and posted it(then ati's driver dept figuared out how to fix it in drivers with that info)
> 
> ...


Adding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.

And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.


----------



## DaedalusHelios (Apr 6, 2009)

Wile E said:


> Adding all those optimizations cost more development money. Things the parent companies of the devs are taking very seriously nowadays. dev teams no longer get the time or budget allotted to them.
> 
> And again, still no proof exists that Ubi pulled 10.1 as a favor to nV.



The hardware DX10.1 compliance from ATi isn't compatible with the latest microsoft directX 10.1 software provided to developers by microsoft. I remember reading about it thinking no wonder why nobody bothers with 10.1. 

I know that affected 3xxx series ATi, but maybe 4xxx fixed the mistake?


----------



## FryingWeesel (Apr 6, 2009)

no proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.

and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
http://www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml



> Enhance the ATI Experience
> 
> 
> It is, of course, a well known fact that Doom 3 is a game which performs best when using boards by nVidia. This has left ATI fans frustrated and eager for a driver update or some other fix. Since ATI has not yet responded, a way of improving the way Doom 3 handles on ATI cards has been posted on the Beyond3D forums. According to the author, the performance increase can increase frame rate from 34fps in 1280x1024 to 48fps. Changes would, of course, depend on each individual set-up. A further suggestion from the forum is that the fix really kicks-in if vsync is enabled. Please feel free to post your experience with the fix on the MegaGames Forums.
> ...



theres more advanced versions but the megagames one is easy to find thats why i use it 

fact is that as u see the changes where EASY to make and made a HUGE diffrance in perf for ATI cards, but Id include such shader/math based code, because nvidia cards do texture lookups faster then they do math(at the time)


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> no proof they didnt either, and their comments when interviewed dont lead me to belive they removed it for any reasion other then because it gave ati an advantege.
> 
> and the optimizations for doom3 took a user very little time to figuar out, if u would like, i could link the post on megagames about it....
> http://www.megagames.com/news/html/pc/doom3enhancetheexperiencept2.shtml
> ...


And I guess innocent until proven guilty means very little to you?

Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.


----------



## FryingWeesel (Apr 6, 2009)

Wile E said:


> And I guess innocent until proven guilty means very little to you?
> 
> Yeah, and that quote in no way goes against what I said about them optimizing for nV, but not sabotaging ATI. No matter how you look at it, it's not sabotage to NOT program for something's strong points. There is no conspiracy.



duno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.

if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> duno where this innocent till proven guilty crap comes from, surely not the US legal system, i have enough experience with that to tell you, its guilty till proven innocent.
> 
> if u read all of the quote, basickly they did something they knew would run poorly on ati cards, when they could have just included both, optimizing for one by doing something that will hamper perf on another is in my eyes bullshit.



Where does it say they left it out on purpose? And by adding that code, they would also have to add some sort of detection and switch routine to the code. I don't see how that is sabotage.


----------



## DaedalusHelios (Apr 6, 2009)

Thats like saying that games using Havok are fighting Physx development and therefore Nvidia.

If the programmers don't have the time or money to spend on optimizing you can throw them some money to get it done. If ATi helped with funding too I am sure both would have been running on par. 

I don't see why it has to be a conspiracy.


----------



## eidairaman1 (Apr 6, 2009)

Ive known TWIMTBP titles actually play extrememly well on ATI parts.


Wile E said:


> I didn't call you a fanboy. I said fanboys made it up. Did you make it up?
> 
> And Ubi never said a render pass was missing, like the DX10.1 feature you are referring to.They said their implementation is buggy. If you want to take that as a conspiracy against ATI by nV and Ubi, be my guest.
> 
> And none of what you are saying has any solid backing in terms of evidence. No proof of motive exists. No, NV is not an angel of a company, nor is MS, Intel, or AMD. They are all guilty of something shady at any given point in time, but just because a game has the TWIMTBP tags on it, does not mean that the developer is doing anything to hurt ATI. Yes, they optimize for nV, because nV provides them the means to do so, but they don't sabotage ATI like so many want to believe.


----------



## a_ump (Apr 6, 2009)

okay, i wasn't sure what to thk, and imo you can call the Doom 3 coding w/e you want. Great support for Nvidia and or conspiracy against ATI, either way they could have easily fixed ATI's support in that game but no matter no. 

However i found this article interesting about the Dx10.1 removal in Assassin's Creed.

http://techreport.com/discussions.x/14707

So they responded that there were no image quality differences with Dx10.1 compared to Dx10, only performance improvements for compliant hardware. then they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience? just sounds like bullshit to me. We see benchmarks of great performance with ATI in a game that is TWIMTBP, then it's removed and not thought about since to re-instate dx10.1


----------



## FordGT90Concept (Apr 6, 2009)

a_ump said:


> then they state that they didn't want there to be a bad gaming experience, why would increased performance lower the gaming experience?


Because the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code.  It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path.  Ya know, fix it for the masses, not the few.


----------



## Braveheart (Apr 6, 2009)

*looks at Q9450 with a weird, undecided face*


----------



## TheMailMan78 (Apr 6, 2009)

FordGT90Concept said:


> Because the code for DirectX 10.1 was using a separate rendering path so fixing bugs in the DirectX 10/9 rendering path could easily cause complications in the DirectX 10.1 code.  It's easier just to remove the DirectX 10.1 render path and focus on improving the DirectX 10/9 path.  Ya know, fix it for the masses, not the few.



Says the person with the Nvidia card.


----------



## Wile E (Apr 6, 2009)

TheMailMan78 said:


> Says the person with the Nvidia card.



I say it too.


----------



## TheMailMan78 (Apr 6, 2009)

Why use an "inferior" path? I mean 10.1 runs better.


----------



## Wile E (Apr 6, 2009)

TheMailMan78 said:


> Why use an "inferior" path? I mean 10.1 runs better.



For a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.


----------



## FryingWeesel (Apr 6, 2009)

just dont update AC, you can play the game with 10.1 if u dont update it.....and it works perfectly...


----------



## TheMailMan78 (Apr 6, 2009)

Wile E said:


> For a much smaller amount of people. Would've meant extra dev time and money to perfect. It runs on DX10 on all of the modern cards, it only runs in DX10.1 for a small percentage. Some of their later games have it. It just wasn't a priority to add it back into AC.



So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?


----------



## Wile E (Apr 6, 2009)

TheMailMan78 said:


> So people with "bleeding edge" 10.1 get the shaft because Nvidia didn't develop accordingly?



nVidia didn't develop the title. Ubi did. Just like most of the major dev houses, they feel it's a waste to spend too much dev time on patching games, unless they really need it. AC doesn't need 10.1 to run, so they felt their dev money was better spent elsewhere, especially considering that all modern cards can already run it in DX10 anyway. Why implement and debug DX10.1 for a minority of gamers, on a game that is perfectly stable without, when they can use that dev time for newer titles?

Some of their newer titles do have 10.1 back in them.


----------



## a_ump (Apr 7, 2009)

i believe he was referring to Nvidia failing to implement Dx10.1 onto their GPU's, not the game itself. I agree lol why can't a very successful company like Nvidia implement dx10.1 yet ATI could? i do realize at first that ATI released their first dx10/10.1 card 6months after the G80 but dam Nvidia's had plenty of time to get dx10.1. It just amazes me how much pull nvidia has over so many other large companies like microsoft. I realize that it made sense for microsoft to lower dx10 requirements at first cause nvidia had the only dx10 compatible cards, but after SP1 it should have been strictly DX10.1, that coulda turned tables in an instant, and performance in games such as crysis and stalker maybe would be much better.


----------

