# NVIDIA Kepler Yields Lower Than Expected.



## TheMailMan78 (Feb 16, 2012)

NVIDIA seems to be playing the blame game according to a article over at Xbit. This is what they had to say, "Chief executive officer of NVIDIA Corp. said that besides continuously increasing capital expenditures that the company ran into in the recent months will be accompanied by lower than expected gross margins in the forthcoming quarter. The company blames low yields of the next-generation code-named Kepler graphics chips that are made at TSMC's 28nm node. "Decline [of gross margin] in Q1 is expected to be due to the hard disk drive shortage continuing, as well as a shortage of 28nm wafers. We are ramping our Kepler generation very hard, and we could use more wafers. The gross margin decline is contributed almost entirely to the yields of 28nm being lower than expected. That is, I guess, unsurprising at this point," said Jen-Hsun Huang, chief executive officer of NVIDIA, during a conference call with financial analysts.

NVIDIA's operating expenses have been increasing for about a year now: from $329.6 million in Q1 FY2012 to $367.7 million in Q4 FY2012 and expects OpEx to be around $383 million in the ongoing Q1 FY2013. At the same time, the company expects its gross margins in Q1 FY2013 to decline below 50% for the first time in many quarters to 49.2%. Nvidia has very high expectations for its Kepler generation of graphics processing units (GPUs). The company claims that it had signed contracts to supply mobile versions of GeForce "Kepler" chips with every single PC OEM in the world. In fact, NVIDIA says Kepler is the best graphics processor ever designed by the company. [With Kepler, we] won design wins at virtually every single PC OEM in the world. So, this is probably the best GPU we have ever built and the performance and power efficiency is surely the best that we have ever created," said Mr. Huang.

Unfortunately for NVIDIA, yields of Kepler are lower than the company originally anticipated and therefore their costs are high. Chief exec of NVIDIA remains optimistic and claims that the situation with Fermi ramp up was ever worse than that. "We use wafer-based pricing now, when the yield is lower, our cost is higher. We have transitioned to a wafer-based pricing for some time and our expectation, of course, is that the yields will improve as they have in the previous generation nodes, and as the yields improve, our output would increase and our costs will decline," stated the head of NVIDIA.

Kepler is NVIDIA's next-generation graphics processor architecture that is projected to bring considerable performance improvements and will likely make the GPU more flexible in terms of programmability, which will speed up development of applications that take advantage of GPGPU (general purpose processing on GPU) technologies. Some of the technologies that NVIDIA promised to introduce in Kepler and Maxwell (the architecture that will succeed Kepler) include virtual memory space (which will allow CPUs and GPUs to use the "unified" virtual memory), pre-emption, enhance the ability of GPU to autonomously process the data without the help of CPU and so on. Entry-level chips may not get all the features that Kepler architecture will have to offer."

*View at TechPowerUp Main Site*


----------



## TheMailMan78 (Feb 16, 2012)

On a side note this explains why the new AMD GPU's are so damn expensive. Also thanks for the heads up Crap Daddy!


----------



## LAN_deRf_HA (Feb 16, 2012)

So are we going to get stuck with 28nm as long as we were stuck with 40nm? With this much going into it and the transition being so costly I doubt anyone will be eager to move on to the next process.


----------



## jpierce55 (Feb 16, 2012)

I imagine we will be, it may have been better to wait another generation before moving on as it is. This news is not surprising to me. Nvidia has been doing good financially, so I expected it was external.


----------



## Crap Daddy (Feb 16, 2012)

All this time we were blaming AMD how greedy they are with the 7900 series. I'm really starting to think those 50 bucks or so over the right price may go directly to TSMC.


----------



## erocker (Feb 17, 2012)

But Nvidia said soon?! How dissappointing.


----------



## ViperXTR (Feb 17, 2012)

nVidia: We expected more from 7900
AMD: We expected higher yield from Kepler 
_Problem nVidia? trollface.jpg_


----------



## theonedub (Feb 17, 2012)

It would be more surprising if there weren't issues. I'm looking forward to the next  Nvidia offering.


----------



## NC37 (Feb 17, 2012)

Every year it seems there is another production problem related to TSMC or GF. Really makes me want to ask an ATI exec if they factored that in when they decided to close their in house fabs and go with them. Bad publicity + delays like this have to cut into profits. Maybe the cheaper production outweighs this but I'd have for sure looked at this downside and considered that in the long run it would be better to keep things in house.


----------



## BlackOmega (Feb 17, 2012)

This is basically a message saying "Hey guys, I know you wanted a competitively priced GPU from us, but because our yields are total suck ass, they're going to be expensive as hell. Sorry."

 So we can expect a GPU that _may_ outperform the 79xx series, but a quite the price premium.


----------



## erocker (Feb 17, 2012)

BlackOmega said:


> This is basically a message saying "Hey guys, I know you wanted a competitively priced GPU from us, but because our yields are total suck ass, they're going to be expensive as hell. Sorry."
> 
> So we can expect a GPU that _may_ outperform the 79xx series, but a quite the price premium.



...and they knew it all along. Their previous comments were nothing but to keep people from purchasing from their competition.


----------



## mastrdrver (Feb 17, 2012)

How is it that nVidia is the only one affected by the hard drive shortage? AMD never said it's GPU sales were affected by the shortage.


----------



## BlackOmega (Feb 17, 2012)

erocker said:


> ...and they knew it all along. Their previous comments were nothing but to keep people from purchasing from their competition.



Of course they did. But the whole "wait for _x_ to come out" thing works really well.


----------



## radrok (Feb 17, 2012)

Looking forward to the finished product, 79xx didn't impress me.
Come on Nvidia, can't wait to get back to good OpenGL support.


----------



## BlackOmega (Feb 17, 2012)

erocker said:


> ...and they knew it all along. Their previous comments were nothing but to keep people from purchasing from their competition.



Actually now that I think about it for more than second, I'm wondering if this is just a fake shortage to falsely inflate their prices.


----------



## Fluffmeister (Feb 17, 2012)

radrok said:


> Looking forward to the finished product, 79xx didn't impress me.
> Come on Nvidia, can't wait to get back to good OpenGL support.



I agree, looking forward to see what Kepler offers.


----------



## badtaylorx (Feb 17, 2012)

why make the comment in the first place???

we allready knew that kepler wouldnt be here for a cppl more months???


----------



## eidairaman1 (Feb 17, 2012)

Sounds alot like Fermi. Who thinks Nv needs a corporate realignment?


----------



## newtekie1 (Feb 17, 2012)

erocker said:


> ...and they knew it all along. Their previous comments were nothing but to keep people from purchasing from their competition.



I'm pretty sure the whole "our mid-range cards will outperform their top end" was kind of a hint that kepler would be expensive as hell.  Basically saying the mid-range was going to start at $500...


----------



## entropy13 (Feb 17, 2012)

eidairaman1 said:


> Sounds alot like Fermi. Who thinks Nv needs a corporate realignment?



You mean "sounds alot (sic) like the first Fermi."


----------



## eidairaman1 (Feb 17, 2012)

entropy13 said:


> You mean "sounds alot (sic) like the first Fermi."



Ya took a revision to fix it. Eh basically a 2900/3870 deal


----------



## semantics (Feb 17, 2012)

maybe they need to hire new experts making these projections which always seem wrong.


----------



## Inceptor (Feb 17, 2012)

Projections are almost always wrong.  Period.  Full stop.
Criticizing on the basis of a commentary which was, at best, cryptic, is just as wrong.

So, NV says it's all TSMC's fault.  Why? because they have to; they can't ship out a ton of GPUs now, if they say nothing and pretend it's all OK, it'll be a bit like the Bulldozer rollout for them.

"Hey guys, our 28nm yield is pretty bad, so uhh, you're gonna pay for it, and unless you fork over the money IMMEDIATELY to get the few cards we ship, you're gonna wait for it."
*NV guy waves his hand at you as you look at those AMD GPUs and says,*
"Those aren't the cards you're looking for, move along now.  Here's a shiny brochure about the awesomness you can have from us.... six months from now."


----------



## eidairaman1 (Feb 17, 2012)

Your Last Line Makes me laugh hard



Inceptor said:


> Projections are almost always wrong.  Period.  Full stop.
> Criticizing on the basis of a commentary which was, at best, cryptic, is just as wrong.
> 
> So, NV says it's all TSMC's fault.  Why? because they have to; they can't ship out a ton of GPUs now, if they say nothing and pretend it's all OK, it'll be a bit like the Bulldozer rollout for them.
> ...


----------



## NAVI_Z (Feb 17, 2012)

Inceptor said:


> Projections are almost always wrong.  Period.  Full stop.
> Criticizing on the basis of a commentary which was, at best, cryptic, is just as wrong.
> 
> So, NV says it's all TSMC's fault.  Why? because they have to; they can't ship out a ton of GPUs now, if they say nothing and pretend it's all OK, it'll be a bit like the Bulldozer rollout for them.
> ...



sike !...


----------



## omegared26 (Feb 17, 2012)

**

dont worry nvidia fanboys, Kepler will be out when AMD will have the 8xxx series


----------



## DannibusX (Feb 17, 2012)

omegared26 said:


> dont worry nvidia fanboys, Kepler will be out when AMD will have the 8xxx series



Wow. You're worthless.


----------



## omegared26 (Feb 17, 2012)

**

you think so? kid did i talked bad with you? maybe your good for nothing but i didnt sayed nothing to you so pls just shut up and cya DannibusX


----------



## ViperXTR (Feb 17, 2012)




----------



## hellrazor (Feb 17, 2012)

mastrdrver said:


> How is it that nVidia is the only one affected by the hard drive shortage? AMD never said it's GPU sales were affected by the shortage.



Because they have computers, and computers really like to have hard drives?


----------



## DannibusX (Feb 17, 2012)

Huh?


----------



## m1dg3t (Feb 17, 2012)

Nvidia never surprises me, NEVER


----------



## EpicShweetness (Feb 17, 2012)

I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!


----------



## omegared26 (Feb 17, 2012)

EpicShweetness said:


> I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
> Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!



I used in 10 years Ati cards and Nvidia cards and i saw big differences at collors and quality of image and my conclusion was Ati-Amd is better in all things (price,video quality.....)


----------



## Super XP (Feb 17, 2012)

EpicShweetness said:


> I've heard this little line to many times from Nvidia ppl "All your AMD cards are good at is gaming"
> Why yes sir you are correct my AMD card is very competitive in that sense to your Nvidia one, and guess what? I bought my card for gaming why would I want a chip that has all these extra features that sound awesome, but I hardly use! By making such a colossal chip it's much more complex and much harder to develop, stop wasting your time with it Nvidia, and bring us gaming performance so I don't have to pay so much!


NV has no choice but to design multi-use Graphics Cards that do a lot more than just gaming. This is needed so they can remain competitive, and not just in gaming.

If NV just stuck to simple designs that offered solid top class gaming performance, they would would have been close to filing for Chapter 11. IMO.


----------



## Rowsol (Feb 17, 2012)

omegared26 said:


> you think so? kid did i talked bad with you? maybe your good for nothing but i didnt sayed nothing to you so pls just shut up and cya DannibusX



mmmmk


----------



## Jetster (Feb 17, 2012)

Well


----------



## Recus (Feb 17, 2012)

omegared26 said:


> dont worry nvidia fanboys, Kepler will be out when AMD will have the 8xxx series



Since mid-range Kepler will be faster than all 7000s *one card will rule them all*. 
	

	
	
		
		

		
		
	


	




 Flagship Kepler will compete with 8000.


----------



## omegared26 (Feb 17, 2012)

Recus said:


> Since mid-range Kepler will be faster than all 7000s *one card will rule them all*. http://www.vanadrighem.eu/images/trollface.png Flagship Kepler will compete with 8000. http://i56.tinypic.com/2e1sp3m.gif



And loool since nvidia will cost 3xtimes more than amd u should buy 4 of those but i bet u dont have money for that


----------



## radrok (Feb 17, 2012)

omegared26 said:


> I used in 10 years Ati cards and Nvidia cards and i saw big differences at collors and quality of image and my conclusion was Ati-Amd is better in all things (price,video quality.....)



Could you please explain this "difference" in colours?
I'm really looking forward to what you will pull out now.


----------



## omegared26 (Feb 17, 2012)

radrok said:


> Could you please explain this "difference" in colours?
> I'm really looking forward to what you will pull out now.



man the collors on amd are better than nvidia, you will see that if you have 2xPc's with same monitors (one with amd card and one with nvidia card), amd has better collors, nvidia has pale collors and believe me when you will make that tests u will see im right


----------



## radrok (Feb 17, 2012)

You should calibrate the monitor every time you switch GPU before doing any analysis on colour.
If you just plug and forget then you can't really complain about colours.


----------



## Recus (Feb 17, 2012)

omegared26 said:


> And loool since nvidia will cost 3xtimes more than amd u should buy 4 of those but i bet u dont have money for that



While I'm buying 4 cards I don't care about the price.


----------



## Fluffmeister (Feb 17, 2012)

omegared26 said:


> man the collors on amd are better than nvidia, you will see that if you have 2xPc's with same monitors (one with amd card and one with nvidia card), amd has better collors, nvidia has pale collors and believe me when you will make that tests u will see im right



AMD does have awesome collors.


----------



## pr0n Inspector (Feb 17, 2012)

omegared26 said:


> I used in 10 years Ati cards and Nvidia cards and i saw big differences at *collors* and quality of image and my conclusion was Ati-Amd is better in all things (price,*video quality*.....)



Are you trying to reverse-troll?
Because the last thing I want is a video card meddling with my images and videos. Maybe that's why  I color-manage everything I can.


----------



## W1zzard (Feb 17, 2012)

colors are even better on LSD

i think the OP is talking about analog VGA outputs, which due to their analogness can suffer from picture quality degradation depending on the components used on the board. shouldn't happen for dvi/hdmi/dp


----------



## wolf (Feb 17, 2012)

Yeilds on big chips, on a new node, are low? really? noooooo....... 

it's happened before, it'll happen again, they will be cometitive cards I'm sure of that.


----------



## Red_Machine (Feb 17, 2012)

AMD has nVIDIA by the balls.  They've got an arrangement with TSMC to prioritise AMD chips, which leaves nVIDIA with very little fab time.


----------



## eidairaman1 (Feb 17, 2012)

Red_Machine said:


> AMD has nVIDIA by the balls.  They've got an arrangement with TSMC to prioritise AMD chips, which leaves nVIDIA with very little fab time.



You have to realize TSMC does other chips aswell not just AMD or Nvidias.


----------



## the54thvoid (Feb 17, 2012)

eidairaman1 said:


> You have to realize TSMC does other chips aswell not just AMD or Nvidias.



Yar.  I think Apple has a deal with TSMC.  A big deal.


----------



## eidairaman1 (Feb 17, 2012)

the54thvoid said:


> Yar.  I think Apple has a deal with TSMC.  A big deal.



However It seems TSMC has alot of teething problems as of the last 5 years.


----------



## Benetanegia (Feb 17, 2012)

mastrdrver said:


> How is it that nVidia is the only one affected by the hard drive shortage? AMD never said it's GPU sales were affected by the shortage.



AMD GPU sales were affected by the shortage, and they probably did say so in their own report:

http://www.anandtech.com/show/5465/...ort-169b-revenue-for-q4-657b-revenue-for-2011



> Meanwhile the biggest loser here was the desktop GPU segment, thanks both to a general decrease in desktop sales and the hard drive shortage. Compared to CPU sales desktop GPU sales in particular are being significantly impacted by the hard drive shortage as fewer desktop PCs are being sold and manufacturers cut back on or remove the discrete GPU entirely to offset higher hard drive prices.



Also while on laptops AMD has a bigger marketshare, in desktops Nvidia has a 60%, so it's more affected than AMD there. In any case Nvidia's Q4 results were much better than AMD's Q4, so it's just a matter of explaining why their operating expenses were higher than before.


----------



## TheMailMan78 (Feb 17, 2012)

Benetanegia said:


> AMD GPU sales were affected by the shortage, and they probably did say so in their own report:
> 
> http://www.anandtech.com/show/5465/...ort-169b-revenue-for-q4-657b-revenue-for-2011
> 
> ...



That and AMD has way more fab time then NVIDIA. There is a reason NVIDIA was down graded. NVIDIA saying this is just telling you "Get ready to pay out the ass for our new GPU" Stock holders are not fanboys. They play no favorites.


----------



## Prima.Vera (Feb 17, 2012)

BlackOmega said:


> This is basically a message saying "Hey guys, I know you wanted a competitively priced GPU from us, but because our yields are total suck ass, they're going to be expensive as hell. Sorry."



Exactly what I was thinking. Plus, you can add the obvious delay in launching the cards. Fermi all over again!


----------



## BlackOmega (Feb 17, 2012)

omegared26 said:


> And loool since nvidia will cost 3xtimes more than amd u should buy 4 of those but i bet u dont have money for that


Actually I have to agree with this. 


radrok said:


> Could you please explain this "difference" in colours?
> I'm really looking forward to what you will pull out now.



 He can't but I can .

 Nvidia has sacrificed image quality in lieu of performance. 
 Now this goes back a little bit but back when I was using some 8800's in SLI when I switched from the 175.19 driver to the 180.xx driver I noticed that my framerate doubled [in BF2142] but all of the colors washed out. At the time I was using a calibrated Dell Trinitron Ultra-Scan monitor so I _immediately_ noticed the difference in color saturation and overall image quality. 
 I actually switched back to the 175.19 driver and used it as long as I possibly could. Then I made the switch to ATi and couldn't have been happier. Image quality and color saturation was back, not to mention the 4870 I bought simply SMOKED my SLI getup. 

EDIT:





Prima.Vera said:


> Exactly what I was thinking. Plus, you can add the obvious delay in launching the cards. Fermi all over again!


 Makes me wonder if the same thing that happened when Fermi came out is going to happen again. People waited and waited, then Fermi debuted, was a flop and all of the ATi cards sold out overnight.


----------



## alwayssts (Feb 17, 2012)

wolf said:


> Yields on big chips, on a new node, are low? really? noooooo.......



EXACTLY.

Compound this:

AMD has 32 CUs and really only needs slightly more than 28 most of the time.  7950 is a fine design, and it doesn't really hurt the design if yields are low on 7970.  Tahiti is over-designed, prolly because of the exact reason mentioned; big chip on new node.  Even if GK104 did have the perfect mix of rop:shader ipc, the wider bus and (unneeded) bandwidth of 7950 should make up that performance versus a similar part with 256bit bus because 7950 is not far off that reality.  Point AMD on flexibility to reach a certain performance level.

  Again, I think the 'efficient/1080p/gk104-like' 32 ROP design will come with Sea Islands when 28nm is mature and 1.5v 7gbps gddr5 is available..think something similar to a native 7950 with a 256-bit bus using higher clocks.  Right now, that chip will be Pitcairn (24 ROPs) because it is smaller and lines up with market realities.  Point AMD on being realistic.  

nVIDIA appears to have 16 less-granular big units, which itself is a yield problem...like Fermi on a less-drastic level because the die is smaller.  If the shader design is 90% ppc (2 CU versus 1 SM) or less versus AMD, every single SM is needed to balance the design.  I wager that is either a reality or very close to it considering 96sp, even with realistic use of SFU, is not 90% of 128.  Yeah, scalar is 100% efficient, but AMD's 4vliw/MIMD designs are not that far off on average.  Add that Fermi should need every bit of of 5ghz memory bandwidth per 1ghz core clock and 2 SM (ie 32 ROP/16/256-bit SM, 28 ROP/14 SM/224-bit) and you don't have any freaking wiggle room at all if your memory controller/core design over or under-perform. 

Conclusion:

So if you are nVIDIA you are sitting with a large die, with big units that are all needed at it's maximum level to compete against the salvage design of the competition.  Efficient as fermi can be yes, smart choices for this point in time...not even close.

Design epic fail.


----------



## TheoneandonlyMrK (Feb 17, 2012)

radrok said:


> You should calibrate the monitor every time you switch GPU before doing any analysis on colour.
> If you just plug and forget then you can't really complain about colours.



if you just plug and forget with both cards you have a reasonable comparison untweeked and nv look  poorer, simples


----------



## radrok (Feb 17, 2012)

theoneandonlymrk said:


> if you just plug and forget with both cards you have a reasonable comparison untweeked and nv look  poorer, simples



Do you realize it makes no sense to not optimize things? If default is fine for you then okay, be my guest.


----------



## TheoneandonlyMrK (Feb 17, 2012)

radrok said:


> Do you realize it makes no sense to not optimize things? If default is fine for you then okay, be my guest.


  read again ,i never said that i said if you plug and foreget both that would then be a fair comparison and NV look worse,,, simples


----------



## cadaveca (Feb 17, 2012)

radrok said:


> Do you realize it makes no sense to not optimize things?



To me, it makes no sense TO optimize anything. The average user is going to do just that, so while "optimized" systems amy be better, most users will do no such thing, jsut beucase it's pain in the butt, or they do not know how.

For a professional, where colour matters, sure, calibration of your tools is 100% needed. But not all PC users use their PCs in a professional context, and most definitely not the gamer-centric market that find their way on to TPU.


You need to be able to relate the user experience, nto the optimal, unless every user can get the same experience with minimal effort. When that requires education of the consumer, you can forget about it.


----------



## radrok (Feb 17, 2012)

I understand your point Dave, still I think that is a waste to not get self informed about things and get the best experience you can out of your purchases.




theoneandonlymrk said:


> read again ,i never said that i said if you plug and foreget both that would then be a fair comparison and NV look worse,,, simples



With all due respect, your sentence makes no sense to me, sorry.


----------



## Benetanegia (Feb 17, 2012)

AMD does not have "better" colors, it has "more saturated" colors. Oversaturated colors. Several studies have dmostrated that when people are presented 2 identical images side by side, one being natural and the other being oversaturated, they tend to prefer the oversaturated one, well 70% of people do. But the thing is it's severely oversaturated and colors are not natural by any means. They are not the colors you can find in real life.

So what is "better"? What is your definition of better? I guess if you belong to the 70% of people whose definition of better is more saturated then I guess that AMD has a more appealing default color scheme. If your definition of better is "more close to reality, more natural" then you'd prefer Nvidia's scheme.

Saying that AMD has better color is like saying that fast food tastes better, because they use additives to make it "taste more". I guess people who get addicted to fast food do think it tastes better, but in the end it's just a matter of taste and so is colors.


----------



## TheMailMan78 (Feb 17, 2012)

Benetanegia said:


> AMD does not have "better" colors, it has "more saturated" colors. Oversaturated colors. Several studies have dmostrated that when people are presented 2 identical images side by side, one being natural and the other being oversaturated, they tend to prefer the oversaturated one, well 70% of people do. But the thing is it's severely oversaturated and colors are not natural by any means. They are not the colors you can find in real life.
> 
> So what is "better"? What is your definition of better? I guess if you belong to the 70% of people whose definition of better is more saturated then I guess that AMD has a more appealing default color scheme. If your definition of better is "more close to reality, more natural" then you'd prefer Nvidia's scheme.
> 
> Saying that AMD has better color is like saying that fast food tastes better, because they use additives to make it "taste more". I guess people who get addicted to fast food do think it tastes better, but in the end it's just a matter of taste and so is colors.



Having using AMD for years and just now using a NVIDIA card I can say with full confidence what you just said is BS. They look the same. I didn't even have recalibrate for process colors.


----------



## radrok (Feb 17, 2012)

TheMailMan78 said:


> Having using AMD for years and just now using a NVIDIA card I can say with full confidence what you just said is BS. They look the same. I didn't even have recalibrate for process colors.



I agree with you TheMailMan78, in fact no one has given us proof to strengthen their argument.
That's why I asked the person who brought the "colour" argument in the first place.


----------



## Benetanegia (Feb 17, 2012)

TheMailMan78 said:


> Having using AMD for years and just now using a NVIDIA card I can say with full confidence what you just said is BS. They look the same. I didn't even have recalibrate for process colors.



It was true some years ago at least, I honestly don't know if it's true now, but people still say the same. In any case my point was that there's no "better" color, just more saturated or less saturated color and it's all about what you prefer. The one truth is that most of the media we are fed nowadays is oversaturated anyway, so it's just a matter of what extent of oversaturation you really prefer.

And I find kinda funny that you chose to call BS on my post and not any of the preceeding ones.


----------



## TheMailMan78 (Feb 17, 2012)

Benetanegia said:


> It was true some years ago at least, I honestly don't know if it's true now, but people still say the same. In any case my point was that there's no "better" color, just more saturated or less saturated color and it's all about what you prefer. The one truth is that most of the media we are fed nowadays is oversaturated anyway, so it's just a matter of what extent of oversaturation you really prefer.
> 
> And I find kinda funny that you chose to call BS on my post and not any of the preceeding ones.



I call yours BS because I expect more out of you....

Dont sink to it man.


----------



## pr0n Inspector (Feb 17, 2012)

TheMailMan78 said:


> I call yours BS because I expect more out of you....
> 
> Dont sink to it man.



There used to be a 16-235 vs 0-255 issue. But that was dealt with long ago and it was not the video card's job anyway.


----------



## LAN_deRf_HA (Feb 18, 2012)

BlackOmega said:


> Actually I have to agree with this.
> 
> 
> He can't but I can .
> ...



Nvidia sacrificed IQ with the 7xxx series, that was it. Still to this day I rag on people who bought 7xxx cards because it was empty framerates. First time I can recall a new card gen having lower IQ than the previous one. The driver issue you talk about is well behind BOTH companies. Both got into the habit of releasing drivers around card release time that had IQ errors that increased performance. Namely I can think of this happening in Crysis 1 around the time the 3870/8800 GT were being compared, but the issue was always corrected in successive driver releases.



TheMailMan78 said:


> Having using AMD for years and just now using a NVIDIA card I can say with full confidence what you just said is BS. They look the same. I didn't even have recalibrate for process colors.



You're doing it wrong. You need screenshots. I've seen this a lot in AA quality comparison shots in reviews as recently as Metro 2033's release. AMD cards are more saturated, at least that recently.


----------



## sergionography (Feb 18, 2012)

alwayssts said:


> EXACTLY.
> 
> Compound this:
> 
> ...



well nvidia did drop the hot clocks which allowed more cores in the gpu and will no longer be limited in clocks as the shaders and the cores will have the same frequency(before since they had hot clocks they always had scaling issues), they radically changed the fermi make up and seems like they know what they are doing, as for the gtx660 i read leaks that it was a 340mm2 chip compared to the 365mm2 of the hd7970 and is meant to compete and come close to the hd7970 which seems reasonable, tho im not sure how they will pull a gtx680/670 (probably will be like the gtx470/480 with disabled hardware)

so while i agree with you overall nvidia isnt in such a bad place, only their biggest chip is.
so in the worst case nvidia will end up with a top end gpu that is 10-20% slower than amds top end, but I doubt that, even with the 256bit bandwidth that everyone is all crazy about i dont think it should be a problem in most scenarios, especially considering the fact that most people buying nvidia dont really do multiple gpu setups while for amd its almost a must for eyefinity.

also i heard leaks nvidia was debating whether they should call the gk104 gtx660 or gtx680 when the gk110 was supposed to be for that but isnt coming anytime soon, so idk whether the yeild issues force nvidia to do so, or whether they think the gk104 is sufficient, either way we need competition already, and for cards with 340mm2 and 365mm2 die sizes they should be well in the 350-400$ price range, and thats considering the TSMC 20% more expensive wafer prices


----------



## pr0n Inspector (Feb 18, 2012)

LAN_deRf_HA said:


> Nvidia sacrificed IQ with the 7xxx series, that was it. Still to this day I rag on people who bought 7xxx cards because it was empty framerates. First time I can recall a new card gen having lower IQ than the previous one. The driver issue you talk about is well behind BOTH companies. Both got into the habit of releasing drivers around card release time that had IQ errors that increased performance. Namely I can think of this happening in Crysis 1 around the time the 3870/8800 GT were being compared, but the issue was always corrected in successive driver releases.
> 
> 
> 
> You're doing it wrong. You need screenshots. I've seen this a lot in AA quality comparison shots in reviews as recently as Metro 2033's release. AMD cards are more saturated, at least that recently.




I don't think we were talking about the image quality of 3D engines.


----------



## TheGuruStud (Feb 19, 2012)

I have my 7950, nvidia, so na na na boo boo. Go cry to mommy. We knew yields were low LAST YEAR (for both camps)!.

Fantastic card, btw   Runs much better than the 6950s I had. At 1,175 core so far. Still testing 
With a non-reference cooler and OCed it still won't go above low 60s. The fans are still silent.


----------



## Inceptor (Feb 20, 2012)

sergionography said:


> especially considering the fact that most people buying nvidia dont really do multiple gpu setups while for amd its almost a must for eyefinity.



The other way around.


----------



## Wrigleyvillain (Feb 20, 2012)

Inceptor said:


> The other way around.



Uh idk...can't speak to multi-monitor really but offhand I know a lot more people running Crossfire than SLI and always have pretty much (if the opposite is in fact what you were saying).


----------



## erocker (Feb 20, 2012)

For Nvidia Surround (3 monitors) you need two cards. For AMD Eyefinity you only need one.


----------



## m1dg3t (Feb 20, 2012)

erocker said:


> For Nvidia Surround (3 monitors) you need two cards. For AMD Eyefinity you only need one.



And with eyefinity you can run up to 6 screen's


----------



## sergionography (Feb 21, 2012)

erocker said:


> For Nvidia Surround (3 monitors) you need two cards. For AMD Eyefinity you only need one.



which is why amd graphic cards are more bandwidth hungry than nvidia
and explains why nvidia is releasing a 256-bit card to compete with amds hd7970



Inceptor said:


> The other way around.



for multi-monitor on nvidia you have to SLI, while for eyefinity you can use one amd card, thats what i was referring to, in other words the hd7970 needs that extra bandwidth more than the gk106


----------

