Tuesday, February 24th 2015

"It Won't Happen Again:" NVIDIA CEO Breaks Silence on GTX 970 Controversy

In the wake of bad PR, and a potentially expensive class-action lawsuit over the GeForce GTX 970 memory controversy, NVIDIA CEO Jen-Hsun Huang wrote a candid letter addressed to everyone concerned, explaining in the simplest possible language what went wrong with designing and marketing the chip, how it doesn't affect the design-goals of the product, its quality or stability, and how it could be misconstrued in a whole different ways.

Huang's explanation of the issue isn't much different from the one we already have, but bears the final stamp of authority from the company, especially with the spate of discrepancies between what NVIDIA representatives post on GeForce forums, and what ends up being the company's position on certain things. Huang's letter signs off with "we won't let this happen again. We'll do a better job next time."

The transcript of Huang's letter follows.

Hey everyone,

Some of you are disappointed that we didn't clearly describe the segmented memory of GeForce GTX 970 when we launched it. I can see why, so let me address it.

We invented a new memory architecture in Maxwell. This new capability was created so that reduced-configurations of Maxwell can have a larger framebuffer - i.e., so that GTX 970 is not limited to 3GB, and can have an additional 1GB.

GTX 970 is a 4GB card. However, the upper 512MB of the additional 1GB is segmented and has reduced bandwidth. This is a good design because we were able to add an additional 1GB for GTX 970 and our software engineers can keep less frequently used data in the 512MB segment.

Unfortunately, we failed to communicate this internally to our marketing team, and externally to reviewers at launch.

Since then, Jonah Alben, our senior vice president of hardware engineering, provided a technical description of the design, which was captured well by several editors. Here's one example from The Tech Report.

Instead of being excited that we invented a way to increase memory of the GTX 970 from 3GB to 4GB, some were disappointed that we didn't better describe the segmented nature of the architecture for that last 1GB of memory.

This is understandable. But, let me be clear: Our only intention was to create the best GPU for you. We wanted GTX 970 to have 4GB of memory, as games are using more memory than ever.

The 4GB of memory on GTX 970 is used and useful to achieve the performance you are enjoying. And as ever, our engineers will continue to enhance game performance that you can regularly download using GeForce Experience.

This new feature of Maxwell should have been clearly detailed from the beginning.

We won't let this happen again. We'll do a better job next time.

Jen-Hsun
Source: NVIDIA
Add your own comment

140 Comments on "It Won't Happen Again:" NVIDIA CEO Breaks Silence on GTX 970 Controversy

#101
Sony Xperia S
FluffmeisterWith millions of cards sold giving record profits and capturing more market share than ever.... I think your boycott might be a little late.
It is rather a shame and pity and not occasion to vaunt. So many stupid and blind people buying products of the evil nvidia. :(
Posted on Reply
#102
newtekie1
Semi-Retired Folder
Sony Xperia SIt is rather a shame and pity and not occasion to vaunt. So many stupid and blind people buying products of the evil nvidia. :(
When the alternative is just as evil, arguably more so, ya can't really blame them.
Posted on Reply
#103
EarthDog
So many stupid and blind people buying products of the evil nvidia.
Bwaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaahahahhahalololol.

:shadedshu:
Posted on Reply
#104
Sony Xperia S
newtekie1When the alternative is just as evil, arguably more so, ya can't really blame them.
Oh really, then be so kind to tell all those blind people with nvidia cards HOW THE HELL to calibrate their image settings, so that the image on the screen has something to do with reality!
Posted on Reply
#105
the54thvoid
Super Intoxicated Moderator
Sony Xperia SOh really, then be so kind to tell all those blind people with nvidia cards HOW THE HELL to calibrate their image settings, so that the image on the screen has something to do with reality!
I have no issues. My image settings are all good and image is great. If you took off those red tinted glasses, things might appear more normal.
Of course, normal to me is crisp, smooth, colourful and immersive. If you require different parameters might I suggest a ZX Spectrum.
Posted on Reply
#106
Bugsy004
Has anyone returned their GTX970?
Posted on Reply
#107
newtekie1
Semi-Retired Folder
Bugsy004Has anyone returned their GTX970?
On the contrary I literally just ordered a second for SLI.
Posted on Reply
#108
RichF
FluffmeisterWith millions of cards sold giving record profits and capturing more market share than ever.... I think your boycott might be a little late.
Because apathy will surely teach them a lesson.

Every one of your posts that I've seen about this has been trying to minimize the issue. What do you suggest people do? Look at how profitable Nvidia has been lately and praise the company for their bait and switch marketing and flawed designs?
Posted on Reply
#109
RealNeil
Bugsy004Has anyone returned their GTX970?
newtekie1On the contrary I literally just ordered a second for SLI.
This. SLI 970s will be a good thing,..............

The sales of "refurbished & open box" 970s are already starting, so anyone who blames NVIDIA and can't see a way ~not~ to choke on your 970s, send them back so we, (us po' boys) can buy them for less money.
Nothing changes the performance in reviews when they were introduced, they're still the same.
Posted on Reply
#110
HumanSmoke
RichFBecause apathy will surely teach them a lesson.
It's less a case of apathy, than a statement of fact regarding the average consumers mindset - many of whom will never, ever read the thousands of posts on numerous websites concerning the matter.

You actually believe that a few outcries from a group of people (many of whom would never entertain owning an Nvidia product regardless of the issue or not) and whom represent a miniscule percentage of the consumer buying base, somehow wield enough power to cause a paradigm shift in OEM/ODM and consumer buying trends?
Were Nvidia and AMD taught any lesson when the colluded to price fix discrete graphics? When was the last price war since the judgement? How many people speak out against the two companies strategy of dovetailing prices and actually boycott both vendors products? How many people said "fuck this shit, I'm boycotting both these companies and buying a S3 Chrome" (obviously no one since the whole division went to HTC, with SiS/XGI and Matrox faring no better)
Were Intel taught a lesson after litigating Cyrix and C&T out of existence and keeping their foot on AMD's throat ? How many flocked to Motorola based products? (Record revenues year after year says not a lot)
Were AMD taught a lesson after they were caught cheating on benchmarks with fictitious processors and hobbling Intel numbers using out of date software results? How many boycotted AMD as a result? (Answer: Not a lot. Many defended AMD because of the "underdog" status. Win at all cost is acceptable if you're starting with a big enough handicap)
Were Samsung, Toshiba, and LG taught a lesson when they paid settlement after settlement for their part in seven years of LCD price fixing? Where was the outcry, and why is it never mentioned when any of these companies launch new product?
I think we can all agree that anti-trust, patent warfare, and widespread price fixing are more injurious to the consumer that the technical specification of a single SKU, yet none of these previous egregious (and more wide ranging) instances -any many more besides, met with anything other than a metaphorical shrug of the consumers shoulders in the greater scheme.
Consumers ain't care even if a significant proportion are aware of the issue - which is very seldom the case even when it makes the mainstream news including international TV coverage. Fewer still allow morality to intrude upon their quest for the next newest widget....if it did most. if not all the offending companies I mentioned above would have been blown into the weeds by consumer buying power.
Posted on Reply
#111
RichF
RealNeilThis. SLI 970s will be a good thing,..............

Nothing changes the performance in reviews when they were introduced, they're still the same.
Not quite.

With the passage of time the 3.5 GB VRAM limit is going to become an increasing issue. I'm sure some thought that the 1.5 GB that the GTX 580 shipped with was more than enough.

When the 970 was reviewed it offered the promise of 4 GB of VRAM, not 3.5. That means more potential performance in the future. Games were also less likely to bump into the problematic partition back then than they are now and going forward.
Posted on Reply
#112
newtekie1
Semi-Retired Folder
RichFNot quite.

With the passage of time the 3.5 GB VRAM limit is going to become an increasing issue. I'm sure some thought that the 1.5 GB that the GTX 580 shipped with was more than enough.

When the 970 was reviewed it offered the promise of 4 GB of VRAM, not 3.5. That means more potential performance in the future. Games were also less likely to bump into the problematic partition back then than they are now and going forward.
Not likely. It handled the top AAA titles today on the highest possible settings with 4k. If I have to turn off AA when running 4k I won't be too upset.

Also, the 1.5GB on the GTX580 was enough, in fact I had SLI 470s with 1.25GB. They worked perfectly fine, the memory amount wasn't an issue during their lifespan.
Posted on Reply
#113
EarthDog
500mb isn't going to make this card suddenly obsolete. 3.5gb is plenty for 2560x1400 gaming on down for the next two years for 90% of titles.

Go ahead peeps. Sell em... I'll happily swipe a couple. :)
Posted on Reply
#114
RichF
EarthDog500mb isn't going to make this card suddenly obsolete. 3.5gb is plenty for 2560x1400 gaming on down for the next two years for 90% of titles.

Go ahead peeps. Sell em... I'll happily swipe a couple. :)
The microstutter from 28 GB/s bandwidth and XOR contention is surely a bonus.
Posted on Reply
#115
EarthDog
When you use over 3.5gb in certain situations, yes. Which is why I qualified the statement..
Posted on Reply
#116
Eric_Cartman
RichFThe microstutter from 28 GB/s bandwidth and XOR contention is surely a bonus.
Well now nVidia has something to match the horrible micro-stuttering caused by Crossfire!
Posted on Reply
#117
HumanSmoke
RichFWith the passage of time the 3.5 GB VRAM limit is going to become an increasing issue.
Two points:
1. The vast majority of discrete graphics cards sold are 3GB and lower ( 2GB probably is the norm for mainstream gaming), so unless you see gaming developers, TWIMTBP, and Gaming Evolved moving to quickly alienate HD 6000, HD 7000, GTX 600, and GTX 700 series owners in the next year or so, 3.5GB should be ample for the most part- and where it isn't, it is pretty common practice to lower game image quality. And,
2. "With the passage of time" - say 12 months, the GTX 970 will be at a level that the GTX 670/680/760/770 is at now - EOL'ed and a $200 purchase. Just as GF 104/114 gave way to GK 104, and GK 104 gave way to GM 204, it is near certainty that GP 104/204 will relegate GTX Maxwell to the realms of the mainstream gamer.
RichFI'm sure some thought that the 1.5 GB that the GTX 580 shipped with was more than enough.
Well,it was for me four years ago. Are you expecting the GTX 970 to remain a performance segment card for the next 3-4 years?
Posted on Reply
#118
heydan83
HumanSmokeTwo points:
1. The vast majority of discrete graphics cards sold are 3GB and lower ( 2GB probably is the norm for mainstream gaming), so unless you see gaming developers, TWIMTBP, and Gaming Evolved moving to quickly alienate HD 6000, HD 7000, GTX 600, and GTX 700 series owners in the next year or so, 3.5GB should be ample for the most part- and where it isn't, it is pretty common practice to lower game image quality. And,
2. "With the passage of time" - say 12 months, the GTX 970 will be at a level that the GTX 670/680/760/770 is at now - EOL'ed and a $200 purchase. Just as GF 104/114 gave way to GK 104, and GK 104 gave way to GM 204, it is near certainty that GP 104/204 will relegate GTX Maxwell to the realms of the mainstream gamer.

Well,it was for me four years ago. Are you expecting the GTX 970 to remain a performance segment card for the next 3-4 years?
I think the observation these guys want to make is that maybe the vram wont be enough, faster than every 970 buyer thought when making the purchase and dont forget some games start to demand 3.5/4 GB on 1440p+ like shadows of mordor, and this card wasnt design fot 1080p...
Posted on Reply
#119
HumanSmoke
heydan83I think the observation these guys want to make is that maybe the vram wont be enough, faster than every 970 buyer thought when making the purchase and dont forget some games start to demand 3.5/4 GB on 1440p+ like shadows of mordor, and this card wasnt design fot 1080p...
I don't doubt that there are games that will peg the vRAM limit, and if you are user buying specifically for those titles then you would definitely have a cause for complaint, but those games aren't the norm, and likely wont be in a world dominated by console ports. A bigger cause for complaint would be those looking for a cheapish (by Nvidia standards) SLI option where doubling up on cards should translate to better image quality, but as an SLI and CFX user myself, it should be taken as read that multiplying the number of graphics cards doesn't automatically translate into a performance multiplier.
So for those running SLI, triple SLI, or buying predominantly for an RTS game, I would see the performance drop off as a major disappointment, but is gaming heading in the short term (say the next 12 months) where 3.5+GB of vRAM is the price of entry for the majority games at the majority of game image quality levels for 19x10 (valid for those running at better than 60Hz) and 25x16/1440 gaming?
Posted on Reply
#120
GhostRyder
RealNeilThis. SLI 970s will be a good thing,..............

The sales of "refurbished & open box" 970s are already starting, so anyone who blames NVIDIA and can't see a way ~not~ to choke on your 970s, send them back so we, (us po' boys) can buy them for less money.
Nothing changes the performance in reviews when they were introduced, they're still the same.
Time does make a difference and VRAM limits are going up at a very high right now partially due to the fact that the new consoles give more room for performance so even console ports to PC use more resources than ever. Evolve for instance is already showing something is going on with the GTX 970 as it significantly drops down the list after being the top dog at 1080p once you move of to 1600/1440p. That's just one game I have seen recently but we have already seen Shadows of Mordor and a few others hitting that problem area which is going to make it self more apparent when you put two together. While 3gb is enough for more than enough scenarios, cards like that were released over a year or more ago and people can still get by, your long term is going to suffer.
Eric_CartmanWell now nVidia has something to match the horrible micro-stuttering caused by Crossfire!
Good luck with that argument...Because CFX stutters horribly and people like punishing themselves...
heydan83I think the observation these guys want to make is that maybe the vram wont be enough, faster than every 970 buyer thought when making the purchase and dont forget some games start to demand 3.5/4 GB on 1440p+ like shadows of mordor, and this card wasnt design fot 1080p...
Bingo, the fact you can already hit the VRAM limit now only means it will get worse with time. While 3.5gb is enough for more things than naught, its not exactly comforting to be told that you bought a 4gb card and to basically stop being ungrateful and enjoy it by the company. Not really the type of thing I like to hear from a company honestly...

What matters is the cards market point, a single one at 1440p is still going to be pretty good and serves well at the price point. Its just not going to be as good for as long and it renders the idea of buying 2-3 a little less than optimal since your more likely to hit that 3.5gb limit with that much more power from the extra GPU's.
Posted on Reply
#121
rtwjunkie
PC Gaming Enthusiast
RichFNot quite.

With the passage of time the 3.5 GB VRAM limit is going to become an increasing issue. I'm sure some thought that the 1.5 GB that the GTX 580 shipped with was more than enough.
That's the situation for ALL GPU's. They aren't meant to be the last purchase you'll ever make. Every single top-end GPU is enough for the time period it is released. They fall behind. All have a finite life as games are developed that either need more processing power or more VRAM, or both. The GTX 580's 1.5GB of VRAM was enough for when it was released as well. Nobody thought "Hey, I have a 580 with 1.5GB of VRAM, I'm all set forever!"
Posted on Reply
#122
newtekie1
Semi-Retired Folder
heydan83I think the observation these guys want to make is that maybe the vram wont be enough, faster than every 970 buyer thought when making the purchase and dont forget some games start to demand 3.5/4 GB on 1440p+ like shadows of mordor, and this card wasnt design fot 1080p...
Here is the funny thing, the Ultra-HD Texture Pack that makes Shadow of Mordor use all that RAM actually needs 6GB of VRAM, and I hit the VRAM limit on my 4GB 290Xs and the stuttering was just as bad as with 970s. And the visual difference it makes wasn't really that noticeable when you are actually playing the game. Plus, for whatever reason, Shadow of Mordor doesn't use the extra 0.5GB of memory, once it hits the 3.5GB it starts paging out to system RAM right away, at least I've never seen it use the extra 0.5GB.
Posted on Reply
#123
heydan83
newtekie1Here is the funny thing, the Ultra-HD Texture Pack that makes Shadow of Mordor use all that RAM actually needs 6GB of VRAM, and I hit the VRAM limit on my 4GB 290Xs and the stuttering was just as bad as with 970s. And the visual difference it makes wasn't really that noticeable when you are actually playing the game. Plus, for whatever reason, Shadow of Mordor doesn't use the extra 0.5GB of memory, once it hits the 3.5GB it starts paging out to system RAM right away, at least I've never seen it use the extra 0.5GB.
Yes, and if that continues the same, we will need at least 6GB more quickly than anyone though...
Posted on Reply
#124
HumanSmoke
heydan83Yes, and if that continues the same, we will need at least 6GB more quickly than anyone though...
And yet, AMD's next flagship, the 390X is set to arrive with a 4GB framebuffer. So is the prediction of 4GB+ wrong, or are AMD shooting themselves in the foot, or are they making some serious compromises with color compression and hoping that non-Gaming Evolved partners scale back on use of uncompressible data (like that is going to happen on TWIMTBP titles if GM 200 arrives with 6GB of GDDR5)?
Posted on Reply
#125
RealNeil
Who knows what they're up to?
But I think it will be easier to get 8GB 390X cards for non-crazy prices. AMD seems to facilitate this with their partners.
You can buy Sapphire Vapor-X 8GB R9-290X (and other) GPUs now, and they're not that bad as to price.

8GB NVIDIA based GPUs are not even listed for sale at Newegg.com right now, but I know they exist. (Probably expensive as hell)

I'm not sure that huge amounts of memory will help unless GPU's memory ~bandwidth~ gets a wider highway for data throughput.
Posted on Reply
Add your own comment
Nov 26th, 2024 10:19 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts