Wednesday, February 4th 2015

Specs Don't Matter: TechPowerUp Poll on GTX 970 Controversy

In the thick of the GeForce GTX 970 memory controversy, last Thursday (29/01), TechPowerUp asked its readers on its front-page poll, if the developments of the week affected the way they looked at the card. The results are in, and our readers gave a big thumbs-up to the card, despite the controversy surrounding its specs.

In one week since the poll went up, and at the time of writing, 7,312 readers cast their votes. A majority of 61.4 percent (4,486 votes) says that the specs of the GTX 970 don't matter, as long as they're getting the kind of performance on tap, for its $329.99 price. A sizable minority of 21.2 percent (1,553 votes) are unhappy with NVIDIA, and said they won't buy the GTX 970, because NVIDIA lied about its specs. 9.3 percent had no plans to buy the GTX 970 to begin with. Interestingly, only 5.1 percent of the respondents are fence-sitters, and waiting for things to clear up. What's even more interesting is that the lowest number of respondents, at 3 percent (219 votes), said that they're returning their GTX 970 cards on grounds of false-marketing. The poll data can be accessed here.
Add your own comment

143 Comments on Specs Don't Matter: TechPowerUp Poll on GTX 970 Controversy

#51
CAPSLOCKSTUCK
Spaced Out Lunar Tick
So what is the actual return rate then ? even approximately. Actions speak louder than words.

Why would TPU manipulate results ?
I have more faith in TPU and its members giving me reliable info than any marketing or political points scoring exercise by fanboys from either camp.

Nvidia vs AMD ..........boring
Intel vs AMD ...........boring
Posted on Reply
#52
Serpent of Darkness
ShockGThese are DX12 cards which is the cool part.
DX12 is more of a "CPU optimizer" upgrade. This doesn't really restrict you from having to purchase a future generation of graphic cards from either camp to use it.

As far as I know, all current generations, NVidia 700 and AMD R#-200s for the most part, can run DX12.
ProtagonistOn that note it seems nvidia will be skipping 16nm in favor of 14nm Samsung, either way waiting for a die shrink then I'll purchase my next GPU, I have personally had enough of the 28nm GPUs.
Saw the news about that on Tweaktown.com. I don't feel optimistic that Samsung will continue to do business, produce 14 FF chips for NVidia after the current sue job. I'm not saying it isn't going to happen, I am saying that I feel it won't happen unless the case was settled, or the agreement to produce said 14 ff chips for NVidia was agreed before the patent infringement case. Of course money always talks. So it's possible if NVidia was paying Samsung more $$$, it could happen.
damric3.5GB is fine since it's not a 4K capable card anyway.
You're mistaken. VRam isn't summed up together as one when you add an additional graphic card to your system. If you have 3 GTX 970s in SLI with 4 GBs VRam, it doesn't mean you have a total of 12 GBs VRam. It means that each GPU has a 4 GB VRam framebuffer, and that's dedicated to each GPU. 3.5 GBs VRam each if you want to be technical plus how much ROPs less per card than previously stated by the manufacture's specs. If you go 4k and you have a 3way SLI 970 setup, and 3.8 GBs VRam is needed in this hypothetical scenario, each card will try and access 3.8 GBs VRam to store information for the image that's being rendered and sent to the display device.

I've read on Tweaktown.com that AMD is making a claim, or hinting one that after DX12 aka Win10 is released, AMD may eventually make it where AMD Mantle will allow VRam, from each GPU, to be summed up as one. This hint was made next to the one about the GPU could also use CPU memory, eventually...

@ the post,

Worst case scenario, Nvidia could be faced with false advertisement for misleading their customers with a faulty product. Consumers could take that route. It be ironic if that were to happen after trying to sue Samsung and Qualcomm. Think about it. It's not surprising. First they came out with the GTX Titan-Z for $3,000 as a high end gaming card with 64bit floating point capabilities. Hoping for AMD to fail with the R9-295x Now they've pretty much mis-lead their consumer base, or claim ignorants to the truth about their product. For consumers, worst case scenario is you couldn't play any future PC Games with high textures, surround, or go 4k HD. A heavily modded Skyrim is a big no for GTX 970. So if this was an investment that's going to last you for the next 5 years, you're totally screwed on any future PC game that requires more VRam usage. You are crap out of luck. I know Star Citizens is possibly going to be one of those games.

I don't feel it's right that AMD owners should degrade themselves or toss dirt on NVidia, but in another point of view, NVidia had it coming from the AMD camp because they've pretty much pulled the same "QQ-cry-cry-cry" card about the Frame Time Variance issues with the GPU king of 2013 aka AMD 7990. AMD took it's punches and improved from their mistakes in their own way. Now if you ask me, did NVidia deserved it, and the answer is yes. Thank your consumers for that. It's basically a natural "eye for an eye" reaction from the AMD camp at NVidia. Is NVidia going to learn from this. I highly doubt it because a majority of the base has basically rolled over, or gave into the compromise that "oh it only an issue for the card is a piece of crap if I go above 3.5 GBs VRam Usage" comments. The reals message being sent to NVidia by consumers is I will pay more for less, but don't be truthful about it.

Sadly, I think NVidia consumers deserve better than this even if NVidia consumer's don't want to stand up for themselves. Say "hey, you sold me a faulty card, this isn't what I paid for when I purchased this product." Now a lot of members on this forum are going to make the argument that "oh when AMD R9-200 cards were throttling, AMD customers didn't cry to AMD about it, or the Frame Time Variance on 7000 series cards in Crossfire was crap" case, and you know what, I think those issues are less significant than NVidia covering up a drop in ROPs or lack of full direct memory access on the GTX 970. The reasoning for that is because AMD didn't really, intentionally mislead their consumers. R9-290x cards would go up to 1.0Ghz core clock until it started throttling because of the increased temperatures. AMD fixed its Frame Time Variance issues over time. In addition, AMD Catalyst Drivers (Both beta and WHQL) are not causing issues for AMD users as much as haters would believe. The question really isn't what NVidia consumers should say to counter-argument AMD consumers about the GTX 970 issues. NVidia consumers are either going to refund their cards, or stick with their purchase because it meets there needs or expectation. I think the best question NVidia consumers should ask is are you willing to let NVidia continue to provide you with a product that doesn't meet specs when they are suppose to have a track record, over AMD, of producing reliable, premium products. Are NVidia consumers paying $500, $600 for premium NVidia products in the future that don't live up to expectations anymore.... A lot of NVidia users saying this is acceptable behavior. It's acceptable for NVidia to live up to AMD's failed standards. It won't upset you so long as you don't know right...... Once you bought these faulty products, your locked until you buy more NVidia products down the line.

It really doesn't matter what this poll represents. The polls could be a misrepresentation of the truth, it could be the truth, but it doesn't represent the 100%, bigger picture.
Posted on Reply
#53
W1zzard
the54thvoidAny chance of a 970 versus 980 bench up at 4k ultra IQ?
Just check my reviews. Not using AA for 4K though because it makes no sense for the performance hit.
Posted on Reply
#54
HumanSmoke
Serpent of DarknessAs far as I know, all current generations, NVidia 700 and AMD R#-200s for the most part, can run DX12.
Fermi, Kepler, Maxwell, and GCN architectures all have at least some preliminary DX12 support.
Serpent of DarknessSaw the news about that on Tweaktown.com. I don't feel optimistic that Samsung will continue to do business, produce 14 FF chips for NVidia after the current sue job.
If the contract is in the public arena now, it most certainly was signed some time ago. Also note that Samsung is more a collection of divisions than a single company. Samsung Electronics had a bitter patent dispute with Apple, but it doesn't stop Samsung from supplying Apple.
Serpent of DarknessI've read on Tweaktown.com that AMD is making a claim, or hinting one that after DX12 aka Win10 is released, AMD may eventually make it where AMD Mantle will allow VRam, from each GPU, to be summed up as one.
Many sites carried the story, and pooled memory is slated for both Mantle and DX12.
Posted on Reply
#55
RCoon
the54thvoidAny chance of a 970 versus 980 bench up at 4k ultra IQ?
W1zzardJust check my reviews. Not using AA for 4K though because it makes no sense for the performance hit.
In other words:

Potaters

Gonna

Potate
Posted on Reply
#56
rpsgc
Fanboysim is one hell of a drug.


The cult of NVIDIA is almost as bad as the cult of Apple. And they call themselves intelligent people....
Posted on Reply
#57
Sony Xperia S
CAPSLOCKSTUCKSo what is the actual return rate then ? even approximately. Actions speak louder than words.
Well, we will see by the end of February. :)
Posted on Reply
#58
john_
HumanSmokeThat sounds hyperbolic on your part. I think the reason that many sites didn't go straight into OHNOTHESKYIS FALLING mode, is because the performance is what it was on launch day. Sure Nvidia misrepresented the specs, and sure, some people have had cause to regret their purchases - but for the most part, many people aren't affected because 1. they don't load the vRAM to it's full extent, and 2. many have recourse for refund. In the end, it is still a product that mostly works as advertised for most people. It doesn't blow up, it isn't made by child slave labour, and it isn't responsible for the decline of Western civilization. It's a graphics card that will be yesterdays news as soon as the next graphics card arrives.
Performance is not the case here. I think I was explaining this in many cases. It wasn't the case with 290X's gpu clock speed, it's not in the case of 970 either. A company should NEVER lie because the marketing department thinks it is good idea.
The misleading was prior to launch. Viral marketing by John Fruehe (and AMD's unofficial "leaker" Donanimhaber) who vociferously stated that Bulldozer's IPC was significantly better than 10h. Hiding behind the "personal opinion" card, whilst proudly proclaiming his AMD Vice-President status. Sold quite a few 900 chipset boards, and basically locked people in to a Bulldozer purchase on the strength of dubious marketing.
Basically shit happens - and AMD haven't been immune. Remember when AMD got caught falsely advertising that its flagship card had functioning hardware UVD ? No? Nor does anyone else, because the issue was largely hand waved away thanks to the card being less than popular. In the same time frame, AMD deliberately misled with their fictitious processor and bogus benchmarking. Somewhat higher profile, but largely excused by AMD fans because of their underdog status. The fact that in all probability you don't remember either just goes to show how quickly bad behaviour slips from the public consciousness. If that's too long ago, how about AMD and Nvidia's price fixing judgement, or the LCD panel price fixing scandal. In the end, the next shiny thing on the shelves trumps social conscience for the most part. If this sounds cynical, its because thisand worse, happens time and time again.
Prior to launch? Does it count? All companies are trying to create a very positive image of a product they are about to show. But then the product comes out, everyone see it's performance, everyone knows what this product does and how it performs and end of story. Donanimhaber? Who cares about Donanimhaber? Someone gone and bought a new motherboard based on expectations? His mistake. I have said that many times in the past - I don't know if i had done it here. After reading the first Bulldozer review, I gone at an online shop's page and ordered the Phenom 1055T that I am still using. No I don't remember about UVD. That case looks like the same as this one with 970(didn't read the link).
But what you are trying to tell me reminds of the political situation the last 40 years in Greece that brought us where we are today. Two political parties both corrupted and their voters using as arguments, not how much better their political party is, but how much worst the other political party is. You put two voters, one from each political party in a table, and they both had FACTS that where proving that both parties where corrupted and bad choices. The result? Everyone can see it today. If today half of us excuse lies from one company and the other half excuse lies from the other company, in a duopoly, tomorrow we will be reading mostly lies on the specs. Believe me.
Posted on Reply
#59
john_
W1zzardfixed that for you. (my personal view)
Well, in most articles their authors totally believed that excuse from Nvidia and where asking their viewers to understand that it was totally possible to be also the truth.
Posted on Reply
#60
the54thvoid
Super Intoxicated Moderator
rpsgcFanboysim is one hell of a drug.
Superbly cutting remark. It's quite novel to hear.
rpsgcThe cult of NVIDIA is almost as bad as the cult of Apple.
Informative, of course, there is no cult of AMD.
rpsgcAnd they call themselves intelligent people....
Hmm, now that's just supposition. You bombed out on that one, sorry.
Posted on Reply
#61
RCoon
john_Well, in most articles their authors totally believed that excuse from Nvidia and where asking their viewers to understand that it was totally possible to be also the truth.
GPU engineers are smart, GPU marketers doubly so. To think it was an honest mistake would be foolish, even I can see that. I'm not saying those authors were morons, just that taking information from the source of the problem as 100% fact isn't always a wise publishing decision.
Posted on Reply
#62
HumanSmoke
john_No I don't remember about UVD. That case looks like the same as this one with 970(didn't read the link).
No, they are actually quite different. As you've stated, you've read many articles about the 970 issue. The one article concerning the UVD issue you couldn't be bothered reading....but don't feel bad, many people in the day couldn't be arsed either. The more things change the more they stay the same.;)
Posted on Reply
#63
Ferrum Master
It would be interesting to see if the DX12 will have the same side effect as mantle...

Increased VRAM ussage... :laugh:
Posted on Reply
#64
HumanSmoke
Ferrum MasterIt would be interesting to see if the DX12 will have the same side effect as mantle...
Increased VRAM ussage... :laugh:
Might be a long wait to find out. Game developers weren't in any great hurry to get DX 11 utilised, and I've yet to see any 11.1/11.2 options!
Posted on Reply
#65
JBVertexx
Keep in mind that Nvidia had >70% of the market in Q3 and most likely Q4 (hexus.net/tech/news/graphics/78209-nvidia-pulls-away-amd-graphics-card-market-share/)

So the fact that only 61% in the poll said that "Spec's don't matter" still poses a problem for Nvidia

Also, you didn't separate those actually in the market for a new GPU. Those not in the market for a new GPU are probably more likely to say they don't matter (plus it was the first answer in the poll, which always skews results).

Those actually in the market for a new GPU I would bet my paycheck that there would be more of those who are concerned or at a minimum "fence sitters".

I myself came close to pulling the trigger on a 970. The controversy caused me to pause, and I think I'll wait it out to see what AMD comes out with next year.
Posted on Reply
#66
Ferrum Master
HumanSmokeMight be a long wait to find out. Game developers weren't in any great hurry to get DX 11 utilised, and I've yet to see any 11.1/11.2 options!
DX11.1 and DX11.2 are supported by AMD only, aint it?, thus not covering the market. Frostbyte only supports DX11.1 as far I remember.

It depends how UT4 starts to become a most used game engine whore again as UT3 was.

Or... how fast this thing will pop in Xbone SDK and thus automatically ported to game engine and then to PC... So I think not that long really... might be even late this year...

Where's Crytek? He's quite silent like a partisan lately... I guess the money crisis lol kind of broke them down?
Posted on Reply
#67
Dalai Brahma
Yeah.. 3.5GB is still a good spec, makes great job.. I think.. but... If I pay for 4GB (working 100%), I won't get a 3.5GB...
Remember me ads about smartphone with "8GB"... "what?! I have only 5.1GB.. my phone is defective..."
Posted on Reply
#68
Caring1
No different to buying a computer with a 500Gb hard drive and only 450Gb is usable....
Posted on Reply
#69
dj-electric
Why do so many people find it hard to believe that most people simply don't care about the last 512MB as it is completely useless for 1080P and 1440P users? If it was only the last 256MB would you care? and about about the last 128MB or 64MB?
Posted on Reply
#70
RejZoR
Playing Battlefield 4 on Ultra and I'm kinda asking myself if I even have to replace my HD7950 with anything else... It works smooth as butter through Mantle API lol. And so do all the other games that I've tried so far...
Posted on Reply
#71
Nelly
Problem with these polls is you will get Nvidia fanboys just praising the card even though they don't own a GTX 970.

If the vote was somehow able to let only people who own the GTX 970, how different would the actual poll be? Who knows...
Posted on Reply
#72
GhostRyder
the54thvoidAll I want to say is those that relentlessly accuse TPU of being an Nvidia shill really ought to find a new forum.
By all means condemn Nvidia but the frankly infantile responses to democratic polls from generally AMD owners is ironically a neon sign post to Red bias.
AMD in context sold the stock 290 cards that were unable to stay at their PR advertised boosts. In many cases they throttled well below 1ghz. I don't recall quite so much hate or as many posts on that misleading sales pitch.
It took custom solutions, months later to let the cards fly free.
I wouldn't touch a 970 knowing the issue and knowing my gaming resolution would maybe in a small % of games cause problems. But it doesn't mean I need to pour such illogical hatred and conspiracy on TPU.
And 'specs don't matter' is a valid point. My 3gb cards outperform 4gb cards, even at some 4k settings. Game coding is more relevant to performance in many cases.
Have Nvidia been dishonest? Of course.
Does the card still suit the vast majority of owners? Apparently.
Should Nvidia do something honest about it? Yes.
Will they? No.
Will I still buy Nvidia? If they still perform better than AMD, yes.
Even if its a tad expensive? Probably.
Would I buy AMD? If their card is better.

All very simple....
A good sum up of the issues at hand and everything, though I will say this much being an owner of reference R9 290X's (3 overclocked to 1030 stock) they never throttled on the stock coolers at uber mode setting though that was after the driver update that "Fixed" the issue with fan speed variance when I got them.

I don't see how the results were not expected, the card is still a decent card and that has not changed, just its value perspective and where it sits is now a bit off. People who bought 1 GTX 970 are probably not going to ever experience the problem at least until the point of upgrade down the line because the GPU's only have enough power to really run 1440p 60hz effectively. 1080p 120/144 with, or 1440p 60hz-144hz as well are hard enough to run as is (Well maybe not the 1080P option as much, and DSR is a different story) and its going to require 2+ cards to be effective as it is (Also including 4K 60hz) which just means use other options/go for the gold if you want to run them effectively. I don't see having 3.5gb as really a problem for this card, just being told it has more than it can run effectively as being the issue and honestly people should have a problem being told a lie. But you should not be forced to return a card your satisfied with as that makes no sense...
Posted on Reply
#73
Sony Xperia S
Caring1No different to buying a computer with a 500Gb hard drive and only 450Gb is usable....
Yeah, kind of.

It is like buying a 1TB drive, expecting it to be 1000 GB, but in reality usable only 930 GB.

And that is exactly what happens and no one sues Seagate or WD... :rolleyes:



But, why don't you concentrate on the upcoming Radeon R9 380X which promises very significant performance improvements, low temperatures and high durability???
Posted on Reply
#74
Jorge
The results (sadly) show that:

1. Many consumers are technically challenged
2. Many consumers don't mind be defrauded
3. Many consumers have no moral compass
4. Nvidia is unscrupulous
5. Nvidia intentionally deceived the sheeple
Posted on Reply
#75
GreiverBlade
spec don't matter, maybe but lies does ... and it was a lie ... oh well ... the 970 is still a good 3.5gb card and the top in her segment but still ...


ah! whatever...
Posted on Reply
Add your own comment
Nov 21st, 2024 13:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts