Wednesday, February 4th 2015

Specs Don't Matter: TechPowerUp Poll on GTX 970 Controversy

In the thick of the GeForce GTX 970 memory controversy, last Thursday (29/01), TechPowerUp asked its readers on its front-page poll, if the developments of the week affected the way they looked at the card. The results are in, and our readers gave a big thumbs-up to the card, despite the controversy surrounding its specs.

In one week since the poll went up, and at the time of writing, 7,312 readers cast their votes. A majority of 61.4 percent (4,486 votes) says that the specs of the GTX 970 don't matter, as long as they're getting the kind of performance on tap, for its $329.99 price. A sizable minority of 21.2 percent (1,553 votes) are unhappy with NVIDIA, and said they won't buy the GTX 970, because NVIDIA lied about its specs. 9.3 percent had no plans to buy the GTX 970 to begin with. Interestingly, only 5.1 percent of the respondents are fence-sitters, and waiting for things to clear up. What's even more interesting is that the lowest number of respondents, at 3 percent (219 votes), said that they're returning their GTX 970 cards on grounds of false-marketing. The poll data can be accessed here.
Add your own comment

143 Comments on Specs Don't Matter: TechPowerUp Poll on GTX 970 Controversy

#76
Casecutter
HumanSmokeproduct that mostly works as advertised for most people
The 970 is a product that... works, although not as advertised
As W1zzard said...
W1zzardfixed that for you. (my personal view)
Posted on Reply
#77
Dalai Brahma
GreiverBladespec don't matter, maybe but lies does ... and it was a lie ... oh well ... the 970 is still a good 3.5gb card and the top in her segment but still ...


ah! whatever...
+1

For Who mencioned HDD... I know (maybe you also do) that 1000GB is a unformatted space. For standard...
But did we Know that 4GB Nvidia VRAM is 3.5GB faster + 0.5GB slower, and 64ROPs means 56, and etc...??
That is a "little" diference...
Posted on Reply
#78
Casecutter
Sony Xperia SBut, why don't you concentrate on the upcoming Radeon R9 380X which promises very significant performance improvements, low temperatures and high durability???
Ah, who's saying that???
Posted on Reply
#79
R-T-B
Indeed. I have yet to hear anything concrete about it's specs, let alone it's "durability."
Posted on Reply
#80
64K
Sony Xperia SYeah, kind of.

It is like buying a 1TB drive, expecting it to be 1000 GB, but in reality usable only 930 GB.

And that is exactly what happens and no one sues Seagate or WD... :rolleyes



But, why don't you concentrate on the upcoming Radeon R9 380X which promises very significant performance improvements, low temperatures and high durability???
There will be plenty of talk of the R9 380X as it gets closer to being available and more leaks come out.
If rumors are true then it should have significant improvements in performance probably beating a GTX 980 by a fair bit. I don't think it will have low temps since it's rumored to be a 300 watt card.
Posted on Reply
#81
RejZoR
Sony Xperia SYeah, kind of.

It is like buying a 1TB drive, expecting it to be 1000 GB, but in reality usable only 930 GB.

And that is exactly what happens and no one sues Seagate or WD... :rolleyes:



But, why don't you concentrate on the upcoming Radeon R9 380X which promises very significant performance improvements, low temperatures and high durability???
As much as I hate what NVIDIA did with GTX 970, no one really said anything about low temperatures for R9-380X. High performance yes, but for low temperatures, not really. They did mention R9-480X series to be very power efficient. But that is still very far away...
Posted on Reply
#82
Uplink10
HumanSmokeFermi, Kepler, Maxwell, and GCN architectures all have at least some preliminary DX12 support.
DirectX 12 also proably has elements of DX9, DX11 so a lot of cards probably support DX12, but not fully.
Caring1No different to buying a computer with a 500Gb hard drive and only 450Gb is usable....
Sony Xperia SIt is like buying a 1TB drive, expecting it to be 1000 GB, but in reality usable only 930 GB.
You two are both wrong, Windows shows numbers (930) in binary and the unit (GB) in decimal, they should show the unit also in binary (GiB). This is not HDD manufacturer fault, it is Microsoft fault. Microsoft for some reason won`t change their units to binary. I hope they fix this in Windows 10.
Posted on Reply
#83
Kaotik
ShockGThese are DX12 cards which is the cool part.
They're not any more "DX12 cards" than anything else on the markets currently.
Yes, they support 4 new features included in DX12, but those are not the only new features of DX12.
When NVIDIA released the card and boasted about DX12 compatibility, DirectX development head from MS said that there are no final conformance tests available for hardware manufacturers.
Of those 4 features, Intel supports at least 1, possible 2, 3 or 4. In fact, one of those features is nothing more than DX-version of Intel PixelSync (Raster Ordered Views).
Of those 4 features, GCN supports at least 1 (Raster Ordered Views), 1 very very likely (Volume Tiled Resources) and possibly the other 2, too.
Posted on Reply
#84
Hood
The sub-group of humanity known as "gamers" have traditionally had only one sure way to be heard - voting with their wallets. But we love any chance to get vocal about "the enemy", even over petty lies by admen (like that's a real shock - who knew that our trusted advertisers could sink so low? People who rage about this have obviously never told any lies in their entire lives). So if this crap has your panties in a wad, go buy an AMD card, I'm sure they never lied about any of their crappy cards. Otherwise you should just keep running your 970 and be glad that it beats a 780 for $100 less. (and that they both blow away any Radeon card). Who can blame AMD for taking a cheap shot, they are rapidly losing market share and heading for bankruptcy. I recently bought a brand-new EVGA Classified GTX 780 Ti for $400, how is AMD going to beat that?
Posted on Reply
#85
Animalpak
if anyone giveaway their GTX 970 i will take it !
Posted on Reply
#86
john_
HumanSmokeNo, they are actually quite different. As you've stated, you've read many articles about the 970 issue. The one article concerning the UVD issue you couldn't be bothered reading....but don't feel bad, many people in the day couldn't be arsed either. The more things change the more they stay the same.;)
I also have read many articles and reviews about CrossFire problems and the GPU throttling on the 290X. Probably much more than about Nvidia's lies with 970.
So, your point about what I bothered reading is what? That I don't care reading about AMD problems? If this is the case, don't create a false image by specifically choosing only one line from my post and ignoring/deleting/downgrading the rest of it.
If this is not the case, ignore this comment and tell me what you really meant. :)
Posted on Reply
#87
Petey Plane
XoriumIt is very much a 4k capable card in SLI. So no 3.5gb is not "fine"
you're right, all the previous benchmarks are now invalid :rolleyes:.

at 4k in SLI, there would be virtually no performance difference between a true 4gb card and the 3.5gb the 970 has.
Posted on Reply
#88
bogami
As far as efficiency dilemma of RAM and GTX970 is another proof of advertising, handling and throwing sand in the eyes !
Which all of them are not seeing is grist to the mill to nVidia. Talking about the lower middle class !!!!!!!!!!!!!!! failed!!!!!!!!!!! it failure!!! cut out processor !!!!!!!!!!!!! it can not be measured with AMD R 9 290X.
The criterion would only be GTX780 !! Is Not 20 nm, but 27 nm a new generation of processors that should be sufficient for 4K gaming .wich does not for all the games and anyway overestimated by 120$. all together gave a good advertisement for Nvidia.
And hiding the fact that we will set you up with TITAN X again the same error and failure cut processor for much more 1000$ at list :shadedshu:
Posted on Reply
#89
scorpion_amd13
Dalai BrahmaYeah.. 3.5GB is still a good spec, makes great job.. I think.. but... If I pay for 4GB (working 100%), I won't get a 3.5GB...
Remember me ads about smartphone with "8GB"... "what?! I have only 5.1GB.. my phone is defective..."
Sure, as long as all the other smartphones have that 8GB memory completely empty, then yeah, this is a good comparison.
Caring1No different to buying a computer with a 500Gb hard drive and only 450Gb is usable....
Of course it is, as long as all the other 500GB hard drives out there offer 500GB of usable space. With the GTX 970, the full amount of memory IS there (as in, on the card). It's just that the last 512MB of it are accessed differently to the rest of the card's memory, which leads to lower (or drastically lower) performance, depending on the game's requirements.

This isn't the real problem, though. As it has been said before, suddenly finding out that the card's memory configuration isn't what nVidia stated it was doesn't change the performance numbers, not in the slightest. The real problem is that nVidia LIED about it. There's NO WAY the decision-makers at nVidia didn't know about all this since before the card was launched. It's not like they let TSMC figure out the specs for the batches of chips that would end up in GTX 970 cards.

Let's face it, if the marketing department would have gotten the wrong spec-sheet from the engineering teams about the GTX 970 specs, considering that the whole of the internet has reported those numbers, they would have sent out a press release the next day (absolutely worst case scenario, the reviews are closely monitored to a level that would make the NSA jealous), saying that there had been a mix-up, along with a PDF with the real specs. That didn't happen. This means that nVidia's marketing department were either knowingly lying through their teeth, or someone in charge had ordered the engineering teams to feed them false information.

There is absolutely NO WAY the engineers didn't know the correct specs for the GTX 970 and even if there was a mix-up somewhere down the line, somebody would have noticed it before launch day. Launch day press events held for both AMD and nVidia graphics cards usually contain all the technical specs the reviewers would ever be interested in knowing. That's just about all the technical details you would normally read in one of W1zzard's reviews. Then there's a Q&A session, where the press gets to ask whatever they want to ask about the product. Now, unless the marketing staff has a technical background (they changed from the engineering teams to the marketing department somewhere down the line, usually), they are pretty much technically illiterate. In such a case, they wouldn't be able to properly mount a graphics card inside a PC and all they're ever able to do is quote from the presentation they've just gone through. In such a case, though, they ALWAYS have an engineer with them. Said engineer handles the technical part of the briefing and, of course, the technically-oriented questions from the press. Whichever the case, two things stand out clearly: 1) the engineering and marketing teams do more than just exchange a single botched-up PDF and 2) there's absolutely NO WAY this was an honest mistake, one of those things that routinely get overlooked, like, say, the plastic shroud isn't black, but a very dark shade of grey.

Personally, I only care about this because they most clearly lied about this willingly. I don't buy or recommend graphics cards because they're from nVidia or AMD, the only thing that actually matters to me is what said card offers for the money (performance, noise, overclocking, reliability, good drivers, etc.). I've had cards from both companies over the years and I've tested hundreds more. I don't care who's caught lying, I can always buy from the other guy after all. But I do care when both the press and (implicitly) the consumers are being lied to so blatantly. Such an event should never be treated as casual by either side because it sets the worst kind of precedent possible: it sends the liar a signal that it's ok to continue lying and that they can easily get away with it.
HoodThe sub-group of humanity known as "gamers" have traditionally had only one sure way to be heard - voting with their wallets. But we love any chance to get vocal about "the enemy", even over petty lies by admen (like that's a real shock - who knew that our trusted advertisers could sink so low? People who rage about this have obviously never told any lies in their entire lives). So if this crap has your panties in a wad, go buy an AMD card, I'm sure they never lied about any of their crappy cards. Otherwise you should just keep running your 970 and be glad that it beats a 780 for $100 less. (and that they both blow away any Radeon card). Who can blame AMD for taking a cheap shot, they are rapidly losing market share and heading for bankruptcy. I recently bought a brand-new EVGA Classified GTX 780 Ti for $400, how is AMD going to beat that?
So, in your opinion, the fact that you got to buy nVidia's third best card (for its generation) for a price that used to get you a high-end flagship not so long ago is reason enough to overlook the fact that your favorite company intentionally lied to its customers? Fanboys never, ever cease to amaze...
Posted on Reply
#90
xorbe
I don't like it, but it's a first world problem, so I don't care much.
Posted on Reply
#91
64K
scorpion_amd13So, in your opinion, the fact that you got to buy nVidia's third best card (for its generation) for a price that used to get you a high-end flagship not so long ago is reason enough to overlook the fact that your favorite company intentionally lied to its customers? Fanboys never, ever cease to amaze...
What? The GTX 780Ti was the flagship of that generation (Kepler) and the Classified version is still a very nice card. Why are you saying that you could get a Flagship for that same price ($400)?

At launch

GTX 480 $500
GTX 580 $500
GTX 780 $650
GTX 780Ti $700
Posted on Reply
#92
HumanSmoke
john_I also have read many articles and reviews about CrossFire problems and the GPU throttling on the 290X. Probably much more than about Nvidia's lies with 970.
So, your point about what I bothered reading is what? That I don't care reading about AMD problems? If this is the case, don't create a false image by specifically choosing only one line from my post and ignoring/deleting/downgrading the rest of it.
If this is not the case, ignore this comment and tell me what you really meant. :)
HumanSmokeNo, they are actually quite different. As you've stated, you've read many articles about the 970 issue. The one article concerning the UVD issue you couldn't be bothered reading....but don't feel bad, many people in the day couldn't be arsed either. The more things change the more they stay the same.;)
I would have thought it was obvious. Consumers in general have short memories - especially so in markets with a high incidence of built in obsolescence. Nothing in my previous post(s) was a personal attack aimed at you, yet you've chosen to see it as such - so that's something you're best answering yourself. As for singling out a part of your post, it was done so to highlight the general malaise that consumers view tech. As I pointed out earlier
HumanSmokeIn the end, the next shiny thing on the shelves trumps social conscience for the most part.
The rest of your post? Well, that - buyer beware, in essence - was stated as your opinion, and your opinion is as valid as anyone else's. Why would I argue opinion when the represented values have differing levels of impact from person to person?
My opinion is that if fugitive war criminals whose part-time hobby was spreading Ebola through orphanages decided to sell graphics cards at 30% of MSRP, people would get crushed in the stampede to buy them.
Posted on Reply
#93
xvi
I'm hoping everyone gets upset and the market gets flooded with barely used 970s. I'd buy one used (at a good price) knowing the 970 is a little gimped, but I'd be pretty miffed if I got less than what was advertised.

As many have pointed out, those of us who actually take vram usage in to account when purchasing a card have certainly got grounds to stand on here. We expected to be able to properly use 4GB of video memory and we're missing ~12%.

True, that doesn't make the benchmarks and reviews any less factual. The issue persisted then. Still, consumers who purchased that card with the intent of it performing well down the line when vram usage is higher will be disappointed.
Posted on Reply
#94
scorpion_amd13
64KWhat? The GTX 780Ti was the flagship of that generation (Kepler) and the Classified version is still a very nice card. Why are you saying that you could get a Flagship for that same price ($400)?

At launch

GTX 480 $500
GTX 580 $500
GTX 780 $650
GTX 780Ti $700
Ok, second-best if you don't count dual-GPU stuff. Either way, you're forgetting about one or a couple of Titans (Titan Black Edition and Titan-Z, the original Titan was slower than 780 Ti).

As for price, you just need to go a little bit before the GTX 480. Namely, the GTX 285 had a 400$ price tag attached to it at launch and it was the single-GPU flagship of its generation.
Posted on Reply
#95
Ja.KooLit
Poll dont matter because you can vote anonymously. The fact that NVIDIA lied makes me sick. If nobody found out about that gimped 970, would they even have the courage to say "oops, wrong specs". They just keep their mouth shut and hope nobody find out. Of course fanboy will always defend even its too obvious that nvidia keeps milking the consumers.
Posted on Reply
#96
efikkan
Complain all you want, but it's the real world performance which counts. A little lower memory bandwidth and L2 cache doesn't matter much for the GTX 970 since it's bottlenecked by processing power long before memory bandwidth. Anyone who do the calculations will see the GTX 970 has higher memory bandwidth per GFlop, which means memory bandwidth will be a smaller problem for the GTX 970. Tests like these prove that when overclocking the GPU alone, the GTX 970 shrinks the performance gap to GTX 980.

Most people don't realise that higher memory bandwidth offer little difference given a set GPU performance, while increasing the GPU performance also increases the need for memory bandwidth. This means a higher performing GPU is more bottlenecked by memory bandwidth.

This is why GTX 970 is a more "balanced" GPU than GTX 980, and is also why GTX 980 in theory should be ~32% faster but in reality is only <15% faster.
Posted on Reply
#97
the54thvoid
Super Intoxicated Moderator
scorpion_amd13Fanboys never, ever cease to amaze...
Says scorpion_amd13. Unless that's not the relevant AMD we're talking about in context with Nvidia?

And wtf? You've been a member 9+ years and posted 13 times? That must be a record you hermit - post more :D
Posted on Reply
#98
scorpion_amd13
efikkanThis is why GTX 970 is a more "balanced" GPU than GTX 980, and is also why GTX 980 in theory should be ~32% faster but in reality is only <15% faster.
You do realize that what you described has absolutely nothing to do with memory bandwidth, right? Upping the shader count never yields the same percentage in regards to real-life performance gain. Never ever. It's because of the way the GPU itself schedules tasks for its many shader groups, basically. The more shader groups, the more complex and hairy it all gets, so yeah, ~15% gain sounds about right considering the difference in frequency and shader count between the 970 and the 980.

Want a famous example? Radeon HD 3870 (320 VLIW5 shaders) versus Radeon HD 4870 (800 of the same VLIW5 shaders). The 4870 was about 56% more powerful, even though the shader count alone would indicate a massive jump in performance. And memory bandwidth had absolutely nothing to do with it.

@the54thvoid : Heh, long story, that part of the nickname actually has to do with the initials of my name. Not that hard to guess that I was rather young (although not 13) when I chose the nickname. And yeah, I've been around a really long time. I just rarely post. I enjoy the site and the community (gotta love those slug-fests between fanboys, eh?), I'm just rarely tempted to get involved.
Posted on Reply
#99
Ja.KooLit
the54thvoidSays scorpion_amd13. Unless that's not the relevant AMD we're talking about in context with Nvidia?

And wtf? You've been a member 9+ years and posted 13 times? That must be a record you hermit - post more :D
up for a long term hibernation I guess? :D
Posted on Reply
#100
xvi
scorpion_amd13You do realize that what you described has absolutely nothing to do with memory bandwidth, right? Upping the shader count never yields the same percentage in regards to real-life performance gain. Never ever. It's because of the way the GPU itself schedules tasks for its many shader groups, basically. The more shader groups, the more complex and hairy it all gets, so yeah, ~15% gain sounds about right considering the difference in frequency and shader count between the 970 and the 980.

Want a famous example? Radeon HD 3870 (320 VLIW5 shaders) versus Radeon HD 4870 (800 of the same VLIW5 shaders). The 4870 was about 56% more powerful, even though the shader count alone would indicate a massive jump in performance. And memory bandwidth had absolutely nothing to do with it.

@the54thvoid : Heh, long story, that part of the nickname actually has to do with the initials of my name. Not that hard to guess that I was rather young (although not 13) when I chose the nickname. And yeah, I've been around a really long time. I just rarely post. I enjoy the site and the community (gotta love those slug-fests between fanboys, eh?), I'm just rarely tempted to get involved.
You don't post often, but when you do, you post well.

Posted on Reply
Add your own comment
Nov 21st, 2024 14:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts