• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Responds to GTX 970 Memory Allocation 'Bug' Controversy

Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
The price will likely drop a bit anyway once AMD actually have a new product to sell, the current fire sale of products in a market already flooded with cheap ex-miners clearly doesn't make much difference.

Wait a second, ex-miners? That was so 2013. I haven't seen cards used in mining at any profitable margins since mid-2014, and even then nearly no one was doing it anymore.

There might still be a few on the market, but I doubt it.
 
Joined
Oct 22, 2014
Messages
14,170 (3.81/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
Am I reading into it wrongly, or is the 970 basically operating in 224-bit mode most of the time, and not 256-bit mode?
Yep, you're reading it wrong, that's the bandwidth in Gb/s
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
NOOOO NVIDIA IS OUR BULL GOD!!!!!!!
Seriously though, for most its still a good deal.
Pretty much, the numbers might change but the performance hasn't magically altered since it launched.

Having said that, I wouldn't mind one bit if Nvidia dropped prices across the board, offered AAA title game keys, and intro'd the GM 200 early just to erase the bad taste.
 
Joined
Feb 14, 2012
Messages
2,356 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Yep, you're reading it wrong, that's the bandwidth in Gb/s

Are you sure? If it's only using 7 of 8 channels for 3.5GB that's 224-bit mode. Then the 8th channel is like 32-bit mode, and the two modes are exclusive. I thought AnandTech made it clear that it strides 1-2-3-4-5-6-7 - 1-2-3-4-5-6-7 and doesn't touch the remaining 8th channel until more than 3.5GB.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,119 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Ignorance stones? Only a fanboy in denial, or a complete moron would take nvidia's explanation as truth with zero skepticism. Performance at the time of review isn't the problem, the problem is that people were sold a gpu expecting the specifications of that gpu to be the same four months later as they were at release. Also, people don't just buy a gpu to play only the games on market up to the point of release, they buy them with future performance in mind. You don't think there are some people that may have decided to skip the 970 if they had known there could be problems addressing the full 4gb of ram in the future, especially when console ports are using up to 3.5gb at 1080p now? What about the people that were worried about that 256 bit memory bus? Nvidia pointed out to reviewers that their improved L2 cache would keep memory requests to a minimum, therefore allowing more "achieved" memory bandwidth even with a smaller memory interface. The point is they lied about the specs, and it's almost impossible to believe that this is some kind of big misunderstanding between the engineering and marketing team that they just happened to realize after 970 users discovered it.

Mmm. Read my posts much? I did say I don't believe NV were genuine. Yawn...
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Wait a second, ex-miners? That was so 2013. I haven't seen cards used in mining at any profitable margins since mid-2014, and even then nearly no one was doing it anymore.
There might still be a few on the market, but I doubt it.
There are still more than a few out there. A quick browse of eBay shows multiple listings. This guy has 8 used R9 290's for sale , used "in an air conditioned environment", kind of screams scrypt mining, even if the number of boards being sold isn't a big enough clue.
 
Joined
Aug 20, 2007
Messages
21,541 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Ah. Must be some alt-coin craze I was unaware of after litecoin then. Wow, you'd think people would realize GPU mining profit wasn't all that great after that and cut it out way before now. *shrugs*

For what it's worth, when I sold my mining cards, I did so here and sold them dirt cheap with big warnings on them. Thanks for the correction, anyhow.
 
Joined
Jul 19, 2008
Messages
1,180 (0.20/day)
Location
Australia
Processor Intel i7 4790K
Motherboard Asus Z97 Deluxe
Cooling Thermalright Ultra Extreme 120
Memory Corsair Dominator 1866Mhz 4X4GB
Video Card(s) Asus R290X
Storage Samsung 850 Pro SSD 256GB/Samsung 840 Evo SSD 1TB
Display(s) Samsung S23A950D
Case Corsair 850D
Audio Device(s) Onboard Realtek
Power Supply Corsair AX850
Mouse Logitech G502
Keyboard Logitech G710+
Software Windows 10 x64
I'm waiting for an apology to the community for all the guys here that said there was no issue..... Or at least just say you were wrong.

Nvidia advertised the 970 as the same memory subsystem as the 980. That is clearly a big fat lie. and its not even a little different, its completely different. I cant see how they could "overlook" that in the specs.

Edit: ....and why does GPUz show 64 rops when there are 56?? Does GPUz actually detect or are the specs written in based on the model number??
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,963 (3.72/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
why does GPUz show 64 rops when there are 56?? Does GPUz actually detect or are the specs written in based on the model number??
GPU-Z asks the NVIDIA driver. Even if I queried the hardware units, it would still show them as active, because they are not really disabled (they can be used for AA)
 
Last edited:
Joined
Sep 7, 2011
Messages
598 (0.12/day)
Location
Pacific Rim
Processor Ryzen 3600
Motherboard B450
Cooling Scythe Ashura
Memory Team Dark Z 3200 8GB x2
Video Card(s) MSI 390
Storage WD 2TB + WD Green 640GB
Display(s) Samsung 40JU6600 @ 200% scaling
Case Coolermaster CM 690 II
Audio Device(s) Fiio E10K, Graham Slee Solo II SRG, Sennheiser HD6XX, AKG K7XX, ATH WS1100is
Power Supply Corsair HX650
Mouse Rival 700
Keyboard Corsair K70, Razer Tarantula
It's amusing that what many people want is:
a. apology
b. free game
c. price drop

rather than anything that solve, negate, or prevent further problem.
 
Joined
Apr 30, 2012
Messages
3,881 (0.84/day)
GPU-Z asks the NVIDIA driver. Even if I queried the hardware units, it would still show them as active, because they are not really disabled (they can be used for AA)

If your quering the driver and its telling you 64 does that mean the marketing department is making drivers now ?

We all just got told that the engineers knew but never communicated the correct specs to the marketing team.
 
Last edited:
Joined
Jul 19, 2008
Messages
1,180 (0.20/day)
Location
Australia
Processor Intel i7 4790K
Motherboard Asus Z97 Deluxe
Cooling Thermalright Ultra Extreme 120
Memory Corsair Dominator 1866Mhz 4X4GB
Video Card(s) Asus R290X
Storage Samsung 850 Pro SSD 256GB/Samsung 840 Evo SSD 1TB
Display(s) Samsung S23A950D
Case Corsair 850D
Audio Device(s) Onboard Realtek
Power Supply Corsair AX850
Mouse Logitech G502
Keyboard Logitech G710+
Software Windows 10 x64
GPU-Z asks the NVIDIA driver. Even if I queried the hardware units, it would still show them as active, because they are not really disabled (they can be used for AA)

Ok I see.....and GPUz doesn't detect cache so that went unnoticed.

The more I think about this, the worse it looks for Nvidia. There is a very long thread on Nvidia forums about this starting since just after the 970 release late last year. Nvidia only makes comment when the story is reported by major tech sites. I cant believe they didn't know about this much earlier......its like they held off as long as they could to continue the hype and get 970 sales over the Christmas period.

The 970 memory subsystem is unique, I'm not aware of a similar system at least on mid-high end graphics cards. Unless they have been doing it and we weren't aware. I just don't accept they overlooked a memory subsystem that is so unique and forgot to mention it....and I don't accept it took them 3 months to figure it out.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
The more I think about this, the worse it looks for Nvidia. There is a very long thread on Nvidia forums about this starting since just after the 970 release late last year. Nvidia only makes comment when the story is reported by major tech sites. I cant believe they didn't know about this much earlier..........and I don't accept it took them 3 months to figure it out.
It's actually quite believable.
What percentage of GTX 970 buyers encountered the problem? Sometimes people (especially the disgruntled) post multiple times over multiple forums, but for the most part, 970 owners don't seem that affected - even owners here and elsewhere say it hasn't impacted them personally (in fact the biggest outcry is from people who own AMD cards, go figure!). Nvidia stated they'd sold a million GTX 970's and 980's, and I'll go out on a limb and say that the bulk of those sales are of the former. Hundreds of thousands of cards sold, and how many individual issues (as opposed to multiple postings by individuals) reported ?
You don't have to look very far for a precedent. AMD's Evergreen series owners started questioning the stability of the cards almost from launch in September 2009 ( I returned two cards of mine personally and 3-4 from builds I was doing for others). It wasn't until the issue became better publicized in early 2010 that AMD started work on trying to locate and remedy the problem ( PowerPlay state voltage settings).
It would be nice to entertain the thought that this kind of stuff is acted as soon as it rears it head, but that seldom happens unless the issue is pervasive.
 
Joined
Feb 13, 2012
Messages
523 (0.11/day)
Are you sure? If it's only using 7 of 8 channels for 3.5GB that's 224-bit mode. Then the 8th channel is like 32-bit mode, and the two modes are exclusive. I thought AnandTech made it clear that it strides 1-2-3-4-5-6-7 - 1-2-3-4-5-6-7 and doesn't touch the remaining 8th channel until more than 3.5GB.


Another thing that was mentioned is that each smm on Maxwell is capable of outputting 4pixels per clock meaning even tho it has 58 rops, in reality it only has 52pixel/second fillrate due to having 13smm. and in other words it's more like 208bit or being able to feed 6.5 of the channels, now had it been that nvidia enabled only 12smms then 192 bit would've fed the gpu just fine. Now with all this brought to attention it only makes me realize how ignorant I been about so many of the details on gpus. How does this translate to other architectures like gcn for example? Yes Hawaii has 512bit and 64rops, but do they get fed efficiently or are they there more for compute rather than graphics? Because that's what it feels like. And with this being said everyone complained about gtx 960 having only 128bit, but to come think about it having 32rops it gets fed exactly according to how much it can handle without any bottlenecks.
 
Joined
Feb 14, 2012
Messages
2,356 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
When I say 256-bit vs 224-bit, I'm referring to the effective vram width (not sm/core/rop guts of the GPU, that can vary a lot as usual). I'm surprised everyone is hung up on the 4 vs 3.5, and not the 256 vs 224.
 
Joined
Aug 11, 2011
Messages
4,357 (0.89/day)
Location
Mexico
System Name Dell-y Driver
Processor Core i5-10400
Motherboard Asrock H410M-HVS
Cooling Intel 95w stock cooler
Memory 2x8 A-DATA 2999Mhz DDR4
Video Card(s) UHD 630
Storage 1TB WD Green M.2 - 4TB Seagate Barracuda
Display(s) Asus PA248 1920x1200 IPS
Case Dell Vostro 270S case
Audio Device(s) Onboard
Power Supply Dell 220w
Software Windows 10 64bit
AMD's Evergreen series owners started questioning the stability of the cards almost from launch in September 2009 ( I returned two cards of mine personally and 3-4 from builds I was doing for others). It wasn't until the issue became better publicized in early 2010 that AMD started work on trying to locate and remedy the problem ( PowerPlay state voltage settings).

Yeah, but in AMDs case they didn't really know what was going on and had to investigate ans since it wasn't repeatable 100% of the time that made things more difficult, in this case nVidia knew everything beforehand and in fact they themselves caused the confusion by providing the press with wrong specs. Not saying that AMD hasn't done things wrong in the past just that the example you used isn't comparable IMO.

I mean, we all know AMD and nVidia cherry pick benchmarks and stuff to make their products look better but flat out giving wrong specs is a different thing.
 
Last edited:
Joined
Apr 19, 2011
Messages
2,198 (0.44/day)
Location
So. Cal.
What sinks is that the more I read the more we uneath the smell of stinky fish... In that Nvidia looks to have found plenty of chip with 3-SM that needed to be fused, as planned. However they found many chips having one on the L2 defective on too many chip and appeared to figure out a way to weasel around it because "nobody ever checks or question L2" we take their word (spec's). I don't care if it performs alright, they duped folks because their equiptment wasn’t as advertised.

Nvidia screwed to pooch on this by saying the GTX 970 ships with THE SAME MEMORY SUBSYSTEM AS OUR FLAGSHIP GEFORCE GTX 980, and I'm not buying the... PR didn't get the message. Nvidia doesn’t need to lower price or give away games… It's the owners who purchase prior who should have a way to be compensated if they want. Nvidia should just step-up… come out and say if you want to return them we’ll refund all your money, or you can apply for some form of settlement. If Nvidia can’t bring themselves to do that then, a Class Action Suit should be brought to make the owner who were duped be provide some compensation.

What's so funny are the guy here defending Nvidia saying it not a big deal, or they all do it… OMG! If this was AMD or Apple those same folks would be calling for their heads. To allow Nvidia to sweep this under the rug would just promote other Tech companies to further deceive with impunity.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Yeah, but in AMDs case they didn't really know what was going on and had to investigate ans since it wasn't repeatable 100% of the time that made things more difficult, in this case nVidia knew everything beforehand and in fact they themselves caused the confusion by providing the press with wrong specs. Not saying that AMD hasn't done things wrong in the past just that the example you used isn't comparable IMO.
I was using comparable in the sense that between the first signs of detection and subsequent actioning there is a lag. If you wanted a more apples-to-apples comparison of the companies prior knowledge of performance discrepancy but declining to publicize it, a more apt comparison would be AMD's advertising of Bulldozer as an eight core processor knowing full well that limitations of shared resources meant a degradation in core performance.
I mean, we all know AMD and nVidia cherry pick benchmarks and stuff to make their products look better but flat out giving wrong specs is a different thing.
Agreed.
 
Top