Friday, January 23rd 2015
GeForce GTX 970 Design Flaw Caps Video Memory Usage to 3.3 GB: Report
It may be the most popular performance-segment graphics card of the season, and offer unreal levels of performance for its $329.99 price, but the GeForce GTX 970 suffers from a design flaw, according to an investigation by power-users. GPU memory benchmarks run on GeForce GTX 970 show that the GPU is not able to address the last 700 MB of its 4 GB of memory.
The "GTX 970 memory bug," as it's now being called on tech forums, is being attributed to user-reports of micro-stutter noticed on GTX 970 setups, in VRAM-intensive gaming scenarios. The GeForce GTX 980, on the other hand, isn't showing signs of this bug, the card is able to address its entire 4 GB. When flooded with posts about the investigation on OCN, a forum moderator on the official NVIDIA forums responded: "we are still looking into this and will have an update as soon as possible."
Sources:
Crave Online, LazyGamer
The "GTX 970 memory bug," as it's now being called on tech forums, is being attributed to user-reports of micro-stutter noticed on GTX 970 setups, in VRAM-intensive gaming scenarios. The GeForce GTX 980, on the other hand, isn't showing signs of this bug, the card is able to address its entire 4 GB. When flooded with posts about the investigation on OCN, a forum moderator on the official NVIDIA forums responded: "we are still looking into this and will have an update as soon as possible."
192 Comments on GeForce GTX 970 Design Flaw Caps Video Memory Usage to 3.3 GB: Report
Great news for the owners, and gg nvidia :slap:
Really I think it goes deeper then AB for seeing the lack of memory usage but hey :wtf: tjmo
I won't believe that engineers are so stupid these days that they don't know how to manage device memory (or at least how to make sure that it's accessible at runtime).
It seems they may have cut too much with he laser when making the 970, as the 980 doesn't suffer this issue.
The memory is still being utilized just when you hit the 3000+ it seams to take a nose dive to 25% and below causing bottlenecks, leading to performance issues.
Que E.D. jokes. Blue pill solution.
Sarcasm intended.
Either way this is a pretty big issue (if it is 100% confirmed) and something that hopefully can be addressed with a patch in drivers/bios.
Though it is curious that the 980s do not suffer the same problem which does bring to mind the rumor or it having to do with the cut on the chip.
Not sure honestly what to make of this. It's interesting to say the least...
..oh wait a minute.
What bothers me is the naivete of some commenter above stating that nVidia would not send a flawed product out and that their engineers test it fully. I'm sorry, but are you sure you're not brain dead?! Companies send out bloody cars fully knowing they have f* up the production on that model and there is a good chance it will blow up.
And you dare to think that they would not send a flawed GPU? When in Gods name has a GPU killed anybody? Cars kill people on a daily basis and companies making them have no problem cutting custs even if that risks lives. And you expect a company to give a shit about a GPU? Seriously?!
Tech forums : Where hysteria, viral marketing and the echo chamber form a singularity so dense that even common sense can't escape.
But yeahh GTX 970 still the best till now...
- Allocated 31 Chunks
- Allocated 3968 MiByte
(bandwidth for last 1-3 chunks varies from test to test)How it looks in use, when playing is a question .Or we see and what do we not see? These problems can be expected only in 4k resolution. Who will buy a GTX970 with 4 gb RAM for 4k playing will be disappointed .
Regardless, from all that debate I'm picking up conjecture, conspiracy and 'stabs in the dark'.
Issue needs addressed formerly, not by a few end users.
Nvidia will have a case to answer though. Regardless of the badly run benchmarks, people are reporting tanking frame rates when memory usage crosses a threshold that is reasonably below the cards hardware limit. Who knows, may be the texture compression algorithm uses Vram itself.