Wednesday, January 28th 2015
NVIDIA to Tune GTX 970 Resource Allocation with Driver Update
NVIDIA plans to release a fix for the GeForce GTX 970 memory allocation issue. In an informal statement to users of the GeForce Forums, an NVIDIA employee said that the company is working on a driver update that "will tune what's allocated where in memory to further improve performance." The employee also stressed that the GTX 970 is still the best performing graphics card at its price-point, and if current owners are not satisfied with their purchase, they should return it for a refund or exchange.
Source:
GeForce Forums
89 Comments on NVIDIA to Tune GTX 970 Resource Allocation with Driver Update
@ryun dont be so fast to dismiss this.. people where complaining about the 970 on the nvidia forum since day 1 with no real answers.
Also the PS4 and XB0X ONE do have 8gb of memory inside but some is reserved at times for system resources. However recently they have even stated they are/will be allowing more and more to be accessed for games.
Anyway, either way any update will help the GTX 970 but it will never alleviate the root cause completely. It might be better if they find a way to utilize the ~500mb in a different method that will alleviate the main ram for its necessary tasks. Could be a good way to get some extra performance in a way that could make the 3.5gb feel a bit larger and use it effectively though that is just a random thought. Any performance work is good for the owners or for the future so if they can help it at all that is going to be nice.
Also why are we discussing G-Sync vs FreeSync on this thread? I do not see the relevance?
As for them finding any "across the board" FpS performance highly doubtful, other than a few instances, perhaps they'll get it to more usable for those with SLI. Either way it seems Nvidia is sweeping this under the rug.
I certainly have noticed micro-stutter in Mordor that I associate with this, and it sucks. It's very noticeable, even in benchmarking (where you see a very quick plummet of framerate). It would be great if I could use higher resolution/textures (memory-heavy options) and simply have a lower, if consistent experience....I feel the resolution options are granular enough (say from 2560x1440->2688x1512->etc) the space between 3.5-4 would be useful. I would have also appreciated if the L2C/SYS/XBAR clocks would have been set to GPC clocks out of the box (like previous generations?), as raising those up seemed to help things quite a bit for me. There are simply quite a few nvidia and aib (mostly evga) kerfluffles that really rub me the wrong way about this product since launch, and are only being fixed/addressed now because of greatly-appreciated community digging.
That allll said, and this doesn't excuse it, they are being addressed...and that means something. Yes, nvidia screwed the pooch by not disclosing this information at launch, but not only does this revelation not change the product, it also holds their feet to the fire for optimizations more-so than if it had been disclosed from the start. It also does not change the fact when I run at 2560x1440 (etc), which is really all I need from my viewing distance, I am still getting better performance than any other 9.5'' card on the market. I feel for the price I paid, the performance is fair. Had I paid more than I did, or adversely had they cut the product to 12SMMs/192-bit, I would likely be disappointed. This is obviously a very well thought-out and placed product, and still deserves praise for the weird amalgamation that it is.
Edit: added pic of how I changed l2c etc clocks. I know this is a common mod amongst the community, but am curious if this helped others stabilize things as much as it helped me.
And for the memory capacity. It says 8GB and guess what, PS4 has 8GB of memory. Are you saying it doesn't have it on PCB? Oh wait... No one said GTX 970 doesn't have 4GB of memory. Because we all know that it does. But the way how it's accessing it and utilizing it, that's the shitty part and the reason for the outrage beyond 3,5GB. Get your facts straight man...
Would you mind telling us where did you get the 1.2 billion transistor count? From speculations on desktop Jaguar cores? PS4 ain't desktop system you know, it's custom designed for PS4 specifically and based upon desktop components...
The card does in fact have 64 ROPs. It just only uses 56 because using the others would actually make the card slower.
Holy shit. 290x got super cheap. Like...wow. I was thinking they were $360-370 (or whatever the last price drop was), in which case I think the premium over a 290 vanilla was (and is) still worth it. At $280 for a 290x though (!) you're totally right. If you can handle one of those beasts, they are one hell of a deal.
I still stand by the nicety that is a short card though, as well as being able (even if by bios mods) to draw a huge amount of power over 2x6-pin. Having a highish-end card in a mid-range package is super nice...granted it's totally out of all kind of specs and perhaps bound one day to release magic smoke. The fact it exists and CAN do it though; worth the over-all smallish premium on the $/perf to me; for others they may see perf/w at stock as a boon. It's all relative (and in regards to PR-speak at that), but yeah....those 290x are a nice deal. :roll:................:lovetpu:
While NVIDIA has never fully explained in-depth how such memory allocation is handled for those cards, it worked and we knew it was there (perhaps got closer now than they ever wanted). Nvidia should've at least present very top level overview, saying this is engineering based multiple generations of experience. This time I feel they didn't want it known they'd implemented it on such a high-performance card. It's not that big a deal given the price/performance, trade-offs are made for yields and offering the enormous volume they needed...
But the fact is engineering crafted it (as customer finally unearth), while marketing I really believe couldn't bring themselves to admit gritty truth and they have the blame. I said there should be some type of restitution, but as I'm not affected others can figure that out.
EnlightenmentEntitlement :roll:* AMD acquired ATI in October 2006. the 2900 XT launched in May 2007
It's wrong and not something anyone should take lightly or slight.