Thursday, November 5th 2015
Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough
This year's installment to the Call of Duty franchise, Black Ops III, has just hit stores, and is predictably flying off shelves. As with every ceremonial annual release, Black Ops III raises the visual presentation standards for the franchise. There is, however, one hitch with the way the game deals with system memory amounts as high as 12 GB and video memory amounts as high as 8 GB. This hitch could possibly be the reason behind the stuttering issues many users are reporting.
In our first play-through of the game with its highest possible settings on our personal gaming machines - equipped with a 2560 x 1600 pixels display, Core i7 "Haswell" quad-core CPU, 12 GB of RAM, a GeForce GTX 980 Ti graphics card, NVIDIA's latest Black Ops III Game Ready driver 385.87, and Windows 7 64-bit to top it all off, we noticed that the game was running out of memory. Taking a peek at Task Manager revealed that in "Ultra" settings (and 2560 x 1600 resolution), the game was maxing out memory usage within our 12 GB, not counting the 1.5-2 GB used up by the OS and essential lightweight tasks (such as antivirus).We also noticed game crashes as little as 10 seconds into gameplay, on a machine with 8 GB of system memory and a GTX 980 Ti.What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB.On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
In our first play-through of the game with its highest possible settings on our personal gaming machines - equipped with a 2560 x 1600 pixels display, Core i7 "Haswell" quad-core CPU, 12 GB of RAM, a GeForce GTX 980 Ti graphics card, NVIDIA's latest Black Ops III Game Ready driver 385.87, and Windows 7 64-bit to top it all off, we noticed that the game was running out of memory. Taking a peek at Task Manager revealed that in "Ultra" settings (and 2560 x 1600 resolution), the game was maxing out memory usage within our 12 GB, not counting the 1.5-2 GB used up by the OS and essential lightweight tasks (such as antivirus).We also noticed game crashes as little as 10 seconds into gameplay, on a machine with 8 GB of system memory and a GTX 980 Ti.What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB.On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.
168 Comments on Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough
On side note, anyone fancy a BF4 Conquest game? I just put this new 60Hz server up on PC:
-=THE EAGLES=- 60Hz |No Limits|Conquest 24/7|4FUN|Free 4 All
Anyone from EU (and surrounding areas) are very welcome to join!
cheers lads! :)
Ridiculous!!
Ultra = the most hardware taxing visual display a game can muster, optimization be damned. And a 1000$ machine shouldn't be able to handle it, at least not based on the precedents set by games like doom 3, quake 4, crysis, heavily modded skyrim. etc. You want eye candy? It's going to cost you.
Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.
Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.
And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?
Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...
Quincunx Anti-Aliasing is not MSAA
its a precursor or rather something NV tried to do to make better aa performance when the hardware couldn't handle it. I do see MSAA as a part of the Open GL 1.5 standard from late 2003 though as well as DirectX 9.0c from August 2004. So you are correct it was around in 2005, but SSAA (games settings just had it as AA) was still the standard then. Either way the ultra setting wasn't just about aa, draw distance/FOV, shadows, hdr, water effects, other animations like arrow trails, bullet effects, and etc.
So thanks for zeroing in on some random points in my post. The point was that Ultra was not as he described locked in with an 8x MSAA boost.
secondly who says it has to look the same? you? are you ranting against yourself? As described above Ultra typically adds a ton of different effects to a game, many of which are quite noticeable. Though ideally the game will still look and play fine with them off for the majority of people.
Some people are just fine adding an area rug and calling it a day. Some people pay millions to interior decorators to make their homes look like a palace or the set of their fav sci-fi show.
Ultra mode is for the latter group.
The rest will do fine on Medium or High or better yet their own custom preset of the effects they care about and none that they don't.
But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)
Drivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim support
This means the maximum available is mapped to the GPU...usually equvilant to the VRAM on your card.
It was supposed to be fixed in 8.1 but MS postponed it till W10..... no biggie, as Wiz said witth 16GB it's all good... :)
Beta
Final
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards should always be capped by what the game can offer. Go higher and it's a case of poor optimization. I thought this was fixed with Windows 7 (and d3d10/11)
but it's not fixed, and never will be in W7... kernel issue..
Here's some info on DX if you want to read ....
msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx
And welcome to TPU fellow Tennesean. :toast:
Execption being Superfetch imo. :P Never did like it lol
You are right. Running a game from ram is better, but that still wouldn't justify caching data for segments that won't be needed for minutes/hours to come, or ones that aren't needed any more. What matters is what's being displayed now and what will be in the very near future (for the game), in other words: what the game "needs". Then it's simply a matter of balancing when to cache newer data and when to scrub older ones. I'll take your word for now. My experience with programming hasn't reached d3d yet >_>
2. Because Batman AK used over 7GB at launch, now uses around 5.5GB max. GTA V online also had memory leak issues, which have been fixed since.
It's a constantly moving finish line, and the natural evolution of things. It's unrealistic to think or hope that requirements will stand still.