# Black Ops III: 12 GB RAM and GTX 980 Ti Not Enough



## btarunr (Nov 5, 2015)

This year's installment to the Call of Duty franchise, Black Ops III, has just hit stores, and is predictably flying off shelves. As with every ceremonial annual release, Black Ops III raises the visual presentation standards for the franchise. There is, however, one hitch with the way the game deals with system memory amounts as high as 12 GB and video memory amounts as high as 8 GB. This hitch could possibly be the reason behind the stuttering issues many users are reporting.

In our first play-through of the game with its highest possible settings on our personal gaming machines - equipped with a 2560 x 1600 pixels display, Core i7 "Haswell" quad-core CPU, 12 GB of RAM, a GeForce GTX 980 Ti graphics card, NVIDIA's latest Black Ops III Game Ready driver 385.87, and Windows 7 64-bit to top it all off, we noticed that the game was running out of memory. Taking a peek at Task Manager revealed that in "Ultra" settings (and 2560 x 1600 resolution), the game was maxing out memory usage within our 12 GB, not counting the 1.5-2 GB used up by the OS and essential lightweight tasks (such as antivirus). 



 

 




We also noticed game crashes as little as 10 seconds into gameplay, on a machine with 8 GB of system memory and a GTX 980 Ti.




What's even more interesting is its video memory behavior. The GTX 980 Ti, with its 6 GB video memory, was developing a noticeable stutter. This stutter disappeared on the GTX TITAN X, with its 12 GB video memory, in which memory load shot up from maxed out 6 GB on the GTX 980 Ti, to 8.4 GB on the video memory. What's more, system memory usage dropped with the GTX TITAN X, down to 8.3 GB. 



 

 

On Steam Forums, users report performance issues that don't necessarily point at low FPS (frames per second), but stuttering, especially at high settings. Perhaps the game needs better memory management. Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.

*View at TechPowerUp Main Site*


----------



## TheDeeGee (Nov 5, 2015)

Didn't know the CoD Series was still a thing.


----------



## GhostRyder (Nov 5, 2015)

Are you serious???  There is no way they need that much Vram to run that game realistically...


----------



## Jborg (Nov 5, 2015)

TheDeeGee said:


> Didn't know the CoD Series was still a thing.



Its not really, its just a money grab at this point.

Definitely not even thinking about picking this game up, especially after seeing this crap lol.

Certainly not upgrading anything in my system to play a new call of duty game.

The originals were amazing games. But like I said, its just a money grab now.

I would like to see a remake of a WW2 game, only problem is they will never make another call of duty with the COD2 layout. I WANT a custom server browser..... Not some lame matchmaking non-sense.



GhostRyder said:


> Are you serious???  There is no way they need that much Vram to run that game realistically for that game...



Agreed, this is just absurd.


----------



## Szb84 (Nov 5, 2015)

Activision must be jealous to the "attention" what WB got with Batman: AK if they allowed to release this...


----------



## TheDeeGee (Nov 5, 2015)

Jborg said:


> I would like to see a remake of a WW2 game, only problem is they will never make another call of duty with the COD2 layout. I WANT a custom server browser..... Not some lame matchmaking non-sense.



Ye... the last CoD i played was the very first one.

I remember playing the Demo over and over at the time, was such an epic experience. As bad as WW2 was, it just has an awesome vibe to it.


----------



## RejZoR (Nov 5, 2015)

One thing is pushing the limits and another just being lazy. Craming gazillion textures and polygons in engine and requiring super computer is just moronic. It just means they put ZERO effort in designing the engine, they just crammed everything into it. That's why Unreal Engine 4 runs on pretty much anything no matter what even with most likely same fidelity as this CoD.

Because if those are the specs, not many people can actually run that. I mean, I have 32GB RAM so I don't care, but graphic card is still "just" 4GB (GTX 980). If you want to target the broadest target public, you can't have specs like this. Or it has to look like nothing we've seen to date. Frankly, I doubt that will be the case...


----------



## buildzoid (Nov 5, 2015)

How on earth did they pull this off? Did they just decide to load the entirety of the game into the RAM and VRAM?

COD BLOPS3: the new Crysis.


----------



## Kissamies (Nov 5, 2015)

What about Windows 10 or even 8.1?

It would be fun to try it with my i3/8GB/GTX670/W10Pro PC. 



buildzoid said:


> COD BLOPS3: the new Crysis.



Crysis wasn't so hungry for RAM/VRAM, just GPU horsepower.


----------



## Deleted member 67555 (Nov 5, 2015)

When I played the beta I did better on an AMD 7850 with a R9 280x
Than I did with an I5 3570k with a R9 280x..
I'm not sure why...it stuttered like crazy with the 3570...I just figured it was a beta bug


----------



## RejZoR (Nov 5, 2015)

buildzoid said:


> How on earth did they pull this off? Did they just decide to load the entirety of the game into the RAM and VRAM?
> 
> COD BLOPS3: the new Crysis.



But in all honesty, while Crysis is still super demanding, it looked spectacular at time and is in a way still a benchmark for visual fidelity. Mostly because it's SO old but looks like it was released a year or two ago...


----------



## buildzoid (Nov 5, 2015)

I know I know. I'm just abusing the can it run Crysis joke.

COD BLOPS3: needs more RAM than Chrome!

sorry I'm in a meme mood


----------



## Estaric (Nov 5, 2015)

why are people calling this the new crysis? Wont this just be patched in no time and then it will be just another call of duty game.


----------



## Vayra86 (Nov 5, 2015)

Call of Shitty. Avoid like the plague. Fun for consoles, because they lack so many things that PC gamers do have access to. Otherwise...

moving along


----------



## 64K (Nov 5, 2015)

TheDeeGee said:


> Didn't know the CoD Series was still a thing.



Sales of COD games has dropped off since Black Ops 2 which generated $500 million dollars in sales on the first day of release but the franchise is still pretty strong. Last year COD Advanced Warfare was the best selling video game.


----------



## peche (Nov 5, 2015)

RejZoR said:


> But in all honesty, while Crysis is still super demanding, it looked spectacular at time and is in a way still a benchmark for visual fidelity. Mostly because it's SO old but looks like it was released a year or two ago...


crysis never crashed me ... not even with a medium video card!


----------



## uuuaaaaaa (Nov 5, 2015)

RejZoR said:


> But in all honesty, while Crysis is still super demanding, it looked spectacular at time and is in a way still a benchmark for visual fidelity. Mostly because it's SO old but looks like it was released a year or two ago...



Use a custom cfg and an HD texture pak and it looks absolutely (even more) epic! Crysis was one of those games that will go down in history for everything that it meant in terms of pushing visual fidelity. The game aged pretty well still looks awesome even for today's standards.


----------



## GreiverBlade (Nov 5, 2015)

well wasn't COD:Ghost RAM hungry (for the standard at the time) around 6gb needed or it wouldn't run?

that one put the challenge a bit higher tho (ABOUT TWICE HIGHER????  )

edit... about vRAM ... HELL!?!? 6gb is not enough ... hello 8gb 390/x/290/x... 
well i have a 980 4gb but i am not interested in COD games anymore (although Ghost was fun when i played it )


----------



## Ferrum Master (Nov 5, 2015)

Holy cow... where they do find the coder monkeys? Or are they blind during testing phase?


----------



## GhostRyder (Nov 5, 2015)

GreiverBlade said:


> well wasn't COD:Ghost RAM hungry (for the standard at the time) around 6gb needed or it wouldn't run?
> 
> that one put the challenge a bit higher tho (ABOUT TWICE HIGHER????  )
> 
> ...


 So basically we are stuck with about 5 video cards with enough video memory to run this game at ultra...


----------



## W1zzard (Nov 5, 2015)

GhostRyder said:


> So basically we are stuck with about 5 video cards with enough video memory to run this game at ultra...



or have 16 GB RAM, which makes GTX 980 Ti work flawlessly


----------



## Ferrum Master (Nov 5, 2015)

W1zzard said:


> or have 16 GB RAM, which makes GTX 980 Ti work flawlessly



Does the game during intro show some logos like sponsored by Micron, Nanya or Samsung?

It would explain a lot .


----------



## flyingpussy (Nov 5, 2015)

activision is now trying to surpass wb


----------



## kiddagoat (Nov 5, 2015)

Yup Call of Dooty.... and I do mean Dooty....  I really wish they would just go back to keeping it simple like with CoD4.... it almost seems like they are the Apple of video games.... they add all these "features" that other games have used and executed better in previous titles but once CoD does it, it is all ohhhhhh mmerrrr gawwwddd melting my face...... etc etc... They are just very lazy.  

 Console kiddies will gobble this up as always as that's the bigger market... but still... this series just needs to go away for awhile to give itself time to reinvent.  They are just snagging and grabbing from other titles now.


----------



## Jborg (Nov 5, 2015)

GhostRyder said:


> So basically we are stuck with about 5 video cards with enough video memory to run this game at ultra...





kiddagoat said:


> Yup Call of Dooty.... and I do mean Dooty....  I really wish they would just go back to keeping it simple like with CoD4.... it almost seems like they are the Apple of video games.... they add all these "features" that other games have used and executed better in previous titles but once CoD does it, it is all ohhhhhh mmerrrr gawwwddd melting my face...... etc etc... They are just very lazy.
> 
> Console kiddies will gobble this up as always as that's the bigger market... but still... this series just needs to go away for awhile to give itself time to reinvent.  They are just snagging and grabbing from other titles now.



I foresee this affecting the sales of this game big time....(PC Sales) -consoles always do well with the kidz) I mean its already a money grab series, most people who enjoyed the original ones, are not buying the new ones because of how stupid the gameplay now is... Also too, I am pretty sure that most people who want this game will be younger kids who want to play another Call Of Duty run n gun spray n pray game

All these perks/attachments/killing streaks are just lame, you don't need any actual skill to get first place, its basically who can out cheese the other player, this is the main reason I stopped playing. The original games actually took teamwork/skill
If only, they would just make a simplistic barebones shooter like the originals with a custom server browser.... I would buy it in a heartbeat.


----------



## GhostRyder (Nov 5, 2015)

W1zzard said:


> or have 16 GB RAM, which makes GTX 980 Ti work flawlessly


I thought I read it stutters on the GTX 980ti but that disappears with the Titan X?

Oh I missed that bottom comment about 16gb of ram and the GTX 980ti working fine...  Ok so we now have 6 Cards that can run it efficiently on Ultra.



Jborg said:


> I foresee this affecting the sales of this game big time.... I mean its already a money grab series, most people who enjoyed the original ones, are not buying the new ones because of how stupid the gameplay now is... Also too, I am pretty sure that most people who want this game will be younger kids who want to play another Call Of Duty run n gun spray n pray game
> All these perks/attachments/killing streaks are just lame, you don't need any actual skill to get first place, its basically who can out cheese the other player, this is the main reason I stopped playing. The original games actually took teamwork/skill
> If only, they would just make a simplistic barebones shooter like the originals with a custom server browser.... I would buy it in a heartbeat.


Eh yea, I actually like the Black Ops/WaW games just for the zombies personally as the multiplayer got to boring and pretty easy if you know the right ways to play it (Meaning right guns, positions, etc).


----------



## Vayra86 (Nov 5, 2015)

kiddagoat said:


> this series just needs to go away



Fixed 

A timeout won't help, this is Activision at its finest and they will never give up their holy grail


----------



## dorsetknob (Nov 5, 2015)

guess i won't be able to play it on my win me system and riva128  4 mb graphics card then


----------



## DeNeDe (Nov 5, 2015)

stupid console port =]]


----------



## Jborg (Nov 5, 2015)

GhostRyder said:


> Eh yea, I actually like the Black Ops/WaW games just for the zombies personally as the multiplayer got to boring and pretty easy if you know the right ways to play it (Meaning right guns, positions, etc).



Yeah I agree. I didn't stop playing after Cod4 MW... I also played WaW and the 1st and 2nd black ops a little bit, I also have Modern Warfare 2.... Modern Warfare 2 was when I stopped playing. I didn't actually play the standard multiplayer for MW2, for this game you could actually create custom matches still, even though there was no server browser.

So it was still possible to have restricted 5v5 S&D and Capture the flag matches in MW2. I played in a couple online paid tournaments and actually took 2nd place in a MW2 tournament... After this game, there was no way to create a custom match to only use specific weapons/perks/attachments. This is basically when the COD competitive community died for good. Back in Vcod / Cod2 / Cod4 there was such a massive competitive community with many many paid LANS around the world. Hell, COD2 was in the WSVG in 2007, if anybody wants to see the best players go at it, youtube search WSVG Cod2 - Watch teams like TEK-9 and Team Pandemic.... the level of their gameplay is just


----------



## natr0n (Nov 5, 2015)

Whatever it takes for victory.

In this case mad ram and beast gpu.


----------



## RCoon (Nov 5, 2015)

Probably did this on purpose for press. Bad press is still press.


----------



## Slizzo (Nov 5, 2015)

CoD doesn't need any press, it will sell a lot of copies regardless.

During Beta I had no issues with performance, ran maxed out. However I do have 16gb of RAM.


----------



## KarymidoN (Nov 5, 2015)

Ferrum Master said:


> Does the game during intro show some logos like sponsored by Micron, Nanya or Samsung?
> 
> It would explain a lot .



Also by NVidia TITAN X 12GB VRAM



GhostRyder said:


> I thought I read it stutters on the GTX 980ti but that disappears with the Titan X?
> 
> Oh I missed that bottom comment about 16gb of ram and the GTX 980ti working fine...  Ok so we now have 6 Cards that can run it efficiently on Ultra.
> 
> ...



Thats right.


----------



## matar (Nov 5, 2015)

OMG this reminds me in 2007 Crysis when I had intel core 2 quad q6700 XFX GTS 8800 640MB in SLi and 8GB i ran the game at full setting @1680x1050 and I got like 1 or 2 fps

looks like this will happen again thanks not buying it because I spend then to run Crysis $1500 on 3 8800GTX in 3-way sli and that's when nVidia first introduced 3 -way sli support.


----------



## NoRest4Wicked1 (Nov 5, 2015)

I think the last Activision game I purchased was Pitfall for the Atari 2600.  I'm done with pre-ordering games from all studios since they pretty much all seem to think it's acceptable to release broken crap and attempt to fix it afterwards.  I don't buy any new games until at least a few weeks after launch to judge whether or not it's worth my time so soon after release.  If it's in crappy condition a few weeks after release, I wait until it's worth my time.  There are some games that will never be in good enough condition to bother with.  (I'm looking at you, Batman).  I'm really looking forward to Fallout 4 but even with Bethesda (which IMO is a fairly solid game studio) behind it, I'm still waiting the aforementioned few weeks.  I won't allow myself to buy it until at least the middle of December.


----------



## truth teller (Nov 5, 2015)

this is call of doody blac kops 4-1, released in november 2015





this is unreal tournament 3, released in november 2007

one was released 8 years ago and works perfectly fine on 2gb of ram and 0.5gb vram
the other is a pile of coder vomit that requires 8gb on a professional graphics card paired along with 12gb of ram to minimally function, at least until the next crash

if your kid asks for this for christmas you are failing at parenting


----------



## DeNeDe (Nov 5, 2015)

GOD help us when Candy Crush will need 12GB or more of RAM/VRAM (they just aquired the company behind the game)


----------



## ASOT (Nov 5, 2015)

What a joke )) dont believe that


----------



## Sasqui (Nov 5, 2015)

Small memory leak?


----------



## Kissamies (Nov 5, 2015)

Well, one solution is just not run it with max details.. 

Also as said on previous posts, it's a miracle if they are not going to fix this issue with patch.


----------



## 64K (Nov 5, 2015)

truth teller said:


> if your kid asks for this for christmas you are failing at parenting



It's not just the COD base games that Activision makes a killing with. They had something like 40 DLCs for COD Advanced Warfare. lol 40 damn DLCs!


----------



## P4-630 (Nov 5, 2015)

Let's hope Bethesda's Fallout 4 does better


----------



## The Terrible Puddle (Nov 5, 2015)

Had stutter in the beta too. Fixed it by setting the application to highest priority in task manager.


----------



## rooivalk (Nov 5, 2015)

Ha! who said 8GB VRAM on 390 useless xD


----------



## HumanSmoke (Nov 5, 2015)

Ferrum Master said:


> Holy cow... where they do find the coder monkeys? Or are they blind during testing phase?


Coding specifically for a niche genre. For the masses there are First Person Shooters.....for the elite studios there are First Person Shoot Own Foot,


----------



## n-ster (Nov 5, 2015)

An interesting test to do is how it behaves with Fury and its HBM


----------



## GoldenX (Nov 5, 2015)

And you can play MGS V on an A4-4000 APU + 4GB of RAM (1GB for integrated GPU) without any performance problem.

Do they even code, or are they using RPG Maker?


----------



## yogurt_21 (Nov 5, 2015)

crysis finally replaced.

yeah but can it play COD Black Ops 3 at 4k?


----------



## ZeppMan217 (Nov 5, 2015)

matar said:


> OMG this reminds me in 2007 Crysis when I had intel core 2 quad q6700 XFX GTS 8800 640MB in SLi and 8GB i ran the game at full setting @1680x1050 and I got like 1 or 2 fps
> 
> looks like this will happen again thanks not buying it because I spend then to run Crysis $1500 on 3 8800GTX in 3-way sli and that's when nVidia first introduced 3 -way sli support.


Crysis came out in 2007 and it was unique in many ways. It's 2015, BLOPS3 is not breaking any grounds, aside from sales.


----------



## Shihab (Nov 5, 2015)

truth teller said:


> this is call of doody blac kops 4-1, released in november 2015
> 
> 
> 
> ...



And why are you comparing these two, exactly?
The game may be ridiculously unoptimized, doesn't mean it has 2007 graphics.


----------



## ElNiko (Nov 5, 2015)

I remember playing COD MW2 on a Pentium 4, 1gb DDR and my first dedicated video, HD5450. That P4 though, melted three 4-pin 12v PSU connectors...

MW3 started stuttering if maxed out anyway, so now that I got my i5-2500/8gb/HD7870 and thought I could run almost everything on my 720p old screen, this comes to me... F*ck you, Activision ! Down here in Argentina things cost triple than there !


----------



## hhumas (Nov 5, 2015)

wtf cod bop iii . first flop of black ops


----------



## KarymidoN (Nov 5, 2015)

rooivalk said:


> Ha! who said 8GB VRAM on 390 useless xD






n-ster said:


> An interesting test to do is how it behaves with Fury and its HBM


Probably will not be better. From what has been shown so far this game needs a lot of memory, not Faster memory and a Greater Bandwidth (as HBM). TitanX (12gb), R9 390 (8GB) must be Good


----------



## Serpent of Darkness (Nov 5, 2015)

Szaby59 said:


> Activision must be jealous to the "attention" what WB got with Batman: AK if they allowed to release this...



WB recently posted that they can't fix the issues found in Batman: Arkham Knight.  So whether this "might" computes to increased sales because of the publicity, or an acknowledgement of failure, I honestly don't think it's going to help WB in the least.  The same could be said for Black OPs 3.  The issue could very well be a memory leak of some type.  Other's have mention Crysis 3.  On it's release day, it needed a patch because it was stuttering on Ultra or Very High Settings, and the patch fixed it.  So this could be a typical hic-up on release day that a lot of games experience.




GhostRyder said:


> So basically we are stuck with about 5 video cards with enough video memory to run this game at ultra...



I fail to see the relevant point.  We are getting to a point where games are being pushed to the extremes.  The cries of many PC Gaming Enthusiast "wishes" are coming true to fully utilize hardware for the best gaming experience that could possibly be provided.  When the masses get what they want, they cry at the cost that comes with it.  Imagine how much TPU and others are going to cry when Star Citizens is released.  How many discrete graphic cards are you going to count when you have to render ships and cockpits that use more than 7 digits worth for points, not including baked high-res texture maps, not including displacements and AO maps, just so you get the eye-candy experience you've been demanding?  I find it irrelevant because if we want better PC Games, and I'm not talking about 100% functioning games, we have to pay for it not with money alone.  We have to pay for it in our systems to process the work necessary to play them.  The money we invest in our systems determines in some proportion the level of experience we'll get as an output.  If Black OPs 3 requires 16 GBs CPU Framebuffer just to push higher resolutions of texture, particle effects, shadows, and other crap at decent FPS, then either deal with it or don't.  Just take into account that this is the future... We've lived in a PC Era were the demands weren't high, and our systems could easily over-kill on the requirements.  Now that we want more, and we are getting more, the over-kill factor is slowly shrinking.




GigabyteFanBoy said:


> why are people calling this the new crysis? Wont this just be patched in no time and then it will be just another call of duty game.



They are calling it the new Crysis because of it's consumption.  It requires around 16 GBs CPU Ram to run on the highest settings, and it uses more than 6GBs of VRAM on the discrete graphic card's GPU Framebuffer just to push the highest level of graphics and play.  Simple answer, it's pushing the bar.  Basically, it has a lot of information being stored on the GPU and CPU side.  Some are saying this could be memory leaks, and I can see where they are coming from.  I suspect that higher resolution baked Textures is one thing.  Another is less to do with the game-engine itself, and more to do with the level of detail they have on their models, in the game.  High detailed models equates to high use of polygons.  High use of polygons per models equates to higher system demands and memory usage just to store the information.  Add multiple high poly models in a scene, and you increase the demands on the CPU and GPU side.  A lot of players don't realize that on the 3D side of the spectrum, a lot of games use lower QUAD-models for characters and models.  Take for example Planetside 2.  We are probably looking at roughly 7,000 pt models of characters, per player, x 50 to 100 players in a given region, not including the terrain in the area, or the building (assuming of course your looking at it so it's being rendered), plus a bunch of other crap like explosions, tanks, flying vehicles, tracers, etc-- is what made the game so demanding...  I know for a fact that one of the spaceships from Star Citizens, I think it's the Mustang, had a poly count of over 100,000 points.  It could be more than that.  Typically if you want to add more detail into your models, you increase the point count.  For PC games, you want the points count to be low so you can render other 3D objects in the scene at a faster time.


----------



## ZeppMan217 (Nov 5, 2015)

Serpent of Darkness said:


> Simple answer, it's pushing the bar.


Does it? Crysis had unmatched visual fidelity in 2007, what's so special about BLOPS3?


----------



## chr0nos (Nov 5, 2015)

This is just lazy coding at worst (or best IYKWIM )

People wanted games that used the most resources, that does not mean they have to use it wisely.


----------



## GhostRyder (Nov 5, 2015)

Serpent of Darkness said:


> I fail to see the relevant point.  We are getting to a point where games are being pushed to the extremes.  The cries of many PC Gaming Enthusiast "wishes" are coming true to fully utilize hardware for the best gaming experience that could possibly be provided.  When the masses get what they want, they cry at the cost that comes with it.  Imagine how much TPU and others are going to cry when Star Citizens is released.  How many discrete graphic cards are you going to count when you have to render ships and cockpits that use more than 7 digits worth for points, not including baked high-res texture maps, not including displacements and AO maps, just so you get the eye-candy experience you've been demanding?  I find it irrelevant because if we want better PC Games, and I'm not talking about 100% functioning games, we have to pay for it not with money alone.  We have to pay for it in our systems to process the work necessary to play them.  The money we invest in our systems determines in some proportion the level of experience we'll get as an output.  If Black OPs 3 requires 16 GBs CPU Framebuffer just to push higher resolutions of texture, particle effects, shadows, and other crap at decent FPS, then either deal with it or don't.  Just take into account that this is the future... We've lived in a PC Era were the demands weren't high, and our systems could easily over-kill on the requirements.  Now that we want more, and we are getting more, the over-kill factor is slowly shrinking.


What exactly are you inferring with that comment???

First of all, yes pushing the envelope requires that technology advance and is a good thing when done correctly as it delivers a next level experience to the gamers which in turn requires more hardware to keep up with.  However, this is not one of those cases and as we have all seen from the screenshots and Beta's for this game its not anything to write home about graphics wise.  When we compare games like GTA 5, Witcher 3, or any AAA title in recent times and compare their performance and requirements to this it becomes pretty obvious how ridiculous the requirements are.

If this game was delivering breath taking visuals, AI that is smarter than the average, and/or set pieces that are bigger than what we have seen before that would be a different story.  What this is, is either a bug in the system (Likely a memory leak), lazy coding, or something else...



ZeppMan217 said:


> Does it? Crysis had unmatched visual fidelity in 2007, what's so special about BLOPS3?


^Bingo


----------



## Filip Georgievski (Nov 5, 2015)

First of all, I played (almost) all of the COD series (single and multi player).
You can see my config in my details.
I have never had problems with COD (not even with Advanced Warfare running on high settings)
I think that this is too much of Activision to ask as not all of us have money to afford those parts (TITAN X 12GB = 500eur +)
My opinion is poor coding since I did play Crysis 3 on medium to high with this config and It does 50fps tops, 35fps min.


----------



## Ubersonic (Nov 5, 2015)

A CoD game that's really really badly coded? whatever next


----------



## Filip Georgievski (Nov 5, 2015)

Just the case of old engine, new game.....

You all remember Mafia 2 right? Of course you do...
Toughest game ever for most PCs, beside Crysis 3....
I was released 2011 as I recall (correct me if I'm wrong)
Mafia 3 is scheduled to be released in 2016.
5 years of working on a game? Even a Comodore 64 will be able to run this game with a frame or 2.
COD AW - 2014 to COD BO3 - 2015
Really? Just 1 year to get the game out?


----------



## Bytales (Nov 5, 2015)

truth teller said:


> this is call of doody blac kops 4-1, released in november 2015
> 
> 
> 
> ...


 
LOOOOL, COder Vomit, you couldnt have said it better if you wanted.
Probably they tested on their Server with 256 gb ram and 16/12gb Fire/Quadro Cards. Oh look, it works flawlesly, that means ist ready, lets start selling it.

CoD-(Er Vo-Mit) looooool


----------



## 64K (Nov 5, 2015)

Filip Georgievski said:


> COD AW - 2014 to COD BO3 - 2015
> Really? Just 1 year to get the game out?



There were two different developers for those games published by Activision. The developer Treyarch spent a couple of years making Black Ops 3.


----------



## yogurt_21 (Nov 5, 2015)

Filip Georgievski said:


> Just the case of old engine, new game.....
> 
> You all remember Mafia 2 right? Of course you do...
> Toughest game ever for most PCs, beside Crysis 3....
> ...



multiple teams typically work on the series more than likely taking far longer than a year on each game. Black Ops 2 was released in 2012 Id imagine this is the same group that worked on that game.

also we are talking Ultra, used to be a setting reserved for only the highest graphics solutions and cpu/memory setups. 

A 980 Ti and 12GB of memory is nicer than most setups, but it's not a Haswell E rig with 128GB of memory and 980 Ti sli.

I seem to remember Doom 3 on ultra at 2048x1536 being out of my rigs capabilities, also Quake 4 on ultra.

I remember a 7800GTX not being enough even in sli as you were out of vram and a single X1800XT being laughed off as having enough vram but not enough ROP's so X1800XT Crossfire with that annoying master card was the only thing that could run it along with an Athlon 64 clocked to over 3GHZ (FX-57 was the fastest at stock at 2.8GHZ and even it struggled) This was the single core days. When 2GB kits ruled the roost and this game really wanted 4GB. The 7800GTX 512MB came out in limited quantities, it made ultra playable at 1080P but you still needed sli for 2048x1536. Really Ultra on these games didn't really become playable until the next series of graphics cards and cpus. By then you had dual core cpu's, 4GB kits of DDR2 800(as opposed to DDR400) and the X1900XTX and 7900GTX. Even then at 2048x1536 and Ultra you had to crossfire or SLI.

Lately Ultra = 400$ graphics card + 500$ cpu/mem/mobo combination. That's not Ultra, that's medium at best.


----------



## iSkylaker (Nov 5, 2015)

Well isn't that obvious? this is what happen when you try to push things above the average. isn't 1080p @ 120fps enough?

Tomorrow I'll bet people will tell me 8GB of RAM isn't enough for gaming... :rolleye:


----------



## iSkylaker (Nov 5, 2015)

yogurt_21 said:


> Lately Ultra = 400$ graphics card + 500$ cpu/mem/mobo combination. That's not Ultra, that's medium at best.


You couldn't be more wrong with that statement, Ultra doesn't necessarily means having x8 of MSAA, which is what taxes the most the performance of a game when you setup the "Ultra" preset of a game. I'm pretty sure the sample quality isn't part of a developer goal when targeting the final visual results in a video game, anything above x2 MSAA, heck even above any post-processing Anti Aliasing technique is pretty much a complement for more sharpness in the image.


----------



## Parn (Nov 5, 2015)

This kind of vram requirement is simply absurd. 

If they do a survey on Steam, they will find out that 980Ti owners are among the minority let alone TITAN X. And let's not forget Fury X which is the top dog from AMD but is only equiped with 4GB VRAM. If 980Ti and Fury X cannot even run the game smoothly, guess how many average gamers with 970 and 290 are going to buy this game?


----------



## rtwjunkie (Nov 5, 2015)

Ahem.....I've been saying for the last year, stop advising new PC builders they only need 8GB of RAM.  16GB will quickly become the new 8.  But everyone keeps laughing at me on here when I say it.


----------



## iSkylaker (Nov 5, 2015)

Parn said:


> This kind of vram requirement is simply absurd.
> 
> If they do a survey on Steam, they will find out that 980Ti owners are among the minority let alone TITAN X. And let's not forget Fury X which is the top dog from AMD but is only equiped with 4GB VRAM. If 980Ti and Fury X cannot even run the game smoothly, guess how many average gamers with 970 and 290 are going to buy this game?



Is not like is a VRAM requirement, its probably that the game is not optimized to scale well at those resolutions. I assume by using the "Ultra" preset it also includes any taxing AA technique like MSAA with a high sample quality. The render buffer can be like four or eight times bigger depending on the sample quality.


----------



## iSkylaker (Nov 5, 2015)

rtwjunkie said:


> Ahem.....I've been saying for the last year, stop advising new PC builders they only need 8GB of RAM.  16GB will quickly become the new 8.  But everyone keeps laughing at me on here when I say it.


It didn't take long enough it seems, like 20 min. Check the my comments above... and yeah I'm currently laughing at you.


----------



## rtwjunkie (Nov 5, 2015)

iSkylaker said:


> It didn't take long enough it seems, like 20 min. Check the my comments above... and yeah I'm currently laughing at you.



Sorry, I couldn't wait for my "I told you so" as soon as I read the first post.  And I'm on record in these forums for the last year advising people 8GB is not "enough" anymore.

Also, the moderators frown deeply on double and triple posting.


----------



## iSkylaker (Nov 5, 2015)

rtwjunkie said:


> Sorry, I couldn't wait for my "I told you so" as soon as I read the first post.  And I'm on record in these forums for the last year advising people 8GB is not "enough" anymore.
> 
> Also, the moderators frown deeply on double and triple posting.


To be fair and with no offence, I'm afraid you will keep telling people that for the next 2 years.

And sorry for multiple posting.


----------



## Basard (Nov 5, 2015)

W1zzard said:


> or have 16 GB RAM, which makes GTX 980 Ti work flawlessly



So how much system RAM do I need if my card is only 1280MB?


----------



## rtwjunkie (Nov 5, 2015)

iSkylaker said:


> To be fair and with no offence, I'm afraid you will keep telling people that for the next 2 years.
> 
> And sorry for multiple posting.



You are likely right.  Sooner or later though, people will think the sky is falling.  Then all those that haven't upgraded to Skylake or higher will be scrambling to buy the dwindling stockpiles of DDR3.  The two combined will make for a very expensive upgrade.

But...people tend to get set in their ways.

About the postings, just trying to help.   Hopefully it will save you some grief later.


----------



## InhaleOblivion (Nov 6, 2015)

This is just facepalm inducing.  Oh well another game to avoid at launch.


----------



## Dieinafire (Nov 6, 2015)

The only reason you need that much ram is your using heavy AA and Supersampling


----------



## Aquinus (Nov 6, 2015)

rtwjunkie said:


> Sorry, I couldn't wait for my "I told you so" as soon as I read the first post. And I'm on record in these forums for the last year advising people 8GB is not "enough" anymore.


Regardless of how badly coded it is, do people really expect VRAM use to stay under 4GB forever? It didn't stay under 512MB, 1GB, or 2GB, so why should this be any different. Even if everything is cleaned up, the question is not if it will fill 8GB or 12GB of VRAM but rather will it still exceed 4GB of VRAM. I suspect in the next year or two we'll see more and more titles use more and more VRAM now that it's becoming more common. Just as more games are eating up more system memory and push usage past 8GB.


Dieinafire said:


> The only reason you need that much ram is your using heavy AA and Supersampling


AA doesn't tend to increase memory usage by huge amount as it's really just post-processing the already rendered frame since it happens after rasterization. Resolution and the quality and quantity of textures would have a bigger impact on memory usage.


----------



## rtwjunkie (Nov 6, 2015)

@Aquinus I noticed you were saying VRAM throughout....were you meaning RAM on the larger numbers?

This game actually is a double-edged sword.  Based on W1z and bta comments, it seemed to take however much VRAM was presented to it, so I suspect 4GB VRAM cards will still be ok. 

The other edge of the sword was the actual amount of system RAM needed, which is what I was commenting on.


----------



## Aquinus (Nov 6, 2015)

rtwjunkie said:


> @Aquinus I noticed you were saying VRAM throughout....were you meaning RAM on the larger numbers?
> 
> This game actually is a double-edged sword.  Based on W1z and bta comments, it seemed to take however much VRAM was presented to it, so I suspect 4GB VRAM cards will still be ok.
> 
> The other edge of the sword was the actual amount of system RAM needed, which is what I was commenting on.


Both, it's unrealistic to expect one to grow and the other not to. All memory is the same, if you use more than you have, performance starts suffering very quickly. It's why I hate swap space and the page file. If I can't run everything in memory, then I need to upgrade is the way I look at it. I'm not one to close everything I have open just because I want to play a game and I don't want swap space to hide that fact from me.


----------



## mab1376 (Nov 6, 2015)

Aquinus said:


> Both, it's unrealistic to expect one to grow and the other not to. All memory is the same, if you use more than you have, performance starts suffering very quickly. It's why I hate swap space and the page file. If I can't run everything in memory, then I need to upgrade is the way I look at it. I'm not one to close everything I have open just because I want to play a game and I don't want swap space to hide that fact from me.



+1 to that, i have 18gb of ram, and my next pc will probably be 24gb since i build one every 4-5 years.


----------



## ThE_MaD_ShOt (Nov 6, 2015)

kiddagoat said:


> Console kiddies will gobble this up as always as that's the bigger market... but still... this series just needs to go away for awhile to give itself time to reinvent.  They are just snagging and grabbing from other titles now.





Jborg said:


> consoles always do well with the kidz)




If you have a Ps3 or xbox360 you not going to be to happy when you find out you got a severely cut down version.

The last generation (Xbox 360 + PS3) versions of Black Ops III *will not contain*...

1. DLC Support (No Map Packs/DLC Weapons/New Specalists/Zombie Maps, etc.
2. Ground War
3. Paintshop  (Camo Creator/Editor)
4. Campaign Mode

Just though I would add that.


----------



## flmatter (Nov 6, 2015)

so I wonder how well it will run on a 4k monitor? 16gb of system ram and a r9 390 with 8gb.


----------



## AlwaysHope (Nov 6, 2015)

Aquinus said:


> Both, it's unrealistic to expect one to grow and the other not to. All memory is the same, if you use more than you have, performance starts suffering very quickly. It's why I hate swap space and the page file. If I can't run everything in memory, then I need to upgrade is the way I look at it. I'm not one to close everything I have open just because I want to play a game and I don't want swap space to hide that fact from me.



Have to agree with this... +1, besides that, RAM is so cheap these days anyway..


----------



## arbiter (Nov 6, 2015)

iSkylaker said:


> You couldn't be more wrong with that statement, Ultra doesn't necessarily means having x8 of MSAA, which is what taxes the most the performance of a game when you setup the "Ultra" preset of a game. I'm pretty sure the sample quality isn't part of a developer goal when targeting the final visual results in a video game, anything above x2 MSAA, heck even above any post-processing Anti Aliasing technique is pretty much a complement for more sharpness in the image.


Yea probably could use ultra preset on a 4 or 6gb card if turned off AA which generally high and ultra presets do use them. I personally don't care about jagged edges since i don't rarely standing around looking at them. Its only those things I don't barely notice in the game.


----------



## Uplink10 (Nov 6, 2015)

RejZoR said:


> But in all honesty, while Crysis is still super demanding, it looked spectacular at time and is in a way still a benchmark for visual fidelity. Mostly because it's SO old but looks like it was released a year or two ago...


I played all three Crysis games with low-end graphic cards and games all worked great and looked stunning. Crysis series should be as an example as it can be played on a different range of systems and it looks relatively great in all of them.



Aquinus said:


> It's why I hate swap space and the page file. If I can't run everything in memory, then I need to upgrade is the way I look at it.


Windows has a lot to do with that, I have 8 GB of RAM and when I am using only 3 GB or 4 GB I keep seeing hard faults which is kind of weird since there is so much RAM available this should not be happening .

Maybe Windows is also guilty for this, a while back I was getting "Close programs to prevent information loss" constantly after leaving Firefox or some other program open for a few hours. I was getting this on Windows 8.1 but it never happened to me on Windows 7, well expect when I formated external USB drive and explorer started eating all the memory because of the memory leak (Thank you Microsucks for not fixing that, idiots).


----------



## Tsukiyomi91 (Nov 6, 2015)

Gonna run this game on High @ 1080p & see if it still eats VRAM & system RAM or not.


----------



## Xzibit (Nov 6, 2015)

Tsukiyomi91 said:


> Gonna run this game on High @ 1080p & see if it still eats VRAM & system RAM or not.



Well if your on a 970 Nvidia recommends sticking to 1080p @ high settings



			
				Nvidia said:
			
		

> However, if you’re looking to gear up for Black Ops III and maximize your experience for 60 FPS and *high graphics settings*, we’ve got a set of recommended NVIDIA GPUs for you.











			
				Nvidia said:
			
		

> For _Black Ops III_, we're recommending the GeForce GTX 970, which will enable you to experience the fast-paced shooter at 60 FPS at 1920x1080, at a *high level of detail*.



All the recommended GPUs @ Resolution are for *HIGH SETTINGS.  *They aren't recommending anything past that.


----------



## Slizzo (Nov 6, 2015)

64K said:


> There were two different developers for those games published by Activision. The developer Treyarch spent a couple of years making Black Ops 3.



They actually had 3 years to develop the game, as there are now 3 developers making Call of Duty games.


----------



## Easo (Nov 6, 2015)

"Optimisation".


----------



## Ikaruga (Nov 6, 2015)

I'm really confused what is happening in the thread.

*1,* Any decent application or operating system will use as much free resources as it can if it makes things faster. This is the same story as 2GB vs 4GB video cards, where a game uses 3GB on a 4GB card runs 99.9% the same as on a 2GB video card.

*2,* this game might have a "simple" memory leak bug, so drawing conclusions without proof is questionable perhaps.

personal opinion: The last time COD was good was the time when it ran on idtech.


----------



## SNM (Nov 6, 2015)

In short, I can not play this title on my rig.....well I love CoD BO series...


----------



## RejZoR (Nov 6, 2015)

Hell, even original Far Cry is


Xzibit said:


> Well if your on a 970 Nvidia recommends sticking to 1080p @ high settings
> 
> 
> 
> ...



Screw "high" setting. I always use the highest one. Which i think is Very High or Ultra...


----------



## hapkiman (Nov 6, 2015)

Sounds like poor coding/optimization, and memory leaking.


----------



## SIGSEGV (Nov 6, 2015)

another crappy games story. well. it's meant to be played, isn't it ? lol


----------



## Xzibit (Nov 6, 2015)

hapkiman said:


> Sounds like poor coding/optimization, and memory leaking.








Final benchmarks should be available soon for a comparison.


----------



## Filip Georgievski (Nov 6, 2015)

Sounds to me like an Nvidia based game.....


----------



## Vlada011 (Nov 6, 2015)

Only rear smart people saw trap in days when new GTX980Ti customers laugh to TITAN X owners because for 300$ less get similar performance.
TITAN X will be useful long time after 980Ti become limited with video memory.
This is special bad for people who plan to buy first one card and later when price drop and second card.
On autumn 2016 GTX980Ti owners will be in same position as GTX780Ti owners now.
Worse thing to tell someone who buy so expensive card is disable filter or disable this because they use most video memory.
People who pay 600-700$ don't want to do such things. At the end when some time pass 980Ti owners will be sorry because they didn't saved 200-250$ more for TITAN X. Special who pay custom GTX980Ti.  Off course Fury X will become stronger card than GTX980Ti for several months when NVIDIA launch Pascal and maybe even strong as TITAN X. Same as R9-290X become better than Kepler in some situation even if that was obvious inferior card on launch date.
NVIDIA don't want any more as AMD to constantly improve performance of their most expensive cards and someone who pay 1000 euro for card to install driver after 2 years and see improvement in new games. No, NVIDIA improve performance now only 12 months, after new series show up there are good indication that even try to sabotage performance of older cards and AMD who constantly improve suddenly have similar performance.


----------



## john_ (Nov 6, 2015)

So, only 390/X and Titan X have enough VRAM for this one?


----------



## vega22 (Nov 6, 2015)

any plans for rerunning the tests with amd gpu for comparison?


----------



## las (Nov 6, 2015)

TheDeeGee said:


> Didn't know the CoD Series was still a thing.



So you didnt know the best selling games were still a thing? Strange.


----------



## Aquinus (Nov 6, 2015)

Uplink10 said:


> Windows has a lot to do with that, I have 8 GB of RAM and when I am using only 3 GB or 4 GB I keep seeing hard faults which is kind of weird since there is so much RAM available this should not be happening .
> 
> Maybe Windows is also guilty for this, a while back I was getting "Close programs to prevent information loss" constantly after leaving Firefox or some other program open for a few hours. I was getting this on Windows 8.1 but it never happened to me on Windows 7, well expect when I formated external USB drive and explorer started eating all the memory because of the memory leak (Thank you Microsucks for not fixing that, idiots).


A hard fault is any time the "page file" is accessed. So you can be idling and Windows will decide to take stuff out of memory to make more room. You can disabled the page file and still get hard faults, the difference is that the hard faults aren't accessing a real drive but, rather essentially a ram disk instead of a real disk.

You don't need to fill system memory in Windows for it to start hard faulting, it just does it a lot more often when you start running out of space. Another reason why I run without a page file.


----------



## wickedcricket (Nov 6, 2015)

Huh?

On side note, anyone fancy a BF4 Conquest game? I just put this new 60Hz server up on PC:

*-=THE EAGLES=- 60Hz |No Limits|Conquest 24/7|4FUN|Free 4 All*

Anyone from EU (and surrounding areas) are very welcome to join! 

cheers lads!


----------



## Prima.Vera (Nov 6, 2015)

It's awesome! Let's have some uncompressed textures, put them inside the game and be done with it. No wonder the game is more than 50GB....
Ridiculous!!


----------



## LiveOrDie (Nov 6, 2015)

Must be the shitty old engine code there still using they should of just paid unreal %5 and used unreal.


----------



## Tsukiyomi91 (Nov 6, 2015)

@Xzibit will do on my main rig (Rig 1) & also to see if there's any impact on Rig 2. Will run without those anti-aliasing methods to gauge performance impact.


----------



## xorbe (Nov 6, 2015)

Latest game stresses latest hardware on top-shelf "ultra" benchmark settings.  News at 11.


----------



## Legacy-ZA (Nov 6, 2015)

Well, we better see some killer number crunching graphics cards from nVidia or AMD. It would seem most of the latest game titles are very demanding. I am sure Fallout 4 and Deus-Ex: Mankind Divided will be so too.


----------



## yogurt_21 (Nov 6, 2015)

iSkylaker said:


> You couldn't be more wrong with that statement, Ultra doesn't necessarily means having x8 of MSAA, which is what taxes the most the performance of a game when you setup the "Ultra" preset of a game. I'm pretty sure the sample quality isn't part of a developer goal when targeting the final visual results in a video game, anything above x2 MSAA, heck even above any post-processing Anti Aliasing technique is pretty much a complement for more sharpness in the image.


MSAA didn't exist in 2005. It has nothing to do with an Ultra preset.

Ultra = the most hardware taxing visual display a game can muster, optimization be damned. And a 1000$ machine shouldn't be able to handle it, at least not based on the precedents set by games like doom 3, quake 4, crysis, heavily modded skyrim. etc. You want eye candy? It's going to cost you.


----------



## RejZoR (Nov 6, 2015)

That's a load of bollocks @yogurt_21 and you know it.

Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.

Secondly, providing superior visual fidelity by not using ANY optimizations is just stupendous wasting of resources if you can barely tell the difference in the end or even worse, if you can't tell the difference at all! That's why Far Cry used poly-bumpmapping which gives player the illussion you're using 3 million polygons on a model, but in reality, you're only using 3000. Sure some took it a bit too far which resulted in square-ish heads in stills, but frankly, in-game, you've rarely noticed it even on such far end extremes.

And this isn't the only optimization. Megatextures, texture streaming, LOD, tessellation etc, all this stuff means you can spend resources on things that matter and cleverly obscure those that are less important. It's why we can have wast worlds because engine is cleverly balancing graphic card capabilities for things that matter and things that don't. If you spend all resources on a character you can't even admire properly, you've just wasted resources that could be spent on 500 trees in the distance. Sure there are people who say faking something is not the same, but when faked effect is nearly impossible to be distinguished from the reference one, does it even matter at that point?

Besides, Ultra setting often doesn't mean you're disabling optimizations, it just means you're pushing the settings to start tessellating items sooner and using distance LOD later. Which is logically more demanding...


----------



## truth teller (Nov 6, 2015)

Shihabyooo said:


> And why are you comparing these two, exactly?
> The game may be ridiculously unoptimized, doesn't mean it has 2007 graphics.


you are right, ut3 actually looks better than bloopers3, all those mushy textures, yuck, i shouldnt have compared it to a 07 game, next time i might do comparison between blondes&blacks3 and ut2003



Filip Georgievski said:


> Sounds to me like an Nvidia based game.....


oh its based alright, higher ram usage than a kite


----------



## yogurt_21 (Nov 6, 2015)

RejZoR said:


> That's a load of bollocks @yogurt_21 and you know it.
> 
> Firstly, MSAA did exist long before 2005. I didn't bother to do that much extensive search, but GeForce 3 back in 2001 had MSAA support. Not to mention the fact that "anti-aliasing" has been used even long before that. I know my GeForce 2MX and GeForce 2 Pro had anti-aliasing which I extensively used even back then. Probably year 2000 or so.
> 
> ...



http://www.tomshardware.com/reviews/anti-aliasing-nvidia-geforce-amd-radeon,2868.html

*Quincunx Anti-Aliasing is not MSAA*

its a precursor or rather something NV tried to do to make better aa performance when the hardware couldn't handle it. I do see MSAA as a part of the Open GL 1.5 standard from late 2003 though as well as DirectX 9.0c from August 2004. So you are correct it was around in 2005, but SSAA (games settings just had it as AA) was still the standard then. Either way the ultra setting wasn't just about aa, draw distance/FOV, shadows, hdr, water effects, other animations like arrow trails, bullet effects, and etc. 

So thanks for zeroing in on some random points in my post. The point was that Ultra was not as he described locked in with an 8x MSAA boost. 

secondly who says it has to look the same? you? are you ranting against yourself? As described above Ultra typically adds a ton of different effects to a game, many of which are quite noticeable. Though ideally the game will still look and play fine with them off for the majority of people. 

Some people are just fine adding an area rug and calling it a day. Some people pay millions to interior decorators to make their homes look like a palace or the set of their fav sci-fi show.

Ultra mode is for the latter group.

The rest will do fine on Medium or High or better yet their own custom preset of the effects they care about and none that they don't. 

But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)


----------



## Aquinus (Nov 7, 2015)

yogurt_21 said:


> But chewing up 12GB of ram isn't a feat these days and as good as a single 980 ti is, you can still go better (sli, tri-sli, sli titan-z's)


Absolutely but, it seems that it is also hungry enough for VRAM where the 4GB versus 6/8GB argument starts becoming more of a thing. 


btarunr said:


> Once we installed 16 GB RAM in the system, the game ran buttery-smooth with our GTX 980 Ti.


So tell me, how does the 390 and 390X behave with it? This article seems to be 100% "omg, nvidia cards can barely run it," which means nothing if not pitted up against the competition.


----------



## Pill Monster (Nov 7, 2015)

Blame MS not the developers.    Any rig with a 4GB GPU will need 16GB running a DX11 title..... it's the way WDDM  works...(or doesn't work lol)

Drivers map system memory into GPU address space, but in W7/8.1  it's a bit broken, as there's no Dynamic Resources or reclaim support 
This  means the maximum available is mapped to the GPU...usually equvilant to the VRAM on your card.


It was supposed to be fixed in 8.1 but MS postponed it till W10..... no biggie, as Wiz said witth 16GB it's all good...


----------



## Xzibit (Nov 7, 2015)

Comparison

*Beta*











*Final*


----------



## Ikaruga (Nov 7, 2015)

Perhaps it's just an oversight/bug and they probably didn't think about 12GB systems and manualy treat as as 16GB ones? As I wrote earlier, the rest is good, any AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.


----------



## Shihab (Nov 7, 2015)

Ikaruga said:


> any AAA game should use as much resources as it can, it's actually a good thing to fill up the system and the video memory as much as possible.



Any game should use as much resources as it _*needs. *_
Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards _should_ always be capped by what the game can offer. Go higher and it's a case of poor optimization.



Pill Monster said:


> Drivers map system memory into GPU address space, but in W7/8.1 it's a bit broken, as there's no Dynamic Resources or reclaim.
> This basically means the maximum available is mapped into the GPU adress space...usually equvilant to the VRAM on your card.



I thought this was fixed with Windows 7 (and d3d10/11)


----------



## Pill Monster (Nov 7, 2015)

Shihabyooo said:


> Any game should use as much resources as it _*needs. *_
> Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards _should_ always be capped by what the game can offer. Go higher and it's a case of poor optimization.
> 
> 
> ...


They're talking about DWM and 0 copy between RAM and VRAM.     I'm referring to unified addressing where RAM is mapped into GPU space......

but it's not fixed, and never will be in W7... kernel issue..



Here's some info on DX if you want to read ....

https://msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx


----------



## Sephil Slyfox (Nov 7, 2015)

ZeppMan217 said:


> Crysis came out in 2007 and it was unique in many ways. It's 2015, BLOPS3 is not breaking any grounds, aside from sales.



I do think it looks cool however I also liked it when it was Call of Duty 2 dunno why


GhostRyder said:


> I thought I read it stutters on the GTX 980ti but that disappears with the Titan X?
> 
> Oh I missed that bottom comment about 16gb of ram and the GTX 980ti working fine...  Ok so we now have 6 Cards that can run it efficiently on Ultra.
> 
> ...


I di too but... Wish


P4-630 said:


> Let's hope Bethesda's Fallout 4 does better



Hope so just pre-ordered


----------



## 64K (Nov 7, 2015)

Sephil Slyfox said:


> I do think it looks cool however I also liked it when it was Call of Duty 2 dunno why



Probably because COD 2 was better. COD 1, 2 and MW1 were the most fun to me but I haven't given up on the series. I'm way behind though. I've still got half of Black Ops 1 to finish and then on to MW3 later on. I'm in no hurry. Newer COD games aren't a priority for me. I still like to go back and play 1, 2 and 4 every couple of years.

And welcome to TPU fellow Tennesean.


----------



## Ikaruga (Nov 7, 2015)

Shihabyooo said:


> Any game should use as much resources as it _*needs. *_
> Scaling in resource requirements should be downwards to work on less powered machines (by reducing render elements/quality, etc). Scaling upwards _should_ always be capped by what the game can offer. Go higher and it's a case of poor optimization.


Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.


----------



## Pill Monster (Nov 7, 2015)

Ikaruga said:


> Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.


Yep for sure.....




Execption being Superfetch imo.   Never did like it lol


----------



## Shihab (Nov 7, 2015)

Ikaruga said:


> Absolutely not. Windows should cache as much as it could and the game engine should also cache/prefetch/prebake/etc as much as it could on any given system. Loading up stuff from storage vs having it in RAM or VRAM is an easy choice.



IIRC Windows' been doing similar stuff since XP and Vista (Prefetcher and Superfetch, respectively), and I can't count the times I had to disable the latter because it was screwing up the system.

You are right. Running a game from ram is better, but that still wouldn't justify caching data for segments that won't be needed for minutes/hours to come, or ones that aren't needed any more. What matters is what's being displayed now and what will be in the very near future (for the game), in other words: what the game "needs". Then it's simply a matter of balancing when to cache newer data and when to scrub older ones.




Pill Monster said:


> Here's some info on DX if you want to read ....
> 
> https://msdn.microsoft.com/en-us/library/windows/desktop/dn899121(v=vs.85).aspx



I'll take your word for now. My experience with programming hasn't reached d3d yet >_>


----------



## Am* (Nov 7, 2015)

And this is exactly why I bought a Titan X over the 980 Ti...so I won't be forced to wait for months of patches etc before they fixed memory leaks in case I wanted to play one of these crappy ports (not that I do - COD is the last game I'd consider buying)...


----------



## rtwjunkie (Nov 7, 2015)

I have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that?  Why do you assume Activision thinks it is a bug?  You use up that much system RAM with a game because it was DESIGNED that way.


----------



## Am* (Nov 7, 2015)

rtwjunkie said:


> I have to laugh...several people have talked of patches to fix the amount of RAM used....why do you assume there will be patches for that?  Why do you assume Activision thinks it is a bug?  You use up that much system RAM with a game because it was DESIGNED that way.



1. Because the game looks like crap...
2. Because Batman AK used over 7GB at launch, now uses around 5.5GB max. GTA V online also had memory leak issues, which have been fixed since.


----------



## rtwjunkie (Nov 7, 2015)

You are ignoring that games progress.  Nothing stands still in the gaming world.  Gaming, more than anything has consistently forced the advancement of computer parts to bigger, better, faster as requirements increase.  

It's a constantly moving finish line, and the natural evolution of things.  It's unrealistic to think or hope that requirements will stand still.


----------



## Uplink10 (Nov 7, 2015)

rtwjunkie said:


> You are ignoring that games progress. Nothing stands still in the gaming world. Gaming, more than anything has consistently forced the advancement of computer parts to bigger, better, faster as requirements increase.
> 
> It's a constantly moving finish line, and the natural evolution of things. It's unrealistic to think or hope that requirements will stand still.


Programs should consume resources in a smart ways and because they need to. Just take a look at textures, distributors do not compress textures to lower piracy, it isn't working. Why would they make COD to eat up all the RAM, so the consumers will buy more/costlier components? Time will tell, but till it does I suggest people turn to better coded games on PC like Witcher 3.


----------



## rtwjunkie (Nov 7, 2015)

Uplink10 said:


> Programs should consume resources in a smart ways and because they need to. Just take a look at textures, distributors do not compress textures to lower piracy, it isn't working. Why would they make COD to eat up all the RAM, so the consumers will buy more/costlier components? Time will tell, but till it does I suggest people turn to better coded games on PC like Witcher 3.



First I agree with you that people should go play a better game, like The Witcher 3.  But that's not the issue.

Look back on the history of gaming.  There have always been games that pushed the envelope, and thus pushed the advancement of hardware.  I am all for any game that pushes that envelope.  Others will follow suit, and I am glad.  Otherwise, hardware would stagnate where it is.  

Forward thinking has gotten us where we are today, and I don't wish to see it stop there.


----------



## Aquinus (Nov 7, 2015)

Uplink10 said:


> I suggest people turn to better coded games on PC like Witcher 3.


Failworks? O rly? There are a lot of things I would called Witcher 3. A great game, a lot of fun, looks pretty good, but, I wouldn't call it the pinnacle of well coded games. It has its issues like many others.


----------



## Pill Monster (Nov 7, 2015)

We should all go back to playing board games imho...

Monopoly, Cluedo, Trivial Pursuit...very stable without crashes or bugs....lag, glitches etc.. 






Shihabyooo said:


> IIRC Windows' been doing similar stuff since XP and Vista (Prefetcher and Superfetch, respectively), and I can't count the times I had to disable the latter because it was screwing up the system.
> 
> You are right. Running a game from ram is better, but that still wouldn't justify caching data for segments that won't be needed for minutes/hours to come, or ones that aren't needed any more. What matters is what's being displayed now and what will be in the very near future (for the game), in other words: what the game "needs". Then it's simply a matter of balancing when to cache newer data and when to scrub older ones.


Superfetch and Prefetch cache programs iirc,  
But windows caches regardless, look in TM at the cache amount, it's mostly made up of memory mapped files, files on the HDD...lots of system32 files,  and pics, documents muusic....all that stuff.
But that's caching... 

WDDM and DX are being developed to unify GPU and CPU memory, virtually or otherwise.  Similar to consoles and hUMA.   Early stages atm.   

There's also a bit of hype here imo, I realised when testin W10....what differnce does it make to a gamer if 16GB of RAM is used or 16GB of VRAM???? U stll need more  lol

So personally  I don't see the advantage of unified memory in WDDM 2.0...









Shihabyooo said:


> I'll take your word for now. My experience with programming hasn't reached d3d yet >_>



Oh me either, lol  I mainly skip the code stuff (exept for the errors which are handy).     But theres a lot of plain english material documenting how DX works in relation to the OS, makes for interesting reading imho.

Maybe I gave u the wrong link...


----------



## 64K (Nov 7, 2015)

rtwjunkie said:


> First I agree with you that people should go play a better game, like The Witcher 3.  But that's not the issue.
> 
> Look back on the history of gaming.  There have always been games that pushed the envelope, and thus pushed the advancement of hardware.  I am all for any game that pushes that envelope.  Others will follow suit, and I am glad.  Otherwise, hardware would stagnate where it is.
> 
> Forward thinking has gotten us where we are today, and I don't wish to see it stop there.



I don't know @rtwjunkie. There have been games that pushed the PC game envelope in the past versus the consoles leftovers slop. There are reasons that publishers think we are idiots.


----------



## Uplink10 (Nov 7, 2015)

rtwjunkie said:


> Look back on the history of gaming. There have always been games that pushed the envelope, and thus pushed the advancement of hardware. I am all for any game that pushes that envelope. Others will follow suit, and I am glad. Otherwise, hardware would stagnate where it is.


Most of the games aren't so resource hungry and they still look great but whar does COD:BO3 offer to justify its greediness?



Aquinus said:


> Failworks? O rly? There are a lot of things I would called Witcher 3. A great game, a lot of fun, looks pretty good, but, I wouldn't call it the pinnacle of well coded games. It has its issues like many others.


Not perfect example, I meant VRAM usage.


----------



## deemon (Nov 8, 2015)

RejZoR said:


> Because if those are the specs, not many people can actually run that. I mean, I have 32GB RAM so I don't care, but graphic card is still "just" 4GB (GTX 980). If you want to target the broadest target public, you can't have specs like this. Or it has to look like nothing we've seen to date. Frankly, I doubt that will be the case...



noone said you have to play the game with maximum textures. That's why they gave us game settings, so you can SET things right for your hardware. Simple really. I am quite sure you can play this game with 2GB VRAM also with low textures and 1080p.


----------



## hat237 (Nov 8, 2015)

just memory leak no need to worry it will be fine


----------



## RejZoR (Nov 8, 2015)

deemon said:


> noone said you have to play the game with maximum textures. That's why they gave us game settings, so you can SET things right for your hardware. Simple really. I am quite sure you can play this game with 2GB VRAM also with low textures and 1080p.



Playing games at anything less is unacceptable for me. Also, why is game not DX12 ready?


----------



## natr0n (Nov 8, 2015)

Call Of Disaster : Blatantly Optimized


----------



## so11ex (Nov 8, 2015)

I see no point to upgrade 8 to 16 Gb system ram and look for 8-12Gb video cards... Im not a CoD series fan, I have tried maybe half of the games from this game series. To be honest I got the game just to check is it really THAT BAD optimised...

My rig: i7 4770k / iChill 4gb 980 from innovision /8(4*2)GB 1866 ram/SSD Plextor m5s

game settings: 1080p, ultra textures, high (not ultra) shadows, all the rest set on max/ultra, smaa 2x (cinema) - running almost perfect! video ram load 3-4Gb (Gpuz) system ram load 6.3-7/8 (taskmgr)

Overall perfomance - solid 60FPS with decreases to 53-55 on sometimes. No problems, no crashes, sometimes microstutter occurs at that time FPS goes 53-55 (I guess RAM issue when game reads something from SSD) but I cant say its badly freezing. I guess If I will set textures to "high" instead of ultra there would be no problems at all....


----------



## Rowsol (Nov 9, 2015)

Wow... that is absurd.


----------



## laszlo (Nov 9, 2015)

fck them ;they develop new games with hardware producers money just to force us to buy their shitty new stuff to force the upgrade

a lame conspiration from their side


----------



## Vayra86 (Nov 9, 2015)

rtwjunkie said:


> First I agree with you that people should go play a better game, like The Witcher 3.  But that's not the issue.
> 
> Look back on the history of gaming.  There have always been games that pushed the envelope, and thus pushed the advancement of hardware.  I am all for any game that pushes that envelope.  Others will follow suit, and I am glad.  Otherwise, hardware would stagnate where it is.
> 
> Forward thinking has gotten us where we are today, and I don't wish to see it stop there.



But CoD:Blops3 taxing systems like it does today is not about pushing any envelope in terms of graphical fidelity. None. Whatsoever. Textures are stale and washed out, built for lower resolutions, 90% of the game graphics is post processing junk. You compare this to TW3, how can you defend these system requirements if you keep the two games side by side?

If this game pushes any envelope, it is merely the envelope of how badly can you fuck up a console port and how much of a cash grab can you make the product itself. Bad coding is the ultimate example of laziness, because there are tons of games that do it better, and there are tons of games that look better too.

If these developers did any forward thinking themselves, they would have considered better optimization of PC quality settings. I don't really get where you're coming from with this argument at all actually. Even a blind man can see this game has nothing groundbreaking to offer, and I honestly don't get how you can justify the current sys requirements for this game. Not in the least because CoD:Ghosts had similar memory ridiculousness and we all know Treyarch is the second rate developer for this series.

Last but not least, a console port is and has never been about forward thinking or pushing envelopes. They are built for the lowest common denominator. CoD has never pushed envelopes. IW/Treyarch have never pushed envelopes. We have had 7 years of standstill because of these console games. Get real...


----------



## rtwjunkie (Nov 9, 2015)

@Vayra86 I'm not defending the game.  Perhaps I'm being misunderstood, because there have been precisely 3 good CoD games: CoD, Cod 2, and CoD 4- MW.

My point is simply the number of people butt-hurt in general because System requirements are increasing.  It is simply not realistic to think games and hardware should stagnate forever.  If people had always thought that way, we would still be riding horse and buggy.

EDIT: Remember, everyone started complaining on here as soon as they read real world requirements from the TPU staff, which was before anyone had a copy to find out it is a pile of dung.


----------



## Aquinus (Nov 9, 2015)

rtwjunkie said:


> My point is simply the number of people butt-hurt in general because System requirements are increasing. It is simply not realistic to think games and hardware should stagnate forever. If people had always thought that way, we would still be riding horse and buggy.


...or I would still be using my old handy Radeon 9200 and a netburst Celeron. Simple fact is this one:


rtwjunkie said:


> It is simply not realistic to think games and hardware should stagnate forever.


----------



## Vayra86 (Nov 9, 2015)

rtwjunkie said:


> @Vayra86 I'm not defending the game.  Perhaps I'm being misunderstood, because there have been precisely 3 good CoD games: CoD, Cod 2, and CoD 4- MW.
> 
> My point is simply the numver of people butt-hurt in general because System requirements are increasing.  It is simply no realistic to think games and hardware should stagNate forever.  If people had always thought that way, we would still be riding horse and buggy.



If you look objectively at games, I understand completely why people are butt-hurt. The vast majority of these releases are console ports running on an HD7870 equivalent GPU as the baseline. You simply cannot defend that a PC with hardware that is at least 30% faster cannot run that on higher settings. Evidenced by the small amount of games that do get released with proper coding and optimization, like GTA V. Case in point, this thread, which is only about top end hardware running a mediocre game. You mentioned it yourself, TW3, a great example of a game that uses the hardware well and has understandable performance hits from different kinds of graphics settings - nobody is complaining about 30-40 fps in that game and rightly so, people aren't that stupid.


----------



## Aquinus (Nov 9, 2015)

Vayra86 said:


> You mentioned it yourself, TW3, a great example of a game that uses the hardware well and has understandable performance hits from different kinds of graphics settings - nobody is complaining about 30-40 fps in that game and rightly so, people aren't that stupid.


I wouldn't call TW3 that stable and for how much VRAM it uses and how it looks, I would actually expect it to run better. Farcry 4, another games that isn't the best of coded games but, it looked great and ran smoothly in surround on my machine, using almost 4GB of VRAM all the while. You run TW3 in surround and it sucks beyond belief (all the while, around 2GB VRAM used.) There have also been instances where despite a frame rate cap, TW3 will consume 100% of my GPU even though it's running at a flat 60FPS, almost like it's just throwing away extra rendered frames. Either way, people should stop using TW3 as an example of a good game because 3D performance for what the game is, is crap. The only reason why TW3 is good is because the game itself doesn't blow. I can deal with poor performance if the game is good but, most won't care if a game looks good if everything else about it is crap.

So if we're talking about just the rendering engine and not the game itself, TW3 is actually pretty mediocre in comparison. It caters to GPUs that have a lot of pixel pumping power because a lot of it is effects, not texturing.


----------



## Vayra86 (Nov 9, 2015)

Aquinus said:


> I wouldn't call TW3 that stable and for how much VRAM it uses and how it looks, I would actually expect it to run better. Farcry 4, another games that isn't the best of coded games but, it looked great and ran smoothly in surround on my machine, using almost 4GB of VRAM all the while. You run TW3 in surround and it sucks beyond belief (all the while, around 2GB VRAM used.) There have also been instances where despite a frame rate cap, TW3 will consume 100% of my GPU even though it's running at a flat 60FPS, almost like it's just throwing away extra rendered frames. Either way, people should stop using TW3 as an example of a good game because 3D performance for what the game is, is crap. The only reason why TW3 is good is because the game itself doesn't blow. I can deal with poor performance if the game is good but, most won't care if a game looks good if everything else about it is crap.
> 
> So if we're talking about just the rendering engine and not the game itself, TW3 is actually pretty mediocre in comparison. It caters to GPUs that have a lot of pixel pumping power because a lot of it is effects, not texturing.



Meh, I respectfully disagree on that. The game has a lot of interesting elements that you don't see too much elsewhere, and are tied into graphics/performance. Speedtree is an example of this coupled with the royal view distance. But it is also a game that (and this is mostly what I was alluding to) takes performance hits from higher settings that a user can understand. All the quality settings are actually working, they all add something, and they all incur a performance hit without introducing weird amounts of stutter if the resources are there. And to top it off, it does so without requiring a boatload of system/VRAM. Note, all of this is excluding Hairworks from the story.

The other example I pointed out, GTA V, is similar. If we cant agree on TW3 and talk about a game that is well optimized, let's take that one then. A great example because it is also very transparent in terms of VRAM usage in the options menu. CoD is miles away from this, and thát is what people see.


----------



## Aquinus (Nov 9, 2015)

Vayra86 said:


> A great example because it is also very transparent in terms of VRAM usage in the options menu.


I don't have GTA V so I can't speak for experience but, if my memory serves me correctly, that game uses extra VRAM for caching so it could very well be just like TW3 in that respect, so I wouldn't go making any assumptions to that end.

I've played TW3 maxed out without Hairworks. In fact I didn't find Hairworks to impact performance by all that much, I feel that it's just a slow engine for what it is doing. I don't have GTA V so I can't talking about it but, there was a discussion several months ago about how it appeared that GTA V was caching stuff in VRAM if it was available and wasn't reflective of how much memory is being actively used at any given time.

I don't want to go too far into that but, my simple point is that it's not realistic for games to regress while still getting better and I honestly don't think GTA V looks better than TW3 or Farcry 4 judging from screenshots I've seen.


----------



## 64K (Nov 9, 2015)

If it's loading up RAM just because it's there then no problem. If it's loading up RAM and the engine has to start using storage because the RAM is loaded with unnecessary data then that's not ok. That's why the game stutters. The following isn't about RAM but it seems that their game engine wastes resources anyway. Take a look at this concerning VRAM usage



Spoiler: VRAM Usage










https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/33.html



Crysis 3 using 2 GB max and COD Advanced Warfare using 7.3 GB

This game is running on new gen consoles that only have 8 GB RAM total where 4-5 GB is available for the game. I understand that PC gamers always need better hardware to run the same game as on a console but _this much_? _12 GB RAM_? This game isn't breaking new ground for PCs. It's just sloppy coding.


----------



## rtwjunkie (Nov 9, 2015)

64K said:


> If it's loading up RAM just because it's there then no problem. If it's loading up RAM and the engine has to start using storage because the RAM is loaded with unnecessary data then that's not ok. That's why the game stutters. The following isn't about RAM but it seems that their game engine wastes resources anyway. Take a look at this concerning VRAM usage
> 
> 
> 
> ...


 
You may have some valid points from what I've read by users.  However, if you'll note in my edit from above that I did earlier, people started thier outcry about higher requirements BEFORE anyone had the game after the TPU staff put forth the real-world requirements.  Think about that a second and let it digest. 

People were butt-hurt in general about hardware requirements increasing.  That kind of backward thinking will allow consoles to have more performance than the PC eventually if everyone starts expecting that hardware requirements will never increase.

Backward thinking like that did not get us such massively improved PC's in the last 15 years.


----------



## 64K (Nov 9, 2015)

rtwjunkie said:


> You may have some valid points from what I've read by users.  However, if you'll note in my edit from above that I did earlier, people started thier outcry about higher requirements BEFORE anyone had the game after the TPU staff put forth the real-world requirements.  Think about that a second and let it digest.
> 
> People were butt-hurt in general about hardware requirements increasing.  That kind of backward thinking will allow consoles to have more performance than the PC eventually if everyone starts expecting that hardware requirements will never increase.
> 
> Backward thinking like that did not get us such massively improved PC's in the last 15 years.



If people are bitching about games in general taking too much resources and having to upgrade then I don't agree with them. PC gaming is a never ending upgrade kind of hobby. That's a fact and it can get pretty expensive if you want the highest settings. 

If they're bitching because this particular game requires 12 GB RAM when it's just another COD that uses the same engine as before then I agree with them. I think the game probably has a memory leak and Treyarch will probably fix it eventually or whatever is causing this game to require 12 GB RAM.


----------



## Vayra86 (Nov 9, 2015)

rtwjunkie said:


> You may have some valid points from what I've read by users.  However, if you'll note in my edit from above that I did earlier, people started thier outcry about higher requirements BEFORE anyone had the game after the TPU staff put forth the real-world requirements.  Think about that a second and let it digest.
> 
> People were butt-hurt in general about hardware requirements increasing.  That kind of backward thinking will allow consoles to have more performance than the PC eventually if everyone starts expecting that hardware requirements will never increase.
> 
> Backward thinking like that did not get us such massively improved PC's in the last 15 years.



Oh I agree on that. We should also be careful not to jump on the bandwagon too quickly, but in the case of CoD, where there's smoke, there's fire. Even more so, because CoD is one of the (Many) reasons PC games have *not* seen themselves improve vastly over the past console generation. We have just had one of the longest periods of standstill, and todays' console grunt is hardly anything to write home about.


----------



## Pill Monster (Nov 9, 2015)

Aquinus said:


> I don't have GTA V so I can't speak for experience but, if my memory serves me correctly, that game uses extra VRAM for caching so it could very well be just like TW3 in that respect, so I wouldn't go making any assumptions to that end.
> 
> I've played TW3 maxed out without Hairworks. In fact I didn't find Hairworks to impact performance by all that much, I feel that it's just a slow engine for what it is doing. I don't have GTA V so I can't talking about it but, there was a discussion several months ago about how it appeared that GTA V was caching stuff in VRAM if it was available and wasn't reflective of how much memory is being actively used at any given time.


Yeah prior to W10 memory could be allocated even if not in use, beginning with W10 (I think it's 10, u can look it up) any VRAM not actively in use with data residing in it must be given up so other apps can use it, = Memory Reclaim. 
More of a Windows Memory/Driver Management problem than application afaik..

Affects games DX 11< onward.


----------



## Deleted member 67555 (Nov 9, 2015)

Vayra86 said:


> Oh I agree on that. We should also be careful not to jump on the bandwagon too quickly, but in the case of CoD, where there's smoke, there's fire. Even more so, because CoD is one of the (Many) reasons PC games have *not* seen themselves improve vastly over the past console generation. We have just had one of the longest periods of standstill, and todays' console grunt is hardly anything to write home about.


CoD is the reason why PC games haven't what?
Oh no LOL...
I'd say that several game studios are far behind what CoD does.
CoD has some of the best mechanics hands down.
It is a visually good game..yes it could be better.
Online play needs improvement but it is in the top 5 best....for what you're able to do.

No...Most studios need to catch up to CoD...
I don't have BlOps 3 yet but did play the Beta and I can say with certainty that the biggest problem people have with the game is not being able to play in an almost 3D environment.
They don't like it and struggle but its still good.


----------



## Fx (Nov 9, 2015)

hat237 said:


> just memory leak no need to worry it will be fine



The game might have a memory leak, but it isn't the primary culprit. I have my settings set as high as possible at 1080p. I have a EVGA 980ti FTW card and 12GB of memory.

The game crashes as soon as the campaign begins.



jmcslob said:


> No...Most studios need to catch up to CoD...
> I don't have BlOps 3 yet but did play the Beta and I can say with certainty that the biggest problem people have with the game is not being able to play in an almost 3D environment.
> They don't like it and struggle but its still good.



That isn't my problem with the latest games in the series. For me, the problem with Ghost was that they made normal mode feel like it was pseudo hardcore; you ended up dying too fast. Secondly, they took out CTF mode which is what I play 98% of the time.

For Advanced Warfighter, I simply didn't like all of the air boosting.

My favorite games in the whole series is MW2, MW3 and BO2.

Unfortunately, people cry and whine for change, and consequently the devs gave us change. I really never wanted change except for maybe new maps and enhanced graphics. The realistic 3D environment does make the game feel different and is harder, but I am fine with that.


----------



## NC37 (Nov 12, 2015)

BWAHHAHAHA!!

"Games won't go over 4GB in 1080," people said...."a 970 is all you'd need," they said..."the 3.5GB limit is a non issue," they said...."8GB VRAM is overkill for 1080," they said...

Who's laughing now!! Oh right, me! HA HA HA!

Man how many times I told people...memory pool size does not = resolution. There is no set limit. Games never settle on limits for long. I've seen this for years and people still profess that,"you'll never need more than..." this amount or that amount. 

Sure the game is likely badly coded and buggy right now but still, give it time. Limits are meant to be broken. Game complexity increases. Even 8GB will be surpassed. Heck they're already looking past 16GB for next year. Even if it's high end cards...if they make it, devs will try to take advantage of it. Specially since PC gaming is often times not very optimized and can be sloppy.


----------



## Vayra86 (Nov 13, 2015)

jmcslob said:


> CoD is the reason why PC games haven't what?
> Oh no LOL...
> I'd say that several game studios are far behind what CoD does.
> CoD has some of the best mechanics hands down.
> ...



Gun mechanics? CoD didn't invent those at all or set any kind of standard in that regard besides the use of iron sights for aiming, and that was already many, many versions of the game ago. If you really want to talk about mechanics, let's look at UT'99 and you will see how archaic the whole shooter formula really is. It is a type of games that has seen almost zero changes in over ten years. What CoD does, better be good because if it isn't in this day and age, the developer is a complete failure. Shooters are as old as PC's and CoD as a shooter hasn't brought anything new to the table, ever, at all. CoD:AW is also just riding the bandwagon of the return to arena-styled shooters, a returning trend that refers strongly to the upcoming UT4.


----------



## Frick (Nov 13, 2015)

RejZoR said:


> But in all honesty, while Crysis is still super demanding, it looked spectacular at time and is in a way still a benchmark for visual fidelity. Mostly because it's SO old but looks like it was released a year or two ago...



AND you could run it on quite weak systems if you lowered the settings, and it still looked good. I played the multiplayer beta on an Athlon 3000+ and a x1950 pro with most settings on High (1280x1024).


----------



## RejZoR (Nov 13, 2015)

I had the same setup back then and while it looked pretty good even at lower settings, once you check it at highest setting, there is hard to go back.

I usually crank up settings all the way up and then remove non-essential settings that hardly affect visuals, but give good performance. Though, these days, I just crank up everything to max with any game and never look back.


----------



## Slizzo (Nov 13, 2015)

So installed the game on Monday, played a round of Zombies with some friends and did notice some framerate issues. Went back to playing Fallout 4...


----------



## Prima.Vera (Nov 14, 2015)

Too much hate on this thread. To be honest, this latest COD has *THE BEST graphics in the market right now.* Is only natural that the older cards are having issues playing this, because of EVOLUTION. The game is using all available RAM and VRAM, not because of bad coding, but actually because of good coding. Yes is caching all available memory because this is what it was coded to do. And is good. I had big stutter on my 780Ti card with 3GB of VRAM on 1080p, but I fixed that only be disabling the textures from Ultra to High. Now it runs like butter with all other details maximised. With averaging 70FPS across everywhere, I still think is a good game especially considering the graphics. And no other bugs noticed so far.

Peace.


----------



## Uplink10 (Nov 14, 2015)

I haven't played it so I can't comment about the graphics but the story is very bad. I mean seriously do they just sit around and try to make the most unbelievable and impossible game they can think of?

I'd put this game into psychological sci-fi category because the story fits it. It felt like a bad sci-fi movie with all the mind scenes and the combat looked more Crysis armour-mode like - going out with armour mode and shooting everything and everyone up.

Good thing I watched a game movie and didn't buy the game and I find it hard to believe that Black Ops 3 is selling like hot cakes. Does no one appreciate the story anymore? It is all flash, no photo for me.


----------



## Prima.Vera (Nov 15, 2015)

Yeah, story is pretty bad and unbelievable, and technical facts are ridiculously stupid...


----------



## ManofGod (Nov 15, 2015)

The PC Whiny Master Race Strikes again!  Just upgrade your ram already since it is so cheap. It you do not do so, then it is your own fault when you cannot play something on Ultra smoothly. I cannot understand why anyone uses less than 16GB of ram and claims to be a PC Master Race expert.


----------



## Prima.Vera (Nov 16, 2015)

Stuttering is not because of system RAM, but mostly because of Video RAM on GPUs...


----------



## Her3tic (Dec 18, 2015)

Currently my PC is set like this: 
VGA : GTX 680 MSI Twin Frozer III
CPU : Core i7 3770k
Ram: 8gig 
Windows 10 - Latest Direct X  and VGA drivers are installed.

After playing like 10 minutes, Memory usage reach 800mb and goes up like 1400 mb on Cod. The problem is that task manager shows system using up to 2200mb near COD. and im not sure why. 
Is it the games problem that causes this, or something else is wrong.

Graphic setting High on Texture quality and texture filtering and mesh quality. Dynamic shadows on , Subsurface on, Rest Medium.


----------



## johnspack (Dec 19, 2015)

Get a 980ti,  only thing that will run this.  My 970 will run circles around your 680,  but I'd need a 2nd one to run this game.  
Turn your settings down is all you can do.  I'll wait for an affordable pascal card before I get this game.


----------



## xXDS_K1ll3rXx (Apr 30, 2016)

This is NOT true at all! I have got 16 GB RAM, an intel i7-4790k and a gtx 970. I have played the game for about 300 h on steam and i have NEVER had any lag or framedrop. The game takes about 50-60% CPU, 5-7 GB RAM and 2.5-3.3 GB vram on 1080p and max settings. I never had less than 60 FPS. Maybye the 980ti isn' t good enough beacause it was used for bitcoin mining or something like that. And alsow it has got 6 GB vram. Or maybye just some cod hater wrote this. NOT TRUE AT ALL!!


----------



## rtwjunkie (Apr 30, 2016)

xXDS_K1ll3rXx said:


> Maybye the 980ti isn' t good enough beacause it was used for bitcoin mining or something like that.



So let me get this straight, without knowing that the site owner W1zzard does all the GPU testing (and his testing methods are respected around the world), you assume that this respected man was bitcoin mining on his 980Ti, and THAT might be why it wasn't good enough?

Hold on....

Wow, talk about reaching.


----------



## Estaric (Apr 30, 2016)

xXDS_K1ll3rXx said:


> This is NOT true at all! I have got 16 GB RAM, an intel i7-4790k and a gtx 970. I have played the game for about 300 h on steam and i have NEVER had any lag or framedrop. The game takes about 50-60% CPU, 5-7 GB RAM and 2.5-3.3 GB vram on 1080p and max settings. I never had less than 60 FPS. Maybye the 980ti isn' t good enough beacause it was used for bitcoin mining or something like that. And alsow it has got 6 GB vram. Or maybye just some cod hater wrote this. NOT TRUE AT ALL!!


Well there is also things called updates that increase performance. At the time this was posted this was the case it may very well have been patched.


----------



## Octopuss (May 1, 2016)

Guys don't feed the necrotroll.


----------

