# Call of Duty Modern Warfare Benchmark Test, RTX & Performance Analysis



## W1zzard (Nov 5, 2019)

Modern Warfare is the latest iteration in the Call of Duty series. It pairs amazing graphics with captivating gameplay and has NVIDIA RTX support from day one. We take a closer look at performance, comparing 23 graphics cards at three resolutions. In a separate section, we compare RTX on vs. off and look at its FPS cost, too.

*Show full review*


----------



## londiste (Nov 5, 2019)

One of the best, if not the best RT inclusion in the game so far. RT shadows (and these are from point-lights only) are minor but occasionally noticeable effect in the game for a sizable but acceptable performance hit. COD:MW being otherwise optimized to an absolutely excellent degree helps as well.


----------



## TKnockers (Nov 5, 2019)

it looks like nvidia is slowly degrading gtx 1080 ti performance...


----------



## W1zzard (Nov 5, 2019)

TKnockers said:


> it looks like nvidia is slowly degrading gtx 1080 ti performance...


It rather looks like they very much optimized the game for Turing concurrent FP+Int


----------



## TKnockers (Nov 5, 2019)

...in other words didn't care much about older gpus.... rx580 quite solid compared to gtx 1070 @ 1080p


----------



## Theliel (Nov 5, 2019)

I believe it's all faked by nvidia. This was probably a paid promotion.


----------



## claylomax (Nov 5, 2019)

Theliel said:


> I believe it's all faked by nvidia. This was probably a paid promotion.


Your comment says a lot about you.


----------



## B-Real (Nov 5, 2019)

Good to see AMD destroying their equal costing NV counterparts (RX 5700 - 2060, 5700 XT - 2060S) in an NV supported title. 350$ RX 5700 = 400$ 2060S, 400$ RX 5700 XT = 500$ 2070S.


----------



## ShurikN (Nov 5, 2019)

londiste said:


> One of the best, if not the best RT inclusion in the game so far.


Really?
The shadows look nicer, but the entire game looks soft and blurry with RTX ON.


----------



## EarthDog (Nov 5, 2019)

B-Real said:


> Good to see AMD destroying their equal costing NV counterparts (RX 5700 - 2060, 5700 XT - 2060S) in an NV supported title. 350$ RX 5700 = 400$ 2060S, 400$ RX 5700 XT = 500$ 2070S.


Performance appears to be similar with 5700 and 2060s as well as the xt and 2070s. Like always (as of late), AMD takes the price to performance crown, loses in power to performance.


----------



## Mescalamba (Nov 5, 2019)

ShurikN said:


> Really?
> The shadows look nicer, but the entire game looks soft and blurry with RTX ON.



Result of game being rendered at a lot lower resolution. Its only way to have some sort of ray tracing in real time. Regular ray tracing would be frames per hour, not second.  Still looks okay and saves for AA. 



W1zzard said:


> It rather looks like they very much optimized the game for Turing concurrent FP+Int



Which also makes it very good for AMD with computation heavy GPUs. Not so good for old regular GPUs.


----------



## tfdsaf (Nov 5, 2019)

EarthDog said:


> Performance appears to be similar with 5700 and 2060s as well as the xt and 2070s. Like always (as of late), AMD takes the price to performance crown, loses in power to performance.


Hardly, the RX 5700 actually consumes a tad bit lower than the RTX 2060, the fact that it performs about 15% higher means its destroying the 2060 by 15% in performance per watt. 

The RX 5700XT is a tad bit more power hungry than the RTX 2060s, about 10-15W, but the fact its destroying it by 15% in performance, means its also winning in the performance per watt metric as well.


----------



## EarthDog (Nov 5, 2019)

tfdsaf said:


> Hardly, the RX 5700 actually consumes a tad bit lower than the RTX 2060, the fact that it performs about 15% higher means its destroying the 2060 by 15% in performance per watt.
> 
> The RX 5700XT is a tad bit more power hungry than the RTX 2060s, about 10-15W, but the fact its destroying it by 15% in performance, means its also winning in the performance per watt metric as well.


Yes, power use is close with the Nv cards eeking it out for what little that is worth.

I wouldn't call a ~25% difference in power use "a bit more power hungry" (225W vs 175W). With that, it isn't winning the performance /W metric either... or am I math challenged this morning?


----------



## W1zzard (Nov 5, 2019)

ShurikN said:


> The shadows look nicer, but the entire game looks soft and blurry with RTX ON.


not sure i'm seeing it. do these two images look sharp? (direct links, without our comparison viewer)



			https://tpucdn.com/review/call-of-duty-modern-warfare-benchmark-test-performance-analysis/images/rtx-1-on.jpg
		



			https://tpucdn.com/review/call-of-duty-modern-warfare-benchmark-test-performance-analysis/images/rtx-1-off.jpg


----------



## tfdsaf (Nov 5, 2019)

EarthDog said:


> Yes, power use is close with the Nv cards eeking it out for what little that is worth.
> 
> I wouldn't call a ~25% difference in power use "a bit more power hungry" (225W vs 175W). With that, it isn't winning the performance /W metric either... or am I math challenged this morning?


I mean sure, but realistically in various reviews of custom AIB cards the 2060s usually consumes in the 200W region, even the 2060s reference design consumed about 5-10W more than advertised, while the RX 5700xt consumed about 5W less than advertised. 

Most custom 5700xt are consuming about 235W, about 10W more than reference, with Nvidia cards generally consuming more due to higher clock profiles on those cards and better coolers. So its about 35W difference realistically, if you look at total system power consumption that number is usually a tad bit lower as well, and with the RX 5700xt being on average 7% faster at 1440p it does end up in a tie in terms of performance per watt. 

Now we can take extremes where the 5700xt is 25% faster than the 2060s and count only that, or we might take extremes where the 2060s is 7-8% faster and count only that and the performance per watt will look completely different, but overall across a range of 30+ games on average the RX 5700xt is pretty much equal in terms of performance per watt to the 2060s. Maybe a tad bit weaker depending on the games, but on the other hand it's a much better performance per value. 

Ultimately on this game though it seems as the RX 5700xt is beating the 2060s in terms of performance per watt.



W1zzard said:


> not sure i'm seeing it. do these two images look sharp? (direct links, without our comparison viewer)
> 
> 
> 
> ...


RTX on does seem blurrier and softer for sure. I'm not sure how noticeable it is in game, but you can easily spot it in the images.


----------



## fancucker (Nov 5, 2019)

Great write up

So essentially you're missing out on an important technical feature by buying AMD, and the best Ray tracing implementation so far. And the 2060 super on other sites has already been proven to be plenty for 1440p RTX gameplay.


----------



## ShurikN (Nov 5, 2019)

W1zzard said:


> not sure i'm seeing it. do these two images look sharp? (direct links, without our comparison viewer)
> 
> 
> 
> ...


On those images not so much, but:

Image 3: Body and the yellow lines around it
Image 4: Text under the fire extinguisher
Image 6: Mostly ashtray, but a lot of stuff around it too
Image 7: Poster on the wall, texture of the door on the ground
Image 8: Hookah, textures of tables, tapestry, tho to a lesser extent
Image 9: Coffee pot
Image 10: Every single texture on the screen. Guns and the big bag on the table are the biggest offenders
Image 11: Brick textures on the right
Image 13: Blue cans on the shelf in the back


----------



## Cheeseball (Nov 5, 2019)

Whats funny is if you're going to play multiplayer competitively (to rank up and unlock upgrades), you would most likely reduce the graphics settings to hit that sweet 144 Hz and 240 Hz goodness.

The single player campaign looks gorgeous with all the settings on max though.


----------



## EarthDog (Nov 5, 2019)

fancucker said:


> Great write up
> 
> So essentially you're missing out on an important technical feature by buying AMD, and the best Ray tracing implementation so far. And the 2060 super on other sites has already been proven to be plenty for 1440p RTX gameplay.


HGH..... studies show!


----------



## sutyi (Nov 5, 2019)

Hmm... the only note worthy difference I can see is that softshadows and a good ambient occlusion implementation go out the window wherever that magical RTX Logo appears.

Also with DLSS enabled it's enhanced vaseline graphics.



TKnockers said:


> it looks like nvidia is slowly degrading gtx 1080 ti performance...



Just Pascal uarch showing its age and its shortcomings.


----------



## Lionheart (Nov 5, 2019)

fancucker said:


> Great write up
> 
> So essentially you're missing out on an important technical feature by buying AMD, and the best Ray tracing implementation so far. And the 2060 super on other sites has already been proven to be plenty for 1440p RTX gameplay.



Your opinion & existence on this site in a nutshell...


----------



## W1zzard (Nov 5, 2019)

sutyi said:


> Also with DLSS enabled it's enhanced vaseline graphics.


Modern Warfare does not support DLSS?



ShurikN said:


> Mostly ashtray, but a lot of stuff around it too








there is literally no difference between ashtray left and right of the separator, except for the shadow up top  that's correct now?


----------



## the54thvoid (Nov 5, 2019)

It's amazing how people put on their special 'bias' vision (either shade) and spout a whole heap of crap. The RT is very good; it's subtle but you get solidity. RT does not affect the 'blurryness'. That's those 'bias' spectacles folk have on. Likewise, it's not a killer effect, not yet. Certainly not something that will ruin the game for AMD. Jeez, peeps, go and get frenzied over a sporting conflict instead.


----------



## Xuper (Nov 5, 2019)

Omg , Vram : 8842mb?!!! I still play The Outer Worlds and MSI afterburner Overlay reports :6685mb !!


----------



## mouacyk (Nov 5, 2019)

W1zzard said:


> It rather looks like they very much optimized the game for Turing concurrent FP+Int


This is the first time an x60 SKU managed to match performance with an x80 TI of the previous gen.  960 was 50% behind 780 Ti and 1060 was 30% behind the 980 Ti.  It was about time for the intersection to happen.  It's probably time for the x80 Ti SKU to die.


----------



## Fluffmeister (Nov 5, 2019)

Looks like a nice game, and it certainly looks pretty! And with good performance across the board.

The fact you can currently get it free with an RTX card purchase is a bonus.


----------



## moob (Nov 5, 2019)

the54thvoid said:


> It's amazing how people put on their special 'bias' vision (either shade) and spout a whole heap of crap. The RT is very good; it's subtle but you get solidity. *RT does not affect the 'blurryness'.* That's those 'bias' spectacles folk have on. Likewise, it's not a killer effect, not yet. Certainly not something that will ruin the game for AMD. Jeez, peeps, go and get frenzied over a sporting conflict instead.


Yes. It does. I didn't even pay much attention to it when I first saw the comparison and I mostly just focused on the improved shadows, but after W1zz posted those larger images, it's clearly blurrier. Switching between the larger RTX On/Off images while focusing on the guy crouching at the door is literally making my eyes hurt because of how much blurrier it is, though granted I'm somewhat sensitive to that.


----------



## candle_86 (Nov 5, 2019)

mouacyk said:


> This is the first time an x60 SKU managed to match performance with an x80 TI of the previous gen.  960 was 50% behind 780 Ti and 1060 was 30% behind the 980 Ti.  It was about time for the intersection to happen.  It's probably time for the x80 Ti SKU to die.



Not quite if we take ti = ultra


Ti 4200 = Ti 500
6600gt = fx5959 Ultra
7600gt = 6800 Ultra

It used to happen, it all went bad with the 8600gts loosing to the 7900gx2, the 9600gt lost to the 8800 ultra, the gtx 260 lost to the 9800gx2, ect

We all just forgot that it used to be last year's $500 card could now be matched or beat by the $200 card from the next generation.


----------



## the54thvoid (Nov 5, 2019)

moob said:


> Yes. It does. I didn't even pay much attention to it when I first saw the comparison and I mostly just focused on the improved shadows, but after W1zz posted those larger images, it's clearly blurrier. Switching between the larger RTX On/Off images while focusing on the guy crouching at the door is literally making my eyes hurt because of how much blurrier it is, though granted I'm somewhat sensitive to that.



You know what? I'll give you that - you're right. But I think there is a valid procedural reason. The sharper image is false. RT will apply light as it illuminates a surface. Try looking at something that is not under direct illumination- it is always less distinct. By using RT, it's likely the detail you would see artificially, is being smoothed out. It's all covered by Marr's theory of vision, where shapes are derived first and detail follows. In the correct lighting scenario's, detail will always be lost in low (or indirect) light.

But yes, you are correct - RT will create areas of less defined detail. Which is realistic.


----------



## moob (Nov 5, 2019)

the54thvoid said:


> You know what? I'll give you that - you're right. But I think there is a valid procedural reason. The sharper image is false. RT will apply light as it illuminates a surface. Try looking at something that is not under direct illumination- it is always less distinct. By using RT, it's likely the detail you would see artificially, is being smoothed out. It's all covered by Marr's theory of vision, where shapes are derived first and detail follows. In the correct lighting scenario's, detail will always be lost in low (or indirect) light.
> 
> But yes, you are correct - RT will create areas of less defined detail. Which is realistic.


Motion Blur is supposedly more "realistic" as well yet it's the first option I turn off when I jump into a new game. I'd take the "false" image with the sharper details and worse shadows over the "real" image with better shadows any day. Or better yet, a combination of both with the better RT shadows with the clearer non-RT image.


----------



## Chrispy_ (Nov 5, 2019)

sutyi said:


> Hmm... the only note worthy difference I can see is that softshadows and a good ambient occlusion implementation go out the window wherever that magical RTX Logo appears.
> 
> Also with DLSS enabled it's enhanced vaseline graphics.



Yeah, and even if I look past the DLSS subsampling blur, I can't help but notice the mistakes that RTX makes too.

They are DIFFERENT, for sure but neither is always 'right'.

The gun rack example (image 10) - the very soft shadows look awful with RTX, as there are hard shadows on the floor, and it's obviously a single point-source, one-bulb room. IMO the RTX off variant is less visually-wrong, though it still has some issues with vertical surfaces like the ammo box.

The last image, RTX seems to be too low-precision to even add the light rays from the window on the steel cage. It just skips it altogether! The developer intent (with the actual god-ray shaders thrown in for good measure) is obviously as per the non-RTX example. In the RTX variant, it's even weirder to see the god-ray shader effect throwing _shadow _on the steel cage it hits.

I still currently have both a 5700XT and an RTX 2060 at home but I don't care about RTX because I need to finish _Call of Duty: Kevin Spacey's the Bad Guy_  first. I'm WAAAAAY behind the curve with these games!


----------



## moob (Nov 5, 2019)

Chrispy_ said:


> I still currently have both a 5700XT and an RTX 2060 at home but I don't care about RTX because I need to finish _Call of Duty: Kevin Spacey's the Bad Guy_  first. I'm WAAAAAY behind the curve with these games!


I own/played them all except for this, WWII, and CoD 3 since it wasn't released for PC. They're a guilty pleasure of mine. I usually wait for them to go on sale for $15 or so and plow through the single player (haven't cared for the multiplayer since CoD4). I'm just glad the engine finally looks like a modern engine. I'll definitely be getting it at some point and hopefully WWII drops in price.


----------



## W1zzard (Nov 5, 2019)

I'm exploring a new chart for these articles, showing how the cards stack up in performance against the average fps result from our vga reviews.

thoughts?


----------



## candle_86 (Nov 5, 2019)

W1zzard said:


> I'm exploring a new chart for these articles, showing how the cards stack up in performance against the average fps result from our vga reviews.
> 
> thoughts?


Any chance of older cards, I'm sure plenty of people still have r9 290/390 and 280/380 or 960 or 760


----------



## john_ (Nov 5, 2019)

I was looking at Red Dead Redemption 2 on Guru3D just before coming here and the charts look identical, meaning that AMD cards there, also perform really well compared to the Nvidia equivalent models. This is a good day for the read team.


----------



## W1zzard (Nov 5, 2019)

candle_86 said:


> Any chance of older cards, I'm sure plenty of people still have r9 290/390 and 280/380 or 960 or 760


No plans to include older cards in my game articles. This is the same FPS data as in the original article, just relative to average performance, so you can draw some additional conclusions


----------



## Xzibit (Nov 5, 2019)

the54thvoid said:


> You know what? I'll give you that - you're right. But* I think there is a valid procedural reason. The sharper image is false*. RT will apply light as it illuminates a surface. Try looking at something that is not under direct illumination- it is always less distinct. By using RT, it's likely the detail you would see artificially, is being smoothed out. It's all covered by Marr's theory of vision, where shapes are derived first and detail follows. In the correct lighting scenario's, detail will always be lost in low (or indirect) light.
> 
> But yes, you are correct - RT will create areas of less defined detail. Which is realistic.



The denoiser.



			
				Nvidia said:
			
		

> Different denoisers are used for point lights, spot lights, directional lights, and rectangular lights. All pull from the G-Buffer and use hit distance, scene depth, normal, light size and direction to guide the filtering.


----------



## QUANTUMPHYSICS (Nov 5, 2019)

I'm playing Modern Warfare on a Core i9 Extreme, 32GB DDR4, 2080Ti and SSD storage.

The game is at its absolute best during the close quarters missions which are obviously designed to resemble 0 DARK THIRTY. The stealth missions are good too. 

I played the game in REALISM. I kept getting killed, but I made it all the way to the final mission before getting completely overwhelmed and forced to drop the difficulty. 

I stopped playing CoD after Advanced Warfare and I skipped Ghosts. 

This game was FREE with my 2080Ti Black as a download so it cost me nothing technically. 

I am enjoying it. It feels like classic CoD, but the action is on a smaller scale so it's more like the original Modern Warfare. 

It doesn't match the impact of the original Modern Warfare - which came out around the same time as Crysis - but it definitely is worthwhile and redefines the franchise.


----------



## I'm here to throw (Nov 6, 2019)

B-Real said:


> Good to see AMD destroying their equal costing NV counterparts (RX 5700 - 2060, 5700 XT - 2060S) in an NV supported title. 350$ RX 5700 = 400$ 2060S, 400$ RX 5700 XT = 500$ 2070S.



Where were you when Nvidia was destroying their equal costing AMD counterparts in AMD supported titles like BL3 and outerworlds?


----------



## Xzibit (Nov 6, 2019)

W1zzard said:


> not sure i'm seeing it. do these two images look sharp? (direct links, without our comparison viewer)
> 
> 
> 
> ...



What up with the guys boot on the far right?
RTX OFF you can see the soldiers tread on the bottom of his boot but on RTX On its like it over applied motion blur to his boot.

The soldier at the window his gear is clearer on the RTX OFF switch to the RTX On and the backpack and wraps around his waist aren't as detailed with a soft/blurrier effect. Denoiser or post processing effect. Same for the guy knelling next to the door. His gear, helmet and his uniform around his right arm and shoulder are noticeable.


----------



## W1zzard (Nov 6, 2019)

Xzibit said:


> What up with the guys boot on the far right?
> RTX OFF you can see the soldiers tread on the bottom of his boot but on RTX On its like it over applied motion blur to his boot.


That is motion blur, he moved the foot right before I took the screenshot


----------



## Xzibit (Nov 6, 2019)

W1zzard said:


> That is motion blur, he moved the foot right before I took the screenshot



Okay. I thought you were running the graphics settings on the page which had them both disabled.


----------



## W1zzard (Nov 6, 2019)

Xzibit said:


> Okay. I thought you were running the graphics settings on the page which had them both disabled.


Since this is one of my first screenshots, it's possible that I hadn't disabled it yet


----------



## Calmmo (Nov 6, 2019)

TKnockers said:


> it looks like nvidia is slowly degrading gtx 1080 ti performance...



nvidia are simply optimizing for their latest gpu's. happened with the 900 series, its gonna happen with the RTX2000 series next year etc etc.


----------



## 64K (Nov 6, 2019)

Most of the optimizing has already been done for Pascal.

Taking into account that Nvidia is pushing ray tracing hard with their RTX GPUs then you can expect now and in the future that for games that support ray tracing they will optimize for the RTX GPUs first and foremost.


----------



## pandemonium (Nov 6, 2019)

ShurikN said:


> Really?
> The shadows look nicer, but the entire game looks soft and blurry with RTX ON.



This is exactly what I'm seeing.  While the shadows are more realistically defined according to light sources, the entire scene is more blurred with RTX on.  I like crisp, clean details.  I very much dislike post-processing that reduces image quality (call it soften or what have you) and RTX is doing this on the entire scene.  I don't understand how the rest of you aren't seeing this.

Take a look at the teacup (go back to the review if you don't trust my image taken from it below).  The drinking edge is softened with it on.  Also the print is washed out and the colors aren't as vibrant.  The edges defining each part of the pattern is also softened.




It's made clear when you zoom in, but also very apparent looking at the entire scene.  There is an overall softening going on with RTX on.  Is that just the way RTX functions, by diffusing detail due to light refraction?  If that's the case, I _really _don't care for it.


----------



## londiste (Nov 6, 2019)

pandemonium said:


> Is that just the way RTX functions, by diffusing detail due to light refraction?  If that's the case, I _really _don't care for it.


It is not. RTX does not touch any of these aspects. What RTX is used for in this game is shadows. Specifically shadowing from point-lights and the generated shadows are merged with other shadows (from shadow mapping and other usual methods).


----------



## pandemonium (Nov 6, 2019)

Can we explain why RTX on looks softer on its clarity then?


----------



## londiste (Nov 6, 2019)

Motion blur on? COD:MW has a lot of effects on screen most of the time. Getting 1:1 screenshots is not trivial here.

The other suspicion I have is that the game currently seems to have (minor) issues with asset streaming and memory management. 8GB is not quite enough and from what I saw, both detailed models and detailed textures were not loaded or not fully loaded in certain situations.

Either way, RTX causing this makes no sense at all.


----------



## Theliel (Nov 6, 2019)

TKnockers said:


> it looks like nvidia is slowly degrading gtx 1080 ti performance...


Absolutely right. That's what every sane person believes. Nvidia have been degrading performance of discontinued cards through their drivers since history.


----------



## Vayra86 (Nov 6, 2019)

moob said:


> Motion Blur is supposedly more "realistic" as well yet it's the first option I turn off when I jump into a new game. I'd take the "false" image with the sharper details and worse shadows over the "real" image with better shadows any day. Or better yet, a combination of both with the better RT shadows with the clearer non-RT image.



This is indeed an issue with RT in games. Many situations may be 'real' but they certainly are not playable that way. It will always have to be tweaked, I think.

A few years ago we were keen comparing games with movies, because CGI got so good and games gained graphical fidelity, mocap, and other stuff similar to what's used in film. But people seem to have forgotten that movies _get edited too. _The camera roll is often pretty horrible to look at on its own.

Question remains how much truly realistic RT will be left when all is said and done. That said I do like the low-key approach, noticing those finer details I think is where RT can shine, the technology needs to pick its battles, both for fidelity and performance.

And yeah... Motion blur, chromatic abberation and vignetting... whoever thought those were good ideas needs a punch in the face.


----------



## Mescalamba (Nov 6, 2019)

RayTracing is rendered into maybe same resolution, but original picture doesnt have same resolution. Its impossible right now to have real time ray tracing that would actually give you even half of 4k.

RTX is significantly reduced version of full fat ray tracing, cause nothing can render that much rays in real time.

At full HD, it might not be that striking difference and in two generations it will be superior to our normal rendering.

Right now it aint there, its just usable. If you want pin sharp image, go with regular rendering. Btw. you cant see as much details as 4k in real life either. So RTX is sorta quite realistic.  In being a bit fuzzy.


----------



## londiste (Nov 6, 2019)

Mescalamba said:


> RayTracing is rendered into maybe same resolution, but original picture doesnt have same resolution. Its impossible right now to have real time ray tracing that would actually give you even half of 4k.
> RTX is significantly reduced version of full fat ray tracing, cause nothing can render that much rays in real time.
> At full HD, it might not be that striking difference and in two generations it will be superior to our normal rendering.
> Right now it aint there, its just usable. If you want pin sharp image, go with regular rendering. Btw. you cant see as much details as 4k in real life either. So RTX is sorta quite realistic.  In being a bit fuzzy.


You have misunderstood how raytracing is used in COD:MW. It is not used to render the image. Image overall is still rendered the same way with and without RTX.
Raytracing is only used to render certain part of shadows that is then merged with shadows from other (more usual, rasterization) methods.


----------



## moob (Nov 6, 2019)

Vayra86 said:


> And yeah... Motion blur, chromatic abberation and vignetting... whoever thought those were good ideas needs a punch in the face.


Holy hell yes. The trifecta of awful visual effects.

Though I do enjoy a good implementation of DoF. I know a lot of people hate that as well.


----------



## Assimilator (Nov 6, 2019)

Expected a bunch of idiots who haven't played the game with RTX, whining that RTX looks wrong and/or blurry in static screenshots of said game. Was not disappointed.


----------



## Xzibit (Nov 6, 2019)

pandemonium said:


> Can we explain why RTX on looks softer on its clarity then?



Remember Metro Exodus had the same Softness issue with DLSS.  Later a sharpening filter was applied in a "optimization" patch.


----------



## Fluffmeister (Nov 6, 2019)

Assimilator said:


> Expected a bunch of idiots who haven't played the game with RTX, whining that RTX looks wrong and/or blurry in static screenshots of said game. Was not disappointed.



Pretty sure the softness just boils down the AA implementation, as Nvidia noted in their performance guide:



			
				Evil but fun said:
			
		

> _Modern Warfare_ exclusively uses SMAA anti-aliasing to counteract jagged and flickery lines and edges on surfaces and moving game elements. On the low end of things is SMAA 1X, a super fast, albeit basic implementation that lacks a temporal component, meaning flickering and shimmering on moving elements can still be seen.
> 
> To tackle those unsightly blighters, you need a good dose of temporal anti-aliasing, which SMAA T2X delivers in spades. So, what's the final option, "Filmic SMAA T2X"? Simply, *it's a softer implementation* that's intended to closer mirror reality by introducing an additional post-processing pass that improves AA accuracy and further reduces the visibility of aliasing, giving you a near-perfect, aliasing-free picture,* at the cost of some sharpness*.











						Call of Duty: Modern Warfare PC Graphics and Performance Guide
					

Get the inside line on Modern Warfare’s PC-exclusive ray-traced shadows, discover which settings impact performance, see how these settings affect image quality, and get the lowdown on all of the other PC tech enhancements.



					www.nvidia.com
				




As such...



			
				MEANIE nVIDIA said:
			
		

> When enabling SMAA T2X or Filmic SMAA T2X, you can configure the Anti-Aliasing Filmic Strength slider. *By default, the strength is set to max. Turning it down, detail becomes sharper*, but aliasing and temporal aliasing can creep back in, most noticeably on long-distance detail ...











						Call of Duty: Modern Warfare PC Graphics and Performance Guide
					

Get the inside line on Modern Warfare’s PC-exclusive ray-traced shadows, discover which settings impact performance, see how these settings affect image quality, and get the lowdown on all of the other PC tech enhancements.



					www.nvidia.com
				




But yes, presumably you can just apply the sharpen filter, if your the kind of Call of Duty gamer that likes to obsessively zoom in to teacups whilst you play.


----------



## Xzibit (Nov 6, 2019)

Fluffmeister said:


> Pretty sure the softness just boils down the AA implementation, as Nvidia noted in their performance guide:
> 
> But yes, presumably you can just apply the sharpen filter, if your the kind of Call of Duty gamer that likes to obsessively zoom in to teacups whilst you play.



If thats the case those issues should appear on both the RTX On and Off unless we are being presented with different settings under each.  *Page 3*


----------



## Fluffmeister (Nov 6, 2019)

Xzibit said:


> If thats the case those issues should appear on both the RTX On and Off unless we are being presented with different settings under each.



I guess, what is your personal experience with the game?


----------



## rtwjunkie (Nov 7, 2019)

Theliel said:


> Absolutely right. That's what every sane person believes. Nvidia have been degrading performance of discontinued cards through their drivers since history.


Just...no.  They have already had every bit of performance squeezed out with drivers.  Beyond that, when more intensive and complex games get made, older GPU families cannot help but be left behind.


----------



## wolf (Nov 7, 2019)

EarthDog said:


> Yes, power use is close with the Nv cards eeking it out for what little that is worth.
> 
> I wouldn't call a ~25% difference in power use "a bit more power hungry" (225W vs 175W). With that, it isn't winning the performance /W metric either... or am I math challenged this morning?



All of a sudden power consumption matters to AMD flag wavers now that they're competitive, when Vega/Polaris etc were much worse "nobody cares about power consumption" because of price : performance ratio etc etc. It's entertaining how fast it flips.



Theliel said:


> Absolutely right. That's what every sane person believes. Nvidia have been degrading performance of discontinued cards through their drivers since history.



The tinfoil hat is strong with this one. They don't degrade performance, total myth. They just pour more much more optimization effort into the latest generation. Various tests have been done proving that in the same titles over the years performance generally improves slightly or stays the same. Let me google it for you.

To be on topic for a moment, I am very keen to try out this title as I loved the Original, excellent coverage as always *W1zzard*


----------



## EarthDog (Nov 7, 2019)

wolf said:


> All of a sudden power consumption matters to AMD flag wavers now that they're competitive, when Vega/Polaris etc were much worse "nobody cares about power consumption" because of price : performance ratio etc etc. It's entertaining how fast it flips.


AMD flag raiser???? Hahaha, good one.


----------



## Vayra86 (Nov 7, 2019)

moob said:


> Holy hell yes. The trifecta of awful visual effects.
> 
> Though I do enjoy a good implementation of DoF. I know a lot of people hate that as well.



DoF is a bit like DX9's Bloom. It used to suck monkey balls, nowadays you barely ever have to turn it off. DoF is fast going that way it seems. I used to disable it all the time, but found myself not disabling it more often recently.



wolf said:


> All of a sudden power consumption matters to AMD flag wavers now that they're competitive, when Vega/Polaris etc were much worse "nobody cares about power consumption" because of price : performance ratio etc etc. It's entertaining how fast it flips.



Did you just assume his gender?


----------



## Mescalamba (Nov 7, 2019)

londiste said:


> You have misunderstood how raytracing is used in COD:MW. It is not used to render the image. Image overall is still rendered the same way with and without RTX.
> Raytracing is only used to render certain part of shadows that is then merged with shadows from other (more usual, rasterization) methods.



Dunno, cause that output on pics looks exactly like full RTX. Also shadows look different enough to doubt merging with normal too.


----------

