• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Wolfenstein II: The New Colossus Getting Support for NVIDIA Turing's Adaptive Shading

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.15/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
You may remember that we covered in detail the new technologies being implemented on NVIDIA's new brainchild, Turing, back when the architecture and its whitepaper were initially announced. One of the pieces of technology we talked about back then was Content Adaptive Shading, a new technique that would allow for smart trade-offs in image quality for added performance - potentially allowing for increased overall rendering resolutions at a much lesser impact cost.

The tech is now simply known as Adaptive Shading, and it basically works as a post-process step that looks at previous frames to calculate which determine quality conditions for the next one - lowered detail areas such as skies, flat walls, or even shadowed portions of objects require lesser amounts of shading detail, and thus, their shading rates can be reduced from a per-pixel shading to four pixels per shading ratio. And this new feature, which was originally showcased on MachineGames' Wolfenstein II: The New Colossus, will finally be implemented in working form on that particular game, via a patch that's being released on November 19th. This is the first title to make use of this technology - and hopefully, it isn't the last.



View at TechPowerUp Main Site
 
Nope. No amount of new tech will good enough to get me replay that POS game. Worst FPS I played in a long time. Horrible story telling, short campagin and full of those " in-your-face" agendas. F**k this shit.
 
Nope. No amount of new tech will good enough to get me replay that POS game. Worst FPS I played in a long time. Horrible story telling, short campagin and full of those " in-your-face" agendas. F**k this shit.

Not only that, I just cant see the difference at all. I may be blind. Who knows.
 
Nope. No amount of new tech will good enough to get me replay that POS game. Worst FPS I played in a long time. Horrible story telling, short campaign and full of those " in-your-face" agendas. F**k this shit.
This game seems to be another in a category of "Love it or Hate". If seen reviews and player opinions that shower it with praise and others that scorn it at length. I actually like that game, though I admit never getting to far into it. If they add in RTRT effects, I'm getting it though.
 
This game seems to be another in a category of "Love it or Hate". If seen reviews and player opinions that shower it with praise and others that scorn it at length. I actually like that game, though I admit never getting to far into it. If they add in RTRT effects, I'm getting it though.

Lex

Would you be so kind to run benchmarks and tests with your T3500 that is sporting your RTX2080? I am very intrigued with it.

Plus, which ram are you using?
 
As I said I don't own the game ATM. But if/when I get it I'd be happy too. Although, I just posted this video in another thread but it applies to your request;

Take the average of those scores and deduct about 15% and that's what I'm getting generally at ultra setting for benchmarking. I do get better than those scores for general gameplay because I completely turn off AA and lower shadows for actual playing.
 
Not bad.

I was plesently surprised by the power of the Dell T3500 with the W3680 processor. Very impressed with it. Figured I should have just purchased a second one for myself but I do love the itx format and want to go even smaller.......

Oh the difficulty of decision making.
 
Not only that, I just cant see the difference at all. I may be blind. Who knows.
That's the idea. It's a technique to lower quality on places where you can't notice it.
I see a bit of worse AA, but it could be that it's not the same frame.

New Order was a lot better.
 
Not bad.

I was plesently surprised by the power of the Dell T3500 with the W3680 processor. Very impressed with it. Figured I should have just purchased a second one for myself but I do love the itx format and want to go even smaller.......

Oh the difficulty of decision making.
For gaming with modern games like Wolf2, the X58 CPU's, especially when OC'd, are still surprisingly viable.

New Order was a lot better.
TNO was fun, but I enjoyed "The Old Blood" a bit more.
 
That's the idea. It's a technique to lower quality on places where you can't notice it.
I see a bit of worse AA, but it could be that it's not the same frame.

New Order was a lot better.

OK, so it is to help improve performance then?

If that is the case, shouldn't the GTX2070 to 2080TI not worry about this then? Why just turing?

For gaming with modern games like Wolf2, the X58 CPU's, especially when OC'd, are still surprisingly viable.


TNO was fun, but I enjoyed "The Old Blood" a bit more.

x58 is such a good platform and really lasted well.
 
OK, so it is to help improve performance then?

If that is the case, shouldn't the GTX2070 to 2080TI not worry about this then? Why just turing?
Because marketing.
And it sets the bar for when/if we get low end Turing cards.
 
A T3500 is a fine Peace of Computer :=)

i have One with a E5640 as a spare that i used for 14 days last time i redid my water setup on my primary PC, Upgraded the PSU to a Corsair TX850 (fits "perfectly") and put my 1080TI in it and i was flying :)
 
Nope. No amount of new tech will good enough to get me replay that POS game. Worst FPS I played in a long time. Horrible story telling, short campagin and full of those " in-your-face" agendas. F**k this shit.
Yeah, game story is so damn horrible! TNO and TOB is so damn good and yet, somehow they f***ed this up REAL bad. Cant be bothered to play this anymore, and whatever games comes after.

Its sad since it got the most gains from patch to patch, AMD cards especially Vega got good boost in performance and now even green camp got something too.
 
Not only that, I just cant see the difference at all. I may be blind. Who knows.
This one isn't about differences, it's about lowering the burden on the GPU. The good news is you're not blind ;)
 
Last edited:
Yeah Turing already eats the game for breakfast, but more performance is always welcome.
 
Yeah Turing already eats the game for breakfast, but more performance is always welcome.
Plus, I expect the main point here is adaptive shading has been added to the engine. Not entirely unexpected, these guys are always at the forefront of things happening in the computer graphics.
 
OK, so it is to help improve performance then?

If that is the case, shouldn't the GTX2070 to 2080TI not worry about this then? Why just turing?

It was introduced with Turing. It would be nice to see this implemented in a game that actually needs the boost tho.
 
Not only that, I just cant see the difference at all. I may be blind. Who knows.

LoL... don't think you're supposed to easily see any difference... it's a "small" sacrifice in image quality for increased performance. o_O

"... their shading can be reduced from a per-pixel shading to four pixels per shading ratio."

So graphics quality should overall be worst, may not see it, but should work faster.
 
Keep spending money on devs to support useless stuff. I hope nvidia spends themselves to death.
 
Keep spending money on devs to support useless stuff. I hope nvidia spends themselves to death.

l7pDW1F.png
 
Back
Top