• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Call of Duty and Cyberpunk 2077 Getting More NVIDIA RTX Love

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,233 (7.55/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
NVIDIA is celebrating 500 games and applications being part of the RTX ecosystem, and Activision has a large set of feature announcements for the RTX ecosystem. To begin with, Call of Duty: Warzone Season 1: Urzikstan debuts with DLSS 3 Frame Generation support on December 6th. Call of Duty: Modern Warfare III adds full ray tracing and DLSS 3.5 Ray Reconstruction feature to all online multiplayer lobbies from December 6th.

Meanwhile, CD Projekt Red is launching Cyberpunk 2077 Ultimate Edition this week, which debuts with enhanced ray tracing and DLSS support with Update 2.1 (available to both Ultimate and Standard edition players). This includes the Ray Tracing: Overdrive mode, which exits preview-feature status. This adds even higher amounts of ray traced surfaces. For NVIDIA RTX 40-series GPUs, these take advantage of Shader Execution Reordering and Opacity Micromaps features. The game also gets a major global illumination upgrade with Reservoir-based Spatiotemporal Importance Resampling Global Illumination (ReSTIR GI). Lastly, Cyberpunk 2077 now fully implements DLSS 3.5 Ray Reconstruction on NVIDIA RTX 40-series "Ada" and 30-series "Ampere" GPUs, which vastly improves the quality of ray traced elements, such as reflections with supersampling enabled.



View at TechPowerUp Main Site
 
Joined
Nov 27, 2023
Messages
2,329 (6.40/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
“full ray tracing”
“online multiplayer lobbies”
Bold strategy, let’s see if it pays off for ‘em.
 

FreedomEclipse

~Technological Technocrat~
Joined
Apr 20, 2007
Messages
24,050 (3.74/day)
Location
London,UK
System Name DarnGosh Edition
Processor AMD 7800X3D
Motherboard MSI X670E GAMING PLUS
Cooling Thermalright AM5 Contact Frame + Phantom Spirit 120SE
Memory 2x32GB G.Skill Trident Z5 NEO DDR5 6000 CL32-38-38-96
Video Card(s) Asus Dual Radeon™ RX 6700 XT OC Edition
Storage WD SN770 1TB (Boot)| 2x 2TB WD SN770 (Gaming)| 2x 2TB Crucial BX500| 2x 3TB Toshiba DT01ACA300
Display(s) LG GP850-B
Case Corsair 760T (White) {1xCorsair ML120 Pro|5xML140 Pro}
Audio Device(s) Yamaha RX-V573|Speakers: JBL Control One|Auna 300-CN|Wharfedale Diamond SW150
Power Supply Seasonic Focus GX-850 80+ GOLD
Mouse Logitech G502 X
Keyboard Duckyshine Dead LED(s) III
Software Windows 11 Home
Benchmark Scores ლ(ಠ益ಠ)ლ
going to be real honest here. I thought the article was going to be about Nvidia giving away copies of CoD with their new GPUs....

A free copy of that game isnt adding any extra value to the purchase. You probably wouldnt even be able to re-sell it because its such a garbage title. bundling a copy of the game with the card is like double negative. Not only are the GPus stupidly priced but they devalue it even further by bundling a garbage game with it.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
They've redefined and revolutionized ray tracing reconstruction frame generating whatever the hell so many times that I've lost count by now.
 
D

Deleted member 229121

Guest
So "gifting" features that should be standard for AAA titles...
That's gotta be the most arrogant piece of PR trash I've seen in a while.
 
Last edited by a moderator:
Joined
Oct 18, 2019
Messages
413 (0.22/day)
Location
NYC, NY
I was fortunate that I was able to get a 3090 during Cyberpunk's release week. I replaced my 2080Ti and saw immediate framerate improvements on "Psycho mode".

Now, many years later, this game is finally ready for mainstream and even the metro-rail works with the latest patches.

I played through the game twice experiencing minor bugs - but nothing game breaking.

Easily the best looking game, with the most immersive music I'd ever played.

Once the game is fully finished, I'll play through one more time on my 14900k / 4090/ 64GB DDR5.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Gonna need to give CP2077 another boot up to see these enhancements in play, not holding my breath for anything insane, adding yet more reflective surfaces seems a bit lol, the game is crazy reflective as it is already.
 
Joined
Feb 15, 2019
Messages
1,658 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Translation: Moar $$ pumped in 2077 tech demo.

tbh,
I 've play 2077 for one playthough and never touched it again.
This game had gone too far of being a Nvidia tech demo,
and don't even bother to put essential features like a proper New game + mode in the game.
I know the 2.0 is like a whole new game and DLC looks nice,
But looking at my 100 hours playtime and I can't get anything out of it because there isn't a New game+ mode?

Nope,
You can sit right there, bottom of my steam library.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,744 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Translation: Moar $$ pumped in 2077 tech demo.

tbh,
I 've play 2077 for one playthough and never touched it again.
This game had gone too far of being a Nvidia tech demo,
and don't even bother to put essential features like a proper New game + mode in the game.
I know the 2.0 is like a whole new game and DLC looks nice,
But looking at my 100 hours playtime and I can't get anything out of it because there isn't a New game+ mode?

Nope,
You can sit right there, bottom of my steam library.

Cyberpunk isn't an Nvidia tech demo, at launch, it was horribly broken, and arguably a game that missed its mark (and wasn't what was initially promised, I hated it for the longest time, although I think i'm beginning to forgive as it's really showing how the devs have worked on it a lot) but as it got fixed and optimized, all that it does is badly expose how crude the Radeon cards are when it comes to latest-generation graphics techniques. AMD just doesn't compete here.
 
Joined
Feb 15, 2019
Messages
1,658 (0.79/day)
System Name Personal Gaming Rig
Processor Ryzen 7800X3D
Motherboard MSI X670E Carbon
Cooling MO-RA 3 420
Memory 32GB 6000MHz
Video Card(s) RTX 4090 ICHILL FROSTBITE ULTRA
Storage 4x 2TB Nvme
Display(s) Samsung G8 OLED
Case Silverstone FT04
Cyberpunk isn't an Nvidia tech demo, at launch, it was horribly broken, and arguably a game that missed its mark (and wasn't what was initially promised, I hated it for the longest time, although I think i'm beginning to forgive as it's really showing how the devs have worked on it a lot) but as it got fixed and optimized, all that it does is badly expose how crude the Radeon cards are when it comes to latest-generation graphics techniques. AMD just doesn't compete here.
All it does is throwing all the latest and horribly optimized visuals into one giant cappuccino.
They did it when it launches, they still doing it today.

It isn't a AMD - Nvidia thing either.

As a Nvidia user I felt being deceived to see they just keep throwing all these artificial performance barriers to us.
Just to keep our latest graphics cards at lowest FPS as possible.

At the same time, lacking in implementing basic features like a proper New game + mode.

And btw, about the tech demo thing, it is almost common sense now 2077 is THE Nvidia tech demo.
It is too obvious and not worth arguing.
 
Joined
Dec 25, 2020
Messages
6,744 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
All it does is throwing all the latest and horribly optimized visuals into one giant cappuccino.
They did it when it launches, they still doing it today.

It isn't a AMD - Nvidia thing either.

As a Nvidia user I felt being deceived to see they just keep throwing all these artificial performance barriers to us.
Just to keep our latest graphics cards at lowest FPS as possible.

At the same time, lacking in implementing basic features like a proper New game + mode.

And btw, about the tech demo thing, it is almost common sense now 2077 is THE Nvidia tech demo.
It is too obvious and not worth arguing.

Without getting into that merit as I always considered Cyberpunk to be a bad game, there are no artificial barriers, just extremely costly graphical improvements that aren't worth it on current generation hardware.

It's no tech demo, really. It's a game that makes extensive use of the very latest rendering techniques, that's why it performs so badly on AMD. It exemplifies the 95% rule right now.

We've reached the comfy point in photorealism vs. performance relatively recently but further improvements will require exponentially more compute performance backed by ever more complex graphics drivers.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,048 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
We've reached the comfy point in photorealism vs. performance relatively recently but further improvements will require exponentially more compute performance backed by ever more complex graphics drivers.

Hmm, that wasn't the Cyberpunk 2077 I played. I'd say there were better games out there, many using 'old-school' techniques to create immersive realism. Too many dull slab surfaces in the game and low res details to be called a comfy point in photo-realism versus performance, at least for me.

Games like RE Village are surprisingly impressive. I loved Days Gone. The major improvement in CP 2077, IMO, is, ironically, all the flashy neon lights. But overall, I don't think the game is visually any better than a lot of others from the past 5 years or so. Hell, the first game that made me scrutinise the screen was Doom 3, way back in 2004. It was visceral.

RT/PT will maybe one day be the norm, but I don't think test driving it in games is ideal. Once someone comes up with a proper and efficient hardware solution, then it would be better to see. Until then, I feel it's still very much like putting candy sprinkles on top of a cake. FWIW, I was an early defender of CP 2077 and played through with very few bugs, albeit at about 40fps on my GSync monitor.
 
Joined
Dec 25, 2020
Messages
6,744 (4.71/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Hmm, that wasn't the Cyberpunk 2077 I played. I'd say there were better games out there, many using 'old-school' techniques to create immersive realism. Too many dull slab surfaces in the game and low res details to be called a comfy point in photo-realism versus performance, at least for me.

Games like RE Village are surprisingly impressive. I loved Days Gone. The major improvement in CP 2077, IMO, is, ironically, all the flashy neon lights. But overall, I don't think the game is visually any better than a lot of others from the past 5 years or so. Hell, the first game that made me scrutinise the screen was Doom 3, way back in 2004. It was visceral.

RT/PT will maybe one day be the norm, but I don't think test driving it in games is ideal. Once someone comes up with a proper and efficient hardware solution, then it would be better to see. Until then, I feel it's still very much like putting candy sprinkles on top of a cake. FWIW, I was an early defender of CP 2077 and played through with very few bugs, albeit at about 40fps on my GSync monitor.

That's the whole idea behind the 95% rule I mention. Another game showing this is Alan Wake 2, they have such high system requirements, but they don't really look like they're any more photorealistic than a lot of other games we've played already, yeah. That's the mark to overcome, but the challenges to do that are immense and more often than not, aren't worth the investment or limiting one's audience to the latest generation hardware owners over it.

Developers got really good at traditional raster graphics, even doing atmospherics that way. Supposedly, the adoption of full blown path traced graphics is intended to simplify the development cycle while somewhat increasing realism, albeit a massive compute performance cost.

I don't think pushing the boundaries makes of a game a tech demo but I also recognize it's a massive outlier especially when it receives all of the very latest rendering techniques and supports them fairly well, perhaps that's my point
 
Top