• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alan Wake II System Requirements Released, Steep RT Requirements Due to Path Tracing

TheNightLynx

New Member
Joined
Oct 17, 2023
Messages
10 (0.03/day)
This is a reflection of the terrible state of PC games. You need DLSS/ FSR performance mode to get playable framerates is ridiculous. I don't know how well a game that runs badly on most systems will sell. In fact, a PS5 game like Spider Man 2 seems to offer very good visuals on very low end hardware in 2023 without resorting to upscaling at miserable resolution. DLSS/ FSR at Performance mode = to very low resolution.
Consider that PC hardware and console hardware was never be so similar in the past. Same CPUs, same GPUs and in case of Microsoft same development environment and APIs. Imho the real problem is not the multiplatform dev style.
I think the problem is in the videogame business that is became so biiig.
Why optimize if the title sell however.
Bugs ? If any annoying we will release some patches otherwise keep the bugs.
So we buy eternally unfinished games that needs patches for years to became what they should have be on dey one. And after 2-3 years of patches also medium hardware can run well.
 
Joined
Jun 29, 2023
Messages
538 (1.18/day)
Location
Spain
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?
We've woken up, or at least I have anyway.

I also know that a good number of dlss fans don't want to use dlss to attain playable framerates, instead they wanted to use dlss as a way to supercharge performance, which we know is not the developers intention anymore.

Or you have the people like me that realized how nice playing the game at a native resolution is, free of most motion artifacts and sharp as can be.
 
Joined
May 11, 2018
Messages
1,152 (0.49/day)
"The way it's meant to be played!"

In the era of zero or even negative price / performance increases with new generation there is verry little demand for upgrade - unless your old card suddenly doesn't cut it even for measly 1080p at 60 Hz without upscaling.

I wonder how many" badly optimized" new games have actually been intentionally made "to be the new Crysis", without the actual visual generational leap in image quality Crysis brought...
 
Joined
Jan 14, 2019
Messages
11,547 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
... a good number of dlss fans don't want to use dlss to attain playable framerates, instead they wanted to use dlss as a way to supercharge performance ...
I think that's stupid, but each to their own, I guess.

Or you have the people like me that realized how nice playing the game at a native resolution is, free of most motion artifacts and sharp as can be.
I realised this the very first moment I tried DLSS in Cyberpunk 2077. Whatever anybody says, upscaling will always be a last resort to get acceptable performance in my opinion.
 
Joined
Mar 28, 2020
Messages
1,723 (1.05/day)
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
Yes, VRAM usage goes up especially in an open world game, but I don't believe Remedy makes open world games. Alan Wake and Control are not open world games as far as I can recall.

While it is knowned that Remedy tends to push graphical requirements with each game release, I cannot help but feel that they have gone way over the limits here. They are not selling a game anymore. They are selling a graphic showcase. What is the point of releasing a "beautiful game" when only a small handful of people can truely enjoy the eye candy. It is good to test the boundary somewhat, but this is way overboard when you see 4K path tracing requiring a RTX 4080 @ DLSS performance mode (aka 1080p upscale) in order to play @ 60 FPS. Seriously... I am voting with my wallet to not bother buying this game because it just means I need to shell out even more money for a better graphic card.
 
Joined
Oct 12, 2010
Messages
32 (0.01/day)
Location
Genova, Italy
System Name Too old for this
Processor Threadripper 3960X
Motherboard Asus Zenith II Extreme
Cooling EK ZIIE Monoblock
Memory 8x8gb G.Skill Trident Z Neo 3600mhz C16
Video Card(s) Asus 3090 EK
Storage Sabrent Rocket 4 1tb + Rocket 3 2tb
Display(s) LG 65G26LA + Asus PG35VQ
Case Asus Helios
Audio Device(s) Creative AE9 + Edifier Luna + Marantz Cinema 40 + Monitor Audio Silver 500
Power Supply Asus Thor 1200W
Mouse Corsair M85 Elite
Keyboard Corsair K70 Mk.2
That's because upscaling is every gamer's dream, apparently.

Where are all the DLSS fans now?

maybe those fans doesn't exsist and they actually are simple evangelists working for the businessman that cannot sell anymore the persistent increase of resolution or of fps/hz forever, so the new plan is to sell the old resolutions upscaled to became the new future effectively killing low to medium budget pc gaming.
yesterday you play at 1080p@1080p, today you play 1080p@4k, tomorrow you will play 720p@4k, the day after tomorrow 800*600@8k and so on.

it's so nostalgic remembering the old times where I wasn't able to play some games (Civ3 is the first that I clearly remember) because my monitor (or video card drivers idk) was stuck @800*600 and the minimum required resolution was 1024*768... Now the table is turning...
 
Joined
Apr 19, 2018
Messages
1,227 (0.52/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
And everyone here bashing me a week ago because I said the RTX40x0 series is not powerful enough for the latest games, and a "super" variant is a waste of money for just 10% more perf. Not so great if you're only getting 4K20fps, now you can have 4K23FPS for just $1200!

The RTX40x0 series is soooo over!
 
Joined
Jan 14, 2019
Messages
11,547 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
And everyone here bashing me a week ago because I said the RTX40x0 series is not powerful enough for the latest games, and a "super" variant is a waste of money for just 10% more perf. Not so great if you're only getting 4K20fps, now you can have 4K23FPS for just $1200!

The RTX40x0 series is soooo over!
It's equally ironic to get all the bashes for saying that "DLSS is not a QoL feature" and then looking at all the booing Remedy gets for requiring DLSS for AW2 to run. I mean, what happened with DLSS being a QoL feature? If it really is one, then having to enable it for decent framerates isn't a problem, surely? :rolleyes:
 
Joined
Apr 19, 2018
Messages
1,227 (0.52/day)
Processor AMD Ryzen 9 5950X
Motherboard Asus ROG Crosshair VIII Hero WiFi
Cooling Arctic Liquid Freezer II 420
Memory 32Gb G-Skill Trident Z Neo @3806MHz C14
Video Card(s) MSI GeForce RTX2070
Storage Seagate FireCuda 530 1TB
Display(s) Samsung G9 49" Curved Ultrawide
Case Cooler Master Cosmos
Audio Device(s) O2 USB Headphone AMP
Power Supply Corsair HX850i
Mouse Logitech G502
Keyboard Cherry MX
Software Windows 11
It's equally ironic to get all the bashes for saying that "DLSS is not a QoL feature" and then looking at all the booing Remedy gets for requiring DLSS for AW2 to run. I mean, what happened with DLSS being a QoL feature? If it really is one, then having to enable it for decent framerates isn't a problem, surely? :rolleyes:
DLSS looks awful (compared to native) on a large 4K display. I also consider it cheat-marketing, at best it should only be suitable for a 4070 and lower card to require upscaling in order to get 1440 60fps, a high-end gaming card should not require DLSS in order to meet performance expectations. If I wanted to play at 1080p60fps, then that's the kind of monitor I would have.

I can't wait for the 50x0 series to launch, let's hope Nvidia does not hike the prices again, or hobble the performance of anything lower than the 90 series, like they did to the 4080 this time.
 
Last edited:
Joined
Jul 5, 2013
Messages
26,646 (6.50/day)
It seems to be wrong information, the gap between 3060 and 3070 is big.... and where is the 3080 or the 1440p at 60fps? yet they prefer 4k?...??
I have to agree, that graph seems a bit off.
Nothing here makes sense, no wonder people in r/nvidia are roasting them to hell
This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.

View attachment 318395

I hate this, can we stop doing this
Yes, they really should stop doing that.
 
Joined
Jan 14, 2019
Messages
11,547 (5.55/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Joined
Apr 14, 2022
Messages
724 (0.81/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)

In short, Alan Wake 2 will be one of the first games that will take full advantage of DX12 Ultimate.

Ironically, a lot of PC gamers have been wondering when they’d see a game that supports Mesh Shaders. And now that a game requires them, the exact same people are crucifying Remedy.

For what it’s worth, Remedy has been constantly pushing the graphical boundaries of PC games. Remember Quantum Break? That game came out in 2016, and it took it four years until it was playable at 4K. CONTROL was also one of the best-looking games of its time. And now Alan Wake 2 will have Path Tracing effects.

Again, it’s really funny witnessing PC gamers constantly asking for a new “Crysis” game. And when a “Crysis” game does appear on the horizon, those same folks start calling it an “unoptimized mess”. Here is a fun fact. Crysis WAS unoptimized when it came out (due to its awful CPU utilization as it was single-threaded).
 
Joined
Aug 21, 2013
Messages
1,873 (0.46/day)
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)
Crysis also looked like nothing else back in the day with proper physics and amazing graphics.

These days games get released with absurd requirements but when you look at the graphics they offer you're left scratching your head as to why the requirements are so high when the graphics themselves are nowhere near wow level like Crysis was back in the day.
 
Joined
Apr 14, 2022
Messages
724 (0.81/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Crysis also looked like nothing else back in the day with proper physics and amazing graphics.

These days games get released with absurd requirements but when you look at the graphics they offer you're left scratching your head as to why the requirements are so high when the graphics themselves are nowhere near wow level like Crysis was back in the day.
I think only the UE games look like shxt (not all obviously) because the developers and even Epic do not have a clue about how to make it work consistently well.
Most of the in house game engines deliver exceptional graphics. A Plague Tale series, Cyberpunk, all Remedy games etc.
Even the plasticky graphics from Sony engine (PS) are ok and better than any Crysis.

The thing is that games were way more simple back then and just a bump in texture resolution or add cube maps/screen space reflections or a bit more detailed models and they were transformed. We are past that. That's why raster graphics are dead. We have accomplished the mission to have extremely good models and textures. The lighting has always been the problem and although we knew the solution, we didn't have the hardware to run this. After decades, we run RT or PT games. Yes in 1080p and that's a success.

If gamers don't like that, we can go back, revive the MXs and stop even using pixel shaders in games.
 
Joined
Aug 21, 2013
Messages
1,873 (0.46/day)
I think only the UE games look like shxt (not all obviously) because the developers and even Epic do not have a clue about how to make it work consistently well.
UE5 graphics look great in demos and showcases but we have very few UE5 games to really compare.
Most of the in house game engines deliver exceptional graphics. A Plague Tale series, Cyberpunk, all Remedy games etc.
The strength of the in-house engines is that they're best suited for the task. Not their graphics.
Even the plasticky graphics from Sony engine (PS) are ok and better than any Crysis.
Comparing to game that came out in 2007 is not much of a benchmark to pass.
That's why raster graphics are dead. We have accomplished the mission to have extremely good models and textures.
Raster is not dead and wont be for a long time. Even the games you bring up as examples here are still hybrids of raster and RT with some PT experiments.

Also this works both ways. Since raster has been perfected so much over the years people cant always tell the difference or they may actually prefer raster to more accurate methods. For example RT/PT often makes the scene darker. Yes it may be more realistic but people are not always playing games for their realism.
The lighting has always been the problem and although we knew the solution, we didn't have the hardware to run this.
Lighting is the least of the problems that impact realism of the games. Animations and physics matter much more than accurate reflections or light bounces.
What good is a PT game when a dumb robotic NPC stumbles onto the scene and opens it's mouth with bad lip syncing? And just like that the "magic" is gone.
Or you throw a grenade into a coffee cup and all that happens is a small black strain on an intact coffee cup.
If gamers don't like that, we can go back, revive the MXs and stop even using pixel shaders in games.
That's BS argument. So if i don't like anything running at 720p 30fps on a reasonably priced midrange card then i should go back to pre-pixel shader era?
How about i go "back" to native 4K 240fps with barely noticeable difference in some graphical elements (like shadows) with much higher resolution and framerate?

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.
 
Joined
Oct 16, 2023
Messages
143 (0.41/day)
System Name -
Processor 5800x3D
Motherboard AsRock Extreme 4
Cooling Air
Memory 32GB
Video Card(s) 7900 XT
Storage 7.5 TB of SSD storage
Display(s) BenQ XL2746S DyaC+ 240hz . Phillips Momentum 1440P 165hz
Case 4000D AF modded.
Audio Device(s) External Fio K3
Power Supply 850W
Mouse XTRFY MZ1
Keyboard Mechanical
VR HMD -
Software Linux Nobara. Windows 11.
Benchmark Scores A few points.
Obviously most of you do not realize how the games are developed and that in order to improve one thing, in the scale of the game environment, multiple times the performance of a gpu is required.
Remedy has always pushing the boundary in graphics and now the only way to do this is by using the nVidia's tech.

The performance of the gpus is increasing in every gen by 20-40% while the requirements of just a reflection with slightly higher res may require 12 times the performance of the fastest gpu available. Just for a reflection. No real time shadows, no global illumination, no ambient occlusion or complete path traced rendering.

So, yes. It's completely justified that a 4080 can run this game at 1080p upscaled to 4K, if you understand what is required to be calculated in the background.
Am I happy with it? No! But we cannot beat physics and go from 6nm to 0.000003nm or improve the IPC by 300K% in one-two gens.

To end. I'm happy when some companies push the boundary and take advantage of the best available hardware and i'm ok if I cannot play it right away. But the gamers have to read, think and realize why this happens before post or complain about requirements and resolutions etc.

Alan Wake 2 is one of the first games to require DX12 Ultimate's Mesh Shaders (dsogaming.com)
Bro... this is how FM ran on recommended hardware for ULTRA RT + 4K at 1080P ...



An update later and it is still jank and drops to 40 FPS at 1080P, not even at 4K.



Then look at the visuals and tell us... it's truly world apart in visuals for this kind of bad performance on top performing hardware?
 
Joined
Sep 25, 2023
Messages
137 (0.37/day)
Location
Finland
System Name RayneOSX
Processor Ryzen 9 5900x 185w
Motherboard Asus Strix x570-E
Cooling EKWB AIO 360 RGB (9 ekwb vardar RGB fans)
Memory Gskill Trident Z neo CL18 3600mhz 64GB
Video Card(s) MSI RTX 3080 Gaming Z Trio 10G LHR
Storage Wayy too many to list here
Display(s) Samsung Odyssey G5 1440p 144hz 27 x2 / Samsung Odyssey CRG5 1080p 144hz 24
Case LianLi 011D white w/ Vertical GPU
Audio Device(s) Sound Blaster Z
Power Supply Corsair RM1000x (2021)
Mouse Logitech G502 Hyperion Fury
Keyboard Ducky One 2 mini Cherry MX silent
VR HMD Quest 2
Software Win10 Pro / Ubuntu 20.04
Benchmark Scores Timespy 17 219 https://www.3dmark.com/spy/49036100
I have to agree, that graph seems a bit off.
Time will tell tbh, once it's out the benchmarks will drop and if it's a badly optimized game.. They will get bombed one way or another.

This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.
Oh no, totally, it's not an nvidia issue, I didn't meat make it seem that way, People often discuss new game releases and many other things in r/nvidia. It seems to be a good place to debate most of the time, compared to other tech subs.
 
Joined
Jul 13, 2016
Messages
3,180 (1.06/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
This is not NVidia's problem, it's the folks at Remedy. Besides, Redditers are over-whiny for a start anyway.

Can anyone say conclusively whether Nvidia sponsoring this game or other titles isn't pushing requirements up? Game Sponsorship in general is bad for the industry. Both Nvidia and AMD use it as a tool to control the performance and features of games. AMD was very likely limiting DLSS in starfield prior to the blowback and Nvidia is without a doubt using it to manipulate games to it's favor and to increase sales. Nvidia has gotten Gigabyte, ASUS, and MSI to limit it's top SKUs to Nvidia only cards. I'm willing to be they could easily get game devs to target 30 FPS for the base game unless people enable DLSS. Now that sells cards.
 
Joined
Sep 17, 2014
Messages
21,997 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
RAM consumption increases significantly on GPUs with little Vram available, if this is pushed to the limit. Also, Games with large, expansive environments, such as open worlds, need to load and maintain information about large areas of the game in memory. This may increase RAM consumption.

So yes, sometimes there is an impact on RAM consumption, depending on the configurations and hardware.
Sure but how do you capture that in a spec sheet that also specifies what amount of VRAM you need. You go under, you get shit perf. Done

Fact is, you can make a case for more RAM anytime of the day but you simply don't need it if you have a well balanced system. For gaming, it seems the norm is 16GB and 24 or 32 won't help you. And from personal experience... the only games that chew RAM are city sims that you take far into late game. That's an outlier.

Can anyone say conclusively whether Nvidia sponsoring this game or other titles isn't pushing requirements up? Game Sponsorship in general is bad for the industry. Both Nvidia and AMD use it as a tool to control the performance and features of games. AMD was very likely limiting DLSS in starfield prior to the blowback and Nvidia is without a doubt using it to manipulate games to it's favor and to increase sales. Nvidia has gotten Gigabyte, ASUS, and MSI to limit it's top SKUs to Nvidia only cards. I'm willing to be they could easily get game devs to target 30 FPS for the base game unless people enable DLSS. Now that sells cards.
Neh... not sure I totally agree there. It does feed dev budgets. It has also pushed ahead some features we later got everywhere. And at the same time, its a bit like the purchased review.

Also, game sponsorship is inherently also developers and GPU manufacturers having 'the dialogue' which I think is great for an industry. The world revolves around money, money opens doors.

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.
Yep... as above... the world revolves around money. RT is clearly a corporate push to extract more of it. And look where Nvidia is right now, money wise. Look where gaming is, in tandem. Not looking good. Can't last.

But, this has been my line of thinking ever since RT was announced and how AMD has responded to it. I think AMD's strategy is sound, and I think Nvidia's is a gamble. Since that moment I've only been reinforced in my stance. AMD is still selling console GPUs, they created a chiplet GPU and they've got a stable environment going on driver wise, while keeping largely in check with Nvidia's performance movement. In the meantime they're expanding on that technology lead by fusing CPU and GPU together on chips with much better yields. This is as steady as it gets. RT? Whatever. RDNA3 runs it too, but doesn't rely on it to sell.

"The way it's meant to be played!"

In the era of zero or even negative price / performance increases with new generation there is verry little demand for upgrade - unless your old card suddenly doesn't cut it even for measly 1080p at 60 Hz without upscaling.

I wonder how many" badly optimized" new games have actually been intentionally made "to be the new Crysis", without the actual visual generational leap in image quality Crysis brought...
Yeah... put this nonsense next to what Crysis was in terms of advancements. Both 1 and 3. Its hilarious. They still haven't really, convincingly surpassed it, have they. Sure, resolution go up. But what else? FPS go down :p
 
Last edited:
Joined
Apr 14, 2022
Messages
724 (0.81/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
UE5 graphics look great in demos and showcases but we have very few UE5 games to really compare.

The strength of the in-house engines is that they're best suited for the task. Not their graphics.

Comparing to game that came out in 2007 is not much of a benchmark to pass.

Raster is not dead and wont be for a long time. Even the games you bring up as examples here are still hybrids of raster and RT with some PT experiments.

Also this works both ways. Since raster has been perfected so much over the years people cant always tell the difference or they may actually prefer raster to more accurate methods. For example RT/PT often makes the scene darker. Yes it may be more realistic but people are not always playing games for their realism.

Lighting is the least of the problems that impact realism of the games. Animations and physics matter much more than accurate reflections or light bounces.
What good is a PT game when a dumb robotic NPC stumbles onto the scene and opens it's mouth with bad lip syncing? And just like that the "magic" is gone.
Or you throw a grenade into a coffee cup and all that happens is a small black strain on an intact coffee cup.

That's BS argument. So if i don't like anything running at 720p 30fps on a reasonably priced midrange card then i should go back to pre-pixel shader era?
How about i go "back" to native 4K 240fps with barely noticeable difference in some graphical elements (like shadows) with much higher resolution and framerate?

Yes i get that eventually games will incorporate more RT and PT but that's a long way off even with various upscaling and frame generation hacks to get it running at reasonable framerates. Also the hardware capable of doing this needs to come down in price not go up in price like Nvidia (and AMD too) have done. That is the ONLY way things go mainstream. Closed ecosystems and higher prices are exactly what killed VR. Yes it's still there but it's not mainstream.
The same will happen to RT and PT if things continue as they are.

I was talking about UE4, not 5. Although the engine is capable of creating some great results, it's extremely inconsistent. I don't blame it though that much. UE4 is an old engine and asking it to include RT, PT or whatever 2023 tech is already too much.

I agree, the in house engine are best suited for the task. But the result is the graphics.
Like in A Plague Tale: You couldn't get millions of rats running around in a general purpose game engine. You have to reinvent a part of the engine or develop a new on from scratch, so it can handle it.

The fact that raster has been perfected, does not mean that we stay with the prebaked lighting. We advance and we want more dynamic elements. At first is the lighting, then the physics etc. etc.

The fact that we have hybrid RT/PT games, shows how difficult it is to calculate these things in real time. And that's how the requirements are justified. As long as there is a normal setting in game, so the gamers can actually play the game, I'm fine with the extreme requirements for the RT/PT level of settings.
 
Joined
Jul 13, 2016
Messages
3,180 (1.06/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Neh... not sure I totally agree there. It does feed dev budgets.

No one knows the exact amount a sponsorship contributes to a game's budget. One thing is for sure though, the larger the contribution a sponsorship makes to a game the more those devs are beholden to that money. It does feed dev budget but at the cost of features being tailored towards what AMD / Nvidia want and not what the devs / gamers want. It doesn't take a leap of faith to see that AMD / Nvidia will tailor features to sell cards.

It has also pushed ahead some features we later got everywhere. And at the same time, its a bit like the purchased review.

You can't make this statement definitively, it's impossible to tell whether a feature would or would not have been added sans sponsorship. The industry has innovated a vast majority of game features without the need of sponsorships. Therer are far more instances of sponsorships leading to low performance or low quality features as compared to what the industry is already capable of. FXAA, Hairworks, and PhysX among others. PhsyX of which worked perfectly fine via a CPU code path until Nvidia nuked that and sponsor forced the nutuered version into a bunch of games.

Also, game sponsorship is inherently also developers and GPU manufacturers having 'the dialogue' which I think is great for an industry. The world revolves around money, money opens doors.

I don't see how Nvidia and AMD telling devs what to implement is better dialog than them plainly speaking to either company. It's not like that communication can only happen over contracts, both companies have engineers just for game optimization for their GPUs.
 
Joined
Jun 14, 2020
Messages
3,275 (2.09/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
No one knows the exact amount a sponsorship contributes to a game's budget. One thing is for sure though, the larger the contribution a sponsorship makes to a game the more those devs are beholden to that money. It does feed dev budget but at the cost of features being tailored towards what AMD / Nvidia want and not what the devs / gamers want. It doesn't take a leap of faith to see that AMD / Nvidia will tailor features to sell cards.



You can't make this statement definitively, it's impossible to tell whether a feature would or would not have been added sans sponsorship. The industry has innovated a vast majority of game features without the need of sponsorships. Therer are far more instances of sponsorships leading to low performance or low quality features as compared to what the industry is already capable of. FXAA, Hairworks, and PhysX among others. PhsyX of which worked perfectly fine via a CPU code path until Nvidia nuked that and sponsor forced the nutuered version into a bunch of games.



I don't see how Nvidia and AMD telling devs what to implement is better dialog than them plainly speaking to either company. It's not like that communication can only happen over contracts, both companies have engineers just for game optimization for their GPUs.
Can you give me some examples of nvidia sponsorship harming a game? Please, don't go 10 years back, something recent.

AFAIK all nvidia sponsored games run pretty great on competitors cards, they are well optimized and they look good, they support competitors features etc . None of those can be said about amd sponsored games like godfall forspoken jedi immortals etc.
 
Top