• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Several AMD RDNA 4 Architecture Ray Tracing Hardware Features Leaked

Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
In 2023 Nvidia had a $7.4 Billion R&D budget that almost the entirety was spent on GPU development.

In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.

Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because rhe way 99% of you speak about it, you act like these two companies are on a level.playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that tmthr current capitalist paradigm is stock price above all and quarterly earnings above all....tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.

The other lot of you act like it's merely a willpower problem, rhat AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86?
If amd spends a fraction of nvidia's r&d shouldn't their cards also cost a fraction of the price?
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me
AMD can always just build a huge chip on the smallest node available like Nvidia does all the time, it's not that difficult, the issue is people still wouldn't buy it in the same quantity that they would an equivalent Nvidia GPU, so it simply does not make sense from a business stand point.

Those are the averages.
Right because picking the extremes is more realistic :roll:

As usual it's from bad to worse with you every time you post something.
 
Joined
Jun 14, 2020
Messages
3,457 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
AMD can always just build a huge chip like Nvidia does all the time, it's not that difficult, the issue is people still wouldn't buy in the same quantity that they would an equivalent Nvidia GPU, so it simply does not make sense from a business stand point.


Right because picking the extremes is more realistic :roll:

As usual it's from bad to worse with you every time you post something.
If you want a 4080 super you can get it for 999. If you want an xtx you can also get it for 999. You lied and suggested otherwise, let's get over it.
 
Joined
Jan 5, 2006
Messages
18,584 (2.69/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
Will most likely do a Arrow Lake platform upgrade end this year though.
Wait for BTL (to be released in Q3 2025).

With just a BIOS update, you could get a performance uplift that matches the 9800X3D so don't make the upgrade to the ARL platform just yet (new LGA1851 MB required also).
 
Joined
Mar 21, 2016
Messages
2,508 (0.79/day)
RT boils down to where you place the bar of expectations. I'm far more excited about AI. It's more readily usable to be leveraged at least parts of it. That said it's contingent on developers doing so in the right ways. I can foresee a lot of cool stuff coming down the pike with AI and games. It'll be fascinating to see what triple A developers do with AI in the coming years because it's absolutely on the horizon. It's much too powerful a tool to leverage and insert and utilize creatively. Like you can already run python scripts for Minecraft as one example which should send your mind and imagination into a tailspin if you grasp that.
 
Joined
May 31, 2016
Messages
4,437 (1.43/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Still to early for RT in my opinion but let the companies compete.
For me, the level of RT performance is still not enough for the mid range cards to consider it an improvement and pay so much for it.
Because it does not bring that much difference to an image quality and a gameplay itself as it takes away in FPS.
I will look closely how it goes though.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
As usual it's from bad to worse with you every time you post something.
Oh the irony, I damn near spat out my drink :roll:
RT boils down to where you place the bar of expectations
I think this is key. Some are happy to adopt early and play with bleeding edge features, some want it to be thoroughly normalised and be able to buy much lower end hardware and still enjoy a modern feature set. Neither side of that coin should make blanket statements as if it should or does apply to everyone.
 
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
For me, the level of RT performance is still not enough for the mid range cards to consider it an improvement and pay so much for it.
Because it does not bring that much difference to an image quality and a gameplay itself as it takes away in FPS.
The same.
 
Joined
Jun 27, 2019
Messages
2,109 (1.07/day)
Location
Hungary
System Name I don't name my systems.
Processor i5-12600KF 'stock power limits/-115mV undervolt+contact frame'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/@950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
You definitely don't need a 4090 to play rt games. Entirely depends on your resolution. I'm playing rt on my 3060ti, for example hogwarts.

DLSS only drops quality to amd users, whoever has an nvidia card prefers it over native. Heck I activate it even in games that can easily play native.
Same here with a 3060 Ti, I've played and finished multiple games with RT on and it was perfectly enjoyable for me with DLSS on Quality.
Control,Cyberpunk with tweaked settings and some RT stuff on Ultra, Ghostwire Tokyo and some other smaller RT games.

Also the same with DLSS in general, I enable it even when I don't need the extra performance cause even if nothing else at least it fixes the flickering issues and gets rid of the crappy TAA which does look worse to me in most games.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Oh the irony
Irony is when you post factually correct information, average TPU discourse never going above double digit IQ takes, keeping it classy.
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
AMD lost a generation by not improving RT performance with 7000 series. Who knows, maybe they didn't have enough time and resources to do so back then.
As a counterpoint, how many games on a mid-range RTX card are actually better with RT?

I've played several RTX showcase titles on a 3090 and 4070S, and for the most part they look worse. Sure, in a pretty static screenshot they look arguably better, but RT in motion with current gen hardware is kind ugly. RT reflections are an improvement over screen-space reflections IMO, but shadows and lighting done with RT instead of shaders is truly awful - there simply aren't enough samples per frame to give any kind of reasonable effect so shadows crawl like they have lice, lighting that is supposed to be smooth and static is lumpy and wriggles around. Any time you pan your view the new part of the scene can take a good 20-30 frames to look even remotely correct as the denoiser and temporal blender get their shit together to make it even vaguely the right colour and brightness.

If you don't believe me, find your favourite RT title, and take some screenshots of areas that are rich in RT shadows or occlusion as you pan the camera around, and then try to match one of those in-motion screenshots when you're not moving. The difference is stark and the image quality of the actual, in-motion RT you are experiencing in real gameplay is absolute dogshit.
 
Joined
Apr 14, 2022
Messages
745 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
If you have a mid range gpu, you just try RT reflections or shadows.
It's pointless to try any GI or indirect lighting on a mid range card unless you know what you're doing with other settings and you know what to expect.
The PC gamers should already know that. That's the reason they have PCs instead of stupid consoles.
The ability to recognize the bottleneck, the heavy part in a game etc.

RT is always better when you know what and when to enable it.
If you expect miracles, just get a PS5 and call it a day.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
keeping it classy.
It certainly wouldn't be the same without you.
rather focus on FSR.
The reality is, if they want market share from Nvidia they need to double down on both RT and FSR, and at least the signs are positive that it will be the case for both too, so here's hoping.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
but shadows and lighting done with RT instead of shaders is truly awful
It's not that they look awful but they radically change the look and feel the of the game in a negative way for something that you're meant to play in real time. Lighting in a game is not supposed to be ultra realistic in the same way lighting on a movie set isn't either, it's artificial and carefully controlled because otherwise movies would look terrible. If you then try to manipulate light sources to work around this in a game you've made the use of RT redundant as it's no longer realistic.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Ahh yeah fair enough, no big regrets then like the other thread?
Regrets? No, not at all. :) Although part of me misses the 7800 XT, I would regret not being able to spend to my heart's content when out and about a lot more.

Interesting, I didn't do eother of those , it wasn't extortionate and no RT games have been a slide show, I guess it's all relative.

Exactly this, my 3080 has allowed me to enjoy many games with RT effects on, and even taste playable path tracing, as you say a 7900XTX can do this but that's yesteryears performance so it's not all that impressive.
Because you said numerous times that you play with DLSS. Personally, I don't call anything with DLSS/FSR on "decent performance". If this is my only choice, then I'd rather play at native resolution with RT off. I prefer sharpness to pretty lights.
 
Joined
Dec 24, 2022
Messages
78 (0.11/day)
In 2023 Nvidia had a $7.4 Billion R&D budget that almost the entirety was spent on GPU development.

In 2023 AMD had an R&D budget $5.8 billion R&D budget that was primarily spent on x86 development as thet is by far their largest revenue stream.

Now, I want everybody in these comments that just assumes that AMD should just be able to match Nvidia explain to me, how that is to be accomplished. Because the way 99% of you speak about it, you act like these two companies are on a level playing field, that they have access to the same sort of resources, and that for AMD it's just a problem of "not pricing videocards cheap enough" while completely ignoring the fact that the current capitalist paradigm is stock price above all and quarterly earnings above all....

Tell me how AMD is supposed to go out there, undercut Nvidia at each tier by $150+ and still keep stock prices up and investors happy while quarterly profits decrease AND all the while LITERALLY paying either the same or even a higher costs than nvidia on the materials used to make the card (Nvidia probably gets components cheaper do to larger volume)....PLEASE explain that to me. If I remember correctly, Intel sold alderlake with a diminished profit margin, how has thst worked for them? Oh that's right, AMD surpassed them in value.

The other lot of you act like it's merely a willpower problem, that AMD just doesn't "want it bad enough", we'll please explain to me why AMD should be focusing on videocards when they make the overwhelming majority of money from x86? Why should they dump money into videocards when you consumers have proven in the past numerous times that even when they make a product that is OBJECTIVELY a better value, 90% of you STILL buy the Nvidia card (that's right, you're not as rational as you think you are an research into consumer psychology has proven this time and time again)? If I was a business, that wouldn't sound like a good investment to me...

We literally live in a world we're money and profit dictates reality, yet in over a decade observing these "discussion" I honestly cannot think of a single instance where anyone even addressed the fact that Nvidia just plain has more resources to compete with, which is arguably the MOST determinant factor in this competition.

The other part of it that seemingly everybody ignores is the fact that the overwhelming majority, 99% of all consumers, including ALL OF YOU, make purchasing decisions based on IRRATIONAL factors like how the product makes you "feel", and we KNOW that's true for videocards because even when AMD offers a compelling videocard that on paper is an OBJECTIVELY better value, the Nvidia competitor still outsells it 10 to 1.

I'm sure so much of this is motivated by FOMO as well as the fact that some of you probably just don't like the ID of coming online to forums like this and saying you have an AMD gpu so you buybthe Nvidia one because you want to be associated with the "winning side"...and don't laugh, because there is literally decades of consumer psychology research that proves the existence and primacy of these phenomenon. How are you going to get irrational consumers to switch to a competitor based on something that is rational like a a product bring a "better value"?
Thank you. It's well said. I'm amazed how AMD, being the smaller company, is able to compete with Intel and nVidia. The hard truth is, given the chance, AMD could do the same anticompetitive practices as the other two. Like you said, it's all about making the investors happy.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
If amd spends a fraction of nvidia's r&d shouldn't their cards also cost a fraction of the price?
Because manufacturing and shipping costs nothing? Have you seen what TSMC charges for wafers recently?
 
Joined
Apr 15, 2020
Messages
409 (0.24/day)
System Name Old friend
Processor 3550 Ivy Bridge x 39.0 Multiplier
Memory 2x8GB 2400 RipjawsX
Video Card(s) 1070 Gaming X
Storage BX100 500GB
Display(s) 27" QHD VA Curved @120Hz
Power Supply Platinum 650W
Mouse Light² 200
Keyboard G610 Red
The reality is, if they want market share from Nvidia they need to double down on both RT and FSR, and at least the signs are positive that it will be the case for both too, so here's hoping.
That's the way.
 
Joined
Nov 27, 2023
Messages
2,321 (6.38/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (23H2)
@Chrispy_ @Vya Domus
This reminds me of a news article couple of months back that I think I actually discussed this issue with you, @Chrispy_, when Diablo IV added RT support and especially touted awesome new RT shadows with included screenshots… and in every instance RT looked worse. Sure, it was “realistic”, but it was obvious that original crisp and high contrast shadows were deliberate in the part of the game artists to create a certain mood and make the scene readable at a glance from pseudo-iso perspective. RT shadows just looked muddy and undefined instead and made the whole image, funnily enough, look lower res overall.

I don’t have anything against RT, I just feel that most current implementation are analogous to tasteless over-saturated ENB packs and 8K textures for old games - it’s missing the point and mostly makes things look worse, but gamers with no taste lap it up because it’s ostensibly new and high tech, so it must be better.
 
Joined
Jan 14, 2019
Messages
12,337 (5.76/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
@Chrispy_ @Vya Domus
This reminds me of a news article couple of months back that I think I actually discussed this issue with you, @Chrispy_, when Diablo IV added RT support and especially touted awesome new RT shadows with included screenshots… and in every instance RT looked worse. Sure, it was “realistic”, but it was obvious that original crisp and high contrast shadows were deliberate in the part of the game artists to create a certain mood and make the scene readable at a glance from pseudo-iso perspective. RT shadows just looked muddy and undefined instead and made the whole image, funnily enough, look lower res overall.

I don’t have anything against RT, I just feel that most current implementation are analogous to tasteless over-saturated ENB packs and 8K textures for old games - it’s missing the point and mostly makes things look worse, but gamers with no taste lap it up because it’s ostensibly new and high tech, so it must be better.
Interesting thought. Sure, RT makes a scene look more realistic, but I wonder how much realism is a key factor when making a game such as Minecraft, for example. Probably not much. And if that's the case, then how does tilting the game's overall appearance away from the artist's original intention towards a kind of forced realism add to the game's value? I'll let everyone answer this for themselves.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,171 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Personally
Like I said, it's all relative. And given the resolution / hardware you've had, I can absolutely see why you make the choices you do, the parts lacked in RT performance and FSR need a lot of work. Personally, I don't get too bogged down in how the sausage is made, I just enjoy the meal when it tastes great. Even you've said 4k makes the most sense for DLSS, and indeed it does. Rendering at native res isn't really a consideration or goal for me (if going for sharpness I aim for supersampling), I prefer next generation visuals to yesteryears graphics slightly sharper, I really hope AMD can deliver that and tempt me across. I really want them to succeed.
 
Top