• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

Joined
May 31, 2016
Messages
4,412 (1.47/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
I'm trying to wrap my head around, make GPU more independent on the CPU in graphics rendering. Why would that be an issue here? There are tasks that would require processors to perform despite GPU while rendering. Is this to mitigate the bottlenecks that a CPU can cause when paired with a powerful GPU? There is so many variables in here and yet, AMD is addressing this particular issue (if you can even call it that way).
 
Joined
Feb 1, 2019
Messages
3,025 (1.50/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
This is why AMD sits at an all time low of 10% dGPU market share , they fail to read the room !!!
They power more gaming devices than Nvidia, dGPU is smaller than the console market.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,153 (2.87/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
They power more gaming devices than Nvidia
I'm not a fan of nVidia, but the Nintendo Switch accounts for a very large chunk of gaming consoles in the wild and that's powered by a nVidia chip. However, when it comes to non-portable gaming consoles, AMD practically has a monopoly on custom designed SoCs for these devices. dGPU market share doesn't properly describe the diversity of AMD's portfolio to be completely honest.
 
Joined
Dec 10, 2022
Messages
481 (0.78/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Here's an idea for AMD...

Instead of talking about RDNA4, how about actually releasing a new RDNA3 card? The RX 7900 cards came out over two months ago and we haven't really gotten much more than a sniff of what is to come. Thus far, nVidia has released the RTX 4090, 4080, 4070 Ti, is primed to release three versions of the 4070 and has been talking about the 4060 for awhile now. We've heard a few things about the RX 7800 XT but that was a long time ago. We haven't heard anything about when the 7800 cards will be available, let alone the 7700 and 7600 cards!

Maybe AMD should fix the RDNA3 situation before talking about what comes next. I'm sure that the people care more about what's happening now than what will happen in a few years time.
 
Joined
Oct 4, 2017
Messages
703 (0.28/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
They power more gaming devices than Nvidia, dGPU is smaller than the console market.

Completely irrelevant but since you want to go there , please compare dGPU margins to iGPU margins ....
 
Last edited:
Joined
Sep 17, 2014
Messages
21,717 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I'm not a fan of nVidia, but the Nintendo Switch accounts for a very large chunk of gaming consoles in the wild and that's powered by a nVidia chip. However, when it comes to non-portable gaming consoles, AMD practically has a monopoly on custom designed SoCs for these devices. dGPU market share doesn't properly describe the diversity of AMD's portfolio to be completely honest.
This is a good point, but it's an ARM system that also doesn't share any, or barely any games with other platforms nor the PC. You could say Nintendo carved out its own market.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,153 (2.87/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
This is a good point, but it's an ARM system that also doesn't share any, or barely any games with other platforms nor the PC. You could say Nintendo carved out its own market.
Definitely, but it's a huge slice of the market. It doesn't really matter that it's ARM or that almost everything on it is exclusive to Nintendo. The reality is that there are a lot of them out there in the wild, so while it might be a carved out market of their own, it's a pretty big slice of the pie.
 
Joined
Feb 1, 2019
Messages
3,025 (1.50/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Completely irrelevant but since you want to go there , please compare dGPU margins to iGPU margins ....
Its relevant in terms of influence on game design and AI features which is the subject of this thread.

Developers dont care about margins that is something thats not relevant unless....

Is this a Nvidia vs AMD troll discussion or a discussion on if AMD are right to not invest in hardware AI chips?
 
Joined
Oct 15, 2010
Messages
951 (0.19/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
We've heard a few things about the RX 7800 XT
I'm pretty sure that one is going to be a big disappointment. 7900XT is $900. The gap in performance between it and the 6800xt is small(33%) but the gap in price is huge (56%). My guess is it will be barely faster than the 6800xt by 16% which is splitting the difference between the 6800xt and the 7900xt. But at what price? $800 - $750 - $700? If it's $750 then it's 15% more expensive for 16% more perf. Which means price/performance stagnation.
That $750 price with put it on par with the 6950xt on performance. The headline will be "last gen $1100 performance for only $750!!! what a steal!!"
 

Ahhzz

Moderator
Staff member
Joined
Feb 27, 2008
Messages
8,836 (1.47/day)
System Name OrangeHaze / Silence
Processor i7-13700KF / i5-10400 /
Motherboard ROG STRIX Z690-E / MSI Z490 A-Pro Motherboard
Cooling Corsair H75 / TT ToughAir 510
Memory 64Gb GSkill Trident Z5 / 32GB Team Dark Za 3600
Video Card(s) Palit GeForce RTX 2070 / Sapphire R9 290 Vapor-X 4Gb
Storage Hynix Plat P41 2Tb\Samsung MZVL21 1Tb / Samsung 980 Pro 1Tb
Display(s) 22" Dell Wide/24" Asus
Case Lian Li PC-101 ATX custom mod / Antec Lanboy Air Black & Blue
Audio Device(s) SB Audigy 7.1
Power Supply Corsair Enthusiast TX750
Mouse Logitech G502 Lightspeed Wireless / Logitech G502 Proteus Spectrum
Keyboard K68 RGB — CHERRY® MX Red
Software Win10 Pro \ RIP:Win 7 Ult 64 bit
Stick to the topic. Trolling is not allowed. ;)
 

mathohardo

New Member
Joined
Feb 23, 2023
Messages
5 (0.01/day)
well as someone with a high end pc I bought a ps5 to play the new god of war on my 75 inch tv and overall the game is super super pretty (using the 60 fps mode) that is a 10.3 teraflop gpu aka a 16 GB total system and gpu memory ... the 7900xtx is 61 teraflops and the new rdna 4 with be on better processes size then that so even on medium end cards 2k native will obvious run like melted butter... i still hop amd is working on a AI mode but fsr1 to fsr2.1 is kinda amazing... but were are closing in on if you dont use raytracing or 4k your gpu is overkill... like massively as 60 fps for over a decade was a hard thing to do then now games can (esp at 1080 and e sports) break 600 if 2k was by law the max resolution and raytracing was banned the next gen cards would be over the human eyes ability to care at 300 for intensive games and 400+ for lesser taxing titles so it kinda doesn't matter.. other then are companys not over charging and nivida absolutey has... and only owns up to bad decisions after outcry like the last gpu scandle of names they used...

keep in mind nivinda released a 1030 with ddr4 years later then the 1030 ddr5 was already bad and it had a massive performance hit.... and it decived the avereage tech user look it up game ran 30-40% worse with the same name but 85% of h ppl couldn't tell you what the dif is between dd5 and ddr4 if ask that as a straight question
 
Joined
Dec 10, 2022
Messages
481 (0.78/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I'm pretty sure that one is going to be a big disappointment. 7900XT is $900. The gap in performance between it and the 6800xt is small(33%) but the gap in price is huge (56%).
Well, that's similar to the relationship between the RX 6800 XT and RX 6900 XT. The RX 6900 XT was an extra $350 for a "whopping" 9% increase in performance. People only bought the RX 6900 XT because it was often all that they could find. AMD's going to learn this the hard way.
My guess is it will be barely faster than the 6800xt by 16% which is splitting the difference between the 6800xt and the 7900xt. But at what price? $800 - $750 - $700? If it's $750 then it's 15% more expensive for 16% more perf. Which means price/performance stagnation.
That $750 price with put it on par with the 6950xt on performance. The headline will be "last gen $1100 performance for only $750!!! what a steal!!"
I'm actually expecting the RX 7800 XT to be about $550-$600. I think that this would be fair because it would be more performance than the RX 6800 XT for a lower MSRP. If AMD does that, it's game over for nVidia this generation as people will go nuts for it. What AMD needs to focus on is availability because no matter how good of a deal something is, if you don't have it, you can't sell it and it's worthless to you.
 
Joined
Jun 16, 2021
Messages
49 (0.04/day)
System Name 2rd-hand Hand-me-down V2.0
Processor Ryzen R5-5500
Motherboard ASRock X370
Cooling Wraith Spire
Memory 2 x 8Gb G.Skill @ 2933Mhz
Video Card(s) Sapphire RX 5600 XT
Storage 500 Gb Crucial MX500, 2Tb WD SA510
Display(s) Acer 24.0" CB2 1080p
Case (early) DeepCool
Audio Device(s) Ubiquitous Realtek
Power Supply 650W FSP
Mouse Logitech
Keyboard Logitech
VR HMD What?
Software Yes
Benchmark Scores [REDACTED]
Regardless of AI involvement or not, I don't like the concept of upscaling or frame-generation in games. If a game can't be run at a favorable frame-rate, especially with higher tier hardware, without resorting to these magic tricks, then it seems to me the game needs to be better tuned. Or, as many say, "optimized."

Major game projects are generally in development for several years, or so we've been told. If that is accurate, then what sort of hardware is in use at the developer's studio during that long development time? Is everyone using the x86 Cray in the basement? Tell me, how do you construct a game that will be dependent on hardware and drivers that don't yet exist that the game will require to run well upon release? (Yes, that is a rhetorical question.) But it seems that frame-generation and upscaling are being employed as the, "Get Out Of Jail Free" card to smooth over game development shortcomings.

As for RT, it won't be a thing for me personally until a $300 card made by IDon'tCare can run a heavily ray-traced, AAA game using Very High IQ settings at 1080p and never drop below 60 fps. And, without resorting to any sort of upscaling or frame-generation slight-of-hand. Until then, I can't be bothered. I refuse to be Captain Ahab chasing the globally illuminated White Whale, Moby Dick, all over the Atlantic Ocean that is thousands of dollars deep. For what? To play Assassin's Creed AI 2077? Forget that.

I don't know precisely what AMD's plans are, but I hope they see the obvious place where they could move some serious product; namely, that lower mid-tier range where most PC gamers live. Something akin to the RX 6750XT class, decently built, maybe with the new encoder and HDMI standard, and with enough AIB incentives to insure being in the $329 ~ $349 USD range at retail. We don't need no stinkin' AI.
 
Joined
Sep 17, 2014
Messages
21,717 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Regardless of AI involvement or not, I don't like the concept of upscaling or frame-generation in games. If a game can't be run at a favorable frame-rate, especially with higher tier hardware, without resorting to these magic tricks, then it seems to me the game needs to be better tuned. Or, as many say, "optimized."

Major game projects are generally in development for several years, or so we've been told. If that is accurate, then what sort of hardware is in use at the developer's studio during that long development time? Is everyone using the x86 Cray in the basement? Tell me, how do you construct a game that will be dependent on hardware and drivers that don't yet exist that the game will require to run well upon release? (Yes, that is a rhetorical question.) But it seems that frame-generation and upscaling are being employed as the, "Get Out Of Jail Free" card to smooth over game development shortcomings.

As for RT, it won't be a thing for me personally until a $300 card made by IDon'tCare can run a heavily ray-traced, AAA game using Very High IQ settings at 1080p and never drop below 60 fps. And, without resorting to any sort of upscaling or frame-generation slight-of-hand. Until then, I can't be bothered. I refuse to be Captain Ahab chasing the globally illuminated White Whale, Moby Dick, all over the Atlantic Ocean that is thousands of dollars deep. For what? To play Assassin's Creed AI 2077? Forget that.

I don't know precisely what AMD's plans are, but I hope they see the obvious place where they could move some serious product; namely, that lower mid-tier range where most PC gamers live. Something akin to the RX 6750XT class, decently built, maybe with the new encoder and HDMI standard, and with enough AIB incentives to insure being in the $329 ~ $349 USD range at retail. We don't need no stinkin' AI.
Someone gets it.

Except games are not developed to run on hardware of tomorrow - more often than not, games are in fact tuned to hardware of yesteryear. Because that's where the market is - like you say yourself - the midrange and 300 eur card is the moment a performance level is truly 'enabled' in gaming. But the idea to tune to old hardware is in conflict with the approach of shiny graphics to impress and attract audience for a game. Upscaling technologies kinda solve that conflict.

RT is a funny one though, much like numerous heavy post processing effects, and that's where the FSR/DLSS approaches truly shine. Anything with (volumetric) / dynamic lighting, whether raster or RT based, simply incurs a heavy performance hit because its dynamic, in real time. Anything that is calculated in real time will take render time - continuously! - and incurs an FPS hit. No matter how fast the GPU. Upscaling technologies claw back some of that lost time, and in a way, enable more effects rather than applying a lower quality pass to make the game playable. Now, you can run through Cyberpunk's area's without losing half your FPS because of a volumetric patch of fog - the dip is less pronounced - even though the rest of the game might run a solid 60 FPS.

As for AI... I think its a dangerous development in many ways. It doesn't inspire creativity, but rather tunnel vision much like typical algorithms do: in the end, whether programmed or trained 'entities', human input is the basis for its decision making ability, and as such it is limited by it. We've already seen some shining examples from ChatGPT. For gaming, I see a purpose in game development, but not in the consumer space, and there is the risk of an overall decline in quality titles as the barrier of entry gets lower.
 
Last edited:

duladrop

New Member
Joined
Mar 7, 2023
Messages
9 (0.02/day)
My concerns about this is that, if the game devs will adopt their platform. Just picture this, AMD's AI Accelerators can work on a number of features like, RT's, NPCs, bots, and game environment, Image processing, here's a BIG "IF", the game adopts it, It is fun for gamers an it can be more immersive, right? But what happens to NVIDIA with their Tensor cores only used for image processing and RT's? they are not optimized or designed for bots, npcs.

If you will say yeah they have RT's, well AMD's RDNA 3 can do RT's as well their AI accelerators can do that.

But what happens to other things like NPC's, Bots optimization. So we are looking at as to what platform that is more appealing to Game Dev's to make the game more fun and Immersive.
 
Joined
Jan 31, 2022
Messages
59 (0.06/day)
If done wrong it will go the same way as PhysX. A sidenote that nobody cares about because not everyone can use it. We could've gotten super realistic physics simulation (liquids, fabric, dust, etc) a long time ago, but it's GPU acceleration is nvidia only, so nobody used it for anything meaningful.

Do the same with any of the AI processing, make it vendor specific, and you'll lose out. To sell your game well, you'll need to make sure as many people as possible can run it. And since games are mostly developed once and ported over that applies not just on PC but also on consoles.
So we're back to RDNA2 compatibility for the given time.
Any developer even remotely interested in using hardware to improve live AI in games would be smart to do it in a way that runs on all of them without hitting performance too much.

Realtime raytracing has to look at it the same way. If it doesn't run sufficiently well on the midrange cards of both (or now all three) vendors, it won't be more than an interesting gimmick.
In the current market that would be a RTX 3060, RX 6650 XT and A750.
Make it run well there at high settings without upsampling and we're back where it used to be for many years.
The sub 350 cards always used to be able to run the then current games at then common resolutions at high settings. And without upsampling, because there was no upsampling.

In short, no developer would bring out a game in 2023 that uses all the AI and RT stuff only to set the hardware requirements to "RDNA3/Ada Lovelace or higher".
 
Joined
Dec 10, 2022
Messages
481 (0.78/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
If done wrong it will go the same way as PhysX. A sidenote that nobody cares about because not everyone can use it. We could've gotten super realistic physics simulation (liquids, fabric, dust, etc) a long time ago, but it's GPU acceleration is nvidia only, so nobody used it for anything meaningful.
Yep. That's why Havok ended up winning that war. I think the same way when I think of DLSS and FSR. Hell, nVidia blocks its own cards from using those features and now they're saying that DLSS3 only works on the RTX 40-series. All I can do is shake my head at all the people who keep buying nVidia and getting screwed like this. When FSR works on ALL cards, even old nVidia cards that nVidia won't let use DLSS, you know that something's rotten in Santa Clara.
Do the same with any of the AI processing, make it vendor specific, and you'll lose out. To sell your game well, you'll need to make sure as many people as possible can run it. And since games are mostly developed once and ported over that applies not just on PC but also on consoles.
So we're back to RDNA2 compatibility for the given time.
Any developer even remotely interested in using hardware to improve live AI in games would be smart to do it in a way that runs on all of them without hitting performance too much.

Realtime raytracing has to look at it the same way. If it doesn't run sufficiently well on the midrange cards of both (or now all three) vendors, it won't be more than an interesting gimmick.
In the current market that would be a RTX 3060, RX 6650 XT and A750.
Yeah, none of those cards can run HW-accelerated RT properly. It honestly makes me wonder why people are paying extra for mid-to-low nVidia cards when they can't run RT at that level anyway.
Make it run well there at high settings without upsampling and we're back where it used to be for many years.
Ohhh, they don't want that! Jensen wants RT to seem like a "pie-in-the-sky-holy-grail-must-pay-hundreds-for-it" kind of magic.
The sub 350 cards always used to be able to run the then current games at then common resolutions at high settings. And without upsampling, because there was no upsampling.
Which is how it should be.
In short, no developer would bring out a game in 2023 that uses all the AI and RT stuff only to set the hardware requirements to "RDNA3/Ada Lovelace or higher".
I agree with you... well, unless Jensen finances them enough like he did with Control, Mincraft RT and Portal RTX. :laugh:
 
Joined
Jul 9, 2015
Messages
3,413 (1.03/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Completely irrelevant but since you want to go there , please compare dGPU margins to iGPU margins ....
Welp, and?

Last quarter:

greedia GPU business - 1.57 billion
amdia GPU + console business - 1.6 billion

how were the margins, cough?
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,043 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
none of those cards can run HW-accelerated RT properly. It honestly makes me wonder why people are paying extra for mid-to-low nVidia cards when they can't run RT at that level anyway.
I've been playing around with my new A2000 6GB (~3050 performance) in my 2nd 5600G rig, and I have played several games with high settings (or higher), RT+DLSS on and what I call a good-to-great graphics experience at 1080P. Metro Exodus 60+ fps, Doom Eternal ~120 fps, Control 60+ fps, CP 2077 ~50fps, Spiderman+MM 75+ fps. Naturally the rebuttal here is that DLSS is fake resolution, so it's not even a real 1080p, but with the 2.5.1 DLL dropped in all of these games look and play fantastic by my measure, with IQ highly comparable to native now with state of the art, sometimes transformative, visuals. Now the rebuttal may be that your 'measure' sets the bar higher than that, fake resolution gimmicks and so on, I can't argue against what you deem proper an acceptable, but yeah for my taste, in my first hand experience, even the lowest end RTX suite cards can absolutely play games with RT on while looking and playing great.

FWIW I paid 'extra' for the performance per form factor on this one, didn't want an ageing 1650 or anaemic RX6400.
All I can do is shake my head at all the people who keep buying nVidia and getting screwed like this.
Curious, from your unique perspective, is there a use case/buying case in your mind where it is acceptable to buy Nvidia without the head shake? Or does every buyer fit in the same bucket. I'm not after an argument, I'm genuinely curious what you think.
 
Joined
Dec 10, 2022
Messages
481 (0.78/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I've been playing around with my new A2000 6GB (~3050 performance) in my 2nd 5600G rig, and I have played several games with high settings (or higher), RT+DLSS on
I said "not potent enough to properly use RT" and properly using RT means not using DLSS. Jeez, I bet that Jensen just loves you because he's successfully lowered your expectations by making you think that using DLSS is the same as playing at native resolution (except that it's not).

"Sure, let's make it prettier with RT but uglier with DLSS or FSR!" <- Somehow that makes no sense.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,043 (1.27/day)
System Name MightyX
Processor Ryzen 5800X3D
Motherboard Gigabyte X570 I Aorus Pro WiFi
Cooling Scythe Fuma 2
Memory 32GB DDR4 3600 CL16
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
I said "not potent enough to properly use RT" and properly using RT means not using DLSS
Sorry, I didn't realise you get to define what properly means.

I eluded to this exactly in my post lol, except you cut that all out including my rationale, given I don't draw an arbitrary line where I think the tech is unacceptable despite the results, and just conclude it's ugly, well done. Second question ignored too, nice.
 
Last edited:

Ahhzz

Moderator
Staff member
Joined
Feb 27, 2008
Messages
8,836 (1.47/day)
System Name OrangeHaze / Silence
Processor i7-13700KF / i5-10400 /
Motherboard ROG STRIX Z690-E / MSI Z490 A-Pro Motherboard
Cooling Corsair H75 / TT ToughAir 510
Memory 64Gb GSkill Trident Z5 / 32GB Team Dark Za 3600
Video Card(s) Palit GeForce RTX 2070 / Sapphire R9 290 Vapor-X 4Gb
Storage Hynix Plat P41 2Tb\Samsung MZVL21 1Tb / Samsung 980 Pro 1Tb
Display(s) 22" Dell Wide/24" Asus
Case Lian Li PC-101 ATX custom mod / Antec Lanboy Air Black & Blue
Audio Device(s) SB Audigy 7.1
Power Supply Corsair Enthusiast TX750
Mouse Logitech G502 Lightspeed Wireless / Logitech G502 Proteus Spectrum
Keyboard K68 RGB — CHERRY® MX Red
Software Win10 Pro \ RIP:Win 7 Ult 64 bit
I said "not potent enough to properly use RT" and properly using RT means not using DLSS. Jeez, I bet that Jensen just loves you because he's successfully lowered your expectations by making you think that using DLSS is the same as playing at native resolution (except that it's not).

"Sure, let's make it prettier with RT but uglier with DLSS or FSR!" <- Somehow that makes no sense.

Sorry, I didn't realise you get to define what properly means.

I eluded to this exactly in my post lol, except you cut that all out including my rationale, given I don't draw an arbitrary line where I think the tech is unacceptable despite the results, and just conclude it's ugly, well done. Second question ignored too, nice.
I'm going to stop you both right here, for your own sake's. Find the topic, stick to it, act like adults.
 
Top