• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RDNA3 Graphics Launch Event Live-blog: RX 7000 Series, Next-Generation Performance

Joined
Feb 22, 2022
Messages
109 (0.11/day)
System Name Lexx
Processor Threadripper 2950X
Motherboard Asus ROG Zenith Extreme
Cooling Custom Water
Memory 32/64GB Corsair 3200MHz
Video Card(s) Liquid Devil 6900XT
Storage 4TB Solid State PCI/NVME/M.2
Display(s) LG 34" Curved Ultrawide 160Hz
Case Thermaltake View T71
Audio Device(s) Onboard
Power Supply Corsair 1000W
Mouse Logitech G502
Keyboard Asus
VR HMD NA
Software Windows 10 Pro
But I believe most consumers who will pay over $800 for a new graphics card will be looking at that parameter especially.
I won't, and I'm expecting to buy a 7950XTX. Still in the dabble phase with RT, so will take a look at a few titles and assume next generation again will provide some useful leaps.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I've just watched the announcement. It's nice to see some beautifully rendered space images instead of the kitchen of some big boss. :p

Jokes aside - very professional, very lean, and no mention of Nvidia, or even GPU specs whatsoever. We'll have to wait for reviews to know something certain, I guess.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Really need to see the final numbers bear out on these cards, RT is a big selling point for me, as well as reconstruction.

If the 7900XTX is indeed a hell of a beast at 4k, and it can do RT with equal or better relative performance drop to my 3080, it's a contender for sure, especially at such a better price vs the 4090.

FSR3 FG I yearn for information on!

Ray Tracing is still a fancy doodad you enable to go OH THIS THIS IS COOL, in a small handful of triple A games, then turn it off after it slashes FPS or introduces all sorts of nasty motion smoothing, artifacts, and ghosting when DLSS or FSR is enabled.
Couldn't disagree more! no nasty motion smoothing, no artefacts, no ghosting, just the pinnacle of graphical fidelity in games, and now that I've had a taste of it, it's a must have.

Not for you? awesome, power to you, always leave it off and shout it from the rooftops apparently, but for me and many others like me, we love this stuff and it's a must have.
 
Joined
Nov 13, 2007
Messages
10,845 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
Im super torn... on one hand Ray Tracing is the future and it looks and feels cool to use it. On the other hand MCM dies are the future, and I love the design , power, size of this thing...

Im gonna repaste my 3080, play some games and wait for reviews.
 
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Im super torn... on one hand Ray Tracing is the future and it looks and feels cool to use it. On the other hand MCM dies are the future, and I love the design , power, size of this thing...

Im gonna repaste my 3080, play some games and wait for reviews.

I wouldn't worry, these cards will be sold out day 0, and remain overpriced from that day onwards.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Im gonna repaste my 3080, play some games and wait for reviews.
Love my 3080, and am continually impressed even 2 years in how bloody good this card is at 4k heavily undervolted.

Like you I'm toying up that big leap to the next card, currently it feels like only the 4090 or 7900XTX could be that upgrade, but I need independent reviews first.
 
Joined
Jun 27, 2011
Messages
6,772 (1.37/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.
 
Joined
Jun 21, 2021
Messages
3,121 (2.44/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.

Real-time ray tracing is promising technology for gaming but it's still in the early stages of adoption. RT hardware support really needs to show up in consumer hardware (smartphones, game consoles) for it to be "important."

AI-assisted upscaling technology is more important at this time.

But with each passing week, there are more and more games that support ray tracing. Some of the most popular titles now have ray tracing (Fortnite, Minecraft), it's no longer a niche benefit for obscure games.

It's important for AMD to figure out how to do ray tracing well because most likely the next generation videogame consoles will have the feature and if AMD wants to contend as the supplier for those chips, they need a good RT implementation and development environment.
 
Joined
Oct 12, 2005
Messages
713 (0.10/day)
Real-time ray tracing is promising technology for gaming but it's still in the early stages of adoption. RT hardware support really needs to show up in consumer hardware (smartphones, game consoles) for it to be "important."

AI-assisted upscaling technology is more important at this time.

But with each passing week, there are more and more games that support ray tracing. Some of the most popular titles now have ray tracing (Fortnite, Minecraft), it's no longer a niche benefit for obscure games.

It's important for AMD to figure out how to do ray tracing well because most likely the next generation videogame consoles will have the feature and if AMD wants to contend as the supplier for those chips, they need a good RT implementation and development environment.
The main difference between Nvidia and AMD is Nvidia seems to want to dedicate a lot of space to do RT and to do AI. AMD on the other hand want to do it with their compute units.

The upside for Nvidia is when RT is used, they get a very high performance boost and they don't affect the Compute performance. On the other hand, When no raytracing is used, those units sit idle.

The upside for AMD is their compute units are used all the time, but performance suffer when they use ray tracing as they have use compute units for it that could have been used for something else.

In the past, when we had similar paradigm (using general compute units instead of dedicated hardware), Initially the vendor that used the dedicated hardware won, but on the long run, architecture that were more flexible won.

That do not means that in 5 years, Navi 31 will be faster in ray tracing than ADA102, not at all, but in few architecture, it's quite possible that a wider RDNA with way more computes units become faster than an architecture that had dedicated units.

So i see why AMD could be going into that direction right now even if that mean they don't win. They want to perfect it for later.

It's quite possible also, that in few generation, the compute performance for running shaders would be way enough and the spare power could be used on raytracing or vice versa. Right now, there is still too much to do in each compute units, but we could image what a larger, 400mm2 + RDNA 4x+ on N3 with way more compute units could look like.

But right now, it look like if you want to turn RT at 4K, well, you better go green for now.
 
Joined
Nov 4, 2005
Messages
12,013 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
The main difference between Nvidia and AMD is Nvidia seems to want to dedicate a lot of space to do RT and to do AI. AMD on the other hand want to do it with their compute units.

The upside for Nvidia is when RT is used, they get a very high performance boost and they don't affect the Compute performance. On the other hand, When no raytracing is used, those units sit idle.

The upside for AMD is their compute units are used all the time, but performance suffer when they use ray tracing as they have use compute units for it that could have been used for something else.

In the past, when we had similar paradigm (using general compute units instead of dedicated hardware), Initially the vendor that used the dedicated hardware won, but on the long run, architecture that were more flexible won.

That do not means that in 5 years, Navi 31 will be faster in ray tracing than ADA102, not at all, but in few architecture, it's quite possible that a wider RDNA with way more computes units become faster than an architecture that had dedicated units.

So i see why AMD could be going into that direction right now even if that mean they don't win. They want to perfect it for later.

It's quite possible also, that in few generation, the compute performance for running shaders would be way enough and the spare power could be used on raytracing or vice versa. Right now, there is still too much to do in each compute units, but we could image what a larger, 400mm2 + RDNA 4x+ on N3 with way more compute units could look like.

But right now, it look like if you want to turn RT at 4K, well, you better go green for now.
The 4090 has 40% less performance when using RT vs not. How is that a performance boost unless they upscaling too? It applies to either brand, and RT is a generation away from becoming mainstream in games and hardware. For now it’s like tessellated concrete barriers.
 
Joined
Apr 14, 2022
Messages
758 (0.77/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
After the ampere and rdna2 series I thought that the need for more rasterisation wouldn’t exist anymore. The fps have reached ridiculous and pointless levels.

Personally I would be happy with a 2080Ti Super with the same level of raster performance and 5-10x the RT.

All the AAA titles that are coming, have RT and a few GI. And as previously stated, we pay for a new card in order to play the new games. If I want to play Tomb raider and Star Wars, a 5700XT is more than enough.
It will be disappointing if they just match a 3080 in RT.
NVIDIA could kill the whole series with a single card.
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Naaah.

The very first thing is something that looks like a good deal.

This looks like a really good deal. There IS RT performance, let's face it. It will be playable, the gap today isn't even up to a point where it is a total no deal for RT on AMD. So you'll play your five games on it, and that's that.

Let's see the gaming library progress a bit on the RT front before we start calling it a must have feature. Its really not. Games look pretty without it. Games run like shit with it, in a relative sense. The technology is still sub par in both camps.
At the high end many will be choosing Nvidia as more future proof because of it's performance in RT. More future proof with a 1.4a DisplayPort connector. Nvidia is a master at offering cards with high specs today and a nice hidden killer for that same cards, either this is limited memory capacity or bandwidth, or now the video out connectors.
My point is that in the high end people will be looking at RT performance because they are spending a significant amount of money. In the mid range and low end (I have a hard time calling low end cards that will be costing over $200, but anyway), AMD's performance in RT will be problematic. RX 7800 will be slower at RT and RX 7700 even slower and RX 7600 even slower. So, if RX 7600 costs $500 and it is worst in RT compared to Nvidia's RTX 3070, or latter Nvidia's RTX 4060, people will keep buying blindly Nvidia cards even if AMD's cards are butchering those in raster. We see it today. I am keep repeating it that we already see it everywhere. In Steam survay, in sales, in market share, in Nvidia's confidence that it can sell a worst card compared to AMD's equivalent model at a huge premium. Nvidia was doing the same 12 years ago when it was selling pathetic cards that where much worst in gaming compared to AMD's offering, by using PhysX and CUDA, plus the promotion from tech sites, as a strong marketing card.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
At the high end many will be choosing Nvidia as more future proof because of it's performance in RT. More future proof with a 1.4a DisplayPort connector. Nvidia is a master at offering cards with high specs today and a nice hidden killer for that same cards, either this is limited memory capacity or bandwidth, or now the video out connectors.
My point is that in the high end people will be looking at RT performance because they are spending a significant amount of money. In the mid range and low end (I have a hard time calling low end cards that will be costing over $200, but anyway), AMD's performance in RT will be problematic. RX 7800 will be slower at RT and RX 7700 even slower and RX 7600 even slower. So, if RX 7600 costs $500 and it is worst in RT compared to Nvidia's RTX 3070, or latter Nvidia's RTX 4060, people will keep buying blindly Nvidia cards even if AMD's cards are butchering those in raster. We see it today. I am keep repeating it that we already see it everywhere. In Steam survay, in sales, in market share, in Nvidia's confidence that it can sell a worst card compared to AMD's equivalent model at a huge premium. Nvidia was doing the same 12 years ago when it was selling pathetic cards that where much worst in gaming compared to AMD's offering, by using PhysX and CUDA, plus the promotion from tech sites, as a strong marketing card.
We'll see, historically there was a lot more going on and if anything, what it shows is that these pushed technologies tend to just die off. What they did temporarily for market share is another thing, but RT is nowhere near a real presence in the market, despite what the push might want you to believe. The vast majority of implementations is utter crap and offers no benefit.

Still I get the pull you describe about having a card that has full support for everything - I talk about feature parity too, more often than not - but its still part of a package, and RT is not a substantial part of it, compared to price/perf in raster, availability, TDP, perf/w, and overall experience/quality. Nvidia's Ada has a good perf/w and an RT lead, but everything else is substantially worse.
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
We'll see, historically there was a lot more going on and if anything, what it shows is that these pushed technologies tend to just die off. What they did temporarily for market share is another thing, but RT is nowhere near a real presence in the market, despite what the push might want you to believe. The vast majority of implementations is utter crap and offers no benefit.

Still I get the pull you describe about having a card that has full support for everything - I talk about feature parity too, more often than not - but its still part of a package, and RT is not a substantial part of it, compared to price/perf in raster, availability, TDP, perf/w, and overall experience/quality. Nvidia's Ada has a good perf/w and an RT lead, but everything else is substantially worse.
What we see in the future is, I hope, me being completely, hilariously, wrong.

But another thing I see lately, is people buying based on performance today and features promoted today. And AMD is losing in both those categories. AMD is promoting a platform created for longevity, CPUs with equal performance cores, CPUs with high efficiency, and who is winning? Intel, because it is offering more cores today, it is winning benchmarks today, it is offering a cheaper platform today. People don't care if most of those Intel cores are Efficiency cores. They don't care if Ryzen 9 7950X will butcher Intel i9 13900K in applications or games that will need more than 8 P cores in 5 years. They don't care if in that period of 5 years the AM5 owner will be just swapping CPUs and not replacing the whole platform. And they don't care about efficiency.

Lately people care only for TODAY. And in GPUs today is RT. In 5 years we might have something else, or RT be dead, or programmers start using RT in a more efficient way that it is not killing performance. But today it does sell cards. Overpriced worst than the competition's equivalent, better performing in raster cards. Nvidia knows that, that's why it is throwing fake frames in a, still, beta testing form.

Above image from here.

People only look at bars, they only understand bars. And when in reviews they reach the page where it compares models at RT performance, they will have the best excuse to go with the much more expensive Nvidia option. I mean there is already integrated in their brains that AMD sucks in drivers, right? That RT performance will be another example that Nvidia is superior. They will totally disregard all those raster performance charts, because there even the slower Nvidia card will be good enough. They will be focusing on RT charts, because there, the AMD card will just be bad. So they will be choosing the "good enough in everything", over the "better at most, but bad at that new feature that we heard that transforms our games in reality".
Irronicaly, the higher price of the Nvidia model will be one more proof that RT is important and matters. Because "more expensive is better". Right? I believe this is one more true fact about how the average consumer thinks. More expensive equals better, especially when the brand is much stronger.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
What we see in the future is, I hope, me being completely, hilariously, wrong.

But another thing I see lately, is people buying based on performance today and features promoted today. And AMD is losing in both those categories. AMD is promoting a platform created for longevity, CPUs with equal performance cores, CPUs with high efficiency, and who is winning? Intel, because it is offering more cores today, it is winning benchmarks today, it is offering a cheaper platform today. People don't care if most of those Intel cores are Efficiency cores. They don't care if Ryzen 9 7950X will butcher Intel i9 13900K in applications or games that will need more than 8 P cores in 5 years. They don't care if in that period of 5 years the AM5 owner will be just swapping CPUs and not replacing the whole platform. And they don't care about efficiency.

Lately people care only for TODAY. And in GPUs today is RT. In 5 years we might have something else, or RT be dead, or programmers start using RT in a more efficient way that it is not killing performance. But today it does sell cards. Overpriced worst than the competition's equivalent, better performing in raster cards. Nvidia knows that, that's why it is throwing fake frames in a, still, beta testing form.

Above image from here.

People only look at bars, they only understand bars. And when in reviews they reach the page where it compares models at RT performance, they will have the best excuse to go with the much more expensive Nvidia option. I mean there is already integrated in their brains that AMD sucks in drivers, right? That RT performance will be another example that Nvidia is superior. They will totally disregard all those raster performance charts, because there even the slower Nvidia card will be good enough. They will be focusing on RT charts, because there, the AMD card will just be bad. So they will be choosing the "good enough in everything", over the "better at most, but bad at that new feature that we heard that transforms our games in reality".
Irronicaly, the higher price of the Nvidia model will be one more proof that RT is important and matters. Because "more expensive is better". Right? I believe this is one more true fact about how the average consumer thinks. More expensive equals better, especially when the brand is much stronger.

I really don't think RT is any kind of player in this game. This is 20 years ongoing. AMD recovered fine from the initial Turing release, as you can see, but the trend never changed for the last, what, 16-18 years. There is far more going on than just having feature X or Y. One might wonder if the consumer even has enough of an influence on these duopolies ;)

The same thing happened wrt AMD/Intel.

1667550515239.png
 
Joined
May 17, 2021
Messages
3,043 (2.31/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I guess stupid prices are here to stay. Not even AMD saved us from them.
 
Joined
Sep 6, 2013
Messages
3,392 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I really don't think RT is any kind of player in this game. This is 20 years ongoing. AMD recovered fine from the initial Turing release, as you can see, but the trend never changed for the last, what, 16-18 years.

View attachment 268537
Looking at that chart, Nvidia won significant market share two times based on features and one time based on performance.
With GeForce 8000 series in 2007 where it was offering high performance, but mostly new features like CUDA and PhysX. CUDA was important back then for video transcoding and PhysX was promising amazing physics in games. AMD was offering some marvelous options in mid range, like the HD 3850 and latter the amazing HD 4870 and HD 4850, but never really recovered. Talk about bad drivers and the lack of CUDA and PhysX combined with tech press being friendlier to Nvidia was the reason for that. Even after introducing the amazing GCN based HD 7000 people where already bumbling about AMD's bad drivers. When AMD introduced the R9 290X and R9 290, it was beating Nvidia, but tech sites where driving reader's focus away from that fact and to the over 90 Celcious temperatures of the reference models. GeForce 900 and GeForce 1000 series are the examples where Nvidia offered value and performance. It was the clear winner there over AMD that was straggling to remain competitive. Looking at the chart, AMD did managed to close the gap with Polaris, because it was offering high raster performance and great value, combined with excellent mining performance when people where asking for raster and/or mining performance at a great price. But look what happens when Nvidia comes out with RTX 2000 and RayTracing. Market share jumps in it's favor because people rush to buy the new cards with that new amazing feature. RTX 2080 Ti is obviously an overpriced card, but at the same time is the perfect marketing vehicle for Nvidia to declare superiority and sell the cheaper models like hot cakes. After that we have again a mining period where probably the wafer capacity and mining performance dictates who wins market share.
 
Joined
Jan 14, 2019
Messages
12,577 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Am I in the minority who thinks RT isn't important? Too few games use it. Most gpu's do not do it well yet. It needs to become far more mainstream and performative on lower end hardware before I will care.
You're not in the minority, though opinions differ a lot on the topic. I personally think RT is important because some of the games I like use it, and its adoption is slowly but surely increasing year by year. Also, we already have plenty of raster performance (you can play non-RT games on an Nvidia Pascal GPU from 2016 just fine) - where we're lacking performance is RT without the DLSS/FSR upscaling bullshit.
 
Joined
Oct 12, 2005
Messages
713 (0.10/day)
The 4090 has 40% less performance when using RT vs not. How is that a performance boost unless they upscaling too? It applies to either brand, and RT is a generation away from becoming mainstream in games and hardware. For now it’s like tessellated concrete barriers.
The wording was not the best i agree, let say, they get less performance impact for activating the feature.

I think most game can run some kind of RT right now and if they do it well in conjunction of other effect, you can even do it on Turing/RDNA2. It all depend how far you go. RT reflection in area where Screen space reflection can't help is already running good on those GPU (like FarCry 6).

It's when you want to do everything RT (like quake2 RTX or minecraft RTX) that performance become an issue.

If game want to use 1 of the RT effect in a raster pipeline, they can get away with it. (Global Illumination, Reflection or Shadow). Among them, I think it's the Global Illumination that have the most effects on visuals in most scenario.

The thing is game dev really got good at cheating effect. that the visual gain is hard to see in a lot of case. But those cheats are not realistic (although some can look better because they got carefully crafted)

For render that aim to be photorealistic, RT will be needed.

But not all game must go that path, there is still plenty of style around.
 
Top