• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Reportedly Readying RX 6900 XTX, Bringing the Battle to NVIDIA RTX 3090

Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Quite a few people, actually.
Quantify that :) You really can't. People buy the latest GPU so you don't even know if people who bought RTX even care. Its guesswork, yours as much as anyone else's.

I think its much more realistic to look at economic realities and actual content, both of which aren't rosy for RT.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
How about a different approach.
Cyberpunk 2077 RT on DLSS on at 4k with 3090 high settings below 60 FPS.
Now my question again is. Is the hardware for RT sufficient for games now, considering CB2077 is not fully ray traced?
So your personal measure of success for hardware or game implementation is CP2077, a game that even in rast only cards perform below where you'd expect, or fully RT games? that was never my argument, and I've already agreed RT hardware capability needs to and will increase dramatically over coming generations.
Now tell me, when people see this happening on a card for $3k they would still hold your side?
current market conditions aside, not only is it a $1500 card, but a card with ~90% of it's performance has an MSRP of $700, so you don't need to spend 3k to get good RT perf, and again, I said RT performance generally wasn't a major factor in my choice, but it has sufficient grunt in the here and now for most of todays games with RT.
If you expect me to find a link or a proof for 'MAJORITY OF PEOPLE' saying it, is as I've expressed it to be then you are mistaken. There is no link to majority of people saying this.
When you say it like it's a fact, I'd have assumed that notion comes from somewhere other than your opinion.
So here is my question for you, since you don't agree with my statement 'majority of people say RT is good but the hardware lacks', is the hardware we currently have enough for fully ray traced games? Because the games will get more demanding for sure on the RT side and rasterization side.
Your measure of success, not mine. I'm saying a good hardware pairing can make for great experiences today.
And one more question, how do you want me to give you a link to 'majority of people say'? I think the conclusion is the one you seek not a link or a source to 'majority of people say'.
Cite a source? a TPU frontpage poll, another tech community poll or some sort, I dunno, something more than how you feel, packaged as how most people feel.

I think its much more realistic to look at economic realities and actual content, both of which aren't rosy for RT.
Such a shame it's here to stay and a major part of the future of real time graphics eh. Both of those realities will catch up.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Such a shame it's here to stay and a major part of the future of real time graphics eh. Both of those realities will catch up.
Will catch up but not for current hardware.
When you say it like it's a fact, I'd have assumed that notion comes from somewhere other than your opinion.
Isn't it a fact? What is your fact that you can support with source?
So your personal measure of success for hardware or game implementation is CP2077, a game that even in rast only cards perform below where you'd expect, or fully RT games? that was never my argument, and I've already agreed RT hardware capability needs to and will increase dramatically over coming generations.
No it isn't it is an example do you have better? for let's say 3060 or 3070? or are we going to focus on a $3k card? MSRP is meaningless at this point in time. I'm sure you know that right?
current market conditions aside, not only is it a $1500 card, but a card with ~90% of it's performance has an MSRP of $700, so you don't need to spend 3k to get good RT perf, and again, I said RT performance generally wasn't a major factor in my choice, but it has sufficient grunt in the here and now for most of todays games with RT.
Why current market conditions aside? You want me to drop the market conditions because it is not going up your alley but you bum me out due I chose CB2077 as an example for insufficient hardware for RT? what about other cards like 2070 or 3070 and below? do those not matter as well?
Your measure of success, not mine. I'm saying a good hardware pairing can make for great experiences today.
yes and it doesnt have to be RT for a great experience. especially if it doesnt bring much to the game really.
Cite a source? a TPU frontpage poll, another tech community poll or some sort, I dunno, something more than how you feel, packaged as how most people feel.
Do you have a link or a source that contradicts what I said? Just curious.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Do you have a link or a source that contradicts what I said? Just curious.

HUB poll
raytrace.png


So you would belong to the minority (28%) who don't care about RT, yet speak the loudest?
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Such a shame it's here to stay and a major part of the future of real time graphics eh. Both of those realities will catch up.
Not at all! See, I'm not opposed to the technology or the advancement... I'm opposed to the way it is being rolled out, the timing of it, and the effect on GPU pricing and quality.

It feels like one step forward, two steps back. I'll happily jump in on it when the market would deal with it in healthy way, but it never did and is not looking to anytime soon. And in that, RT(X) is different from previous big changes in GPU hardware and capabilities. Today its a tech for the happy few - handful even - because GPUs in general are unobtanium.

HUB poll
View attachment 215079

So you would belong to the minority (28%) who don't care about RT, yet speak the loudest?

Ahem, selective reading much? A VAST majority actually cares about raster BEFORE RT. And that is exactly the choice you have in the market right now, too. 51 + 28%. Not 28%.

I also believe that aligns well with what most topics on the subject contain. A few happy campers being all over RT, and a vast majority just waiting it out while using raster only, but looking forward to see where it goes next.

The clear minority is those who advocate RT progress before or even equal to raster progress: 13 + 5 + 3%. This poll simply contains five degrees of 'how much you can care' about one compared to the other, weighing them all.

You know what would kick RT adoption up in a major way? Nvidia making sure an RT capable GPU lands in the hands of every gamer. Now, look at the market again ;)
 
Last edited:
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Ahem, selective reading much? A VAST majority actually cares about raster BEFORE RT. And that is exactly the choice you have in the market right now, too. 51 + 28%. Not 28%.

I also believe that aligns well with what most topics on the subject contain. A few happy campers being all over RT, and a vast majority just waiting it out while using raster only, but looking forward to see where it goes next.

The clear minority is those who advocate RT progress before or even equal to raster progress: 13 + 5 + 3%. This poll simply contains five degrees of 'how much you can care' about one compared to the other, weighing them all.

You know what would kick RT adoption up in a major way? Nvidia making sure an RT capable GPU lands in the hands of every gamer. Now, look at the market again ;)

LMAO, so between 2 GPU with equal Rasterization performance and cost per frame (rasterized), yet one is superior in RT than the other, which do you think people are buying? would that make 72% of people picking Ampere and the other 28% picking RDNA2? This is without factoring in DLSS, which skewer buyers towards Ampere much further.

You sure know the market so tightly that you would deny any market study made by professional so I wouldn't bother sourcing any :roll:.

Btw Intel is also dead set on Ray Tracing and XeSS, which will make Ray Tracing more accessible in the future
 
Last edited:
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Btw Intel is also dead set on Ray Tracing and XeSS, which will make Ray Tracing more accessible in the future
Intel is not dead set on RT but they can't afford to be behind even if majority doesn't really care because that would have been a bad marketing and no company will go that route.
You sure know the market so tightly that you would deny any market study made by professional so I wouldn't bother sourcing any :roll:.
So what is the market study according professionals and who are they?
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Intel is not dead set on RT but they can't afford to be behind even if majority doesn't really care because that would have been a bad marketing and no company will go that route.

So what is the market study according professionals and who are they?

Huh, why would Intel be focus on something the majority of people wouldn't care? (is 28% a majority? :roll: ) dedicated RT cores cost die size you know.

Jon Peddie research? Nvidia sure sold a hell lots of GPU in Q1/Q2 compare to AMD, and Steam Hardware Survey do counting them Ampere users nicely, but I guess someone here doesn't trust HWS all that much.
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
If I were to make an educated guess, RTX4000 will offer the same improvement to both Rasterization and RT compare to RTX3000, meaning the perf cost with RT ON will remain relatively constant between Turing, Ampere and Ada.
To be fair, the improvement in RT cores from Turing to Ampere was quite minimal in performance. They were mostly just reorganizing and moving things around.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
To be fair, the improvement in RT cores from Turing to Ampere was quite minimal in performance. They were mostly just reorganizing and moving things around.

It's a trade off between Rasterization vs RT, if Nvidia add more RT cores it would take away available space for CUDA cores. The current DXR is hybrid Rasterization and RT, so reducing rasterization and boosting RT would just come to the same overall FPS number
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,246 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Will catch up but not for current hardware.
Sure, current hardware when compared to 1,2 ...3 generations forward will be far less performant at RT, I still agree, but you get to ride that wave, which has already started.
Isn't it a fact? What is your fact that you can support with source?
Dude I was curious, sources do also exist for these sort of things, one just got posted, more on that later. If you want to talk perceptions I'll gladly give you mine but I wont claim that my experiences of what people say and like is representative of a majority, by definition it's far too small a sample size.
No it isn't it is an example do you have better? for let's say 3060 or 3070? or are we going to focus on a $3k card? MSRP is meaningless at this point in time. I'm sure you know that right.
Why current market conditions aside? You want me to drop the market conditions because it is not going up your alley but you bum me out due I chose CB2077 as an example for insufficient hardware for RT? what about other cards like 2070 or 3070 and below? do those not matter as well?
Most of those cards (amd and Nvidia) can be bought at or close to MSRP (caveat being some manufacturers MSRP's are what they are, you'll pay more for a ROG STRIX than you will a TUF for example) if you are determined enough, but sure, the market is what it is. Again, I'm not claiming a majority of people are buying RTX or even RT capable cards primarily for RT games, but when it's a strong secondary or tertiary factor, that's nice too, as can be other features or talking points, like potentially more VRAM, power efficiency, software suite etc.
yes and it doesnt have to be RT for a great experience. especially if it doesnt bring much to the game really.
That's cool too, It's not essential but it's fun and fascinating.
Do you have a link or a source that contradicts what I said? Just curious.
The one that got posted by @nguyen is interesting, in the way that we can already see in these comments there are multiple interpretations of the data.

My take would be ~20% put a fair bit of stock in it for a current purchase, another 50% of people are still regarding it as a meaningful feature. Being that it is the future, this notion is only going to increase.
 
Joined
Sep 15, 2011
Messages
6,762 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Miss the days when the top dogs were actually affordable and easy to get. This is now just like watching a battle between a Ferrari and a Lambo.
Interesting, but that's about it.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
The one that got posted by @nguyen is interesting, in the way that we can already see in these comments there are multiple interpretations of the data.

My take would be ~20% put a fair bit of stock in it for a current purchase, another 50% of people are still regarding it as a meaningful feature. Being that it is the future, this notion is only going to increase.
To be fair, what our colleague here posted is not what this is all about. It is not about the opinion of people who think RT is currently ok or would rather stay with rasterization.
He's completely disregarded the fact that the hardware is barely capable of RT and you need a piles of money to make it happen in a respectable way. The opinion for people willing to invest in RT now or not is clearly moving leaning towards rasterization anyway. RT is great and it will be the future but it wont happen tomorrow considering hardware advancement keeping in mind what Ray Tracing can really do.
That's cool too, It's not essential but it's fun and fascinating.
Sure it may be fascinating but it still is long down the road for me.
Most of those cards (amd and Nvidia) can be bought at or close to MSRP (caveat being some manufacturers MSRP's are what they are, you'll pay more for a ROG STRIX than you will a TUF for example) if you are determined enough, but sure, the market is what it is. Again, I'm not claiming a majority of people are buying RTX or even RT capable cards primarily for RT games, but when it's a strong secondary or tertiary factor, that's nice too, as can be other features or talking points, like potentially more VRAM, power efficiency, software suite etc.
Is it possible really to get a card for MSRP? I know the 2000 series cards cost more than they did during launch.

Sure, current hardware when compared to 1,2 ...3 generations forward will be far less performant at RT, I still agree, but you get to ride that wave, which has already started.
Actually you dont have to. If it doesn't bring much to the game and causes a huge impact in performance you can simply wait because the money you have to put in for it and the performance you get is not that good.
Dude I was curious, sources do also exist for these sort of things, one just got posted, more on that later. If you want to talk perceptions I'll gladly give you mine but I wont claim that my experiences of what people say and like is representative of a majority, by definition it's far too small a sample size.
Which source to support your point of view have you provided yourself? Our colleagues post is pointless in our argument.
 
Joined
Jan 27, 2008
Messages
25 (0.00/day)
System Name Home
Processor Ryzen 5 5600X@4.8Ghz
Motherboard Gigabyte Aorus Pro AXi 550B
Cooling Custom water cooling CPU+GPU (dual 240mm radiators)
Memory 32GB DDR4 3600Mhz CL16
Video Card(s) Gigabyte RTX2080 Super Waterforce
Storage Adata M.2 SX8200Pro 1TB + 2TB Crucial MX500 2.5" SSD + 6TB WD hdd
Display(s) Acer Nitro XF252Q FullHD 240hz + 1440p 144hz AOC
Case CoolerMaster NR200 white
Audio Device(s) SteelSeries Arctis 9 Wireless headset
Power Supply Corsair SF600 Platinum
Mouse Logitech G Pro Wireless Superlight
Keyboard Logitech G915 TKL
Software Windows 10
HUB poll
View attachment 215079

So you would belong to the minority (28%) who don't care about RT, yet speak the loudest?

This poll shows that you and other RT diehards are 8%!!! No compromises gaming- like here as been said.
Then there are 13% of who think that 50-50. If we even add that to RT is a MUST company- then you get 21% of the poll.
79% do not CARE about RT!

This what I can call majority. I have seen similar percent's give or take on other techtuber polls.

If we can play RT games without any performance loss (60fps min with all maxed out) and without the need of FSR, DLSS, XeSS etc. Then I can say it is not a gimmick.
Right now everytime I play and try to enjoy there is weird shimmering when using DLSS. I noticed it in Metro EE and CP2077 and in Call Of Duty Warzone. No bueno.
Unless the DLSS improves this, i will not be using it and that automatically means I cannot use RT on higher resolutions than 1440p and not use the DLSS.
1080P and RT turned ON - it's fine with most RT capabale cards.
 
Last edited:
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
It's a trade off between Rasterization vs RT, if Nvidia add more RT cores it would take away available space for CUDA cores. The current DXR is hybrid Rasterization and RT, so reducing rasterization and boosting RT would just come to the same overall FPS number
And that my friend is clearly showing Ray Tracing will have hardware problems to make it run good enough on all hardware in all given scenarios. Not to mention the advancement in Ray Tracing in games is not helping either if you have to make trade-offs, you bet on a horse and will must support it with the marketing. That is how I see it today.
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
This is getting really off topic.
And that my friend is clearly showing Ray Tracing will have hardware problems to make it run good enough on all hardware in all given scenarios. Not to mention the advancement in Ray Tracing in games is not helping either if you have to make trade-offs, you bet on a horse and will must support it with the marketing. That is how I see it today.
This is a classic chicken and egg problem. Developing RT effects does not make sense if there is no hardware to support it. Adding hardware to support RT does not make sense if there is no RT effects in software. RT Cores take up ~3% on Turing dies. Relatively small investment for a new capability giving time for the industry to adjust and use/showoff it in a still meaningful way. Scaling them up should not be difficult but there is the same question of cost vs benefit.

In addition to optimizing the very specifically ray tracing side of things, RT advancements in games and software are very much around all the supporting areas that are not hardware accelerated. Notably, building and updating the BVH and other data structures.

Game development is a multi-year process and largely only this year have the major game engines implemented RT effects in a real (and non-beta) way. APIs have somewhat matured in their first incarnation.
Similarly, with AMD having hardware support for RT and Intel also getting some hardware support for RT, this is already getting industry-wide.

Clamoring for full path-tracing is one possible viewpoint but dismissing entire RT until that is very short-sighted IMO. RT has obvious benefits for use cases it is already being used - basically everything to do with lighting but in practice mostly GI, shadows, some aspects of transparency. Reflections too but that probably has more to do with us not having found better ways to do proper reflections than current strengths of RT.
 
Last edited:
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
This poll shows that you and other RT diehards are 8%!!! No compromises gaming- like here as been said.
Then there are 13% of who think that 50-50. If we even add that to RT is a MUST company- then you get 21% of the poll.
79% do not CARE about RT!

This what I can call majority. I have seen similar percent's give or take on other techtuber polls.

If we can play RT games without any performance loss (60fps min with all maxed out) and without the need of FSR, DLSS, XeSS etc. Then I can say it is not a gimmick.
Right now everytime I play and try to enjoy there is weird shimmering when using DLSS. I noticed it in Metro EE and CP2077 and in Call Of Duty Warzone. No bueno.
Unless the DLSS improves this, i will not be using it and that automatically means I cannot use RT on higher resolutions than 1440p and not use the DLSS.
1080P and RT turned ON - it's fine with most RT capabale cards.

Dude, saying I don't care about RT is very different to saying I care about Raster but RT is also important, it's not that nuanced.
Idk about you but I would turn on RT Reflection in CP2077 and tweak other settings for ~60FPS (including DLSS)

Also I'm seeing less shimmering in CP2077 with DLSS as opposed to Native rendering, DLSS is just black magic really :D

As for RT, it can make the game looks like a different, more advanced version of the same game, The Ascent for example
 
Last edited:
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
LMAO, so between 2 GPU with equal Rasterization performance and cost per frame (rasterized), yet one is superior in RT than the other, which do you think people are buying? would that make 72% of people picking Ampere and the other 28% picking RDNA2? This is without factoring in DLSS, which skewer buyers towards Ampere much further.

You sure know the market so tightly that you would deny any market study made by professional so I wouldn't bother sourcing any :roll:.

Btw Intel is also dead set on Ray Tracing and XeSS, which will make Ray Tracing more accessible in the future

Don't move the goalposts if you stand corrected now, mate. That's not fair is it. You completely misread that poll, end of story, and I assume its because of tinted glasses - not necessarily bad, but just be honest about it. You are entitled to your opinion as much as we all are.

I'll be the first to admit that if RT gains traction, it gains traction. Really. But it hasn't and doesn't - and no, the games don't look like different games either. Most of them look like somebody was too busy cleaning everything meticulously. Reflections are overdone, lighting passes are generally lacking balance, but yes, they all move all the time, fully dynamic - that is true. Is it a different experience in terms of gaming? Absolutely not, it just looks like an extra post effect. In Metro Exodus, some scenes even detract from being a good experience. You linked the Ascent, the first thing that I noticed was the stuttering of the RT half of that video.

What we have now is 16-20% of GPU floor plan reserved to boost RT performance (extra cache also serves it). Those cores are working hard to produce somewhat flashier reflective surfaces and somewhat more dynamic lighting and shadow. That is all she wrote so far, and often just picking one of them - never all at once. Imagine how much more you need hardware wise to make it all tick.

Also... its good Intel is stepping in this race too. But all these generations are, are attempts at finding a good equilibrium between marketable RT and not too shitty raster. The balancing act is ongoing and every company is going to try and carve out a piece of Unique Selling Point. AMD really does it too, even by not offering hardware for it - if they find ways to accelerate RT performance over their regular GPU floorplan, they have a winner that Nvidia can never get close to again. Alternatively, if Nvidia proves they can do fantastic RT (they haven't, yet) at good performance levels on marketable dies without per-title support being required, they also have a winner. So far though all Nvidia has shown us is price hikes for a minimal RT perf advantage, and exploding TDPs.

So far, not one single color/camp has the best solution and it is anyone's guess where this might go. And the economy around GPUs plays a major role in it too, and its not looking very good currently. That's a huge part of what colours that poll you linked, too. If its affordable, people will happily adopt.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
What we have now is 16-20% of GPU floor plan reserved to boost RT performance (extra cache also serves it).
RDNA2 Infinity Cache is ~15% of the die and it is huge. Pretty sure it has more benefits than just aiding AMD's RT performance.
 
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
This is a classic chicken and egg problem. Developing RT effects does not make sense if there is no hardware to support it. Adding hardware to support RT does not make sense if there is no RT effects in software. RT Cores take up ~3% on Turing dies. Relatively small investment for a new capability giving time for the industry to adjust and use/showoff it in a still meaningful way. Scaling them up should not be difficult but there is the same question of cost vs benefit.

In addition to optimizing the very specifically ray tracing side of things, RT advancements in games and software are very much around all the supporting areas that are not hardware accelerated. Notably, building and updating the BVH and other data structures.

Game development is a multi-year process and largely only this year have the major game engines implemented RT effects in a real (and non-beta) way. APIs have somewhat matured in their first incarnation.
Similarly, with AMD having hardware support for RT and Intel also getting some hardware support for RT, this is already getting industry-wide.

Clamoring for full path-tracing is one possible viewpoint but dismissing entire RT until that is very short-sighted IMO. RT has obvious benefits for use cases it is already being used - basically everything to do with lighting but in practice mostly GI, shadows, some aspects of transparency. Reflections too but that probably has more to do with us not having found better ways to do proper reflections than current strengths of RT.
I'm not dismissing anything. ray tracing is cool really but the hardware cant support this in the way it should. Every company is getting their customers teased about the new features and capabilities but that doesnt change the fact, RT has a huge miss in hardware requirements which is hard to compensate for.
With the better ways as you probably know, there are demos that dont use RT. Reflections, lightning, shadows are there and you would not tell the difference. RT is great no doubt but for now it is mostly a marketing point.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
RDNA2 Infinity Cache is ~15% of the die and it is huge. Pretty sure it has more benefits than just aiding AMD's RT performance.

I believe part of it is compensating for the choice of GDDR6, isn't it?

I wasn't referring to AMD's 16-20% - I was referring to Nvidia's, where the shader efficiency hasn't improved since Pascal, instead it was chopped up in smaller bits and the cache is likely a byproduct of that too. And they needed that change to cater for their additional RT/Tensor patchwork on top. Not to improve raster.

Meanwhile:

6900XT = 520 mm²
3090 = 628 mm²

That's about 20% bigger right there. Despite the changes since Turing, they still have a net die size increase of 20% for similar rasterized perf. Albeit on a slightly bigger node.

Stay on topic guys. This is about possible XTX variant not about RT.

True actually.
 
Last edited:
Joined
May 31, 2016
Messages
4,440 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Meanwhile:

6900XT = 520 mm²
3090 = 628 mm²

That's about 20% bigger right there. Despite the changes since Turing, they still have a net die size increase of 20% for similar rasterized perf. Albeit on a slightly bigger node.
I wonder how much bigger the Samsung's 8nm is vs TSMC 7nm. Something tells me if you had shrunk the 3090 die to TSMC 7nm it would have still been bigger than 6900XT's die is.
almost a 100mm2 is a lot of a differnece
 
Joined
Feb 3, 2017
Messages
3,822 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
I believe part of it is compensating for the choice of GDDR6, isn't it?

I wasn't referring to AMD's 16-20% - I was referring to Nvidia's, where the shader efficiency hasn't improved since Pascal, instead it was chopped up in smaller bits and the cache is likely a byproduct of that too. And they needed that change to cater for their additional RT/Tensor patchwork on top. Not to improve raster.

Meanwhile:
6900XT = 520 mm²
3090 = 628 mm²

That's about 20% bigger right there. Despite the changes since Turing, they still have a net die size increase of 20% for similar rasterized perf. Albeit on a slightly bigger node.
As you well understood, my post was deliberately ignoring Nvidia there :)

There is a reddit post where detailed Turing die shots were analyzed. What he came up seems to be correct enough, Tensor cores and FP16 capabilities may be more nuanced but RT Cores are distinguishable and straightforward. RT Cores make up about 6% of TPC and about 3% of total die size. The increase for Tensor cores and/or FP16 capability concurrent to FP32 has more/most uses outside RT, same for cache. Implementation for AMD and Intel should not be too much different in terms of transistors and area cost, possibly less.

I wish there were good/readable enough die shots for RDNA2 and Ampere but apparently not so far. Would also need comparisons without RT and in case of RDNA where RT capability is part of some other block (TMU?) it is probably impossible to read.

3090 is on Samsung 8N, 6900XT is on TSMC N7:
- 3090 die is 28.3B transistors on 628 mm² - 45 MTr/mm²
- 6900XT die is 26.8B transistors on 520 mm² - 51 MTr/mm²
This highlights the differences in manufacturing processes more than anything.
In terms of transistors/area cost of latest improvements RDNA2 has huge amount of transistors (at least 6.4B plus some control logic which is 24% of total transistors) in Infinity cache, Ampere no doubt has a lot of transistors in the doubled ALUs in shaders.

More cache has been the go-to improvement for a few generations before RDNA and Turing. More likely than not adding more and more cache (at different levels) would happen with or without RT.

I wonder how much bigger the Samsung's 8nm is vs TSMC 7nm. Something tells me if you had shrunk the 3090 die to TSMC 7nm it would have still been bigger than 6900XT's die is.
almost a 100mm2 is a lot of a differnece
Assuming similar transistor density as 6900XT, 3090 on N7 would be 5.5% larger, about 30 mm².
That assumption is obviously suspect though. Without Infinity Cache 6900XT die would be noticeably less dense. On the other hand, there is A100 on TSMC's N7 with 54.2B transistors and 826mm² making the density out to 65,6 MTr/mm².
 
Last edited:
Joined
Jun 5, 2021
Messages
284 (0.22/day)
As you well understood, my post was deliberately ignoring Nvidia there :)

There is a reddit post where detailed Turing die shots were analyzed. What he came up seems to be correct enough, Tensor cores and FP16 capabilities may be more nuanced but RT Cores are distinguishable and straightforward. RT Cores make up about 6% of TPC and about 3% of total die size. The increase for Tensor cores and/or FP16 capability concurrent to FP32 has more/most uses outside RT, same for cache. Implementation for AMD and Intel should not be too much different in terms of transistors and area cost, possibly less.

I wish there were good/readable enough die shots for RDNA2 and Ampere but apparently not so far. Would also need comparisons without RT and in case of RDNA where RT capability is part of some other block (TMU?) it is probably impossible to read.

3090 is on Samsung 8N, 6900XT is on TSMC N7:
- 3090 die is 28.3B transistors on 628 mm² - 45 MTr/mm²
- 6900XT die is 26.8B transistors on 520 mm² - 51 MTr/mm²
This highlights the differences in manufacturing processes more than anything.
In terms of transistors/area cost of latest improvements RDNA2 has huge amount of transistors (at least 6.4B plus some control logic which is 24% of total transistors) in Infinity cache, Ampere no doubt has a lot of transistors in the doubled ALUs in shaders.

More cache has been the go-to improvement for a few generations before RDNA and Turing. More likely than not adding more and more cache (at different levels) would happen with or without RT.

Assuming similar transistor density as 6900XT, 3090 on N7 would be 5.5% larger, about 30 mm².
That assumption is obviously suspect though. Without Infinity Cache 6900XT die would be noticeably less dense. On the other hand, there is A100 on TSMC's N7 with 54.2B transistors and 826mm² making the density out to 65,6 MTr/mm².
So tsmc's 7nm is 91mm but got 51mm on rx 6900xt. So tsmc's scalling of sram and logic is lesser than advertised ?
 
Top