• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

Joined
Jul 31, 2024
Messages
398 (3.13/day)
The 7800XT was not an upgrade. I have a 7800XT and I check the performance charts quite often for my card. The 6800XT was very often better as the 7800XT. Personally I see the 7800XT 5-10% behind the 6800XT. That's why some called the card the renamed 7700XT.

Some other aspects were more important for myself as only the performance difference for the 7800XT vs the 6800XT / 6950XT.
 
Joined
Nov 4, 2023
Messages
101 (0.25/day)
They had probably many bugs with RDNA3 and not with just hi end models. The high power consumption in video playback is one example.
I am expecting them to remain out of the high end market as long as they see that consumers are unwilling to pay for their cards.
Seeing as the lower end models were monolithic dies I'm willing to bet those bugs, such as a high-power consumption in video playback, would be more software than hardware, and seeing as higher power consumption in video playback for a desktop gpu isn't really much of a concern was probably a lower priority to fix. The high-end was chiplet and underperformed expectations even at AMD. Chiplets for a GPU are relatively new and thus there is going to be a larger chance of a severe hardware design flaw being uncovered during release.

The 7800XT was not an upgrade. I have a 7800XT and I check the performance charts quite often for my card. The 6800XT was very often better as the 7800XT. Personally I see the 7800XT 5-10% behind the 6800XT. That's why some called the card the renamed 7700XT.

Some other aspects were more important for myself as only the performance difference for the 7800XT vs the 6800XT / 6950XT.
If you look at the MSRP, and adjust it for inflation the 7800 XT was an upgrade for the 6700 XT, in which it was around 45% faster.

7900 xtx = 6900 xt
7900 xt = 6800 xt
7900 gre = 6800
7800 xt = 6700 xt
7700 xt = 6700
7600 xt = 6650 xt
7600 = 6600 xt
 
Joined
Apr 18, 2019
Messages
2,374 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
I will be pleasantly surprised if AMD manages to even match the 7900XTX in raster.
Hoping for another 2900->3870 deal, but I'm doubtful
 
Joined
Dec 6, 2016
Messages
155 (0.05/day)
System Name The cube
Processor AMD Ryzen 5700g
Motherboard Gigabyte B550M Aorus Elite
Cooling Thermalright ARO-M14
Memory 16GB Corsair Vengeance LPX 3800mhz
Video Card(s) Powercolor Radeon RX 6900XT Red Devil
Storage Kingston 1TB NV2| 2x 1TB 2.5" Hitachi 7200rpm | 2TB 2.5" Toshiba USB 3.0
Display(s) Samsung Odyssey G5 32" + LG 24MP59G 24"
Case Chieftec CI-02B-OP
Audio Device(s) Creative X-Fi Extreme Audio PCI-E (SB1040)
Power Supply Corsair HX1200
Mouse Razer Basilisk X Hyperspeed
Keyboard Razer Ornata Chroma
Software Win10 x64 PRO
Benchmark Scores Mobile: Asus Strix Advantage G713QY | Ryzen 7 5900HX | 16GB Micron 3200MHz CL21 | RX 6800M 12GB |
You don't need to convince me on this. RTX 3050 sells 5-10 times better than RX 6600.
But at least AMD's cards will look more competitive that will force tech press to be less of promoters of Nvidia hardware, which could be the first step for a mentality change in the market.

I noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.
 
Joined
Sep 6, 2013
Messages
3,358 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.
The consumer will google that and end up with many posts saying "AMD drivers are trash, FSR is trash, only buy Nvidia cards" and will go and buy the Nvidia card.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
If you look at the MSRP, and adjust it for inflation the 7800 XT was an upgrade for the 6700 XT, in which it was around 45% faster.

7900 xtx = 6900 xt
7900 xt = 6800 xt
7900 gre = 6800
7800 xt = 6700 xt
7700 xt = 6700
7600 xt = 6650 xt
7600 = 6600 xt

Performance upgrade:

7900 xtx = 6950 xt 36%
7900 xtx = 6900 xt 47%
7900 xt = 6800 xt 36%
7900 gre = 6800 31%
7800 xt = 6750 xt 40%
7800 xt = 6700 xt 48%
7700 xt = 6700 42%

These four are rebrands of one and the same thing:
7600 xt = 6650 xt 4%
7600 = 6600 xt 10%
 
Last edited:
Joined
Apr 2, 2011
Messages
2,820 (0.56/day)
So...lots of Nvidia and AMD hate in this thread. I support neither company, I buy a GPU that I can support.

That said, if the 8800 is equivalent to the 7900 it seems like the normal incremental improvement. +1 generation ~= +1 performance segment. That's usually coupled with more power draw...so putting out a lower power draw (and presumably cooler) version of the card would be great. The RT performance doesn't exactly concern me. As far as I'm concerned it's another TressFX. You remember that, don't you? The newest thing that was going to make hair rendering super realistic. The thing that it seem like nobody actually remembers about the Tomb Raider game...

Seriously though, it's stupid to support a brand. I bought a 5070, a 3080, and haven't been able to justify any new GPU purchase since the prices went from high to utterly silly. 3060s selling in 2024 for $300 is utterly silly, and as long as the market tolerates that Nvidia will continue to sell goods at eye watering prices. I don't support AMD with bad products...and I didn't support Intel with their current crop of driver crippled GPUs...but telling Nvidia that they can spoon feed you slop and charge for filet mignon is frustrating. I hope the 8000 series helps to rectify that...but I see too many people who sprung for the 4060 to believe that we'll see the blatant price gouging stop any time soon.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
May I ask for the line for 7800XT = 6800XT ?

It's more or less the same.

1733167801127.png

 
Joined
Jun 1, 2010
Messages
388 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
Low compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.


Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
It would make zero sense. I can already buy 4070 Ti Super for less that equals 7900XT raster and beats it in RT.

For 8800XT to have a chance it must not be more expensive than 7800XT is new. Meaning around 500. 550 at most.
The less it costs the better deal it will be. 450 would be good. 400 would be amazing. I doubt it will be less than 400.
I know, that I would be beaten for what I'll write here. But still...

It's logically correct. But truth is, AMD is not interested in such generoucity of"giving away" VGAs, for any less than nVidia. No matter how much "slower" RDNA4 is going to be, compared to nVidia GPUs, AMD for last five years (or even more), have shown by all their actions, that they will price thieir (top) card, similary to nVidia (top) "counterparts". Even if there's an entire performance gulf between them. This is sad, but it feels like AMD is going to extract every last penny, much like nVidia.

This happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits.


3090ti RT was not enough, no.

4090 RT isn't clearly enough either.

This whole thing is a farce.
Exactly! This was an nVidia game, only to inflate the price of graphic cards.

I still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels. GPUs simply have no room for RT to scale. This is jack of all trades, master of none. SInce both RT and raster parts share the silicon space, and thermal envelope, and none can "breathe".
If GPUs would be raster only, they would have much less footprint, either by size, and by power. And everyone, who wants to tinker/enjoy mazoshistic pleasure of limited RT capabilities would be able to add the RTRT cards.
 
Last edited:
Joined
Aug 21, 2013
Messages
1,908 (0.46/day)
The whole RDNA2 line-up failed to outperform their MSRP-sakes from NVIDIA. By a significant (15+ %) margin at least. Also no DLSS, CUDA, non-existent RT performance on top of that + insane power spikes.
Also, 6500 XT. Not as bad as GT 1630 but still ridiculous.
"At least"?. TPU shows 6900XT losing only 10% to 3090 while costing 500 less. That's an even better deal than 7900XTX is today compared to 4090.
6950XT was arguably an even better deal by increasing performance over 6900XT by further 7% while costing 100 more.
3090 Ti increased by 8% over 3090 but added another 500 and thus extending the price cap to 900.

We're also talking about 2020/2021 here. DLSS had just gotten to the point it was actually worth using but availability was still very limited and thus the fact that AMD did not have an answer at the time did not matter much. As for CUDA - well if Nvidia made it open source then AMD cards could run it no problem. You're also calling out non-existent RT perf. The same non-existent perf that applied to 3090 Ti...
RDNA3 is trickier:
7900 XTX looked somewhat attractive in comparison with 4080, however at this price point, a gamer expects more than just raw raster performance. They want to enable everything. You can't do that on 7900 XTX. Thus, it had to be launched more significantly below 1200. $850 tops.
7900 GRE is just an abomination and a half. At 600 dollars, it's just a meager 10 to 20 % boost over 4070 at the cost of being worse in power draw and scenarios that aren't gaming pure raster titles.
7800 XT is the same story as 7900 XTX. NOT CHEAP ENOUGH TO CONVINCE. 4070 is more feature rich and performance difference is only visible with FPS graphs enabled. $100 premium is low cost enough.
7700 XT is also an abomination.
7600... Don't even get me started, it's awful.
And 4080 users can enable everything and still enjoy high refreshrate gaming at 1200 or would a sane person look at 4080 price and conclude that if they're already ready to spend 1200 then why not jump to 4090?
AMD, unlike Nvidia did not increase their top cards price. 7900XTX launched at the same MSRP as 6900XT.
7900 GRE was and is an odd release. Probably for dumping defective N31 dies.
7800 XT and 7700 XT were the most popular RDNA3 cards i believe.
7600 may have been awful 8GB card but at least unlike Nvidia it was not priced at 400 and then charged another 100 for clamshell 16GB version.
Not to mention Nvidia not even releasing 3050 (a truly awful card that does not even have 8GB) successor.
That's why I'm not delusional. I'm just strongly pessimistic because AMD seem to live in the fairy tale where nothing NVIDIA better than 2080 Ti exists.
Strongly pessimistic person expects 550 or 600 most. Not over 750. You realize that if 8800XT really ended up costing 750+ then AMD would not be able to sell any because 7900XT and 7900XTX would be so much better deals?
The Radeon 5000 and Geforce GTX 1000 series were priced just fine. The Geforce RTX 2000 series introduced us to ray tracing where pricing started to get out of hand. Pricing went insane with the Geforce RTX3000, Geforce RTX4000, Radeon 6000 and Radeon 7000 series.
You're confusing something here. Yes 20 series was massive price hike for very little substance, but 30 series was very well priced thanks to cheaper node. 6000 and 7000 series had roughly the same prices with some outliers. 40 series was again a price hike.
GTX 480 was 250W, and it was not called "efficient". It was a disaster.
You did not find anything older than a 15 year old card?
Nvidia also had 250W 780, 780Ti, 980Ti and 1080Ti. 980Ti was praised for it's power efficiency and 1080Ti is legendary.

Also you do not account for the fact that 480 was a single fan blower card and it's performance was underwhelming.
Cooling 270W today is a far cry from cooling 250W fifteen years ago. The coolers are much bigger and can easily handle it.
Not to mention tolerable noise levels now vs then.
Forget what is playable. This is marketing. Someone pays $2000 for an RTX 4090, someone pays $1000 for the RX 7900XTX and one gets 60fps and the other one 15fps(I don't exactly remember the framerates, but I think PathTracing in those cards are like that). You know what you have? Not a playable game, but the first "proof" for the buyer of the RTX 4090 that their money where spend well. It's marketing and Nvidia is selling cards because of RT and DLSS.
Playable framerate is not marketing. It is essential. A person buying 7900XTX is not buying it for 60fps tech demo.
Playing one tech demo at barely playable framerate (these days i expect high refreshrate experience at 90+) is not what i call a "money spent well".
They might, then what is AMD going to do? Lower the price to $500? Then to $450 and then to $400? Then in their financial results the gaming department will be more in red than Intel's. From a gamer/consumer perspective we all love low prices. But with Nvidia having all the support of the world, with countless out there been educated to love Nvidia products and hate AMD products, with countless out there willing to spend more money to get a worst Nvidia product than a better AMD product, aggressive pricing could end up a financial disaster for AMD. So they need to be careful. Now, if RDNA4 is a marvel architecture that they know that Nvidia can't counter and if we assume that they have secured enough wafers to cover the high demand that we could expect from a positive reaction from consumers, then and only then AMD will price their products aggressively. Putting an MSRP of $400 and failing to cover demand or scalpers driving the price to $600 will do no good to AMD, only bad.
Nvidia lowering prices while manufacturing costs go up and new G7 being also more expensive? Never gonna happen. The best we can expect is the same price and that's assuming they're feeling generous and cut into their margins.
AMD wont start a price war with Nvidia because they dont have the money coffers and capacity.
Nvidia wont start a price was with AMD because they want to increase their money coffers.
Time — yeah, potentially RT can be faster since you don’t have to manually set up lighting.
That's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction.
Absolutely. I find it amusing how people now look at cards that are nearly double the TDP and it’s apparently fine, no problem there.
But it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner.
Even if this thing will indeed have 45% better RT performance or whatever it wont make a difference to the market share situation.
And people like john_ will still complain that AMD "only" manages 4080S RT performance. Nothing new here.
Conveniently ignoring the fact that Nvidia themselves do not give 4090 RT performance for 1/4th the price.

Nvidia does not really care about RT availability or market penetration. They only care how much more they can charge for this on their top cards.
If they truly cared (like they claim) they would do everything in their power to produce cheap mainstream cards with good RT perf.

Ironically it's AMD who has managed to bring RT to masses even on consoles. TBH i did not think consoles would get RT so soon and at this level of performance.
8800 XT, 220 watt card?
Nasty! /s.
This happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits.
X570 was justified because it had Gen4 capable chipset in 2019. Something Intel introduced a whole two years later (with fewer lanes).
Today's AM5 prices regardless of the chipset are way more arrogant.

Zen 3 also had massive performance increase. RDNA3 had some bad examples but the top card did not increase in price.
7700XT may have been that but at least it was 12GB. Meanwhile Nvidia asked 400 for a 8GB card and whopping 500 for 16GB despite AMD proving with 7600XT that going from 8GB to 16GB does not add 100 the the price. Not to mention that i remember 7700XT being out of stock because people bought it up compared to 7800XT.
I still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels.
I agree but practically i dont see this happening. The overhead of moving data over PCIe is so large that for real-time rendering this would introduce a whole host of problems that were prevalent in the SLI/CF days including the dreaded micro-stutter. Maybe future Gen6 or similar speeds can mitigate this issue somewhat but that still leaves the extra slot problem where most motherboards do not have and extra x16 electrical (not just physical) slot to plug in that RT card.
 
Joined
Jan 8, 2017
Messages
9,464 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Another thing to consider is that it seems UE5 is becoming an industry standard for better or worse (mostly worse) and most games are going to feature software RT in the form of Lumen with hardware RT being either optional or not really bringing much to the table, this will further muddy the waters.

There is a serious non zero chance that dedicated RT hardware will fade away in favor of more general purpose performance, with computing this happens very often historically.
 
Joined
Jun 2, 2017
Messages
9,252 (3.37/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
This post eerily describes me as well right down to the 7900 XT, my current GPU.
I also have a 7900Xt and feel the exact same way.

Another thing to consider is that it seems UE5 is becoming an industry standard for better or worse (mostly worse) and most games are going to feature software RT in the form of Lumen with hardware RT being either optional or not really bringing much to the table, this will further muddy the waters.

There is a serious non zero chance that dedicated RT hardware will fade away in favor of more general purpose performance, with computing this happens very often historically.
Check Freesync vs Gsync.
 
Joined
Nov 27, 2023
Messages
2,402 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
That's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction.
…do you struggle with the word “potentially”? Nothing I said is false, “patently” or not.
And no, even with the current limited RT usage it is not even close to being double work, especially seeing the industry converging onto UE5 where RT implementation is baked into the pipeline.

But it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner.
A better cooler doesn’t change the fact that more power equals more heat dumped into your case, out of it and into ones room. I already said that I don’t care one way or the other, my personal preferences are just that. If people are willing to accept 500 watt GPUs - more power to them.

oh hey that’s actually a good joke
 
Joined
Jan 8, 2017
Messages
9,464 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Epic made a really bizarre choice where instead of hardware Lumen simply being a faster version of software Lumen, it also features more complex ray tracing and runs worse than the software version. This will lead to the hilarious situation where consumers will have to face the choice between ray tracing and slower ray tracing, at a glance they'll look similar but they'll be bewildered by the fact that it runs worse.
 
Joined
Aug 21, 2013
Messages
1,908 (0.46/day)
…do you struggle with the word “potentially”? Nothing I said is false, “patently” or not.
And no, even with the current limited RT usage it is not even close to being double work, especially seeing the industry converging onto UE5 where RT implementation is baked into the pipeline.
So 1,5 times the work and it's somehow better?
A better cooler doesn’t change the fact that more power equals more heat dumped into your case, out of it and into ones room. I already said that I don’t care one way or the other, my personal preferences are just that. If people are willing to accept 500 watt GPUs - more power to them.
270W of heat is not uncomfortable or unmanageable. Even during summer heatwaves. I have 375W and now this as approaching uncomfortable levels in summer.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
8800 XT, 220 watt card?

With those specs, maybe +5-10% over the aging RX 6800 XT (year 2020). The only thing that could save AMD's face is the price - if it is $399, it will sell, if it is $499-599, it won't.
 
Joined
Jan 17, 2018
Messages
435 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
Low compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.
The problem with AMD's 7000 series RT capability is the performance in Pure-Raster vs RT compared to the 6000 series was something around 2-4% better in games with more than superficial RT(see Control). They said that RDNA3 was better at raytracing, but that 'better' was basically a meaningless improvement. The only real difference was the pure rasterization capabilities, which allowed the cards better performance overall.

Would be nice to see the 8000 series get a more significant boost to Raytracing than the 7000 series. RDNA4 really needs to see nothing more than a around a -48% regression(which would still put it behind Nvidia) in Raytracing vs Pure Raster to make it worth considering playing even older title with raytracing at anything more than 1080p.
 
Joined
Jan 8, 2017
Messages
9,464 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
The problem with AMD's 7000 series RT capability is the performance in Pure-Raster vs RT compared to the 6000 series was something around 2-4% better in games with more than superficial RT(see Control). They said that RDNA3 was better at raytracing, but that 'better' was basically a meaningless improvement.
You need to brush up on your math, that's not how those percentages work, those are performance regressions for each card with RT on, you can't just subtract percentages like that. If you do the math correctly RDNA3 is about 8% better and that's about the same Nvidia achieved this generation as well.

Suppose 100 is baseline, -61% is 39, - 58% is 42, 42/39 ~= 8%.
 
Joined
Aug 21, 2013
Messages
1,908 (0.46/day)
With those specs, maybe +5-10% over the aging RX 6800 XT (year 2020). The only thing that could save AMD's face is the price - if it is $399, it will sell, if it is $499-599, it won't.
Faster than that. Reported to be 7900 XT (raster) performance. This is 36% faster than 6800 XT. Way faster in RT compared to 6800 XT (4080S perf).
It will sell fine even at 499 unless 7900 XT drops to 499 in clearance sales (which it wont).
599 is a much tougher sell. Personally i dont think it will be either 399 or 599. Both are unrealistic. Im betting on 499.
 
Joined
Jan 17, 2018
Messages
435 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
You need to brush up on your math, that's not how those percentages work, those are performance regressions for each card with RT on, you can't just subtract percentages like that. If you do the math correctly RDNA3 is about 8% better and that's about the same Nvidia achieved this generation as well.

Suppose 100 is baseline, -61% is 39, - 58% is 42, 42/39 ~= 8%.
Math wrong, noted.

That being said, a 8% improvement is not enough when you're already ~33% behind.
 
Joined
Nov 27, 2023
Messages
2,402 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
So 1,5 times the work and it's somehow better?
Are you arguing with ghosts now? You do realize that I said in this very thread this:
Real-time RT as it is in GAMES today is little more than a gimmick.
I am not saying that RT based engines aren’t the way forward - they inevitably are as essentially THE holy grail of real-time rendering. But the push started way, waaaaay too early.
Do I need to further elaborate on my position or nah?

270W of heat is not uncomfortable or unmanageable. Even during summer heatwaves. I have 375W and now this as approaching uncomfortable levels in summer.
Okay? Your point being? Are you trying to explain why me preferring lower TDP cards as a personal choice is wrong somehow or…?
 
Joined
Dec 25, 2020
Messages
6,835 (4.74/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
When RX 7000 came out I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
A few years latter and probably with SONY pushing AMD in that direction, the rumors talk about a new RX 8000 series that mostly increases performance in Raytracing.
Better late than never....

Don't worry, it's never been about RT itself but more about "Radeons are bad at X and Y so X and Y do not matter, because it exposes a deficiency in my favorite brand"

Better late than never, sadly RTX 50 series will walk all over it

I noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.

Raw performance is half the story. By giving even the lowly RTX 3050 full access to the entire RTX Studio suite and heavily investing into day-one game ready drivers, providing years of updates etc. NV captures this value-sensitive market, besides, a 3050 will run eSports and most F2P phenomenon games just fine at very high settings and good frame rates
 
Top