• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,984 (6.72/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Good for Amd then that you aren't leading it , you'd run it into the ground in 6 months lol

RT is the single biggest leap in photorealism in real time rendering , EVER, you either use amd and cant enjoy fhe feature properly and need fsr performance or you are simply trolling. Its here to stay. Games are starting to use RT features that cant even be disabled anymore AMD is lagging behind sorely because of RT and updcaling and other bells and whistles, RDNA2 was very competitive in raster and often faster and yet it still got beat because the vast majority wants these new features , a very tiny minority on the Internet moans about RT being a gimmick.

Upscaling surpassed native in more and more ways ever since dlss 2.0 launched , if you still get artifacting or shimmering then you are a fsr user.

and anyone who calls it a gimmick has his brain shoved where the sun doesn't shine.

RT, Updcaling, AI is the future and any company that does not implement better solutions wilm be left in the dust
Delusional much
 
Joined
Feb 24, 2021
Messages
175 (0.12/day)
System Name Upgraded CyberpowerPC Ultra 5 Elite Gaming PC
Processor AMD Ryzen 7 5800X3D
Motherboard MSI B450M Pro-VDH Plus
Cooling Thermalright Peerless Assassin 120 SE
Memory CM4X8GD3000C16K4D (OC to CL14)
Video Card(s) XFX Speedster MERC RX 7800 XT
Storage TCSunbow X3 1TB, ADATA SU630 240GB, Seagate BarraCuda ST2000DM008 2TB
Display(s) AOC Agon AG241QX 1440p 144Hz
Case Cooler Master MasterBox MB520 (CyberpowerPC variant)
Power Supply 600W Cooler Master
Using A Path Tracing Benchmark? What AMD card supports Path Tracing?
They all do. They're just not very good at it.
Path tracing isn't magic, it's mathematics, just like any other method of generating computer graphics. You can ray trace or path trace on a GPU with no RT cores, or even on a CPU. It won't run very well, but you can do it. The indie game "Teardown" is fully ray-traced (not rasterised - its use of voxels allows ray tracing to work at low ray count without looking like complete ass), doesn't use RT cores, and is playable (albeit only at relatively low resolutions and frame rates) on old GPUs like the RX 580 and GTX 1060. Nvidia's RT cores are just much better at path tracing than AMD's, and RDNA4 will hopefully change that.

I hate to be that guy, but literally nobody cares about path tracing when it comes with that much of a performance penalty. Not even the most die hard Nvidia zealots running 4090s. Ask anyone running one if they'd rather run this game at 1080p 60FPS path traced or 4K 60FPS with RT and DLSS Quality on their 4K monitors when actually playing the game and not benchmarking.
You don't need to hate anything, I completely agree.
I used Cyberpunk 2077 with PT as an example, because I wanted to find a situation which was as close to a performance of pure ray/path tracing performance as possible. The overwhelming majority of games which use RT or PT, are primarily rasterised, and only overlay the tracing on top for reflections, lighting, and shadows as an additional effect or embellishment on top of the rasterised image.
My choice of example was intended to show a situation where tracing performance is the primary factor in the performance result, and rasterisation isn't significant.
I fully understand that this isn't representative of the difference in performance in realistic gaming scenarios, and I apologise if my choice of example was misleading.

In a more realistic situation, of a primarily rasterised game which uses some traced effects, an RX 7900 GRE is much closer to the performance of an RTX 4070 Ti, and an RX 7900 XTX is often faster overall. The point I was trying to make is that the Nvidia GPU will lose much less performance when ray/path tracing is enabled compared to when it's disabled, and that RDNA4 having 3x the tracing performance of RDNA3 would allow them to close this gap. For example, rasterisation might be 85% of the frame time for an RTX 4070 Ti, with the remaining 15% being ray tracing, while an RX 7900 XTX might need to spend 45% of its frame time on ray tracing; so even though its rasterisation performance is much higher, it might not be much faster overall in games that use ray tracing.

And also, over the next few years, more games will make use of more intensive ray/path-traced effects, so tracing performance will become even more important over time. Even so, I don't expect that examples as extreme as Cyberpunk 2077 with PT will be directly relevant to the average gamer any time soon, but it is still indirectly relevant, as an indication of ray/path-tracing performance as a component of total gaming performance.

I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
 
Last edited:
  • Like
Reactions: Am*

Am*

Joined
Nov 1, 2011
Messages
337 (0.07/day)
System Name 3D Vision & Sound Blaster
Processor Intel Core i5 2500K @ 4.5GHz (stock voltage)
Motherboard Gigabyte P67A-D3-B3
Cooling Thermalright Silver Arrow SB-E Special Edition (with 3x 140mm Black Thermalright fans)
Memory Crucial Ballistix Tactical Tracer 16GB (2x8GB 1600MHz CL8)
Video Card(s) Nvidia GTX TITAN X 12288MB Maxwell @1350MHz
Storage 6TB of Samsung SSDs + 12TB of HDDs
Display(s) LG C1 48 + LG 38UC99 + Samsung S34E790C + BenQ XL2420T + PHILIPS 231C5TJKFU
Case Fractal Design Define R4 Windowed with 6x 140mm Corsair AFs
Audio Device(s) Creative SoundBlaster Z SE + Z906 5.1 speakers/DT 990 PRO
Power Supply Seasonic Focus PX 650W 80+ Platinum
Mouse Logitech G700s
Keyboard CHERRY MX-Board 1.0 Backlit Silent Red Keyboard
Software Windows 7 Pro (RIP) + Winbloat 10 Pro
Benchmark Scores 2fast4u,bro...
I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
I do agree -- but in my opinion, ray tracing is not where AMD's priority should be...but rather the feature parity against CUDA, DLSS, RTX video and RTX HDR. They're the only features I've actually missed moving from my RTX 3060 to the 7800 XT -- and by tackling these features, they'll get a 10x better return on their money than investing in anything related to ray tracing, since it will also finally be a showcase of what their cards can do in AI workloads for professionals and this is still where the current investor gold rush is. Even if AMD beats Nvidia in ray tracing at every equivalent SKU by 20%, the market will still pick Nvidia over AMD's GPUs for DLSS or CUDA alone.

So long as AMD are in both of the higher end (relative to Nintendo at least) consoles from Microsoft and Sony, ray tracing is going nowhere for mainstream gaming and will remain an afterthought -- at least until next gen consoles launch...that is more than evident now, considering we've had GPUs with this capability for almost 7 years with barely any progress (compared to past generations like the GTX 400 series for example, where mass feature adoption happened in about 3-5 years from launch and almost everyone got upgraded to hardware capable of the latest feature set, like decent tessellation performance)...by fragmenting the market with multiple different versions of DLSS, selling non-ray tracing capable SKUs like the GTX 1600 cards, the scalping/unavailability of GPUs for about 2 years and the lack of VRAM progression, Nvidia have been their own worst enemy in slowing down mass adoption of ray tracing capable GPUs. And to add to this -- not to mention the huge number of people running old integrated graphics or several generations old GPUs with no ray tracing capability (which is money no game developer is willing to turn down voluntarily -- especially when so many games are being re-released/ported from last gen consoles with not much else besides minor some visual improvements).
 
Joined
Feb 2, 2022
Messages
11 (0.01/day)
System Name CryBaby The God
Processor 9800x3d @ 5.6Ghz
Motherboard MSI MEG x870e GODLIKE
Cooling MSI MEG s360 Cooler w/ IPS
Memory 32gb G.Skill Neo ROYAL 8000c34 Platinum
Video Card(s) Asus Tuf OC RTX 4090
Storage Crucial t705 Gen5 M.2 1TB (Win11), Samsung 990 Pro Gen4 M.2 2TB (Games)
Display(s) 34" MSI MPG 34" OLED 3440x1440 @ 240Hz + 48" LG cx OLED @ 4k/120Hz
Case Asus ProArt PA602 Wooden Edition
Audio Device(s) Godlike>Supra Excalibur USB>Schiit Yggdrasil A2 DAC>Flux Mentor Amp > Norne Vykari > Hifiman He1000s
Power Supply Superflower LeadexVII 1200w Platinum pcie5.1 ATX3.1 PSU
Mouse Logitech G502
Keyboard Razer Hunstman (SE-GOW)
Known this for about a year, the big move in later 2025 into 26 is udna
IF they launched UDNA late this year I'd be shocked. I don't expect that until Fall of the following year.
That said I HOPE I am wrong that that UDNA is imminent. It is so weird how they just plan to release a 70-tier card and call it 'good' for an entire GPU Generation.
That shitz Whack AF, y'all.
 
Top