• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Announces RDNA 3 GPU Launch Livestream

Joined
Jul 19, 2006
Messages
43,592 (6.67/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Wouldn't it be safer and easier to go down to your local brick and mortar store and talk to them and see if they can reserve a card for you?
Those don't exist anywhere near me anymore.
 
Joined
Feb 11, 2009
Messages
5,431 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I hope they just deliver solid gpu's with an eye (feeling of responsibility) on power consumption.

no need to compete with the ridiculous RTX4090, just make something even better then RX6800(XT) while consuming barely any extra power and ill be very happy....well if the price is right of course
 
Joined
Sep 15, 2016
Messages
482 (0.17/day)
As excited as I am for all tech launches I'm very doubtful that AMD can compete with NVIDIA in Ray Tracing this year. If I'm wrong I'll return my 4090, but I'm not expecting much.
 
Joined
Feb 11, 2009
Messages
5,431 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
As excited as I am for all tech launches I'm very doubtful that AMD can compete with NVIDIA in Ray Tracing this year. If I'm wrong I'll return my 4090, but I'm not expecting much.

The problem is, its all in development, its hard to say if the extra RT performance of Nvidia would even be worth it really and even that depends on the person.
Imagine if future RT is optimized more for the consoles, which makes sense, and the AMD cards can run that just fine without too much of a performance hit (instead of 100 fps, you get 80) and Nvidia cards can run that without a performance hit (100 fps) is that then that much of a factor?

We dont know what the future holds or how it will develop, even if Nvidia holds a large RT performance advantage, who is to say if it really matters?

all speculation, we will see, I do really like what RT is doing for the world of gaming, so I also hope its improved significantly but the consoles run what they run and games will be made for those which will have an effect on RT development
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,922 (3.05/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 \Paradigm 7se MKII, Paradigm 5SE MK1 , Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
Hey man, this is the Internet. TPU worships at the Altar of the Almightly Pageview.

You clicked the link, just like everyone else who read this thread.

Clearly you don't spend much time online.

Returning back to the topic, I did not know the stream would start at 1pm PDT. This article confirmed that. A more normal start time for Pacific Timezone events is 10am PT. This goes back to the printed periodical era (pre-2000s) when journalists had PM deadlines for East Coast based media companies.

I read AMD and that's all it had to say for me to click it, after all i have been waiting over 2 years for a new video card. But i guess i should of know n better just to some what blindly click haha. And never know with the rumors of a delay too it might of changed.

And clearly know nothing about me ;).

It's been a long wait i just wish they get it over and done with already ha.
 
Joined
Dec 28, 2012
Messages
3,553 (0.85/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
They got hacked and their closed source Linux drivers were leaked. In their true anti-capitalist crony style they failed in their attempt to look benevolent by waiting until everyone forgot about two months later. They also capitalized on their mindshare which was only possible because Intel had been surpressing AMD for most of it's existence.
People are still using the "mindshare" excuse for AMD's inability to straighten anything out of their own accord for over a decade?

Would like to find out if we get mid range(x700XT) series of GPUs right at launch and what would be their availability date.
Gonna guess almost no availability. The last few launches form AMD have been paper for GPUs.

Wouldn't it be safer and easier to go down to your local brick and mortar store and talk to them and see if they can reserve a card for you?
In America the only store like that left is microcenter, and most Americans live minimum 3+ hours away. The cost of gas will eat up whatever savings you'd get. Anything local disappeared years ago, the only thing left are fly-by-night computer repair shops that I wouldnt trust with a raspberry pi let alone anything expensive.
 
Joined
Mar 10, 2010
Messages
11,878 (2.28/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I can't wait tbh I need some extreme OT at work or a bit of lottery or bingo luck though:D


Choices yay.
 
Joined
Jun 21, 2021
Messages
2,923 (2.69/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Sonoma 14.5 (with latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Wouldn't it be safer and easier to go down to your local brick and mortar store and talk to them and see if they can reserve a card for you?

A lot of those local mom-and-pop PC stores have steadily shuttered in recent years as their Baby Boomer owners who started their businesses in the Eighties have reached retirement age with no one to pick up the reins.

I am grateful that there are still a few great mom-and-pop PC stores in my area.

Computers are commodities now, people buy and throw them away (er, recycle) every few years. Hell, even Microsoft belated accepted the fact that most people don't upgrade Windows which is why you can get an OEM key for Windows 10 Pro for $7 these days.

The number of people who open up a PC case to remove a component and install a new one is a very, very small portion of the consumer userbase. Joe Consumer is just going to buy a Dell, HP, or Alienware box and when it starts running "too slow" they'll just buy a new one and put the old computer in a kid's room.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.99/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I like the highlighting of efficiency - which has of course been a part of their promises for RDNA3 since its first concrete mention, but still - but it also makes me wonder what we can expect here.

- Not quite 4090 performance, but noticeably lower power?
- Noticeably slower than the 4090, but also much lower power?
- Matching or beating the 4090, at lower power?

That's pretty much in the order I consider most likely - they've promised +50% perf/W over RDNA2 after all, which would place a 330W RDNA3 GPU at beating the 4090 in 1440p and trailing slightly at 2160p (using the 6950XT in TPU's reviews as a baseline). If they stick to 300W like the 6900XT that would fit pretty well with that first suggested scenario. Definitely going to be interesting!
 
Joined
Oct 6, 2021
Messages
1,557 (1.59/day)
I like the highlighting of efficiency - which has of course been a part of their promises for RDNA3 since its first concrete mention, but still - but it also makes me wonder what we can expect here.

- Not quite 4090 performance, but noticeably lower power?
- Noticeably slower than the 4090, but also much lower power?
- Matching or beating the 4090, at lower power?

That's pretty much in the order I consider most likely - they've promised +50% perf/W over RDNA2 after all, which would place a 330W RDNA3 GPU at beating the 4090 in 1440p and trailing slightly at 2160p (using the 6950XT in TPU's reviews as a baseline). If they stick to 300W like the 6900XT that would fit pretty well with that first suggested scenario. Definitely going to be interesting!
images (68).jpeg

Well, the slide says ">50%" not just 50%, the question is how much more than 50% ?

Something like a 60% improvement in efficiency would put the 7900XT in a favorable situation against the competing 4090, perhaps forcing the launch of a marginally better 4090 ti consuming twice as much power.
 
Joined
Jun 21, 2021
Messages
2,923 (2.69/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
Software macOS Sonoma 14.5 (with latest patches)
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
Well, the slide says ">50%" not just 50%, the question is how much more than 50% ?

It will likely be heavily dependent on how much they can optimize the video driver software.

Of course, it will depend on benchmark/software title. When new hardware comes out, the performance uplift is never the same across the board over all metrics.

They could cherry pick through benchmarks to come up with a maximum number. Would that impress some people? For sure. However there would be some disappointed people who run a different -- possibly more pertinent -- benchmark that shows less improvement.

My guess is that there's a little "under promise, over deliver" going on here. It's better for AMD to say +54% and have average people get +57% rather than +51%.

Remember that there's also some variance in individual samples. Saying 53.96% might sound more accurate but it might not be meaningful in the context of describing a general performance increase, so saying >50% is probably less likely to get them into trouble.
 
Joined
Apr 22, 2021
Messages
249 (0.22/day)
Interesting timing, huh? /s

I am ~97.999%+ sure that ~ IF ~ AMD had any solution (yeah, right... I'm still waiting for this "Dr." to show up at AMD, I guess Jensen should be labeled - PROFESSOR Jensen, right :laugh:) ~ to compete/match or even surpasses the 4090 (days later after the 4090 release date and I'm also sure that AMD's team have a 4090 in their labs for testing purposes, etc.), I am confident in saying that by now that IF AMD's 7000 series had a performance lead over the 4090, that a so-called leak (yeah, right) would have reached the MSM, etc. waaaAAAaaay by now showing how the new RDNA3 will outperform the 4090 in the latest benchmarks.

That has not happened. It won't happen because AMD's Dr uhm... crew is still not in. :roll:

 
Joined
May 2, 2017
Messages
7,762 (2.99/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
View attachment 266395
Well, the slide says ">50%" not just 50%, the question is how much more than 50% ?

Something like a 60% improvement in efficiency would put the 7900XT in a favorable situation against the competing 4090, perhaps forcing the launch of a marginally better 4090 ti consuming twice as much power.
My assumption with corporate marketing is always to assume the worst with whatever wording they use. "Launching in H1"? That's June, May at best, no earlier. "Sub-$1000"? $999. So when AMD says >50%, I assume 50.1%, while being happy to be proven wrong. But I never assume more than what they're stating explicitly.

Of course, there are tons of leeway here. Are they being complete assholes and doing a worst-v-best comparison, or are they comparing at the same wattage, the same product tier, or some average/geomean of the whole range, and regardless of either of these, during which workloads?

Hence me starting from an asusmption of 50%. That's what they've promised - the "more than" sign is a vague promise with too many ways out of it to matter.

Interesting timing, huh? /s

I am ~97.999%+ sure that ~ IF ~ AMD had any solution (yeah, right... I'm still waiting for this "Dr." to show up at AMD, I guess Jensen should be labeled - PROFESSOR Jensen, right :laugh:) ~ to compete/match or even surpasses the 4090 (days later after the 4090 release date and I'm also sure that AMD's team have a 4090 in their labs for testing purposes, etc.), I am confident in saying that by now that IF AMD's 7000 series had a performance lead over the 4090, that a so-called leak (yeah, right) would have reached the MSM, etc. waaaAAAaaay by now showing how the new RDNA3 will outperform the 4090 in the latest benchmarks.

That has not happened. It won't happen because AMD's Dr uhm... crew is still not in. :roll:
So ... you understand how academic titles work, right? You go through a doctorate, do whatever research project you're workingon, write a dissertation, have it approved, and you are awarded the title of Dr., which you then have for life (unless you do something really egregious and have your home institution strip you of your degree). You somehow finding that funny is ... well, it just makes you look dumb, whatever the reason. It's pretty hard not to just assume sexism from the get-go - especially given the complete nonsense the rest of that post cosists of - but that's irrelevant really. Dr. Su has been AMD's best CEO for quite some time, and her tenure has been massively successful in many ways, including bringing the GPU branch back from a rapid decline towards irrelevance in the mid-2000s to surpassing Nvidia's efficiency and matching them in absolute performance for the past generation (outside of massively power hungry top SKUs at 2160p).

Also, did you miss the fact that this launch date was already announced quite a while ago? Or the fact that AMD launched RDNA2 two years ago, and have been working on RDNA3 since a while before that launch? Is it really surprising that they're launching a new generation close to Nvidia? For anyone following the PC hardware space even slightly, it really shouldn't be surprising at all. This is how this business operates.

The lack of leaks is somewhat unusual, but then AMD tends to have a lot less leaks than Nvidia - no doubt because of them being an overall smaller operation, and there being more interest in Nvidia leaks to begin with.
 
Joined
Jan 27, 2015
Messages
1,658 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Time to root for AMD.

I hope the 7700 XT comes in at or under 250W and can match a 3080. If it can, they should get such a card out quick and snatch up the midrange from Nvidia.
 
Joined
Jan 17, 2018
Messages
389 (0.17/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
As excited as I am for all tech launches I'm very doubtful that AMD can compete with NVIDIA in Ray Tracing this year. If I'm wrong I'll return my 4090, but I'm not expecting much.
I mean, in most games the 4090 is only ~5% better at raytracing efficiency than the 3090 Ti. That being Original Framerate verses Raytracing Framerate, so it wasn't much of an improvement on the actual efficiency of Raytracing. AMD on the other hand have said they will be dedicating a lot more resources to the raytracing on their RNDA 3 architecture, so the actual efficiency of RDNA3 should be better this generation, supposedly ~2x better, which would put it just about on par with Nvidia in terms of raytracing efficiency.

You've already spent almost $2000 on a GPU, and by the time the AMD GPU's are released you probably won't be eligible for a return. However, I wouldn't really expect AMD to be well past Nvidia in performance though, I'd expect ±5-10%. Would you really return a 4090 for way less than you paid for it to get an extra 5-10% performance? Seems like a massive waste of money.
 
Joined
Apr 19, 2017
Messages
71 (0.03/day)
My assumption with corporate marketing is always to assume the worst with whatever wording they use. "Launching in H1"? That's June, May at best, no earlier. "Sub-$1000"? $999. So when AMD says >50%, I assume 50.1%, while being happy to be proven wrong. But I never assume more than what they're stating explicitly.

Of course, there are tons of leeway here. Are they being complete assholes and doing a worst-v-best comparison, or are they comparing at the same wattage, the same product tier, or some average/geomean of the whole range, and regardless of either of these, during which workloads?

Hence me starting from an asusmption of 50%. That's what they've promised - the "more than" sign is a vague promise with too many ways out of it to matter.


So ... you understand how academic titles work, right? You go through a doctorate, do whatever research project you're workingon, write a dissertation, have it approved, and you are awarded the title of Dr., which you then have for life (unless you do something really egregious and have your home institution strip you of your degree). You somehow finding that funny is ... well, it just makes you look dumb, whatever the reason. It's pretty hard not to just assume sexism from the get-go - especially given the complete nonsense the rest of that post cosists of - but that's irrelevant really. Dr. Su has been AMD's best CEO for quite some time, and her tenure has been massively successful in many ways, including bringing the GPU branch back from a rapid decline towards irrelevance in the mid-2000s to surpassing Nvidia's efficiency and matching them in absolute performance for the past generation (outside of massively power hungry top SKUs at 2160p).

Also, did you miss the fact that this launch date was already announced quite a while ago? Or the fact that AMD launched RDNA2 two years ago, and have been working on RDNA3 since a while before that launch? Is it really surprising that they're launching a new generation close to Nvidia? For anyone following the PC hardware space even slightly, it really shouldn't be surprising at all. This is how this business operates.

The lack of leaks is somewhat unusual, but then AMD tends to have a lot less leaks than Nvidia - no doubt because of them being an overall smaller operation, and there being more interest in Nvidia leaks to begin with.
Excellent post. Not sure what that guy was talking about. seems like he has some serious issues lol. However it always seems to be more interest/hype in AMD leaks than Nvidia from what I've seen.
 
Joined
Oct 18, 2013
Messages
5,659 (1.46/day)
Location
Everywhere all the time all at once
System Name The Little One
Processor i5-11320H @4.4GHZ
Motherboard AZW SEI
Cooling Fan w/heat pipes + side & rear vents
Memory 64GB Crucial DDR4-3200 (2x 32GB)
Video Card(s) Iris XE
Storage WD Black SN850X 4TB m.2, Seagate 2TB SSD + SN850 4TB x2 in an external enclosure
Display(s) 2x Samsung 43" & 2x 32"
Case Practically identical to a mac mini, just purrtier in slate blue, & with 3x usb ports on the front !
Audio Device(s) Yamaha ATS-1060 Bluetooth Soundbar & Subwoofer
Power Supply 65w brick
Mouse Logitech MX Master 2
Keyboard Logitech G613 mechanical wireless
Software Windows 10 pro 64 bit, with all the unnecessary background shitzu turned OFF !
Benchmark Scores PDQ
Livestream, my stream, your stream....it won't matta much, 'cause I've already bought every friggin RDNA3 card on the planet, and the 126.91 that were available off-world too, so this livestream will mainly be to "announce" that yes, the cards are officially launching, but no, there won't be any available to actually buy until late next year, at the earliest :)

/s
 
Joined
Dec 22, 2011
Messages
3,890 (0.85/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Livestream, my stream, your stream....it won't matta much, 'cause I've already bought every friggin RDNA3 card on the planet, and the 126.91 that were available off-world too, so this livestream will mainly be to "announce" that yes, the cards are officially launching, but no, there won't be any available to actually buy until late next year, at the earliest :)

/s

No sarcasm required, there going to be fast and efficient AMD fanboy's wet dreams and will be gobbled up by the scalpers too.
 
Joined
Dec 26, 2006
Messages
3,591 (0.56/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Would like to find out if we get mid range(x700XT) series of GPUs right at launch and what would be their availability date.
What about the 7400?? ;)
 
Joined
Sep 28, 2005
Messages
3,179 (0.47/day)
Location
Canada
System Name PCGR
Processor 12400f
Motherboard Asus ROG STRIX B660-I
Cooling Stock Intel Cooler
Memory 2x16GB DDR5 5600 Corsair
Video Card(s) Dell RTX 3080
Storage 1x 512GB Mmoment PCIe 3 NVME 1x 2TB Corsair S70
Display(s) LG 32" 1440p
Case Phanteks Evolve itx
Audio Device(s) Onboard
Power Supply 750W Cooler Master sfx
Software Windows 11
Wouldn't it be safer and easier to go down to your local brick and mortar store and talk to them and see if they can reserve a card for you?
I tried that with memory express and the guy laughed, called me names then hanged up. Maybe he called me back to also say he banged my mom.

I then realized I cant get pre-orders or do reserves.

And cant fight the bots. Cant get a overpriced 4090. Wont be able to get one of these.
 
Joined
Oct 27, 2020
Messages
789 (0.60/day)
The logic thing with these specs (12288SP etc) that Navi31 has, given the right TBP full Navi31 should be faster in 4K raster than RTX 4090 in a Zen4X3D/13900KS testbed (especially the V-cache enabled model whenever it launches and if) but by how much is anyone guess but certainly not with a ≤350 TBP.
Regarding raytracing performance potential if die sizes leaks are correct the dies are small, the die allocated to specific RT performance improvements seems to be nowhere near Ada's level and also it probably won't have dedicated L2 to raytracing, but doubling the L2 cache for example in each Shader Engine will certainly help if utilised properly.
Ampere was better than Turing in RT but not by much, i would be happy if Navi31 can match Turing's 2080Ti regarding % performance hit when RT is enabled, because for me it will be enough with the titles that we already have especially if FSR2.0 (3.0?) utilized (but not with some future titles that will be RT showcases)

 
Last edited:
Joined
Sep 15, 2016
Messages
482 (0.17/day)
I mean, in most games the 4090 is only ~5% better at raytracing efficiency than the 3090 Ti. That being Original Framerate verses Raytracing Framerate, so it wasn't much of an improvement on the actual efficiency of Raytracing. AMD on the other hand have said they will be dedicating a lot more resources to the raytracing on their RNDA 3 architecture, so the actual efficiency of RDNA3 should be better this generation, supposedly ~2x better, which would put it just about on par with Nvidia in terms of raytracing efficiency.

You've already spent almost $2000 on a GPU, and by the time the AMD GPU's are released you probably won't be eligible for a return. However, I wouldn't really expect AMD to be well past Nvidia in performance though, I'd expect ±5-10%. Would you really return a 4090 for way less than you paid for it to get an extra 5-10% performance? Seems like a massive waste of money.
I have until Jan 30th for full return.

I'll be gaming at 4k Ray Traced, so that's what's important to me. Rasterization is less of an impact when you can already reach 120hz.
 
Joined
Apr 22, 2021
Messages
249 (0.22/day)
AMD on the other hand have said they will be dedicating a lot more resources to the raytracing on their RNDA 3 architecture, so the actual efficiency of RDNA3 should be better this generation, supposedly ~2x better, which would put it just about on par with Nvidia in terms of raytracing efficiency.
Define "a lot more resources", uhm... 65%... or 104%... or just ~8.5% because any shift in increased value could VALIDATE such a claim and if memory serves me, AMD definitely refused to attach any solid value to that expected ++ PR statement ++. :shadedshu:

All the "should & woulds" used above is not an AMD or NVIDIA stable business model forecast, especially for consumers.
 
Joined
Apr 16, 2019
Messages
632 (0.34/day)
I really doubt RDNA3 will beat a 4090 at 4k gaming, but I think it may match 1440p gaming. Plus you have DLSS3, which lets face it, AMD just won't be able to pull something like that off. doubling the frames at 4k with AI? I just don't see AMD having that technical capability.

but, I still plan to buy RDNA3 because I only game at 1440p.
Even 1440p seems unlikely, 1080p though, probably.
 
Joined
Apr 21, 2005
Messages
172 (0.02/day)
My assumption with corporate marketing is always to assume the worst with whatever wording they use. "Launching in H1"? That's June, May at best, no earlier. "Sub-$1000"? $999. So when AMD says >50%, I assume 50.1%, while being happy to be proven wrong. But I never assume more than what they're stating explicitly.

Of course, there are tons of leeway here. Are they being complete assholes and doing a worst-v-best comparison, or are they comparing at the same wattage, the same product tier, or some average/geomean of the whole range, and regardless of either of these, during which workloads?

Hence me starting from an asusmption of 50%. That's what they've promised - the "more than" sign is a vague promise with too many ways out of it to matter.

For RDNA1 they claimed a 50% perf/watt gain over Vega. This was done by them comparing V64 to the 5700XT with both parts at stock.
For RDNA2 they claimed a 50% perf/watt gain in early released slides but in the reveal event they claimed 54% and 64%. 54% was 5700XT vs 6800XT at 4k in a variety of games (listed in the footnotes of their slide). The 64% was 5700XT vs 6900XT at 4K in the same games. This was further confirmed in some reviews but it heavily depended on how they tested perf/watt. Those sites that use the power and performance data from 1 game saw very different results. TPU saw about a 50% gain where as Tehcspot / HUB saw 70%+ gain because HUB used Doom Eternal and the 5700XT underperformed and TPU used CP2077 and the 6900XT underperformed. If you look at the HUB average uplift of the 6800XT and 6900XT then it actually matched up really well with AMDs claimed improvements.

So the AMD method seems to be compare SKU to SKU at stock settings, measure the average frame rate difference in a suite of titles and then work out the perf/watt delta.

With the >50% I do agree with using the 50% as a baseline but I feel confident that they are not doing a best vs worst comparison because that is not something AMD have done prior under current leadership.

What it does do though is give us some numbers to play with. If the Enermax numbers are correct and top N31 is using 420W then you can get the following numbers.

BaselineTBPPower DeltaPerf/Watt multiPerformance MultiEstimate vs 4090 in Raster
6900XT3001.4x1.5x2.1x+10%
6900XT3001.4x1.64x (to match 6900XT delta) extreme upper bound!2.3x+23%
Ref 6950XT3351.25x1.5x1.88x+15%
Ref 6950XT3351.25x1.64x Again extreme upper bound!2.05x+25%

Now the assumption I am making here is pretty obvious and that is the design goal of N31 was 420W to begin with which would mean it was wide enough to use that power in the saner part of the f/v curve. If it was not 420W to begin with and has been pushed to this through increasing clocks then it is obvious the perf/watt will drop off and the numbers above will be incorrect.

The other assumption is the Enermax numbers are correct. It is entirely possible that the reference TBP for N31 will be closer to 375W which with these numbers would put it about on par with the 4090.

My view is the TBP will be closer to 375-400W rather than 420W in which case anywhere from about equal to 5% ahead of the 4090 seems to be the ballpark I expect top N31 to land in but there is room for a positive surprise should AMDs >50% claim be like their >5Ghz claim or the >15% single thread claim in the Zen 4 teaser slide and be a rather large underselling of what was actually achieved. Still I await actual numbers on that front and until then I am assuming something in the region of +50%.
 
Top