Thursday, October 20th 2022
AMD Announces RDNA 3 GPU Launch Livestream
It's hardly a secret that AMD will announce its first RDNA 3 based GPUs on the 3rd of November and the company has now officially announced that that it'll hold a livestream that starts 1:00 pm (13:00) Pacific Daylight Time. The event goes under the name "together we advance_gaming". AMD didn't share much in terms of details about the event, all we know is that "AMD executives will provide details on the new high-performance, energy-efficient AMD RDNA 3 architecture that will deliver new levels of performance, efficiency and functionality to gamers and content creators."
Source:
AMD
104 Comments on AMD Announces RDNA 3 GPU Launch Livestream
I expect AMD to talk about all of the cards but only launch the top 2 cards, leaving the rest until most of the current inventory is depleted. In this sense, it's actually close to nVidia's approach.
RDNA3 AT MSRP WILL BE MINE!!! EAT SHIT BOTS!!! :rockout: :rockout: :rockout: :rockout: :rockout: :rockout: :rockout: :rockout: :rockout:
but, I still plan to buy RDNA3 because I only game at 1440p.
However if you don't care about power consumption or if you have usage cases for Tensor cores and RT cores, you might be happy with what you have.
Either way you'll have to wait for a bit. There probably won't be third-party RDNA3 reviews with actual performance figures on AMD's November 3 launch day so it may be until the second half of that month for you to have enough data points to make a more informed assessment.
Speculating about unannounced/unreleased product can be fun for some but it doesn't provide any useful information to make a well guided purchase decision.
I somehow missed this info, I don't know if it was published here, I know it week or two.
Wait all of them are gone then shit will be massively over priced to now.
I don't know how much time you spend online but Internet articles have been written for far less. Be grateful for what this TPU article is.
You clicked the link, just like everyone else who read this thread.
Clearly you don't spend much time online.
Returning back to the topic, I did not know the stream would start at 1pm PDT. This article confirmed that. A more normal start time for Pacific Timezone events is 10am PT. This goes back to the printed periodical era (pre-2000s) when journalists had PM deadlines for East Coast based media companies.
But yes DLSS3 mad raytracing is among the reasons I chose 4090. I do care about power consumption. Hence why I will be using the power target slider in msi afterburner and maybe undervolt the card. But it seems 4090 is not as happy about undervolting as ampere. Undervolting my 3080 saved me 50 to 80 watt with out losing performance.
I do have use for DLSS3 MD raytracing in games. DLSS3 seems to come in handy. Yes ordered a card Friday last week and got a rtx 4090 Monday this week. So this weekend I am going to play with my 4090.
I care less about power efficiency with my 3080 Ti, I really only use that build for gaming. When I'm not gaming (which is most of the time), the system is powered down. For performance-per-watt, I'm better off using my Mac mini 2018 or my iPad mini.
Anyhow, it's unlikely that AMD RDNA3 will be an NVIDIA RTX/DLSS 3 killer. My guess is that RNDA3 can beat Ada Lovelace from performance per watt and performance per dollar metrics on traditional rasterization performance but once you factor in RT and ML usage cases, the situation is murkier. It really ends up how much any given individual values RT and image upscaling performance, at least from a gaming perspective.
There are non-gameplay situations though, even for Joe Consumer. GeForce cards might have a slight advantage for hardware video encoding tasks and Radeon has nothing like Tensor cores for AI-assisted realtime image replacement (NVIDIA Broadcast as an example).
Like I wrote earlier, even if you watch the livestream event on November 3rd, most likely it'll be a month before the general public sees more concrete performance figures for RNDA3 and how they stack up versus Ada Lovelace.
If you end up regretting your 4090 purchase, you can probably sell it used for more than the current MSRP based on current scalper activity.
I REALLY hope it's good ! I don't really care about equalling a 4090 if it's 400$ cheaper tbh. All I ask for is better prices than Ada.
AMD have cards to play here (no pun intended) by having "high" but down to earth pricing compared to Nvidia and a reasonably powerful and efficient GPU.
I do expect Nvidia to shred AMD on ray-tracing but honestly, is that really the most important thing in a GPU ? Not at all, for me it's raw performance without RT or DLSS or FSR that count the most
The difference being that it is faster and more accurate. Implementing a similar feature on AMD cards would be 100% possible, the problem lies in how good it would be.
Not sure I want frame interpolation though myself. I'd prefer they introduce features that increase performance without have negatives or just straight up push for more RT / Raster performance.
For some titles, frame interpolation comes at a cost that isn't objectionable. This is typical with many improvements since there's usually some sort of tradeoff being made. Heck, today's have more pure rasterization performance and a lot of it is coming from increased power consumption.
A free lunch in technology products (or any product for that matter) is pretty rare. There are usually some negatives: increased cost is a frequent one.
A wonderful thing about DLSS is you can turn it off. Or you can buy Radeon cards, save the space that the ML cores took up on the die and use it for raster cores. That's essentially what Radeon customers are doing.
:)