• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4

Joined
Nov 5, 2012
Messages
42 (0.01/day)
Location
France
System Name Game computer
Processor AMD RyZen 7 5800X3D 4.35GHZ
Motherboard ASRock X470 Taichi
Cooling AMD Fan Wraight
Memory 32768 Mo DDR4-3200 G-Skill CL16
Video Card(s) AMD Radeon RX 7900 GRE
Storage SSD Samsung 970 EVO M2 250 Go, Samsung 970 EVO M2 500 Go, Samsung 850 EVO SATA 500 Go, Toshiba 4 To
Display(s) AOC 24' 1440p 144 Hz DisplayPort + ACER KG251Q 24' 1080p 144 Hz DisplayPort
Case NZXT Phantom Black
Audio Device(s) Corsair Gaming VOID Pro RGB Wireless Special Edition
Power Supply BeQuiet Straight Power 11 1000W
Mouse Roccat Kone XTD
Keyboard BTC USB
Software Windows 11 Pro x64
Do sponsoring and have better advertising


RDNA is not GCN, GCN Ended with Vega.

Use this link here and look up GCN architecture.
Not quite true. As for RDNA, yes, it is different.
But GCN did not end with VEGA. CDNA is GCN, but with the ROP and TMU units removed to make way for matrix cores. By the way, CDNA3 still keeps the GFX9 version like VEGA (GCN5).
=>GCN5/CDNA/CDNA2/CDNA3/CDNA4: GFX9
=>RDNA/RDNA2 (GFX10) - RDNA3 (GFX11) - RDNA4 (GFX12)

Logical to think that UDNA will be GFX13 (or GFX14 if they are superstitious) for both Radeon and Instinct.
 
Joined
Jun 26, 2022
Messages
239 (0.26/day)
Processor 7950X, PBO CO -15
Motherboard Gigabyte X670 AORUS Elite AX (rev. 1.0)
Cooling EVGA CLC 360 w/Arctic P12 PWM PST A-RGB fans
Memory 64GB G.Skill Trident Z5 RGB F5-6000J3040G32GA2-TZ5RK
Video Card(s) ASUS TUF Gaming GeForce RTX 3070
Storage 970 EVO Plus 2TB x2, 970 EVO 1TB; SATA: 850 EVO 500GB (HDD cache), HDDs: 6TB Seagate, 1TB Samsung
Display(s) ASUS 32" 165Hz IPS (VG32AQL1A), ASUS 27" 144Hz TN (MG278Q)
Case Corsair 4000D Airflow
Audio Device(s) Razer BlackShark V2 Pro
Power Supply Corsair RM1000x
Mouse Logitech M720
Keyboard G.Skill KM780R MX
Software Win10 Pro, PrimoCache, VMware Workstation Pro 16
Joined
Feb 1, 2019
Messages
3,664 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Seems like really bad news, they're going to do the same thing with Vega where they made the architecture too granular with a lot of compute and not enough ROPs because there's really no other way you could unify these two different sets of requirements into one architecture.

I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster.


Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?
Dunno who this is aimed at, but I am a long time Nvidia user so will bite.

I am not happy with Nvidia prices, and do make it clear in many posts, also not happy with how Nvidia are under spec'd VRAM in so many cards.

The reason I havent gone out and brought a cheaper AMD card is they dont have feature parity. With GPUs its about software as well. SGSSAA is a deal breaker for me. DLSS is the best modern upscaler, and as it turns out I now like RTX video. On top of this apparently AMD's encoder is even worse than NVENC.

Also the reason I have said AMD need to drop prices is they are the ones chasing market share, thats typically what you need to do to get market share. Of course one effect of AMD doing that is it can also affect Nvidia pricing.

I am no fan boy though, I never understood the mindset of falling in love with a corporate, I dont particularly like Nvidia, too much proprietary stuff in addition to the stuff mentioned above. But ultimately if I dislike a company it doesnt stop me buying their products, my decisions are not based on emotions, life is too short for that, I buy whats suited for my needs at the moment.

I agreed with AMDs initial stance of providing better VRAM instead of silly novelty RT, but sadly Nvidia has managed to infect the AAA market with it, so it looks like AMD are having to change tact on that with future hardware.
 
Joined
Oct 5, 2024
Messages
121 (1.59/day)
Location
United States of America
Dunno who this is aimed at, but I am a long time Nvidia user so will bite.

I am not happy with Nvidia prices, and do make it clear in many posts, also not happy with how Nvidia are under spec'd VRAM in so many cards.

The reason I havent gone out and brought a cheaper AMD card is they dont have feature parity. With GPUs its about software as well. SGSSAA is a deal breaker for me. DLSS is the best modern upscaler, and as it turns out I now like RTX video. On top of this apparently AMD's encoder is even worse than NVENC.

Also the reason I have said AMD need to drop prices is they are the ones chasing market share, thats typically what you need to do to get market share. Of course one effect of AMD doing that is it can also affect Nvidia pricing.

I am no fan boy though, I never understood the mindset of falling in love with a corporate, I dont particularly like Nvidia, too much proprietary stuff in addition to the stuff mentioned above. But ultimately if I dislike a company it doesnt stop me buying their products, my decisions are not based on emotions, life is too short for that, I buy whats suited for my needs at the moment.

I agreed with AMDs initial stance of providing better VRAM instead of silly novelty RT, but sadly Nvidia has managed to infect the AAA market with it, so it looks like AMD are having to change tact on that with future hardware.
I used to think like you and I hope AMD does do this so that my next GPU would be cheaper. But history has shown that simply lowering prices is not good for the company, it doesn't gain much if any market share and all that happens is that profits evaporate, hurting future product development. Gamers constantly put a value on the Nvidia brand, even during years when both products are roughly equal, so until AMD catches up with software and marketing, a lower price won't mean enough sales to justify the loss in profit.
 
Joined
Jan 27, 2024
Messages
291 (0.88/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
But history has shown that simply lowering prices is not good for the company, it doesn't gain much if any market share and all

History shows the opposite - when AMD offered lower prices, it had decent market share.

Radeon HD 4890 = $250 in 2009
Radeon HD 5870 = $400 in 2010

Market share:

1732388874506.png


Today, when RX 7900 XTX is $1000, AMD's share has gone down from 44.5% to 12%:

1732388920971.png


 
Joined
Jan 8, 2017
Messages
9,502 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Also the reason I have said AMD need to drop prices is they are the ones chasing market share
Well, what can I say, enjoy paying more.

AMD cannot compete if their margins turn into dust, something many of you also cannot comprehend is that if the margins of AIBs become too low they'll simply drop AMD. End result ? You'll pay even more for that Nvidia card you've been waiting to buy.
 
Joined
Jan 20, 2019
Messages
1,590 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I used to think like you and I hope AMD does do this so that my next GPU would be cheaper. But history has shown that simply lowering prices is not good for the company, it doesn't gain much if any market share and all that happens is that profits evaporate, hurting future product development. Gamers constantly put a value on the Nvidia brand, even during years when both products are roughly equal, so until AMD catches up with software and marketing, a lower price won't mean enough sales to justify the loss in profit.

Whats more alarming is the extortionate pricing paired with widespread "mindshare compliance" where market dominance reduces the perception of value, creating a sense of inevitability around Nvidia products. As consumers we shouldn't be concerned with "lowering prices is not good for the company" when the general perception is GPUs are way overpriced for the mainstream consumer. I keep hearing people banging on about "product development" as if NVIDIA is operating on razor-thin margins and has no choice but to raise prices to fund innovation. The reality is, among PC components, GPUs stand out as the most profitable segment for manufacturers and those profits have soared to record levels in recent years. This clear gap between cost and value shows how far GPU pricing has strayed from being fair or necessary.

No matter how its spun - preferences, brand loyalties, or personal justifications, the bottom line remains the same, the pricing is outright ridiculous. From a consumer perspective, this issue should be at the forefront of every discussion about the industry’s future. No level of feature sets, dominance, marketing ploys, strategic affiliations/partnerships and mind/market-share should have consumers justifying corporate goals which is nothing short of unethical exploitation for profit.
 
Joined
May 10, 2023
Messages
347 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Whats more alarming is the extortionate pricing paired with widespread "mindshare compliance" where market dominance reduces the perception of value, creating a sense of inevitability around Nvidia products. As consumers we shouldn't be concerned with "lowering prices is not good for the company" when the general perception is GPUs are way overpriced for the mainstream consumer. I keep hearing people banging on about "product development" as if NVIDIA is operating on razor-thin margins and has no choice but to raise prices to fund innovation. The reality is, among PC components, GPUs stand out as the most profitable segment for manufacturers and those profits have soared to record levels in recent years. This clear gap between cost and value shows how far GPU pricing has strayed from being fair or necessary.

No matter how its spun - preferences, brand loyalties, or personal justifications, the bottom line remains the same, the pricing is outright ridiculous. From a consumer perspective, this issue should be at the forefront of every discussion about the industry’s future. No level of feature sets, dominance, marketing ploys, strategic affiliations/partnerships and mind/market-share should have consumers justifying corporate goals which is nothing short of unethical exploitation for profit.
Pricing for GPU is indeed ridiculous and the trend is to just continue getting worse over time, as long as there's no clear competition and companies keep profiting way more in other markets.
Even if AMD improves their products, it'll just mean they'll follow nvidia's price strategy for those sweet margins, and if their UDNA plan follows through it means they'll 100% be able to copy this strategy (full focus on data center, leftovers for the consumer market).

Maybe Intel can provide some good value products, we shall see. What I think will happen is the death of discrete components for most casual gamers and integrated solutions becoming more common, since it allows products in a smaller envelope without the limitations found in our usual ATX formats. Strix Halo is a good exame of that.
 
Joined
May 10, 2023
Messages
347 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
I didn't read through his previous nonsense, I just responded to what he said to me.
I believe you confused the idea of "orange fingers" being related to Trump in some way, right?
However, I believe it's meant to say about folks that eat Cheetos all day long.
 
Joined
Jul 26, 2024
Messages
190 (1.28/day)
It doesn't matter if AMD (ATI) puts a better product out. The Nvidia mindshare is unreal, beyond Apple. They offer extreamely good products right now, like the RX 7800 and 7900gre or even the 7700, Nvidia is selling boats full of 4060's. They can slap their logo on just about anything and your stereotypical diabetic with Cheetos fingers and greasy balding hair is going to buy it.
get the facts straight
agreed that 4060 sucks, but what amd sells in the same price range is a 7600, rebranded 6600xt, not a 7800xt. and it's not like nvidia doesn't sell a 4070 Super that's faster and more efficient than 7900gre. the retail price difference between them is about 40 euros, if you say it is not worth the RT performance, efficiency and not having to use the shimmering mess that fsr is for an upscaler at higher resolutions, you're frankly just an amd fanboy/apologist. I've played 300hrs of rdr2 at 5K DSR with dlss performance, and personally that alone would be enough for me to take the 4070S over the 7900gre if I were to choose again.
It very much matters what they offer,can't blame everything on "mindshare" if there are actual disadvantages to owning their products compared to the other brand.
I do a lot of work on the pc too, using dual high refresh monitors, and I know for a fact after owning a 6800 that it's a mess on amd. When you have two monitors on, and god forbid you want to play a yt video in the background, 6800 just ramps up to +40w power. 4070S sitting at 9W now doing exactly that. When you do 10-20 hrs of such work every week, that adds up to your power bill, which will pretty much nullify any price advantage that amd has in a matter of a year or less.

Opera Zrzut ekranu_2024-11-25_055855_www.computerbase.de.png
 
Last edited:
Joined
Jul 20, 2020
Messages
1,151 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
I do a lot of work on the pc too, using dual high refresh monitors, and I know for a fact after owning a 6800 that it's a mess on amd. When you have two monitors on, and god forbid you want to play a yt video in the background, 6800 just ramps up to +40w power. 4070S sitting at 9W now doing exactly that. When you do 10-20 hrs of such work every week, that adds up to your power bill, which will pretty much nullify any price advantage that amd has in a matter of a year or less.

This seems exaggerated for effect.

If you're paying $2.92/kWh or €2.77/kWh then I can see that difference but the average price in Europe is an order of magnitude lower and that's going by the single new 6800 I can find today at $520. Using the competitive price of $400 it was at before stock ran out, you'd need to be paying $8/kWh to make up the difference between the 4070S and the 6800 in a year.

I don't like high power use for simple tasks, it bugs me and Nvidia cards are better behaved in this way. But the financial impact in most use cases is minimal.
 
Joined
Aug 3, 2006
Messages
141 (0.02/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
Wasn't the power consumption of video playback fixed months ago? I get the 4070 super is good card for you, but limiting the argument to ray tracing is disingenuous when we are arguing if the 7900GRE, 7800XT and 7700XT are fantastic buys all around. You graph shows % not frame number, so when you are showing 30% more peformance at 2k, you are probably showing me 40fps vs 30fps. You are a mind share zombie, you can't even admit they are great cards. I mean look at your sig, you are simping for them for free.

I think DLSS FSR comparison is flawed half the time. Nvidia spends zilch on optimizing their cards for FSR. Comparing FSR to DLSS on an Nvidia card is unfortunately done too often. I don't see any major difference between DLSS and FSR 3.0, and I think techsites exaggerate the difference to absurdities. A control group vs placebo group may be an ego buster for Geforce owners.
 
Joined
Jul 26, 2024
Messages
190 (1.28/day)
Wasn't the power consumption of video playback fixed months ago?
I don't know, it wasn't when I had the card in 2022 (for about half a year, bought in April, sold in October). It was 2 years old then, and still the multi monitor/video playback power was not fixed (two 1440p 170hz panels). I guess I should have waited another two years, to make the comparison fair.
limiting the argument to ray tracing is disingenuous when we are arguing if the 7900GRE, 7800XT and 7700XT are fantastic buys all around. You graph shows % not frame number, so when you are showing 30% more peformance at 2k, you are probably showing me 40fps vs 30fps.
it's mid 70's vs mid 50's actually
Opera Zrzut ekranu_2024-11-26_055456_www.computerbase.de.png
You are a mind share zombie, you can't even admit they are great cards. I mean look at your sig, you are simping for them for free.
doesn't matter what I put in my signature, as long as I'm quoting facts. There is a night and day difference between saying certain things because you prefer X over Y (fanboyism), and preferring X over Y because you can say certain things abut how they compare. And hey, look at yours.....
I think DLSS FSR comparison is flawed half the time. Nvidia spends zilch on optimizing their cards for FSR.
I have never seen anyone claim fsr looks different on amd than on nvidia. Can you prove it ? Sounds made up to me.
Nvidia makes DLSS, not FSR. It's not on them to tinker with FSR implementation. No one willingly chooses the other one if they have a better solution available. Just look at TPUs reviews of FSR2/3, still the worst of upscalers. This is the latest, from STALKER2, but it's not like other fsr2/3 games are better than dlss3. Also, dlss3.5 includes a rt denoiser, which amd just doesn't have :

The FSR 3.1 upscaling implementation is extremely underwhelming in this game. At 4K resolution in its "Quality" mode, the small details in tree leaves, vegetation and of thin steel objects are noticeably degraded, the overall image looks very blurry, even when standing still, and this is clearly visible in our screenshots compared to other upscaling solutions, where even the TSR image looks a lot better, pretty unusual. The FSR 3.1 image is also suffering from disocclusion artifacts and ghosting, mainly around player weapons in motion, especially when interlacing with the grass. The shimmering in vegetation in motion is an issue as well, especially on the grass, and the visibility of these artifacts is more apparent at 1440p resolution. Speaking of 1080p resolution, the FSR 3.1 image is completely broken, producing simply a wrong image quality with extreme loss of all details, it looks like an oil painting.
The DLSS Super Resolution implementation is excellent at 4K resolution, producing a very crisp, detailed and stable image in motion, without shimmering in vegetation or ghosting issues. Compared to native TAA solution, it's a night and day difference across all resolutions in terms of overall image quality and stability. Things are a bit different at 1440p and 1080p resolutions as the DLSS image tends to have small breakups in motion, specifically on the edges of tree leaves, however, the shimmering in vegetation is not an issue in the DLSS image, even at 1080p resolution. With DLAA enabled, the overall image quality improvement is even higher, offering the best graphical experience overall when compared to the native TAA solution, FSR 3.1, DLSS or XeSS.

I don't see any major difference between DLSS and FSR 3.0.
"Tech sites report it, but I don't see it" is not an objective point of view to begin a discussion. Refer to what I wrote about saying things because of brand preference, you're doing the exact things you accuse me of.
I think techsites exaggerate the difference to absurdities. A control group vs placebo group may be an ego buster for Geforce owners.
I guess W1zzard has just been posting nvidia/intel/Epic/Sony-sponsored content in dlss/xess/UE5 upscaler (can't remember the name) vs fsr reviews for years.
btw, fsr3.0/3.1 available for a handful of games only, while dlss3 in hundreds.

Get back to me when you're ready to discuss actual facts.
 
Last edited:
Top