• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4

Joined
Jan 27, 2024
Messages
291 (0.89/day)
Processor Ryzen AI
Motherboard MSI
Cooling Cool
Memory Fast
Video Card(s) Matrox Ultra high quality | Radeon
Storage Chinese
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Chinese
Mouse Chinese
Keyboard Chinese
VR HMD No
Software Android | Yandex
Benchmark Scores Yes
@evernessince True, but cuda and ai are features aimed at the enterprise - workstations, not the average joe that plays on steam.

If AMD continues to offer an inferior product, they will continue to not sell.

RX 7900 XTX is not inferior to RTX 4070, for example. The obstacle that AMD themselves put is the price itself. No one wants to pay more than 500$ for an RX 7900 XTX.
 
Joined
Nov 26, 2021
Messages
1,704 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
RDNA was a big step up from GCN in performance per watt and performance per square mm. If they go the unified route, they would have to bring some of the RDNA improvements into CDNA.
 
Joined
Jul 13, 2016
Messages
3,328 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
@evernessince True, but cuda and ai are features aimed at the enterprise - workstations, not the average joe that plays on steam.

The Stable Diffusion reddit is one of the largest subreddits and there is a huge hobbyist AI market. I wouldn't call this workstation or enterprise per say. The 4090 has sold so well precisely thanks to this market. Aside from just AI I think you are vastly under-estimating the number of people who do more than just gaming on their PCs.

It's not necessarily important either whether everyone uses these features or not, people do factor a lack of certain features or qualities into their purchasing decision.
 

Bouleou

New Member
Joined
Nov 19, 2024
Messages
2 (0.06/day)
Does this mean that they will replace the 'old' graphical hardware pipeline with compute shaders and thus get a parallelisation that scales better?
 
Joined
Jan 8, 2017
Messages
9,500 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Seems like really bad news, they're going to do the same thing with Vega where they made the architecture too granular with a lot of compute and not enough ROPs because there's really no other way you could unify these two different sets of requirements into one architecture.

I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster.

RX 7900 XTX is not inferior to RTX 4070, for example. The obstacle that AMD themselves put is the price itself. No one wants to pay more than 500$ for an RX 7900 XTX.
Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?
 
Last edited:
Joined
Dec 25, 2020
Messages
7,001 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster.

Because they still don't have the funds and can't upkeep two architectural lines anymore. While the CPU business at AMD has been remarkably successful and has earned everyone's respect and recognition, they have much, and I mean much work to do in their graphics business. Meanwhile, research and development has become extremely expensive, wafer prices are at an all time high due to the extremely advanced nodes being used, their resources are spread thin. They're having issues producing results, the gaming business made only $462 million last quarter, down 69%. There have been layoffs, people lost their jobs.

If their gaming business weakens due to being derived from the datacenter lineup, this is actually the smartest move and better outcome for the company's bottom line, and I cannot fault them for focusing on the businesses that make money: datacenter and client CPU segments. By contrast, NVIDIA's gaming business made $3.3 billion, up 15% from last quarter, even at the tail end of a generation. NV's gaming business made almost the same money as AMD's datacenter business, and that's just insanity.

Things really, really aren't good for Radeon right now. I'm not sure if you even noticed, but AMD hasn't released a driver this month, and it's already November 20. Meanwhile, NVIDIA released the STALKER 2 optimized Game Ready driver 8 days in advance, sometime early last week.

Instinct was repositioned as a datacenter product, and this segment's revenue was $3.5 billion according to their latest earnings call. Most of it will be immediately sunk in research and development costs for the next-generation products. Nvidia's datacenter revenue for the last fiscal quarter was $30.8B by comparison; and even that will end up being largely spent in R&D.

Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?

$500 for a 7900 XTX-level card may actually end up a reality within the next 6 months as the RTX 50 series come out. And depending on the features offered by Blackwell, or their power consumption figures - as long as it has 16+ GB of RAM, $500 might actually make them rather undesirable...
 
Last edited:
Joined
Jan 8, 2017
Messages
9,500 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Because they still don't have the funds and can't upkeep two architectural lines anymore.
There is no way a revenue stream of billions of dollars is not sufficient for that, besides if need be it makes more sense to just keep RDNA for the time being. Something is suspect, they either expect all this datacenter AI demand to crater in the coming years which is entirely possible or maybe they really don't know what they're doing.
 
Joined
Dec 25, 2020
Messages
7,001 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
There is no way a revenue stream of billions of dollars is not sufficient for that, besides if need be it makes more sense to just keep RDNA for the time being. Something is suspect, they either expect all this datacenter AI demand to crater in the coming years which is entirely possible or maybe they really don't know what they're doing.

I can easily see it being not sufficient, especially if they opt to spend that revenue in their stronger businesses (as a CEO, I would choose this path). Quick search shows each 3 nm wafer runs around $20k and this is set to increase next year. Engineers are very expensive as well, software takes a lot of time and investment to develop and maintain... I'm not sure Statista is the best source, but NV seems to have spent over $8.6B in R&D this fiscal year alone


1732151130197.png


A new GPU architecture is drafted roughly 4-6 years in advance, which probably means the products they are releasing now were under active development from 2018 to 2020, with costs increasing steadily over time. AMD's in the same boat, here's the data I found, but it also covers all of their other businesses, CPUs and all:


1732151345979.png


With these figures gaming and datacenter revenues wouldn't pay for their 2025 R&D budget at all
 
Joined
Oct 2, 2015
Messages
3,152 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
RDNA4 is the chance they have to prove they aren't 100% disconnected from reality and provide what made Polaris such a great non-high-end product, competitive pricing for the feature set you get.
We'll then see if they can keep driver development on par with software releases with the new architecture in 2026, or if we will be back to the dark age of months between driver releases fixing some problems and causing others. So fun to fire people to "solve issues".

So far, it seems datacenter sales turned the gaming consumer into a 3rd class citizen, up to AMD to prove the contrary.
 
Joined
Aug 12, 2022
Messages
250 (0.29/day)
I've only gotten back into computer hardware relatively recently, so I'm not familiar with AMD's GCN hardware. Could someone please give me a layman answer on why the unification will be beneficial?
GCN "Graphics Core Next" was AMD's graphics architecture used for the Radeon HD 7000 series (GCN ~1), R 200 series (GCN ~2), R9 285 (GCN 3), RX 400/500 series (GCN 4 aka Polaris), and RX Vega/Vega VII series (GCN 5 aka Vega). All GCN graphics cards, professional or gaming, used GCN. But now all gaming cards use RDNA and all professional cards use CDNA. Going from Vega to RDNA 1, AMD gaming cards delivered a nice improvement in gaming, but regressed in some professional workloads like compute. But part of RDNA's success was that the transistor and power budget wasn't wasted on as many non-gaming resources.

Today AMD has a marketshare problem. Very few customers buy CDNA GPUs and that's partly because they're not already common. RDNA itself could attract more customers if it had compute, and if all Radeon products had compute more people would have access to AMD's compute software and this would precipitate in more professional GPU sales. Ultimately, AMD has concluded that its market share can grow and its GPU architectures be made more cheaply if gaming and compute are re-unified, and this justifies the higher transistor cost. This article has what AMD said when this was announced. https://www.techpowerup.com/326442/...ular-gpu-architecture-similar-to-nvidias-cuda
 
Joined
Jul 26, 2024
Messages
190 (1.28/day)
No competition for 5090/5080 until mid 2026, great. AMD stopped trying, and we wonder why NVIDIA is free to do whatever they want with pricing upper mid and high end
 
Joined
Dec 25, 2020
Messages
7,001 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
No competition for 5090/5080 until mid 2026, great. AMD stopped trying, and we wonder why NVIDIA is free to do whatever they want with pricing upper mid and high end

Until mid 2026 assuming:

1. UDNA 1 doesn't flop (performance or feature regression)
2. UDNA 1 doesn't suffer from the same type of chronic issues that plagued RDNA 1 (infamous 5700 XT black screen issues for example)
3. They have more than just a pretty control panel to offer. Even then, I'll concede that asking polished drivers that are stable and performant for a new architecture day one is a bit much - although I expect full stability after 4 months or so
 
Joined
Jan 8, 2017
Messages
9,500 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
With these figures gaming and datacenter revenues wouldn't pay for their 2025 R&D budget at all
This doesn't make sense no matter how you spin it, they were making a lot less money when they decided to split their architectures into CDNA and RDNA. I don't think R&D budgets are the issue here, if you spent 5 billion in past year on R&D you're not mandated to spend even more this year. You are blindsided by the numbers, these companies always spend whatever they have, both AMD and Nvidia have been making more money in the past years so their R&D spending went up proportionally.

NVIDIA is free to do whatever they want with pricing upper mid and high end
They always did that, there isn't a single example I can think of when either AMD or ATI ever had much of an influence on their pricing schemes no matter how competitive they were, absurdities like the GTX 8800 Ultra Mega Super always existed.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,001 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
This doesn't make sense no matter how you spin it, they were making a lot less money when they decided to split their architectures into CDNA and RDNA. I don't think R&D budgets are the issue here, if you spent 5 billion in past year on R&D you're not mandated to spend even more this year. You are blindsided by the numbers, these companies always spend whatever they have, both AMD and Nvidia have been making more money in the past years so their R&D spending went up proportionally.

Sure, you're not mandated, but then you're going to stagnate even further since research and development won't be done for free, and slashing R&D budget ultimately means slower progress and/or a worse resulting product. Given the reliance on compute modern games have, and that dedicated graphics cards are set to become the primary accelerator in the modern AI PC, reunifying CDNA and RDNA into UDNA, while also lowering their development and maintenance costs and reining in R&D spending is a winning move.

I don't think their previous woes had anything to do specifically with them an unified architecture, but because of the resulting product. At the end of the day, that is what most customers care about. The Vega series (especially the VII) were built on what was essentially an almost 10 year old architectural base, even though they were refined to the extreme and used highly advanced technology for their time (such as HBM memory), the fundamental problems of GCN still affected them, all the while Turing was already DX12 Ultimate capable before the standard was even made public. RDNA 1 was an attempt to restart the gaming lineup, and while the first iteration was pretty bad, we can see that RDNA 2 was very much a respectable architecture, arguably the best AMD's made in the past few years.

It was clearly on a level all its own, even though the prices were far higher than what AMD practiced at the time, you usually got what you paid for, perhaps it didn't mean much at the time with rudimentary, almost demo-like support for DLSS 1 and early RT games in the beginning, but long-term, I'd say people who invested in Turing ended up way better off. We see a similar situation now; Nvidia has a more refined, feature-complete, efficient product to sell, and even though they charge for the privilege, the market has clearly - and repeatedly chosen to support that path. The overwhelming success of Pascal architecture and the fact that people are still fond of and using their GTX 1080 and 1080 Ti GPUs today, 8 years later, has even prolonged the RDNA 1 cards' viability, since they have incomplete DirectX 12 support just like Pascal cards and game developers are still inclined to support these products today.
 
Joined
Nov 14, 2021
Messages
140 (0.12/day)
It will be interesting to see how rasterization moves forward with UDNA. Same with RT even. AI stuff doesn't care about these. Though HPC does care a lot about shaders. Wonder how it will impact gaming. HPC/AI is where the money is at. Not so much gaming. Will be interesting to see how the business market shapes AMDs consumer video cards.

I wonder how UDNA will translate across the spectrum of power needs. From smartphones to servers. Usually doesn't go to well. I started seeing articles pop up showing that AMD might be interested in getting into the mobile market, I don't really see how that will really turn into a thing.
 
Joined
Jul 26, 2024
Messages
190 (1.28/day)
They always did that, there isn't a single example I can think of when either AMD or ATI ever had much of an influence on their pricing schemes no matter how competitive they were, absurdities like the GTX 8800 Ultra Mega Super always existed.
Some short memory you have.
 
Joined
Jun 2, 2017
Messages
9,360 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Sure, you're not mandated, but then you're going to stagnate even further since research and development won't be done for free, and slashing R&D budget ultimately means slower progress and/or a worse resulting product. Given the reliance on compute modern games have, and that dedicated graphics cards are set to become the primary accelerator in the modern AI PC, reunifying CDNA and RDNA into UDNA, while also lowering their development and maintenance costs and reining in R&D spending is a winning move.

I don't think their previous woes had anything to do specifically with them an unified architecture, but because of the resulting product. At the end of the day, that is what most customers care about. The Vega series (especially the VII) were built on what was essentially an almost 10 year old architectural base, even though they were refined to the extreme and used highly advanced technology for their time (such as HBM memory), the fundamental problems of GCN still affected them, all the while Turing was already DX12 Ultimate capable before the standard was even made public. RDNA 1 was an attempt to restart the gaming lineup, and while the first iteration was pretty bad, we can see that RDNA 2 was very much a respectable architecture, arguably the best AMD's made in the past few years.

It was clearly on a level all its own, even though the prices were far higher than what AMD practiced at the time, you usually got what you paid for, perhaps it didn't mean much at the time with rudimentary, almost demo-like support for DLSS 1 and early RT games in the beginning, but long-term, I'd say people who invested in Turing ended up way better off. We see a similar situation now; Nvidia has a more refined, feature-complete, efficient product to sell, and even though they charge for the privilege, the market has clearly - and repeatedly chosen to support that path. The overwhelming success of Pascal architecture and the fact that people are still fond of and using their GTX 1080 and 1080 Ti GPUs today, 8 years later, has even prolonged the RDNA 1 cards' viability, since they have incomplete DirectX 12 support just like Pascal cards and game developers are still inclined to support these products today.
Yep that is why the 4090 in every single Game. That is why MX300 is not selling well. It does not matter how many paragraphs you write the fact remains that you are very anti AMD.
 
Joined
Dec 25, 2020
Messages
7,001 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yep that is why the 4090 in every single Game. That is why MX300 is not selling well. It does not matter how many paragraphs you write the fact remains that you are very anti AMD.

You need to be reading very hard into it and very wrong to deduct "I hate AMD" from that post.
 
Joined
Aug 3, 2006
Messages
141 (0.02/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
It doesn't matter if AMD (ATI) puts a better product out. The Nvidia mindshare is unreal, beyond Apple. They offer extreamely good products right now, like the RX 7800 and 7900gre or even the 7700, Nvidia is selling boats full of 4060's. They can slap their logo on just about anything and your stereotypical diabetic with Cheetos fingers and greasy balding hair is going to buy it.
 
Joined
Sep 30, 2024
Messages
110 (1.34/day)
It doesn't matter if AMD (ATI) puts a better product out. The Nvidia mindshare is unreal, beyond Apple. They offer extreamely good products right now, like the RX 7800 and 7900gre or even the 7700, Nvidia is selling boats full of 4060's. They can slap their logo on just about anything and your stereotypical diabetic with Cheetos fingers and greasy balding hair is going to buy it.
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
 
Joined
Sep 26, 2022
Messages
2,147 (2.63/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
I must say I've never had a single issue with AFMF (paired with Anti-Lag), but the CS2 VAC fiasco burned AMD's image seriously.
 
Joined
Aug 3, 2006
Messages
141 (0.02/day)
Location
Austin, TX
Processor Ryzen 6900HX
Memory 32 GB DDR4LP
Video Card(s) Radeon 6800m
Display(s) LG C3 42''
Software Windows 11 home premium
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
Found the 4090 owner with orange fingers.
 
Joined
Nov 14, 2021
Messages
140 (0.12/day)
Wonder if this will hurt sales at all. I have less interest in RDNA 4 now honestly. Id be more interested in getting one just to collect. Same with Intel's cards.

Not to hopeful there will be an RDNA card that is decently more powerful than the 6600 in the same power budget. RT and AI might be heavily improved, but in these lower end cards you still need to have great raster performance as AI and RT is less valuable to utilize.
 
Joined
Jan 20, 2019
Messages
1,590 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I have less interest in RDNA 4 now honestly. Id be more interested in getting one just to collect.

Good plan. You collect, I’ll hold, and together we’ll corner the market on GPUs :D
 
Joined
Sep 30, 2024
Messages
110 (1.34/day)
Found the 4090 owner with orange fingers.
So all nVidia owners are Trump supporters now? I'm not even American you small-minded f. But I'll tell you what, nVidia cards are better than Radeon cards, and Trump is your President, deal with it, wipe away those delicious salty lib tears and grow up and move on. And no, I don't own a 4090, nor do I plan on owning a 5090.

There are plenty of comparison videos that show the difference between FSR and DLSS, and FSR is pretty good, but it NEVER wins against DLSS. FACT, unless you don't know what to look for, which wouldn't surprise me at this point!
 
Last edited:
Top