• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Unify Gaming "RDNA" and Data Center "CDNA" into "UDNA": Singular GPU Architecture Similar to NVIDIA's CUDA

Joined
Jul 24, 2024
Messages
219 (1.78/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Despite this being a logical move, I don't think that gamers will get what they deserve or want.
AMD surely wants to unify architectures to send one of the teams to work on AI. I really doubt that UDNA will get manpower from both RDNA and CDNA teams.

As much as I hate this, this is a step taken towards making more money - the dumb chatty AI that you rather need to ask the same thing twice to be sure.
AMD goes the way where the most money is. Can't blame them, it's all about the money, after all.

What does this mean for gamers? Will we get HBM memory as with Vega?
If not, they will still need to adjust (optimize) for different memory technology (GDDR vs. HBM) and that's still like dealing with two different architectures ...
I'd welcome having HBM memory instead of GDDR in gaming GPUs, even though I'd need to pay extra money for it.
 
Joined
Jul 5, 2013
Messages
27,730 (6.67/day)
So, they went from GCN to RDNA because they couldn’t do “best of both worlds” and had to optimize the arch separately, and now they are going back again????
It was a different world then and the compute dynamic was different. Lots of advances and the separate code bases have become cumbersome. Now they need to unify it.
 
Last edited:
Joined
Aug 2, 2012
Messages
1,986 (0.44/day)
Location
Netherlands
System Name TheDeeGee's PC
Processor Intel Core i7-11700
Motherboard ASRock Z590 Steel Legend
Cooling Noctua NH-D15S
Memory Crucial Ballistix 3200/C16 32GB
Video Card(s) Nvidia RTX 4070 Ti 12GB
Storage Crucial P5 Plus 2TB / Crucial P3 Plus 2TB / Crucial P3 Plus 4TB
Display(s) EIZO CX240
Case Lian-Li O11 Dynamic Evo XL / Noctua NF-A12x25 fans
Audio Device(s) Creative Sound Blaster ZXR / AKG K601 Headphones
Power Supply Seasonic PRIME Fanless TX-700
Mouse Logitech G500S
Keyboard Keychron Q6
Software Windows 10 Pro 64-Bit
Benchmark Scores None, as long as my games runs smooth.
10 years too late for what? Something they were already doing 13 years ago? GCN was already a combined architecture with a single programming model.
Then why it took them so long to return after seeing 4 generations of NVIDIA using what works best.

Are they sleeping?
 
Joined
Sep 1, 2020
Messages
2,343 (1.52/day)
Location
Bulgaria
I still don't understand why you are comparing uDNA to the old GCN? Does anyone already know the future uDNA codebases and functionalities to make a qualified comparison with GCN?
 
Joined
Nov 2, 2016
Messages
112 (0.04/day)
Does that slide with the vague Venn diagram say March 5, 2020 on the bottom? What's that date supposed to represent? Was the "> UDNA <" part "backported" to an older slide?
 
Joined
Jun 1, 2010
Messages
380 (0.07/day)
System Name Very old, but all I've got ®
Processor So old, you don't wanna know... Really!
So, they went from GCN to RDNA because they couldn’t do “best of both worlds” and had to optimize the arch separately, and now they are going back again????
No. Because the heavy computational workloads were the thing of the professional/enterprize market. AMD just followed the nVidia'`s suit, and decreased the compute performance for the consumer entertainment products, since they don`t benefit from this indeed. And also due to nVidia shills whinning from every hole, that "gamur" graphic cards should game better, not compute. So there was no point into putting more raw power into just client hardware. Because no one in the right mind would thought, this would turn into unhealthy trend, to use the "cut down" consumer gamer cards, for the heavy compute workloads, such as crypto-mining and AI. Right?

So since the gamer oriented cards from all vendors now being misused insanely for that particular purposes, and also stuffed AF with "AI" compute blocks, they decided to justify their "premium" over "ordinary" gamer`s HW, and it`s (mis)use in some bizzare way. And at the time significantly reduce the R&D expenses. Especially, if this is now a single solution.

One may take this, just as the "EPYC" route, but for Radeon. Because it seems, they would rather recycle the leftovers from enterprize products (like Ryzen originates from EPYC and TR binned dies), than create something separate. Win-win for the company and shareholders/investors, but for consumer this is even more bad news.
Because, AMD now ,not only abandoned the proper Radeon development, but more like cut it down completely, to just repurpose the failed dies, tha none of their enterprize clients is ready to pay for.

But that`s just some thoughts from another perspective.

P.S.: One way or another, Radeon VII (Vega II), was first of that kind, and personally, outside some bad execution, the idea was not particularly bad, TBH. It wasn`t the top gaming "dog" in general. But in the games, that were benefiting from additional compute blocks, it was shining. As much, as it was greater for the custom SW RT shaders.
 
Joined
Jun 6, 2021
Messages
684 (0.54/day)
System Name Red Devil
Processor AMD 5950x - Vermeer - B0
Motherboard Gigabyte X570 AORUS MASTER
Cooling NZXT Kraken Z73 360mm; 14 x Corsair QL 120mm RGB Case Fans
Memory G.SKill Trident Z Neo 32GB Kit DDR4-3600 CL14 (F4-3600C14Q-32GTZNB)
Video Card(s) PowerColor's Red Devil Radeon RX 6900 XT (Navi 21 XTX)
Storage 2 x Western Digital SN850 1GB; 1 x Samsung SSD 870EVO 2TB
Display(s) 3 x Asus VG27AQL1A; 1 x Sony A1E OLED 4K
Case Corsair Obsidian 1000D
Audio Device(s) Corsair SP2500; Steel Series Arctis Nova Pro Wireless (XBox Version)
Power Supply AX1500i Digital ATX - 1500w - 80 Plus Titanium
Mouse Razer Basilisk V3
Keyboard Razer Huntsman V2 - Optical Gaming Keyboard
Software Windows 11
I have very mixed feelings about this merging of the software. I thought it was a huge mistake back when AMD split the software for the two different GPU segments. However, during that time, the RDNA drivers have been pretty damn solid for the most part and receiving timely updates and all. I remember the nightmares vividly dating back to the 9500PRO... We can only hope the drivers will remain solid moving forward.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
So the rumors that RDNA 4 is just a small refresh of sort of RDNA3 and RDNA5 being the real new architecture seems more valid.

So in other words, RDNA5 will really be UDNA1.

Also as mentioned, they concentrated on Ryzen first and that bet paid off, now they are concentrating on Radeon/Instinct.

Lets see how it goes, but it does looks promising.

They won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
 
Joined
Feb 21, 2006
Messages
2,221 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Lower resell value.

As someone that has been using and reselling Radeon for the better part of 20 years I don't believe this to be true I never had a problem selling a card when I was ready to upgrade.
 
Joined
Dec 6, 2022
Messages
381 (0.53/day)
Location
NYC
System Name GameStation
Processor AMD R5 5600X
Motherboard Gigabyte B550
Cooling Artic Freezer II 120
Memory 16 GB
Video Card(s) Sapphire Pulse 7900 XTX
Storage 2 TB SSD
Case Cooler Master Elite 120
They won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
You have some good points but the rest is simply more of the same anti-AMD propaganda that the influencers spew on a daily basis and then the followers regurgitate.

That misinformation, in my opinion, is the main reason why we now have Ngreedia with a 90% market share.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
They won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
Sorry but the better half of this post is nonsense.

Games run out of the box on RDNA3 as in, no game ready drivers needed these days. There are indeed rare occasions with bugs. And they exist on the Nvidia side too. But as in needing a driver to run said game, Nvidia has been required to move a lot more patches forward since RDNA2-3. Part of that is also due to their expanded featureset. But your impression is off on the general state of RDNA3. Its in a great place. It runs all games admirably. I play a very wide variety and nothing fails.
Optimization is another aspect and in that you're probably right. OTOH, the AMD console hardware already puts RDNA at a pretty strong starting position. Its already running well on similar architecture for its primary target market, the consoles. There are almost no big games that are PC first these days, so again, your impression devs do more on Nvidia is off, too.

They're really in a very good place, overall, wrt game support and stability, easily rivalling Nvidia. Its when you tread into OpenGL space and non-gaming applications, then you will generally see Nvidia having slightly better support, occasionally. Not surprising since those are PC-first, a different reality.
 
Joined
Sep 5, 2024
Messages
33 (0.42/day)
Processor Ryzen 3700X
Motherboard MSI B550
Cooling DeepCool AK620. 3 x 140mm intake fans, 1 x 140mm exhaust fan
Memory 32 Gb DDR4 3000
Video Card(s) RX 6750 XT
Storage NVME, SATA SSD and NAS HDD
Display(s) Dell 24' 1440p, Samsung 24' 1080p
Case Fractal Design Define 7
Audio Device(s) Onboard
Power Supply Super Flower ATX 3.0 850w
Mouse Corsair M65
Keyboard Corsair mechanical
Software Win 10, Ubuntu 24.04 LTS
Yes, good catch. It really needs explanation.
Wild speculation time. This has been on the cards for a while and the rumoured RDNA5 ground up architecture is “UDNA5”, regardless of what they call it. Instinct Teams helps Radeon Teams fix the GPU chiplets strategy.
 
Joined
Dec 29, 2020
Messages
207 (0.15/day)
Could be very well a case that a tiling / chiplet based architecture is more in play at that point. Just because the architecture is the same doesn't mean the gpu's themselves don't have major differences. splitting gcn and rdna has also not been the greatest success for consumer gpu's in terms of competitiveness.
 
Joined
Aug 20, 2007
Messages
21,453 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
Fermi was nvidias last real "unified" arch in the sense that it was virtually unchanged compute to gaming.

No, I don't know that this move gives me good feels at all...
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Sorry but the better half of this post is nonsense.

Games run out of the box on RDNA3 as in, no game ready drivers needed these days. There are indeed rare occasions with bugs. And they exist on the Nvidia side too. But as in needing a driver to run said game, Nvidia has been required to move a lot more patches forward since RDNA2-3. Part of that is also due to their expanded featureset. But your impression is off on the general state of RDNA3. Its in a great place. It runs all games admirably. I play a very wide variety and nothing fails.
Optimization is another aspect and in that you're probably right. OTOH, the AMD console hardware already puts RDNA at a pretty strong starting position. Its already running well on similar architecture for its primary target market, the consoles. There are almost no big games that are PC first these days, so again, your impression devs do more on Nvidia is off, too.

They're really in a very good place, overall, wrt game support and stability, easily rivalling Nvidia. Its when you tread into OpenGL space and non-gaming applications, then you will generally see Nvidia having slightly better support, occasionally. Not surprising since those are PC-first, a different reality.
Lmao no. Tons of AMD users are constantly complaining in new games. Even in old games. Look at Hunt Showdown 1896 steam disc where AMD users have huge issues.

Another example was Starfield. Even AMD sponsored, but all AMD GPU users had no sun present in the game for weeks/months post release.

Like I said. Devs will be priotizing 90% over 10% any day. Just because consoles use AMD hardware don't reflect PC games. There's like only a handful of games that run better on AMD GPUs and when looking at the overall performance across many titles, including alphas, betas, early accesss, lesser popular titles and emulators, Nvidia is the clear winner with least issues and best performance.

AMD is cheaper for a reason. Worse features, worse drivers and optimization, uses more power, has lower resell value.

If AMD GPUs were actually good, they would have way more marketshare. That is just reality.

AMD leaving high-end GPU market is just another nail in the coffin.

You have some good points but the rest is simply more of the same anti-AMD propaganda that the influencers spew on a daily basis and then the followers regurgitate.

That misinformation, in my opinion, is the main reason why we now have Ngreedia with a 90% market share.
Had a 6800XT before my 4090, I know exactly whats up and down. AMD pretty much loses in all areas except price, when you factor in the lower resell value and higher power usage, you literally save nothing. AMD has way more issues in games as well, it is a fact. Go read discussion forums on new game releases and you will see.

Do I want AMD to stay competitive? Yes. Are they? No. Not in the GPU space thats for sure. Nvidia is king.

Its funny how people generally speak to me like I am a Nvidia fanboy. I have like 5 AMD chips in-house, even using AMD CPU in my main gaming rig, I just KNOW that AMD GPU is nowhere near Nvidia at this point in time and I am not alone:


People are literally fleeing from AMD GPUs at the moment.

RDNA4 hopefully will be a success so AMD can crawl back to 20-25% marketshare over the next 5 years. RDNA5 needs to be a homerun as well, for that to happen.

As someone that has been using and reselling Radeon for the better part of 20 years I don't believe this to be true I never had a problem selling a card when I was ready to upgrade.
It is 100% true as AMD lowers price over time to stay competive meaning resell value drops - Nvidia keeps their prices alot more steady; Think Android vs iPhone resell value here. It's the exact same thing. iPhones are far more worth when you sell them again. Tons of demand. More expensive yeah, but you get more back. Same is true for Nvidia GPUs.

AMD is the small player so price is what they adjust to compete. Remember how they sold Radeon 6000 series dirt cheap leading up to 7000 launch and even deep into 7000 series? This is why AMD resell value is very low. 6700, 6800 and 6900 series were selling for peanuts in the used market because of this.

It's not hard to sell AMD GPUs, it is hard not to loose alot of money compared to the price you bought it for. Is what I am saying, which is 100% true. Low demand = Low price.

Also for this gen, AMD uses more power too. When you look at the TCO you simply don't save much buying AMD and you get inferior features and more issues too and this is why AMD lost and keeps losing GPU marketshare. AMD is CPU first, GPU 2nd and they don't spend alot of their R&D funds on GPUs, especially not gaming GPUs, because high-end gaming GPUs simply don't sell well for AMD.

Most people with 500+ dollars to spend on a GPU, buys Nvidia.

AMDs best sellers have all been cheap cards like RX480-470-580-570, 5700XT, 6700XT etc.

This is what they aim for with RDNA4 as well. Low to mid-end, hopefully grapping back some marketshare.
 
Last edited:
Joined
Jul 24, 2024
Messages
219 (1.78/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Its funny how people generally speak to me like I am a Nvidia fanboy.
Maybe they call you Nvidiot, because that's what you are.

You can't compare RX 6800 and RTX 4090. Of course RTX 4090 or iPhone will have higher value when you sell it to someone because the initial investment was much higher (than AMD GPU or Android phone). It's the same with cars: used Dacia will cost much less than used BWM or Mercedez, but Dacia's cost was nowhere near the cost of those two others.

I have never had problem selling my old Radeon GPU for a reasonable price.

Don't forget what practices has been Nvidia using for a long time to gain their marketshare. (Shady practices similar to Intel's).
What I like about AMD is that everything they develop, they release as open source and it can be used on any GPU (Intel, AMD, Nvidia).

If you want to use GSync, you need to have Nvidia GPU, GSync certified/capable monitor. I'm tired of this proprietary greediness ... It has been like this forever ... PissX, DLSS. Man they even refused to allow older RTX cards to utilize newer DLSS despite those cards being totally capable of supporting it. Where is PissX now? Nowadays Nvidia fools their customers by fake frames. You don't pay $1600 for a GPU to play a game with fake frames and distorted image. But that's what Nvidia tells you - you need newest DLSS and fake frames! And you need the newest RTX generation to support the newest DLSS generation, of course.

This DLSS/FSR stuff does not help the case with poor game optimizations. On one hand it's insane that even RTX 4090 cannot run some newest games maxed out above 60 FPS. On the other hand, if you turn on that stupid DLSS/FSR to increase FPS, you are actually putting a blind eye to poor game development. As a game developer, what would drive me to optimize my game to run smoothly when I could just tell customers to turn off DLSS/FSR to increase performance ... But it's all distorted, or fake, or anything ... but definitely not native.

We pay more and more for the new hardware and what we get? Stupid upscaling and/or AI guessing. And Nvidia fully supports that idea from the beginning.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Maybe they call you Nvidiot, because that's what you are.

You can't compare RX 6800 and RTX 4090. Of course RTX 4090 or iPhone will have higher value when you sell it to someone because the initial investment was much higher (than AMD GPU or Android phone). It's the same with cars: used Dacia will cost much less than used BWM or Mercedez, but Dacia's cost was nowhere near the cost of those two others.

I have never had problem selling my old Radeon GPU for a reasonable price.

Don't forget what practices has been Nvidia using for a long time to gain their marketshare. (Shady practices similar to Intel's).
What I like about AMD is that everything they develop, they release as open source and it can be used on any GPU (Intel, AMD, Nvidia).

If you want to use GSync, you need to have Nvidia GPU, GSync certified/capable monitor. I'm tired of this proprietary greediness ... It has been like this forever ... PissX, DLSS. Man they even refused to allow older RTX cards to utilize newer DLSS despite those cards being totally capable of supporting it. Where is PissX now? Nowadays Nvidia fools their customers by fake frames. You don't pay $1600 for a GPU to play a game with fake frames and distorted image. But that's what Nvidia tells you - you need newest DLSS and fake frames! And you need the newest RTX generation to support the newest DLSS generation, of course.

This DLSS/FSR stuff does not help the case with poor game optimizations. On one hand it's insane that even RTX 4090 cannot run some newest games maxed out above 60 FPS. On the other hand, if you turn on that stupid DLSS/FSR to increase FPS, you are actually putting a blind eye to poor game development. As a game developer, what would drive me to optimize my game to run smoothly when I could just tell customers to turn off DLSS/FSR to increase performance ... But it's all distorted, or fake, or anything ... but definitely not native.

We pay more and more for the new hardware and what we get? Stupid upscaling and/or AI guessing. And Nvidia fully supports that idea from the beginning.
Ah, so I am a Nvidiot because I can afford high-end hardware and you can't? :laugh: Are you an AMPOOR then? Obviously I know 6800XT is not comparable to a 4090. I am saying 6800XT was a terrible experience.

Whatever makes you happy. What I am claiming is 100% true and people are fleeing from AMD GPUs in general which anyone can see.

 
Joined
Jul 24, 2024
Messages
219 (1.78/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Ah, so I am a Nvidiot because I can afford high-end hardware and you can't? :laugh: Are you an AMPOOR then?
Nah, it's because you praise Nvidia and spit shit on AMD. Always. In every discussion. It's a pattern of your behavior.

You should not talk about people that they are poor when they don't buy the things they don't need.
Different people have different interests. You paid huge amount for RTX 4090, I'd rather pay that amount for something else.
To me, RTX 4090 for $1600 is not worth it, especially not considering how much I game per month.

Anyway, how about you posted some reasonable comment to my sayings regarding DLSS or Nvidia's shady practices?

I know many people who still have RX 6800 XT and they don't experience any problems. That card was a great successor to RX 5700 XT, doubled the amount of memory of previous generation and added support for a lot of new functions. Had only about 10% less rasterizing performance than RTX 3090 on average in 1080p and 1440p, while priced at less than half of RTX 3090's (comparing MSRP). The main problem (for customers) was it's price during crypto fever. At some time, it might have cost even more than today's RTX 4090. I'd rather call RX 7800 XT a terrible experience compared to RX 6800 XT.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
Nah, it's because you praise Nvidia and spit shit on AMD. Always. In every discussion. It's a pattern of your behavior.

You should not talk about people that they are poor when they don't buy the things they don't need.
Different people have different interests. You paid huge amount for RTX 4090, I'd rather pay that amount for something else.
To me, RTX 4090 for $1600 is not worth it, especially not considering how much I game per month.

Anyway, how about you posted some reasonable comment to my sayings regarding DLSS or Nvidia's shady practices?

I know many people who still have RX 6800 XT and they don't experience any problems. That card was a great successor to RX 5700 XT, doubled the amount of memory of previous generation and added support for a lot of new functions. Had only about 10% less rasterizing performance than RTX 3090 on average in 1080p and 1440p, while priced at less than half of RTX 3090's (comparing MSRP). The main problem (for customers) was it's price during crypto fever. At some time, it might have cost even more than today's RTX 4090. I'd rather call RX 7800 XT a terrible experience compared to RX 6800 XT.
Nah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.

Funny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.

AMD left high-end GPU market now, for good reason. No-one is buying expensive AMD GPUs really. AMD is years behind in too many areas.

Go have a look at 2:30 in this video and you will know why competitive gamers use Nvidia 99% of the time as well. 4080 beats 7900XTX while using 65 watts with ease. 7900XTX uses 325-350% more power.


 
Joined
Jul 24, 2024
Messages
219 (1.78/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Nah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.
Calling other person poor, calling others the same as they called you ...

Funny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.
Making up things and arguments that nobody said ...

And delivering expected results in another thread ...

I'm done with you, kiddo.
 
Joined
Feb 21, 2006
Messages
2,221 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
It is 100% true as AMD lowers price over time to stay competive meaning resell value drops - Nvidia keeps their prices alot more steady; Think Android vs iPhone resell value here. It's the exact same thing. iPhones are far more worth when you sell them again. Tons of demand. More expensive yeah, but you get more back. Same is true for Nvidia GPUs.

AMD is the small player so price is what they adjust to compete. Remember how they sold Radeon 6000 series dirt cheap leading up to 7000 launch and even deep into 7000 series? This is why AMD resell value is very low. 6700, 6800 and 6900 series were selling for peanuts in the used market because of this.

It's not hard to sell AMD GPUs, it is hard not to loose alot of money compared to the price you bought it for. Is what I am saying, which is 100% true. Low demand = Low price.

Also for this gen, AMD uses more power too. When you look at the TCO you simply don't save much buying AMD and you get inferior features and more issues too and this is why AMD lost and keeps losing GPU marketshare. AMD is CPU first, GPU 2nd and they don't spend alot of their R&D funds on GPUs, especially not gaming GPUs, because high-end gaming GPUs simply don't sell well for AMD.

Most people with 500+ dollars to spend on a GPU, buys Nvidia.

AMDs best sellers have all been cheap cards like RX480-470-580-570, 5700XT, 6700XT etc.

This is what they aim for with RDNA4 as well. Low to mid-end, hopefully grapping back some marketshare.
I bought a 6800XT in 2021 then sold it in 2023 for half its value. When I looked at time that was the same thing that would have applied to a 3080 10GB or the Ti model so again i'm not sure about this selling for peanuts. I haven't lost any money on my resales and value drops naturally as the cards age. And if the demand was low I wouldn't have been able to sell any of my cards they were sold literally 1 week after I posted my ads.

I'm someone that spends more than $500 on GPU's and they have all been radeons there is no way to quantify most people do without actually sales data.

Nah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.

Funny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.

AMD left high-end GPU market now, for good reason. No-one is buying expensive AMD GPUs really. AMD is years behind in too many areas.

Go have a look at 2:30 in this video and you will know why competitive gamers use Nvidia 99% of the time as well. 4080 beats 7900XTX while using 65 watts with ease. 7900XTX uses 325-350% more power.


I remember this video and its flaws.

Guy is comparing a AIB 7900XTX vs a FE 4080 instead of a reference 7900XTX that alone makes the comparison moot.

if you are going to compare it has to be AIB vs AIB and reference vs reference. I'm pretty sure I even left a comment on that video when I saw it years ago.
 

las

Joined
Nov 14, 2012
Messages
1,693 (0.39/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
I bought a 6800XT in 2021 then sold it in 2023 for half its value. When I looked at time that was the same thing that would have applied to a 3080 10GB or the Ti model so again i'm not sure about this selling for peanuts. I haven't lost any money on my resales and value drops naturally as the cards age. And if the demand was low I wouldn't have been able to sell any of my cards they were sold literally 1 week after I posted my ads.

I'm someone that spends more than $500 on GPU's and they have all been radeons there is no way to quantify most people do without actually sales data.


I remember this video and its flaws.

Guy is comparing a AIB 7900XTX vs a FE 4080 instead of a reference 7900XTX that alone makes the comparison moot.

if you are going to compare it has to be AIB vs AIB and reference vs reference. I'm pretty sure I even left a comment on that video when I saw it years ago.

Bought a 3080 on release for 699 and sold it for 1200 dollars during mining craze. Which was the reason I picked up a dirt cheap 6800XT as temp card, until 4090 replaced it.

6800XT were selling for like 400 dollars brand new, post mining craze. Even 6900XT and 6950XT were below 450 dollars. 6700XT were selling for like 250-300 dollars. It's a fact that AMD lowers price alot. Lower demand = lower prices and AMD always compete on price. Nvidia don't really have to, because demand is high.

AMD even delayed 7700 and 7800 series like crazy because warehouses were filled to the brink with 6700, 6800 and 6900 series collecting dust. Hence the massive pricecuts. OBVIOUSLY resale price takes a hit then.



You don't have to try and explain, I have been in this game for 25+ years. Built 1000s of custom high-end PCs. Sold millions of units B2B. It's a simple fact that Nvidia retains its value much better over time, especially today with all the RTX features and leading performance. AMD is simply years behind and now left the high-end gaming GPU space officially.

AMD is doing worse than ever in the dGPU gaming space. I use 4090 because AMD have nothing that even comes close. Nvidia absolutely destroys AMD when you consider it all; Features, drivers and optimization, RT and Path Tracing performance. I would pick 4080S, 4080, 4070 Ti SUPER and even 4070 Ti/SUPER over any AMD card right now personally. Simply can't loose DLDSR, DLSS, DLAA, Reflex, ShadowPlay, Proper RT Performance and option for Path Tracing with Frame Gen that actually works good. DLSS/DLAA beats FSR with ease, Techpowerup tested this in like 50 games and Nvidia wins all.

AMD invented nothing new in the GPU space in the last many generations, all they do is undercut Nvidia and offer worse features across the board, which is why they lose and keep losing marketshare. 9 out of 10 people won't even consider AMD at this point.

Lets hope AMD can make a turnaround with RDNA4 and RDNA5 because right now, things look very bad:


Nvidia fanboy? Nah, if AMD were able to offer what Nvidia is offering, I would be using AMD GPU. AMD simply has nothing I want at this point. FSR is mediocre. RT is unuseable. VSR is meh compared to DLDSR. Anti Lag + loses to Reflex.

I don't look at raster performance only. I look at the full picture and there's 600+ games with RTX features now and rising fast. 9 out of 10 new games simply run better on Nvidia. Native gaming is dead to me, I play all games with DLAA or DLDSR which beats native with absolute ease. Even DLSS on the higher presets can beat native while improving performance, proof:

 
Last edited:
Joined
Jun 29, 2009
Messages
125 (0.02/day)
Location
El People's Republik de Kalifornistan
System Name my friends call me Zenny, I make a bad enemy
Processor still computing
Motherboard Naw she aint, she has 2 dachshunds
Cooling I ride motorcycles rain or shine
Memory semi-eidetic, 48 years on-time
Video Card(s) don't need one anymore, but still have a few Polaris
Storage I grew up in Silicon Valley during the '80's and '90's in San Jose, CA
Display(s) 2016 Scout Sixty when I want to look pretty. The females agree.
Case I only roll in old shit. 1963 F100 Unibody
Audio Device(s) JBL Boombox 3, V-MODA Crossfade
Power Supply Ensure, Bacon Jerky, and mineral water
Mouse I hates em. I sets up glue traps for em. Guk!
Keyboard Trying to close a Captains of Crush #3. For 5 years now.
VR HMD Nahmang /Yay Area
Software Hey! My underpants is my business!
Benchmark Scores How much does Mark weigh?
Bought a 3080 on release for 699 and sold it for 1200 dollars during mining craze.

6800XT were selling for like 400 dollars brand new post mining craze. Even 6900XT and 6950XT were below 450 dollars. 6700XT were selling for like 275-300 dollars.

AMD even delayed 7700 and 7800 series like crazy because warehouses were filled to the brink with 6700, 6800 and 6900 series.



You don't have to try and explain, I have been in this game for 25+ years. Built 1000s of custom high-end PCs. Sold millions of units B2B. It's a simple fact that Nvidia retains its value much better over time, especially today.

You are in full denial mode in every post, sadly what I say is true, AMD is doing worse than ever in the dGPU gaming space. I use 4090 because AMD have nothing that even comes close, not even in raster, however Nvidia absolutely destroys AMD when you consider it all; Features, drivers and optimization, RT and Path Tracing performance.

AMD invented nothing new in the GPU space in the last few generations, all they do is undercut Nvidia and offer worse features across the board, which is why they lose and keep losing marketshare. 9 out of 10 people won't even consider AMD at this point.

Lets hope AMD can make a turnaround with RDNA4 and RDNA5 because right now, things look very bad:


Shut up. You're like an ouroboros, except at the center of a toilet.
 
Top