Monday, September 9th 2024

AMD to Unify Gaming "RDNA" and Data Center "CDNA" into "UDNA": Singular GPU Architecture Similar to NVIDIA's CUDA

According to new information from Tom's Hardware, AMD has announced plans to unify its consumer-focused gaming RDNA and data center CDNA graphics architectures into a single, unified design called "UDNA." The announcement was made by AMD's Jack Huynh, Senior Vice President and General Manager of the Computing and Graphics Business Group, at IFA 2024 in Berlin. The goal of the new UDNA architecture is to provide a single focus point for developers so that each optimized application can run on consumer-grade GPU like Radeon RX 7900XTX as well as high-end data center GPU like Instinct MI300. This will create a unification similar to NVIDIA's CUDA, which enables CUDA-focused developers to run applications on everything ranging from laptops to data centers.
Jack HuynhSo, part of a big change at AMD is today we have a CDNA architecture for our Instinct data center GPUs and RDNA for the consumer stuff. It's forked. Going forward, we will call it UDNA. There'll be one unified architecture, both Instinct and client [consumer]. We'll unify it so that it will be so much easier for developers versus today, where they have to choose and value is not improving.
According to Jack Huynh, AMD "made some mistakes with the RDNA side; each time we change the memory hierarchy, the subsystem, it has to reset the matrix on the optimizations. I don't want to do that. So, going forward, we're thinking about not just RDNA 5, RDNA 6, RDNA 7, but UDNA 6 and UDNA 7. We plan the next three generations because once we get the optimizations, I don't want to have to change the memory hierarchy, and then we lose a lot of optimizations. So, we're kind of forcing that issue about full forward and backward compatibility. We do that on Xbox today; it's very doable but requires advanced planning. It's a lot more work to do, but that's the direction we're going."

When AMD originally separated CDNA from RDNA, the company wanted to create two separate entities and thought it would be easier to manage. However, it couldn't be further from the truth. Having two separate teams for optimizations is a nightmare both logistically and engineering-wise. Hence, the shift to a monolithic structure of GPU architectures is beneficial to the company in the long term and will ease the development of newer products with both gaming-focused and compute-focused teams at work. This strategy is similar to NVIDIA's CUDA, which has maintained its architecture line in a single lane, with added special accelerators for AI or/or ray tracing, which AMD also plans to do.
Source: Tom's Hardware
Add your own comment

56 Comments on AMD to Unify Gaming "RDNA" and Data Center "CDNA" into "UDNA": Singular GPU Architecture Similar to NVIDIA's CUDA

#26
lexluthermiester
ZunexxxSo, they went from GCN to RDNA because they couldn’t do “best of both worlds” and had to optimize the arch separately, and now they are going back again????
It was a different world then and the compute dynamic was different. Lots of advances and the separate code bases have become cumbersome. Now they need to unify it.
Posted on Reply
#27
TheDeeGee
Fouquin10 years too late for what? Something they were already doing 13 years ago? GCN was already a combined architecture with a single programming model.
Then why it took them so long to return after seeing 4 generations of NVIDIA using what works best.

Are they sleeping?
Posted on Reply
#28
TumbleGeorge
I still don't understand why you are comparing uDNA to the old GCN? Does anyone already know the future uDNA codebases and functionalities to make a qualified comparison with GCN?
Posted on Reply
#29
close
Does that slide with the vague Venn diagram say March 5, 2020 on the bottom? What's that date supposed to represent? Was the "> UDNA <" part "backported" to an older slide?
Posted on Reply
#30
TumbleGeorge
closeDoes that slide with the vague Venn diagrams say March 5, 2020 on the bottom?
Yes, good catch. It really needs explanation.
Posted on Reply
#31
Random_User
ZunexxxSo, they went from GCN to RDNA because they couldn’t do “best of both worlds” and had to optimize the arch separately, and now they are going back again????
No. Because the heavy computational workloads were the thing of the professional/enterprize market. AMD just followed the nVidia'`s suit, and decreased the compute performance for the consumer entertainment products, since they don`t benefit from this indeed. And also due to nVidia shills whinning from every hole, that "gamur" graphic cards should game better, not compute. So there was no point into putting more raw power into just client hardware. Because no one in the right mind would thought, this would turn into unhealthy trend, to use the "cut down" consumer gamer cards, for the heavy compute workloads, such as crypto-mining and AI. Right?

So since the gamer oriented cards from all vendors now being misused insanely for that particular purposes, and also stuffed AF with "AI" compute blocks, they decided to justify their "premium" over "ordinary" gamer`s HW, and it`s (mis)use in some bizzare way. And at the time significantly reduce the R&D expenses. Especially, if this is now a single solution.

One may take this, just as the "EPYC" route, but for Radeon. Because it seems, they would rather recycle the leftovers from enterprize products (like Ryzen originates from EPYC and TR binned dies), than create something separate. Win-win for the company and shareholders/investors, but for consumer this is even more bad news.
Because, AMD now ,not only abandoned the proper Radeon development, but more like cut it down completely, to just repurpose the failed dies, tha none of their enterprize clients is ready to pay for.

But that`s just some thoughts from another perspective.

P.S.: One way or another, Radeon VII (Vega II), was first of that kind, and personally, outside some bad execution, the idea was not particularly bad, TBH. It wasn`t the top gaming "dog" in general. But in the games, that were benefiting from additional compute blocks, it was shining. As much, as it was greater for the custom SW RT shaders.
Posted on Reply
#32
Guwapo77
I have very mixed feelings about this merging of the software. I thought it was a huge mistake back when AMD split the software for the two different GPU segments. However, during that time, the RDNA drivers have been pretty damn solid for the most part and receiving timely updates and all. I remember the nightmares vividly dating back to the 9500PRO... We can only hope the drivers will remain solid moving forward.
Posted on Reply
#33
las
Neo_MorpheusSo the rumors that RDNA 4 is just a small refresh of sort of RDNA3 and RDNA5 being the real new architecture seems more valid.

So in other words, RDNA5 will really be UDNA1.

Also as mentioned, they concentrated on Ryzen first and that bet paid off, now they are concentrating on Radeon/Instinct.

Lets see how it goes, but it does looks promising.
They won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
Posted on Reply
#34
Makaveli
lasLower resell value.
As someone that has been using and reselling Radeon for the better part of 20 years I don't believe this to be true I never had a problem selling a card when I was ready to upgrade.
Posted on Reply
#35
Neo_Morpheus
lasThey won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
You have some good points but the rest is simply more of the same anti-AMD propaganda that the influencers spew on a daily basis and then the followers regurgitate.

That misinformation, in my opinion, is the main reason why we now have Ngreedia with a 90% market share.
Posted on Reply
#36
Vayra86
lasThey won't concentrate on gaming / consumer GPUs at all. White flag was raised. AMD is a CPU company first and when it comes to the GPU business, they want AI and Enterprise business over Gaming anyday, which is what they spend R&D funds on.

AMD is at like 10% gaming GPU marketshare now and loosing month for month. Simply not a good deal for 90% of gamers. Lower resell value. Higher power usage. Worse features. Worse support and optimization in most games. Even AMD sponsored games tends to run better on Nvidia. Just look at Starfield, which was AMD sponsored, and had several bugs using AMD GPU (no sun for example). Most developers prioritize Nvidia first. Uses Nvidia for development and uses Nvidia at home too. Time is money and time is better spent optimizing for 90% of the playerbase. This is why most new games run better using Nvidia. Less issues. Better overall performance.

RDNA4 needs to have very aggressive pricing to regrab marketshare, but I doubt the top SKU will perform better than 7900GRE.
Absolutely won't perform like a 7900XT at 200 watts like some people expect LOL.
Sorry but the better half of this post is nonsense.

Games run out of the box on RDNA3 as in, no game ready drivers needed these days. There are indeed rare occasions with bugs. And they exist on the Nvidia side too. But as in needing a driver to run said game, Nvidia has been required to move a lot more patches forward since RDNA2-3. Part of that is also due to their expanded featureset. But your impression is off on the general state of RDNA3. Its in a great place. It runs all games admirably. I play a very wide variety and nothing fails.
Optimization is another aspect and in that you're probably right. OTOH, the AMD console hardware already puts RDNA at a pretty strong starting position. Its already running well on similar architecture for its primary target market, the consoles. There are almost no big games that are PC first these days, so again, your impression devs do more on Nvidia is off, too.

They're really in a very good place, overall, wrt game support and stability, easily rivalling Nvidia. Its when you tread into OpenGL space and non-gaming applications, then you will generally see Nvidia having slightly better support, occasionally. Not surprising since those are PC-first, a different reality.
Posted on Reply
#37
Draconis
TumbleGeorgeYes, good catch. It really needs explanation.
Wild speculation time. This has been on the cards for a while and the rumoured RDNA5 ground up architecture is “UDNA5”, regardless of what they call it. Instinct Teams helps Radeon Teams fix the GPU chiplets strategy.
Posted on Reply
#38
qlum
Could be very well a case that a tiling / chiplet based architecture is more in play at that point. Just because the architecture is the same doesn't mean the gpu's themselves don't have major differences. splitting gcn and rdna has also not been the greatest success for consumer gpu's in terms of competitiveness.
Posted on Reply
#39
R-T-B
Fermi was nvidias last real "unified" arch in the sense that it was virtually unchanged compute to gaming.

No, I don't know that this move gives me good feels at all...
Posted on Reply
#40
lexluthermiester
R-T-BNo, I don't know that this move gives me good feels at all...
I know what you mean. Seems reasonable to think they going more for making things simpler and easier and for the feel-goods.
Posted on Reply
#41
las
Vayra86Sorry but the better half of this post is nonsense.

Games run out of the box on RDNA3 as in, no game ready drivers needed these days. There are indeed rare occasions with bugs. And they exist on the Nvidia side too. But as in needing a driver to run said game, Nvidia has been required to move a lot more patches forward since RDNA2-3. Part of that is also due to their expanded featureset. But your impression is off on the general state of RDNA3. Its in a great place. It runs all games admirably. I play a very wide variety and nothing fails.
Optimization is another aspect and in that you're probably right. OTOH, the AMD console hardware already puts RDNA at a pretty strong starting position. Its already running well on similar architecture for its primary target market, the consoles. There are almost no big games that are PC first these days, so again, your impression devs do more on Nvidia is off, too.

They're really in a very good place, overall, wrt game support and stability, easily rivalling Nvidia. Its when you tread into OpenGL space and non-gaming applications, then you will generally see Nvidia having slightly better support, occasionally. Not surprising since those are PC-first, a different reality.
Lmao no. Tons of AMD users are constantly complaining in new games. Even in old games. Look at Hunt Showdown 1896 steam disc where AMD users have huge issues.

Another example was Starfield. Even AMD sponsored, but all AMD GPU users had no sun present in the game for weeks/months post release.

Like I said. Devs will be priotizing 90% over 10% any day. Just because consoles use AMD hardware don't reflect PC games. There's like only a handful of games that run better on AMD GPUs and when looking at the overall performance across many titles, including alphas, betas, early accesss, lesser popular titles and emulators, Nvidia is the clear winner with least issues and best performance.

AMD is cheaper for a reason. Worse features, worse drivers and optimization, uses more power, has lower resell value.

If AMD GPUs were actually good, they would have way more marketshare. That is just reality.

AMD leaving high-end GPU market is just another nail in the coffin.
Neo_MorpheusYou have some good points but the rest is simply more of the same anti-AMD propaganda that the influencers spew on a daily basis and then the followers regurgitate.

That misinformation, in my opinion, is the main reason why we now have Ngreedia with a 90% market share.
Had a 6800XT before my 4090, I know exactly whats up and down. AMD pretty much loses in all areas except price, when you factor in the lower resell value and higher power usage, you literally save nothing. AMD has way more issues in games as well, it is a fact. Go read discussion forums on new game releases and you will see.

Do I want AMD to stay competitive? Yes. Are they? No. Not in the GPU space thats for sure. Nvidia is king.

Its funny how people generally speak to me like I am a Nvidia fanboy. I have like 5 AMD chips in-house, even using AMD CPU in my main gaming rig, I just KNOW that AMD GPU is nowhere near Nvidia at this point in time and I am not alone:

www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr

People are literally fleeing from AMD GPUs at the moment.

RDNA4 hopefully will be a success so AMD can crawl back to 20-25% marketshare over the next 5 years. RDNA5 needs to be a homerun as well, for that to happen.
MakaveliAs someone that has been using and reselling Radeon for the better part of 20 years I don't believe this to be true I never had a problem selling a card when I was ready to upgrade.
It is 100% true as AMD lowers price over time to stay competive meaning resell value drops - Nvidia keeps their prices alot more steady; Think Android vs iPhone resell value here. It's the exact same thing. iPhones are far more worth when you sell them again. Tons of demand. More expensive yeah, but you get more back. Same is true for Nvidia GPUs.

AMD is the small player so price is what they adjust to compete. Remember how they sold Radeon 6000 series dirt cheap leading up to 7000 launch and even deep into 7000 series? This is why AMD resell value is very low. 6700, 6800 and 6900 series were selling for peanuts in the used market because of this.

It's not hard to sell AMD GPUs, it is hard not to loose alot of money compared to the price you bought it for. Is what I am saying, which is 100% true. Low demand = Low price.

Also for this gen, AMD uses more power too. When you look at the TCO you simply don't save much buying AMD and you get inferior features and more issues too and this is why AMD lost and keeps losing GPU marketshare. AMD is CPU first, GPU 2nd and they don't spend alot of their R&D funds on GPUs, especially not gaming GPUs, because high-end gaming GPUs simply don't sell well for AMD.

Most people with 500+ dollars to spend on a GPU, buys Nvidia.

AMDs best sellers have all been cheap cards like RX480-470-580-570, 5700XT, 6700XT etc.

This is what they aim for with RDNA4 as well. Low to mid-end, hopefully grapping back some marketshare.
Posted on Reply
#42
LittleBro
lasIts funny how people generally speak to me like I am a Nvidia fanboy.
Maybe they call you Nvidiot, because that's what you are.

You can't compare RX 6800 and RTX 4090. Of course RTX 4090 or iPhone will have higher value when you sell it to someone because the initial investment was much higher (than AMD GPU or Android phone). It's the same with cars: used Dacia will cost much less than used BWM or Mercedez, but Dacia's cost was nowhere near the cost of those two others.

I have never had problem selling my old Radeon GPU for a reasonable price.

Don't forget what practices has been Nvidia using for a long time to gain their marketshare. (Shady practices similar to Intel's).
What I like about AMD is that everything they develop, they release as open source and it can be used on any GPU (Intel, AMD, Nvidia).

If you want to use GSync, you need to have Nvidia GPU, GSync certified/capable monitor. I'm tired of this proprietary greediness ... It has been like this forever ... PissX, DLSS. Man they even refused to allow older RTX cards to utilize newer DLSS despite those cards being totally capable of supporting it. Where is PissX now? Nowadays Nvidia fools their customers by fake frames. You don't pay $1600 for a GPU to play a game with fake frames and distorted image. But that's what Nvidia tells you - you need newest DLSS and fake frames! And you need the newest RTX generation to support the newest DLSS generation, of course.

This DLSS/FSR stuff does not help the case with poor game optimizations. On one hand it's insane that even RTX 4090 cannot run some newest games maxed out above 60 FPS. On the other hand, if you turn on that stupid DLSS/FSR to increase FPS, you are actually putting a blind eye to poor game development. As a game developer, what would drive me to optimize my game to run smoothly when I could just tell customers to turn off DLSS/FSR to increase performance ... But it's all distorted, or fake, or anything ... but definitely not native.

We pay more and more for the new hardware and what we get? Stupid upscaling and/or AI guessing. And Nvidia fully supports that idea from the beginning.
Posted on Reply
#43
las
LittleBroMaybe they call you Nvidiot, because that's what you are.

You can't compare RX 6800 and RTX 4090. Of course RTX 4090 or iPhone will have higher value when you sell it to someone because the initial investment was much higher (than AMD GPU or Android phone). It's the same with cars: used Dacia will cost much less than used BWM or Mercedez, but Dacia's cost was nowhere near the cost of those two others.

I have never had problem selling my old Radeon GPU for a reasonable price.

Don't forget what practices has been Nvidia using for a long time to gain their marketshare. (Shady practices similar to Intel's).
What I like about AMD is that everything they develop, they release as open source and it can be used on any GPU (Intel, AMD, Nvidia).

If you want to use GSync, you need to have Nvidia GPU, GSync certified/capable monitor. I'm tired of this proprietary greediness ... It has been like this forever ... PissX, DLSS. Man they even refused to allow older RTX cards to utilize newer DLSS despite those cards being totally capable of supporting it. Where is PissX now? Nowadays Nvidia fools their customers by fake frames. You don't pay $1600 for a GPU to play a game with fake frames and distorted image. But that's what Nvidia tells you - you need newest DLSS and fake frames! And you need the newest RTX generation to support the newest DLSS generation, of course.

This DLSS/FSR stuff does not help the case with poor game optimizations. On one hand it's insane that even RTX 4090 cannot run some newest games maxed out above 60 FPS. On the other hand, if you turn on that stupid DLSS/FSR to increase FPS, you are actually putting a blind eye to poor game development. As a game developer, what would drive me to optimize my game to run smoothly when I could just tell customers to turn off DLSS/FSR to increase performance ... But it's all distorted, or fake, or anything ... but definitely not native.

We pay more and more for the new hardware and what we get? Stupid upscaling and/or AI guessing. And Nvidia fully supports that idea from the beginning.
Ah, so I am a Nvidiot because I can afford high-end hardware and you can't? :laugh: Are you an AMPOOR then? Obviously I know 6800XT is not comparable to a 4090. I am saying 6800XT was a terrible experience.

Whatever makes you happy. What I am claiming is 100% true and people are fleeing from AMD GPUs in general which anyone can see.

www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr
Posted on Reply
#44
LittleBro
lasAh, so I am a Nvidiot because I can afford high-end hardware and you can't? :laugh: Are you an AMPOOR then?
Nah, it's because you praise Nvidia and spit shit on AMD. Always. In every discussion. It's a pattern of your behavior.

You should not talk about people that they are poor when they don't buy the things they don't need.
Different people have different interests. You paid huge amount for RTX 4090, I'd rather pay that amount for something else.
To me, RTX 4090 for $1600 is not worth it, especially not considering how much I game per month.

Anyway, how about you posted some reasonable comment to my sayings regarding DLSS or Nvidia's shady practices?

I know many people who still have RX 6800 XT and they don't experience any problems. That card was a great successor to RX 5700 XT, doubled the amount of memory of previous generation and added support for a lot of new functions. Had only about 10% less rasterizing performance than RTX 3090 on average in 1080p and 1440p, while priced at less than half of RTX 3090's (comparing MSRP). The main problem (for customers) was it's price during crypto fever. At some time, it might have cost even more than today's RTX 4090. I'd rather call RX 7800 XT a terrible experience compared to RX 6800 XT.
Posted on Reply
#45
las
LittleBroNah, it's because you praise Nvidia and spit shit on AMD. Always. In every discussion. It's a pattern of your behavior.

You should not talk about people that they are poor when they don't buy the things they don't need.
Different people have different interests. You paid huge amount for RTX 4090, I'd rather pay that amount for something else.
To me, RTX 4090 for $1600 is not worth it, especially not considering how much I game per month.

Anyway, how about you posted some reasonable comment to my sayings regarding DLSS or Nvidia's shady practices?

I know many people who still have RX 6800 XT and they don't experience any problems. That card was a great successor to RX 5700 XT, doubled the amount of memory of previous generation and added support for a lot of new functions. Had only about 10% less rasterizing performance than RTX 3090 on average in 1080p and 1440p, while priced at less than half of RTX 3090's (comparing MSRP). The main problem (for customers) was it's price during crypto fever. At some time, it might have cost even more than today's RTX 4090. I'd rather call RX 7800 XT a terrible experience compared to RX 6800 XT.
Nah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.

Funny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.

AMD left high-end GPU market now, for good reason. No-one is buying expensive AMD GPUs really. AMD is years behind in too many areas.

Go have a look at 2:30 in this video and you will know why competitive gamers use Nvidia 99% of the time as well. 4080 beats 7900XTX while using 65 watts with ease. 7900XTX uses 325-350% more power.


Posted on Reply
#46
LittleBro
lasNah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.
Calling other person poor, calling others the same as they called you ...
lasFunny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.
Making up things and arguments that nobody said ...

And delivering expected results in another thread ...

I'm done with you, kiddo.
Posted on Reply
#47
Makaveli
lasIt is 100% true as AMD lowers price over time to stay competive meaning resell value drops - Nvidia keeps their prices alot more steady; Think Android vs iPhone resell value here. It's the exact same thing. iPhones are far more worth when you sell them again. Tons of demand. More expensive yeah, but you get more back. Same is true for Nvidia GPUs.

AMD is the small player so price is what they adjust to compete. Remember how they sold Radeon 6000 series dirt cheap leading up to 7000 launch and even deep into 7000 series? This is why AMD resell value is very low. 6700, 6800 and 6900 series were selling for peanuts in the used market because of this.

It's not hard to sell AMD GPUs, it is hard not to loose alot of money compared to the price you bought it for. Is what I am saying, which is 100% true. Low demand = Low price.

Also for this gen, AMD uses more power too. When you look at the TCO you simply don't save much buying AMD and you get inferior features and more issues too and this is why AMD lost and keeps losing GPU marketshare. AMD is CPU first, GPU 2nd and they don't spend alot of their R&D funds on GPUs, especially not gaming GPUs, because high-end gaming GPUs simply don't sell well for AMD.

Most people with 500+ dollars to spend on a GPU, buys Nvidia.

AMDs best sellers have all been cheap cards like RX480-470-580-570, 5700XT, 6700XT etc.

This is what they aim for with RDNA4 as well. Low to mid-end, hopefully grapping back some marketshare.
I bought a 6800XT in 2021 then sold it in 2023 for half its value. When I looked at time that was the same thing that would have applied to a 3080 10GB or the Ti model so again i'm not sure about this selling for peanuts. I haven't lost any money on my resales and value drops naturally as the cards age. And if the demand was low I wouldn't have been able to sell any of my cards they were sold literally 1 week after I posted my ads.

I'm someone that spends more than $500 on GPU's and they have all been radeons there is no way to quantify most people do without actually sales data.
lasNah I am a realist and AMD is simply just far behind Nvidia. There is proof all over, you are just ignoring it, because you are the actual fanboy here. I could not care less if my GPU is AMD or Nvidia, as long as it delivers.

Funny how you think a CPU first company is going to beat Nvidia in GPUs tho. Never going to happen.

AMD left high-end GPU market now, for good reason. No-one is buying expensive AMD GPUs really. AMD is years behind in too many areas.

Go have a look at 2:30 in this video and you will know why competitive gamers use Nvidia 99% of the time as well. 4080 beats 7900XTX while using 65 watts with ease. 7900XTX uses 325-350% more power.


I remember this video and its flaws.

Guy is comparing a AIB 7900XTX vs a FE 4080 instead of a reference 7900XTX that alone makes the comparison moot.

if you are going to compare it has to be AIB vs AIB and reference vs reference. I'm pretty sure I even left a comment on that video when I saw it years ago.
Posted on Reply
#48
las
MakaveliI bought a 6800XT in 2021 then sold it in 2023 for half its value. When I looked at time that was the same thing that would have applied to a 3080 10GB or the Ti model so again i'm not sure about this selling for peanuts. I haven't lost any money on my resales and value drops naturally as the cards age. And if the demand was low I wouldn't have been able to sell any of my cards they were sold literally 1 week after I posted my ads.

I'm someone that spends more than $500 on GPU's and they have all been radeons there is no way to quantify most people do without actually sales data.


I remember this video and its flaws.

Guy is comparing a AIB 7900XTX vs a FE 4080 instead of a reference 7900XTX that alone makes the comparison moot.

if you are going to compare it has to be AIB vs AIB and reference vs reference. I'm pretty sure I even left a comment on that video when I saw it years ago.
Bought a 3080 on release for 699 and sold it for 1200 dollars during mining craze. Which was the reason I picked up a dirt cheap 6800XT as temp card, until 4090 replaced it.

6800XT were selling for like 400 dollars brand new, post mining craze. Even 6900XT and 6950XT were below 450 dollars. 6700XT were selling for like 250-300 dollars. It's a fact that AMD lowers price alot. Lower demand = lower prices and AMD always compete on price. Nvidia don't really have to, because demand is high.

AMD even delayed 7700 and 7800 series like crazy because warehouses were filled to the brink with 6700, 6800 and 6900 series collecting dust. Hence the massive pricecuts. OBVIOUSLY resale price takes a hit then.



You don't have to try and explain, I have been in this game for 25+ years. Built 1000s of custom high-end PCs. Sold millions of units B2B. It's a simple fact that Nvidia retains its value much better over time, especially today with all the RTX features and leading performance. AMD is simply years behind and now left the high-end gaming GPU space officially.

AMD is doing worse than ever in the dGPU gaming space. I use 4090 because AMD have nothing that even comes close. Nvidia absolutely destroys AMD when you consider it all; Features, drivers and optimization, RT and Path Tracing performance. I would pick 4080S, 4080, 4070 Ti SUPER and even 4070 Ti/SUPER over any AMD card right now personally. Simply can't loose DLDSR, DLSS, DLAA, Reflex, ShadowPlay, Proper RT Performance and option for Path Tracing with Frame Gen that actually works good. DLSS/DLAA beats FSR with ease, Techpowerup tested this in like 50 games and Nvidia wins all.

AMD invented nothing new in the GPU space in the last many generations, all they do is undercut Nvidia and offer worse features across the board, which is why they lose and keep losing marketshare. 9 out of 10 people won't even consider AMD at this point.

Lets hope AMD can make a turnaround with RDNA4 and RDNA5 because right now, things look very bad:

www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr

Nvidia fanboy? Nah, if AMD were able to offer what Nvidia is offering, I would be using AMD GPU. AMD simply has nothing I want at this point. FSR is mediocre. RT is unuseable. VSR is meh compared to DLDSR. Anti Lag + loses to Reflex.

I don't look at raster performance only. I look at the full picture and there's 600+ games with RTX features now and rising fast. 9 out of 10 new games simply run better on Nvidia. Native gaming is dead to me, I play all games with DLAA or DLDSR which beats native with absolute ease. Even DLSS on the higher presets can beat native while improving performance, proof:

www.rockpapershotgun.com/outriders-dlss-performance
Posted on Reply
#49
zenlaserman
lasBought a 3080 on release for 699 and sold it for 1200 dollars during mining craze.

6800XT were selling for like 400 dollars brand new post mining craze. Even 6900XT and 6950XT were below 450 dollars. 6700XT were selling for like 275-300 dollars.

AMD even delayed 7700 and 7800 series like crazy because warehouses were filled to the brink with 6700, 6800 and 6900 series.



You don't have to try and explain, I have been in this game for 25+ years. Built 1000s of custom high-end PCs. Sold millions of units B2B. It's a simple fact that Nvidia retains its value much better over time, especially today.

You are in full denial mode in every post, sadly what I say is true, AMD is doing worse than ever in the dGPU gaming space. I use 4090 because AMD have nothing that even comes close, not even in raster, however Nvidia absolutely destroys AMD when you consider it all; Features, drivers and optimization, RT and Path Tracing performance.

AMD invented nothing new in the GPU space in the last few generations, all they do is undercut Nvidia and offer worse features across the board, which is why they lose and keep losing marketshare. 9 out of 10 people won't even consider AMD at this point.

Lets hope AMD can make a turnaround with RDNA4 and RDNA5 because right now, things look very bad:

www.tomshardware.com/pc-components/gpus/nvidias-grasp-of-desktop-gpu-market-balloons-to-88-amd-has-just-12-intel-negligible-says-jpr
Shut up. You're like an ouroboros, except at the center of a toilet.
Posted on Reply
#50
las
MakaveliI bought a 6800XT in 2021 then sold it in 2023 for half its value. When I looked at time that was the same thing that would have applied to a 3080 10GB or the Ti model so again i'm not sure about this selling for peanuts. I haven't lost any money on my resales and value drops naturally as the cards age. And if the demand was low I wouldn't have been able to sell any of my cards they were sold literally 1 week after I posted my ads.

I'm someone that spends more than $500 on GPU's and they have all been radeons there is no way to quantify most people do without actually sales data.


I remember this video and its flaws.

Guy is comparing a AIB 7900XTX vs a FE 4080 instead of a reference 7900XTX that alone makes the comparison moot.

if you are going to compare it has to be AIB vs AIB and reference vs reference. I'm pretty sure I even left a comment on that video when I saw it years ago.
Yeah you saw and commented on a 1 year old video, years ago :laugh: Sounds legit.

You are in full denial mode :laugh: Do you also refuse to believe that AMD is doing bad in the GPU space?
Posted on Reply
Add your own comment
Nov 21st, 2024 04:50 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts