• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

Joined
Nov 26, 2021
Messages
1,769 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Please, no HU. Their "demonstration" with the 6700 XT tortured at 55 FPS to show the value of the surplus of vRAM put a lid on me. I have yet to meet a game running 50-60 fps in which there are no scenes where the fps drops dramatically, and these drops happen precisely when you need as many frames per second as possible.

nVidia offers Overwatch 2 Invasion for 4060, AMD goes with Starfield. The problem with these games is whether they appeal to you or not. Otherwise, the price difference between them is only 25 euros in Romania and I deduce from the customer comments that there is no longer the enthusiasm met with the 6000 series. Among the complaints are high noise and/or high temperatures (cheap solutions, dubious quality).
The difference in consumption must also be taken into account (115W versus 155W), which for me means the average consumption of the i5-13500 processor in the games run with the 3070 Ti.
All in all, AMD sells cheaper because it is the only weapon with which it can fight nVidia. Imagine the same price for RTX 4090 and RX 7900 XTX. Who still buys RX?
The difference is barely 20 W.

1691901888321.png


With a frame limiter, the difference is within the margin of error.

1691901979840.png
 
Joined
Aug 25, 2021
Messages
1,206 (0.96/day)
Please, no HU. Their "demonstration" with the 6700 XT tortured at 55 FPS to show the value of the surplus of vRAM put a lid on me. I have yet to meet a game running 50-60 fps in which there are no scenes where the fps drops dramatically, and these drops happen precisely when you need as many frames per second as possible.
I am yet to meet a person who is rational enough to buy a $400 disgrace of card 4060Ti 8GB with twice as narrow 128-bit memory bus in comparison to last gen 3060Ti and embarrassing 5% faster in 1440p, the card that is merely a few percentages faster than 6700XT while being 20% more expensive, with gimped silicon, a depressive product choking in increasing number of games due to low VRAM. Zero value for consumers. Even more so for a $500 abomination 4060Ti 16GB that looks laughable when tested against similarly priced, 4K entry card 6800XT with the same amount of VRAM. That's all I will say about it and Hardware Unboxed clearly shows how reality looks like, whether you like it or not.

Pointless to imagine the same price of 4090 and 7900XTX, unless we also imagine the same performance and the same price. If so, I'd still buy 7900XTX due to modern DisplayPort 2.1 at 54 Gbps, whereas 4090 offers me 7 years old DP 1.4 with twice as little bandwidth. In reality though, it's 25% faster in 4K but 60% more expensive. Not for me. You do need to realize that different buyers favour different features. I am aware of what is appealing about 4090, but it's not for me. Most people enjoy drinking milk. I don't.
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
The difference is barely 20 W.

View attachment 308784

With a frame limiter, the difference is within the margin of error.

View attachment 308785
Maybe not. You don't give an example of an "OC" versus non-oc specimen. The performance/consumption ratio is clearly in nVidia's favor on all levels.
In gaming, the difference is 34W. Maximum, is 41W.
Quite embarrassing because the RX 7600 has the consumption of a 4060 Ti, and performs only in rasterization at the level of a non-ti 4060.

dddd.jpg
 
Joined
Jan 14, 2019
Messages
13,940 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Maybe not. You don't give an example of an "OC" versus non-oc specimen. The performance/consumption ratio is clearly in nVidia's favor on all levels.
In gaming, the difference is 34W. Maximum, is 41W.
Quite embarrassing because the RX 7600 has the consumption of a 4060 Ti, and performs only in rasterization at the level of a non-ti 4060.

View attachment 308803
I suspect the 7600 is way further up its efficiency curve by default than the 4060 is. That is, you could achieve similar results by power limiting / undervolting the 7600, or by overclocking the 4060 to the moon.
 
Joined
Jun 6, 2022
Messages
622 (0.64/day)
I am yet to meet a person who is rational enough to buy a $400 disgrace of card 4060Ti 8GB with twice as narrow 128-bit memory bus in comparison to last gen 3060Ti and embarrassing 5% faster in 1440p, the card that is merely a few percentages faster than 6700XT while being 20% more expensive, with gimped silicon, a depressive product choking in increasing number of games due to low VRAM. Zero value for consumers. Even more so for a $500 abomination 4060Ti 16GB that looks laughable when tested against similarly priced, 4K entry card 6800XT with the same amount of VRAM. That's all I will say about it and Hardware Unboxed clearly shows how reality looks like, whether you like it or not.

Pointless to imagine the same price of 4090 and 7900XTX, unless we also imagine the same performance and the same price. If so, I'd still buy 7900XTX due to modern DisplayPort 2.1 at 54 Gbps, whereas 4090 offers me 7 years old DP 1.4 with twice as little bandwidth. In reality though, it's 25% faster in 4K but 60% more expensive. Not for me. You do need to realize that different buyers favour different features. I am aware of what is appealing about 4090, but it's not for me. Most people enjoy drinking milk. I don't.
In free translation: Miss Universe has a small wart on her shoulder, bleah. Rivala weighs 150 kilograms, she doesn't cook, she's hysterical and violent, but she doesn't have a wart and I like her.
You are an AMD fan, a radical one, but try not to be embarrassing. The 20% increase in 4K means a lot, the rest of the technologies make AMD graphics cards some juniors and you are tied to a connection? LOL!!!!!
Even my old 3070 Ti outperforms the 7900XTX in rendering in all programs that included OptiX.

P.S. We were talking about the 7600 versus the 4060. I don't know why you put the 4060 Ti in the same pot. It is similar to 7600 only in terms of power consumption.

I suspect the 7600 is way further up its efficiency curve by default than the 4060 is. That is, you could achieve similar results by power limiting / undervolting the 7600, or by overclocking the 4060 to the moon.
And nVidia has the curve up. With the old GTX 1080, I stabilized it at 80W instead of 120W, and the RTX 3070 Ti achieves the same performance with 230W (default 290W).

Another example:
undervolt.jpg
 
Last edited:
Joined
Jul 15, 2020
Messages
1,035 (0.62/day)
System Name Dirt Sheep | Silent Sheep
Processor i5-2400 | 13900K (-0.02mV offset)
Motherboard Asus P8H67-M LE | Gigabyte AERO Z690-G, bios F29e Intel baseline
Cooling Scythe Katana Type 1 | Noctua NH-U12A chromax.black
Memory G-skill 2*8GB DDR3 | Corsair Vengeance 4*32GB DDR5 5200Mhz C40 @4000MHz
Video Card(s) Gigabyte 970GTX Mini | NV 1080TI FE (cap at 50%, 800mV)
Storage 2*SN850 1TB, 230S 4TB, 840EVO 128GB, WD green 2TB HDD, IronWolf 6TB, 2*HC550 18TB in RAID1
Display(s) LG 21` FHD W2261VP | Lenovo 27` 4K Qreator 27
Case Thermaltake V3 Black|Define 7 Solid, stock 3*14 fans+ 2*12 front&buttom+ out 1*8 (on expansion slot)
Audio Device(s) Beyerdynamic DT 990 (or the screen speakers when I'm too lazy)
Power Supply Enermax Pro82+ 525W | Corsair RM650x (2021)
Mouse Logitech Master 3
Keyboard Roccat Isku FX
VR HMD Nop.
Software WIN 10 | WIN 11
Benchmark Scores CB23 SC: i5-2400=641 | i9-13900k=2325-2281 MC: i5-2400=i9 13900k SC | i9-13900k=37240-35500
AMD pop in-and-out of the enthusiast segment as a matter of priority-profitability-feasibility per gen\arch.
Nothing new.
A good strategy, imo, as long as you are the weaker (by a large distance) player.
To chase the enthusiast segment dead on, on the back of other segment`s, will be a poor choice.

As to this current rummer, well, rummers, leaks and other dripping matters... paint it blue and throw it to the sea.
 
Joined
May 15, 2020
Messages
697 (0.40/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Joined
Jan 14, 2019
Messages
13,940 (6.31/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
And nVidia has the curve up. With the old GTX 1080, I stabilized it at 80W instead of 120W, and the RTX 3070 Ti achieves the same performance with 230W (default 290W).

Another example:
View attachment 308807
Yeah, but how far they are up the curve is the question here. My favourite example is the 6500 XT that consumes double the power that the 6400 needs while performs only maybe 30% better.
 
Joined
Aug 10, 2023
Messages
341 (0.63/day)
nVidia offers Overwatch 2 Invasion for 4060, AMD goes with Starfield.
No, AMD gives you a full game while Nvidia gives you some more or less worthless cosmetical items in a game (and a few story missions *edit), huge difference. That being said, OW2 is a free game, if you don't play it, your profit is 0.
 
Last edited:
Joined
Jun 6, 2022
Messages
622 (0.64/day)
Gentlemen, let's conclude.
In the TPU survey, AMD is in 2nd place on all levels and the RTX 4000 is on the rise compared to the RX7000.
tpu.jpg


Globally, the situation is dramatic for AMD, nVidia capturing 82% of the discrete video card market at the end of 2022. It is probably even more dramatic now because they cannot keep up with nVidia releases.
marketshare q4 2022.png



There are figures, there are real data and the discussions here are for the sake of discussion. If a wart on nVidia graphics cards bothers you, buy a slower and less power-friendly AMD video card. They don't have the school of those from nVidia, they don't have a good grasp of modern technologies beyond rasterization, some even have problems with the drivers, but they are cheaper. Everyone chooses what they want.

Strictly for 7600 versus 4060, I would not give up the technologies offered by nVidia for 25-50 euros. My unwavering opinion.
I'm not buying anything because the choice made in 2021 was good and I can quietly wait for the next generation. No game forces me to give up ray tracing for a satisfactory framerate, nor am I forced to use DLSS. Not yet.
 
Joined
Apr 12, 2013
Messages
7,625 (1.77/day)
Globally? The same dGPU market which is seeing an epic "meltdown" of its won & Nvidia rumored to be restricting supply of their card for this segment :nutkick:

You sure that's a good example?
 
Joined
Nov 26, 2021
Messages
1,769 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Maybe not. You don't give an example of an "OC" versus non-oc specimen. The performance/consumption ratio is clearly in nVidia's favor on all levels.
In gaming, the difference is 34W. Maximum, is 41W.
Quite embarrassing because the RX 7600 has the consumption of a 4060 Ti, and performs only in rasterization at the level of a non-ti 4060.

View attachment 308803
Unlike AMD, Nvidia seems to limit Furmark. Even that card has almost the same power draw as the ASUS one.

1691943147490.png
 
Low quality post by ViperXZ
Joined
Aug 10, 2023
Messages
341 (0.63/day)
Gentlemen, let's conclude.
In the TPU survey, AMD is in 2nd place on all levels and the RTX 4000 is on the rise compared to the RX7000.
View attachment 308827

Globally, the situation is dramatic for AMD, nVidia capturing 82% of the discrete video card market at the end of 2022. It is probably even more dramatic now because they cannot keep up with nVidia releases.
View attachment 308829


There are figures, there are real data and the discussions here are for the sake of discussion. If a wart on nVidia graphics cards bothers you, buy a slower and less power-friendly AMD video card. They don't have the school of those from nVidia, they don't have a good grasp of modern technologies beyond rasterization, some even have problems with the drivers, but they are cheaper. Everyone chooses what they want.

Strictly for 7600 versus 4060, I would not give up the technologies offered by nVidia for 25-50 euros. My unwavering opinion.
I'm not buying anything because the choice made in 2021 was good and I can quietly wait for the next generation. No game forces me to give up ray tracing for a satisfactory framerate, nor am I forced to use DLSS. Not yet.
Either a bait, so we comment on this gibberish, or the worst take I have ever read in my life on technology.
 
Joined
Aug 25, 2021
Messages
1,206 (0.96/day)
You are an AMD fan, a radical one, but try not to be embarrassing. The 20% increase in 4K means a lot, the rest of the technologies make AMD graphics cards some juniors and you are tied to a connection? LOL!!!!!
Cheap try. Shows your class and superficial impulse to make immature judgements based on crumbs of information. I am such a "radical fan". A few years ago, when I was posting positive things about one of my system Z390 with i7 and Pascal card, I was branded "Intel and Nvidia fan". It looks like one is always a "fan" no matter what products one uses. And a "radical" one too, no doubt, because one dares to explain his reasoning.

You'd better deal with tribalism in your own mind before you project it onto others. And ask members questions, above all, before you post nonsense judgements that sound like uninformed and meaningless label.
Even my old 3070 Ti outperforms the 7900XTX in rendering in all programs that included OptiX.
Nvidia cards are great in Optix, no doubt, but it's niche workload, just like 95% of GPU owners would never encode one single media file. Media encoders/decoders are pretty much similar between new cards from all three vendors, especially H265 and AV1. There is still a hiccup on H.264 for AMD which is being worked on, but other than that, it's fine for various codecs and variable bitrates. See Jarred Walton's extensive testing of media engines on twelve Intel, AMD and Nvidia cards published on Tom's Hardware. There's an informative table that compares all cards and one i9 CPU.

RDNA3 cards have already greatly improved in HIP workloads (see Phoronix extensive testing on this) and AMD engineers are working on the last bit of HIP RT for Radeon cards that should release in a month or so, alongside or near ROCm package. It's all open source, so it takes longer, but once finished, there will be a wide and free access to many professional and virtual GPU tools for all regardless of their GPU brand, which is great for consumers, especially for not-so-well-off creators who cannot afford packages behind Nvidia's pay wall.

Above all, many performance gaps will significantly reduce and in some workloads cards will trade blows. At the end fo the day, this area was significantly accelerated by AMD, effort is visible and growing number of creators and professionals do not want to accept CUDA-only dominance. At some point, in a few years, Nvidia is predicted to ditch many paid subscriptions once they see that people are able to use open source applications. Ironically, they might be tempted to supress some open source applications on GeForce cards, which would be absurd. There is no sane person on this planet who would not support accelerated adoption of open source. Ever growing Linux community is especially enthusiastic about it and cannot stand paid monopolies.
P.S. We were talking about the 7600 versus the 4060. I don't know why you put the 4060 Ti in the same pot. It is similar to 7600 only in terms of power consumption.
It was you who mentioned 6700XT, so I went to available testing videos from Steve at HUB. Both 6700XT and 6800XT in his tests show how almost three year old RDNA2 cards expose the absurdity of price/performance of both 4060Ti 8GB and 16GB models. Watch it. Videos are eye-opening. 4060 would be good @ $250+game($40 saved) and 7600 is already good @230+game($70 saved). Those cards are fine. Prices are wrong, with or without bundled games. Low tier class 60 new cards with 8GB cannot cost more than $230-250 in 2023 as their generational uplift in performance is rather miserable, especially 7600. By thew way, Overwatch 2 is the worst game of Steam, voted by damning reviews, so this $40 game bundle with 40 series cards sounds like a fresh juice and rotten egg for breakfast. Laughable.

Gentlemen, let's conclude.
In the TPU survey, AMD is in 2nd place on all levels and the RTX 4000 is on the rise compared to the RX7000.
Why such nonsense truism dude? Did you re-discover America today? Are you enjoying bathing in Columbus's glory? Is your Ego happy now? So many questions.
TPU survey overrepresents AMD cards in comparison to global GPU market share. If you add up numbers, AMD has 34%. In reality it's less. Brain hurts. Get over yourself and tell us whether Navi 41 is cancelled or not. Nvidia is not the main topic in this article and you are spamming the thread.
Globally, the situation is dramatic for AMD, nVidia capturing 82% of the discrete video card market at the end of 2022. It is probably even more dramatic now because they cannot keep up with nVidia releases.
Why such drama dude? Are you informed? That survey by Jon Peddie was officially called out for being deceptive regarding "9%" for Intel cards. It turned out that tens of thousands of rogue cards were wrongly calculated in client segment. Plus, you need to know his methodology and that he does not talk and gather data from all OEMs and is missing out on significant portion of market.

You should know more about AMD's discrete GPU business in wider context before you post more brain dead comments. Their client GPUs are less than 9% of total revenues and small portion of other important product porfolio. They are not even aiming to increase GPU market share to 30% or above, as they are heavily focused now in other more lucrative segments, such as CPUs, AI APUs/GPUs, FPGAs and consoles. Those bring over 90% of revenues. Once you understand this, it is easier for you to see that there is no "drama" for AMD. Grow out of that brain dead narrative now. You only have one chance to be rational and to reason logically with adults here.

AMD has grown in server CPU market from almost 0% in 2017 to projected 30% by the end of this year. It has been a monumental effort and market share growth, unheard of in server, more difficult than any attempt to change GPU ratio with Nvidia, as Intel had almost exclusive foothold globally and very difficult to tackle with entranched deals with enterprises. But, investment in Zen architecture, relentless drive and R&D paid off. Entire company was invested in it.

You can't have two or three equally big pushes in different segments at the same time. AMD was a small revenue company pre-Covid, just below $7 billion annually. They almost quadrupled this in a few years thanks mostly to Zen, consoles and now FPGA and embedded too. Once they bring themselves into roughly equal position in server with Intel, in 2-3 years, they might be able to refocus more efforts and R&D into GPUs and give it more serious push, depending on how AI pans out.

Key takeaway is that AMD neither craves nor needs more makret share in client GPUs in this period, as their priority is elsewhere. They will be present with GPUs to hover around 15-25%, but that's it. So, it's completely stupid to post any narrative with "AMD GPU drama" words. Nonsense. If you want to be Nvidia's press secretary endlessly repeating how much market they have captured, go and find a job in Nvidia marketing department and good luck with good salary.

There are figures, there are real data and the discussions here are for the sake of discussion. If a wart on nVidia graphics cards bothers you, buy a slower and less power-friendly AMD video card. They don't have the school of those from nVidia, they don't have a good grasp of modern technologies beyond rasterization, some even have problems with the drivers, but they are cheaper. Everyone chooses what they want.
I explained to you figures from Jon Peddie. I explained to you situation with media engines and constantly improving, open source modern technologies. The drivers "issue" is nonsense as latest research suggests that both cards have roughly equal numbers of reported issues. In last three years, I had three or four game crashes and mostly because I was pushing the card really hard. Adrenaline software has never been as stable and pleasant to interact with, whereas Nvidia's GUI on my laptop feels dated, from Stone Age. They could finally modernise it too. Actually, when was the last time you have used AMD card and interacted with its software? Have you noticed changes?
Strictly for 7600 versus 4060, I would not give up the technologies offered by nVidia for 25-50 euros. My unwavering opinion.
I'm not buying anything because the choice made in 2021 was good and I can quietly wait for the next generation. No game forces me to give up ray tracing for a satisfactory framerate, nor am I forced to use DLSS. Not yet.
Technologies are useless if generational performance uplift is dismal. RT is completely useless on lower tier cards. It kills performance and the card chokes in 8GB VRAM in increasing number of games. So, I'd buy 7600 for 230 with game ($70 saved) for my cousin's ITX build.

3070 and 3070Ti are increasingly choked with 8GB of VRAM and many Nvidia owners complain about it. They realised that having bare minimum of VRAM does not make their cards future-proof for more demanding games. Beggars belief how Nvidia has been able to get away with low VRAM offer on most of their Ampere cards. Clearly, marketing propaganda that blinded people with DLSS and RT worked, to hide and mask native hardware deficiencies. See testing in specific games by Hardware Unboxed. It's clearly visible in gameplay. Also, RT performance in several games makes it unplayable mess. There is stuttering and unloaded surfaces too. So, it depends on what you play. Most stuff will run well with reasonable settings though. Each card has its own limits.

But, let's leave all that and discuss Navi 41 in greater detail. Will it be released or not? What are the architectural challenges and where is the rumour rooted?
 
Last edited:
Joined
Aug 10, 2023
Messages
341 (0.63/day)
Cheap try. Shows your class and superficial impulse to make immature judgements based on crumbs of information. I am such a "radical fan". A few years ago, when I was posting positive things about one of my system Z390 with i7 and Pascal card, I was branded "Intel and Nvidia fan". It looks like one is always a "fan" no matter what products one uses. And a "radical" one too, no doubt, because one dares to explain his reasoning.

You'd better deal with tribalism in your own mind before you project it onto others. And ask members questions, above all, before you post nonsense judgements that sound like uninformed and meaningless label.

Nvidia cards are great in Optix, no doubt, but it's niche workload, just like 95% of GPU owners would never encode one single media file. Media encoders/decoders are pretty much similar between new cards from all three vendors, especially H265 and AV1. There is still a hiccup on H.264 for AMD which is being worked on, but other than that, it's fine for various codecs and variable bitrates. See Jarred Walton's extensive testing of media engines on twelve Intel, AMD and Nvidia cards published on Tom's Hardware. There's an informative table that compares all cards and one i9 CPU.

RDNA3 cards have already greatly improved in HIP workloads (see Phoronix extensive testing on this) and AMD engineers are working on the last bit of HIP RT for Radeon cards that should release in a month or so, alongside or near ROCm package. It's all open source, so it takes longer, but once finished, there will be a wide and free access to many professional and virtual GPU tools for all regardless of their GPU brand, which is great for consumers, especially for not-so-well-off creators who cannot afford packages behind Nvidia's pay wall.

Above all, many performance gaps will significantly reduce and in some workloads cards will trade blows. At the end fo the day, this area was significantly accelerated by AMD, effort is visible and growing number of creators and professionals do not want to accept CUDA-only dominance. At some point, in a few years, Nvidia is predicted to ditch many paid subscriptions once they see that people are able to use open source applications. Ironically, they might be tempted to supress some open source applications on GeForce cards, which would be absurd. There is no sane person on this planet who would not support accelerated adoption of open source. Ever growing Linux community is especially enthusiastic about it and cannot stand paid monopolies.

It was you who mentioned 6700XT, so I went to available testing videos from Steve at HUB. Both 6700XT and 6800XT in his tests show how almost three year old RDNA2 cards expose the absurdity of price/performance of both 4060Ti 8GB and 16GB models. Watch it. Videos are eye-opening. 4060 would be good @ $250+game($40 saved) and 7600 is already good @230+game($70 saved). Those cards are fine. Prices are wrong, with or without bundled games. Low tier class 60 new cards with 8GB cannot cost more than $230-250 in 2023 as their generational uplift in performance is rather miserable, especially 7600. By thew way, Overwatch 2 is the worst game of Steam, voted by damning reviews, so this $40 game bundle with 40 series cards sounds like a fresh juice and rotten egg for breakfast. Laughable.


Why such nonsense truism dude? Did you re-discover America today? Are you enjoying bathing in Columbus's glory? Is your Ego happy now? So many questions.
TPU survey overrepresents AMD cards in comparison to global GPU market share. If you add up numbers, AMD has 34%. In reality it's less. Brain hurts. Get over yourself and tell us whether Navi 41 is cancelled or not. Nvidia is not the main topic in this article and you are spamming the thread.

Why such drama dude? Are you informed? That survey by Jon Peddie was officially called out for being deceptive regarding "9%" for Intel cards. It turned out that tens of thousands of rogue cards were wrongly calculated in client segment. Plus, you need to know his methodology and that he does not talk and gather data from all OEMs and is missing out on significant portion of market.

You should know more about AMD's discrete GPU business in wider context before you post more brain dead comments. Their client GPUs are less than 9% of total revenues and small portion of other important product porfolio. They are not even aiming to increase GPU market share to 30% or above, as they are heavily focused now in other more lucrative segments, such as CPUs, AI APUs/GPUs, FPGAs and consoles. Those bring over 90% of revenues. Once you understand this, it is easier for you to see that there is no "drama" for AMD. Grow out of that brain dead narrative now. You only have one chance to be rational and to reason logically with adults here.

AMD has grown in server CPU market from almost 0% in 2017 to projected 30% by the end of this year. It has been a monumental effort and market share growth, unheard of in server, more difficult than any attempt to change GPU ratio with Nvidia, as Intel had almost exclusive foothold globally and very difficult to tackle with entranched deals with enterprises. But, investment in Zen architecture, relentless drive and R&D paid off. Entire company was invested in it.

You can't have two or three equally big pushes in different segments at the same time. AMD was a small revenue company pre-Covid, just below $7 billion annually. They almost quadrupled this in a few years thanks mostly to Zen, consoles and now FPGA and embedded too. Once they bring themselves into roughly equal position in server with Intel, in 2-3 years, they might be able to refocus more efforts and R&D into GPUs and give it more serious push, depending on how AI pans out.

Key takeaway is that AMD neither craves nor needs more makret share in client GPUs in this period, as their priority is elsewhere. They will be present with GPUs to hover around 15-25%, but that's it. So, it's completely stupid to post any narrative with "AMD GPU drama" words. Nonsense. If you want to be Nvidia's press secretary endlessly repeating how much market they have captured, go and find a job in Nvidia marketing department and good luck with good salary.


I explained to you figures from Jon Peddie. I explained to you situation with media engines and constantly improving, open source modern technologies. The drivers "issue" is nonsense as latest research suggests that both cards have roughly equal numbers of reported issues. In last three years, I had three or four game crashes and mostly because I was pushing the card really hard. Adrenaline software has never been as stable and pleasant to interact with, whereas Nvidia's GUI on my laptop feels dated, from Stone Age. They could finally modernise it too. Actually, when was the last time you have used AMD card and interacted with its software? Have you noticed changes?

Technologies are useless if generational performance uplift is dismal. RT is completely useless on lower tier cards. It kills performance and the card chokes in 8GB VRAM in increasing number of games. So, I'd buy 7600 for 230 with game ($70 saved) for my cousin's ITX build.

3070 and 3070Ti are increasingly choked with 8GB of VRAM and many Nvidia owners complain about it. They realised that having bare minimum of VRAM does not make their cards future-proof for more demanding games. Beggars belief how Nvidia has been able to get away with low VRAM offer on most of their Ampere cards. Clearly, marketing propaganda that blinded people with DLSS and RT worked, to hide and mask native hardware deficiencies. See testing in specific games by Hardware Unboxed. It's clearly visible in gameplay. Also, RT performance in several games makes it unplayable mess. There is stuttering and unloaded surfaces too. So, it depends on what you play. Most stuff will run well with reasonable settings though. Each card has its own limits.

But, let's leave all that and discuss Navi 41 in greater detail. Will it be released or not? What are the architectural challenges and where is the rumour rooted?
Kudos for putting the guy in his place, one should copy / paste it for when (not if) it happens next. Why are people so biased when it comes to tech? Try to remember who you were as a kid, I don't think anyone wanted to grow up and be a bitter hardcore fan of one brand who then posts toxic stuff to other people in order to rile them up, make them angry or have their "revenge". Endless cycle, stand over it, be better.

That being said, btt, I'm still not 100% sold that AMD will skip the enthusiast (high margin) market for RDNA 4 - rumors are just rumors, and rumors can also change opinions (of AMD).
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
You should know more about AMD's discrete GPU business in wider context before you post more brain dead comments. Their client GPUs are less than 9% of total revenues and small portion of other important product porfolio. They are not even aiming to increase GPU market share to 30% or above, as they are heavily focused now in other more lucrative segments, such as CPUs, AI APUs/GPUs, FPGAs and consoles. Those bring over 90% of revenues. Once you understand this, it is easier for you to see that there is no "drama" for AMD. Grow out of that brain dead narrative now. You only have one chance to be rational and to reason logically with adults here.

But nvidia's market cap is 1 trillion, while AMD is much smaller - only ~173 billion.
I think this is a severe mismanagement on AMD's side. They should stop the development of CPUs, and instead focus 100% on the GPUs - there is much more money in it
 
Joined
Mar 10, 2010
Messages
11,880 (2.18/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
But nvidia's market cap is 1 trillion, while AMD is much smaller - only ~173 billion.
I think this is a severe mismanagement on AMD's side. They should stop the development of CPUs, and instead focus 100% on the GPUs - there is much more money in it
Thank the heavens your not in management at AMD.

I disagree with you and think your exactly 100% wrong, IMHO.
 
Joined
Nov 26, 2021
Messages
1,769 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Joined
Aug 10, 2023
Messages
341 (0.63/day)
But nvidia's market cap is 1 trillion, while AMD is much smaller - only ~173 billion.
I think this is a severe mismanagement on AMD's side. They should stop the development of CPUs, and instead focus 100% on the GPUs - there is much more money in it
No, but they could sell ATI or RTG so that it can independently become strong again and eg. have more wafers and thus produce bigger/stronger GPUs. Simply the interest in a company for GPUs is higher to succeed or do well if that company is solely a GPU company, very simple. That alone would help.

But they 100% won't do that because they use RTG products for data center (INSTINCT), that are high margin and successful, they use it for consoles (that are not high margin, but very successful) and they use it for APUs that are rather successful, and not bad margin-wise. And then of course lastly they use it to build gaming graphics cards people can buy and put into their PCs - or buy a laptop with a Radeon GPU integrated. They have no upside to sell it, in other words.

Oh and that's the other problem: they can only sell it to someone like Intel for example, a competitor. RTG can not suddenly become a independent company like ATI again. That ship has long sailed.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
No, but they could sell ATI or RTG so that it can independently become strong again and eg. have more wafers and thus produce bigger/stronger GPUs. Simply the interest in a company for GPUs is higher to succeed or do well if that company is solely a GPU company, very simple. That alone would help.

But they 100% won't do that because they use RTG products for data center (INSTINCT), that are high margin and successful, they use it for consoles (that are not high margin, but very successful) and they use it for APUs that are rather successful, and not bad margin-wise. And then of course lastly they use it to build gaming graphics cards people can buy and put into their PCs - or buy a laptop with a Radeon GPU integrated. They have no upside to sell it, in other words.

Oh and that's the other problem: they can only sell it to someone like Intel for example, a competitor. RTG can not suddenly become a independent company like ATI again. That ship has long sailed.

RTG needs more autonomy inside AMD. AMD should be two companies under one roof - Ryzen Technology Group and Radeon Technology Group. In this case, both parts will have equal opportunities and equal starts. Today, RTG is left on an autopilot, and of course, the only thing which lead from it is downward spiralling...
 
Joined
Aug 10, 2023
Messages
341 (0.63/day)
RTG needs more autonomy inside AMD. AMD should be two companies under one roof - Ryzen Technology Group and Radeon Technology Group. In this case, both parts will have equal opportunities and equal starts. Today, RTG is left on an autopilot, and of course, the only thing which lead from it is downward spiralling...
That already happened when Radeon was split into RTG, you literally said what already happened years ago. However, the issue is that they need more resources to succeed (eg wafers) but those are also dependant on TSMC wafer capacity and pricing, it's not that easy. AMD of course wants Radeon and RTG to succeed but due to only 1 company being really good at building chips, that being TSMC, it's simply not enough to saturate the demand. Hypothetically, if Samsung were on the same level as TSMC, they could build the GPUs there, not use MCM, it automatically has more performance then, higher clocks, better latency due to monolithic. Next they could also scale up to a 600mm² monolithic chip, similarly to ADA102. Very possible it would then compete well against 4090, and not only against the much weaker 4080. There's too many sacrifices because AMD doesn't get enough wafers from TSMC. They need a heck ton for Ryzen + EPYC + INSTINCT and only the rest is left over for Radeon as it's the lowest margin product in this line.

Edit: there is nothing "left over", it is just some minimum amount of wafers alloted to Radeon so they can barely succeed. In a perfect world Radeon would get much more wafers and even Ryzen / EPYC would. AMD can not saturate any of these markets maybe aside from console, right now.

Intel market share in the server and consumer is also higher than AMDs because they have a lot of wafers due to producing them themselves. Not because their products are any better, they aren't aside from some niches.
 
Last edited:

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
That already happened when Radeon was split into RTG, you literally said what already happened years ago. However, the issue is that they need more resources to succeed (eg wafers) but those are also dependant on TSMC wafer capacity and pricing, it's not that easy. AMD of course wants Radeon and RTG to succeed but due to only 1 company being really good at building chips, that being TSMC, it's simply not enough to saturate the demand. Hypothetically, if Samsung were on the same level as TSMC, they could build the GPUs there, not use MCM, it automatically has more performance then, higher clocks, better latency due to monolithic. Next they could also scale up to a 600mm² monolithic chip, similarly to ADA102. Very possible it would then compete well against 4090, and not only against the much weaker 4080. There's too many sacrifices because AMD doesn't get enough wafers from TSMC. They need a heck ton for Ryzen + EPYC + INSTINCT and only the rest is left over for Radeon as it's the lowest margin product in this line.

Why don't they help GLOBALFOUNDRIES to buy new machines from ASML and catch with the modern nodes?
It is a critical mistake to rely only on one supplier - in this case only TSMC. There must always be diversification.
When the manufacturers are actually four (or more):
1. GLOBALFOUNDRIES
2. Intel
3. Samsung
4. TSMC
 
Joined
Aug 10, 2023
Messages
341 (0.63/day)
Why don't they help GLOBALFOUNDRIES to buy new machines from ASML and catch with the modern nodes?
It is a critical mistake to rely only on one supplier - in this case only TSMC. There must always be diversification.
When the manufacturers are actually four (or more):
1. GLOBALFOUNDRIES
2. Intel
3. Samsung
4. TSMC
1) extreme pricing of machines, AMD can not invest somewhere else than in their own stuff
2) it's not their business to invest for other companies
3) these companies also need RESEARCH to be as good as TSMC is. That's years of research and recruiting possibly better personell that can compete with the excellent people that work at TSMC.

AMD can not finance that or do the job for others. What AMD can do is work with others on single products, like HBM in the past. Or Nvidia did with GDDR5X and GDDR6X with Micron.

The only (fabless) company I know of that directly subsidized nodes, is Apple. They got the money to do that, AMD does not. What they get for that is primary availability to their products (like phones) for eg 5nm, now 3nm for the next Apple products that are coming.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.55/day)
Location
Ex-usa | slava the trolls
1) extreme pricing of machines, AMD can not invest somewhere else than in their own stuff

And yet they acquired Xilinx for 35 billion $.

2) it's not their business to invest for other companies

I thought GLOBALFOUNDRIES and AMD share the same owners. These are sister companies under one roof.
Don't forget also that GLOBALFOUNDRIES is actually the former AMD's manufacturing division.
 
Top