Sunday, August 6th 2023
AMD Retreating from Enthusiast Graphics Segment with RDNA4?
AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source:
VideoCardz
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?
So saying that 6600 will not have those issues is nuts.
RX 6600 is a low-end, poorly perfoming card for today's games, only good for yesterday's games which require less VRAM, maybe 4 or 6 GB.
The hardware just isn't up to snuff - you don't have access to matrix multiplication units, the hardware video encoder, while no longer completely awful, doesn't support full chroma or 4:2:2 video, which hinders its usefulness for video production as the GPU is incompatible with high-quality codecs used by modern cameras, you lose out on pretty much almost universally all of the niceties and eye candy that Nvidia has developed and maintained over the years, relegate yourself to a last-gen raytracing performance... and if we go by MSRP, congrats, you got 200 bucks off your 20-24 GB GPU that can't run anything that'd make that video memory worthwhile. In the meantime, Nvidia's figured out how do to on-the-fly shader execution reordering, and even has an early implementation of motion interpolation, which while increases latency, it can be countered somewhat with the use of Nvidia Reflex - well, I promise I won't tell anyone about the mess that Radeon's antilag thing is. Oops, I guess I did :oops:
Then there's the other thing, you got 19% fps since launch, that's pretty great! The problem is, Nvidia is also constantly improving their own software's performance. Reviews are best referenced when the hardware's closer to its launch, or when you manage to get a newer review with newer software - for example, I use W1zz's Taichi White XTX review as my reference because of that.
End of the day what matters is that you and you alone are happy, but if you carefully evaluate my reasoning, you'll see that for all the things that I get? The difference in MSRP, those $300 that would separate a 7900 XT to a 4080, accounting for all the performance gains, the far richer feature set, the constant stream of certified drivers plus the studio drivers for content creation which are made available to all RTX GPUs, it all adds up enough for me to personally, lean heavily towards Nvidia's offering. Strictly as a Windows user, anyway... Linux is the green team's achilles heel, mostly because everything that makes a GeForce give the experience it can is closed source.
This dude ran a 2023 AAA test battery on the vanilla 1070 which had slower 8Gbps GDDR5 (reducing mem bandwidth from 320 to 256 GB/s) and has 25% of the SMs of the GP104 disabled (15/20 units present). The newest games which have more sophisticated rendering techniques only begin to get a "passable" rank here when you're talking about Cyberpunk 2077. Warzone, Days Gone, God of War, Apex... they're all highly playable even on this gimped card from 2016.
Similarly, I haven't run into any situation where my 6750 XT can't deliver a stable 60 FPS with using only half of its power limit. Yes, I have other 8 GB (and even 4 GB) GPUs, and they're fine.
Plus, Nvidia's $400 4060Ti is really a $200-250 card, look at the PCB pictures. AMD is probably no better.
nVidia is even working on neural textures to reduce VRAM usage / improve texture quality.
AMD really needs to come up with something spectacular, if they want more market share.
Example: you're living in Central / Western Europe, you earn 2500 Euros / month and rent + food gets you close to 1500 Euros. You still have 1000 Euros for different spendings. You can easily save money to get a 4070 in 2 months... It's not the end of the world and nVidia KNOWS that! That's why nVidia has more market share: people can still afford their products.
It's Apple reloaded: "Apple is expensive!" but I see that 1 out of 4 phones is an iPhone, where I live (even older generations).
It is what it is...
People want "the best of the best" of everything: phone, car, wife / husband. :laugh: But you should be aware that you can't always have the best of the best. Can't afford the RTX 4090? Go for the RTX 4080 instead. Never understood the need to have the highest level of performance - you rarely need it. I know a guy that only plays CS: GO and LoL on a RTX 4070... :wtf: Hell, even my RTX 4060 is overkill for WoW (got it mostly for Warcraft 3 Reforged, Diablo 4 and God of War Ragnarök).Maybe it's time to read more and play less... Just saying...
Before that I had a 1070 with a Core i7 8 Gen, that machine wasn't able to play LOL at 4K without drops, I tried with a 3060 with 12 GB RAM and was way better but still not perfect, so anything higher than 1080P will require a good video card.
So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
Perhaps you could ask Nvidia to sell you 4080 for $850, and then ask AMD to sell 7900XTX for $799.
You didn't need the best.
Everyone doesn't buy the best.
Nvidia 4060 isn't the best.
Life goes on still no surprise.
You prove people are fickle and buy favourite name's.
@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
5080 for $1,200 can pass my test only if it has: 24GB VRAM, 50% uplift in 4K over 4080 and DisplayPort 2.1 ports (imncluding one USB-C).
RX 7900 XTX should be 7900 XT
RX 7900 XT should be 7800 XT
RX 7600 should be 7400 XT
RTX 4090 should be RTX 4080 Ti
RTX 4080 should be RTX 4070
RTX 4070 Ti should be RTX 4060 Ti
RTX 4070 should be RTX 4060
RTX 4060 Ti should be RTX 4050 Ti
RTX 4060 should be RTX 4050
Is 4060Ti capable of "acceptable RT performance" for $500? No. Even 4070 chokes with RT in more demanding titles and becomes a stuttering mess. So, the mainstream market GPUs still have RT performance in its infancy. Raster is dead - long live the raster.