Cheap try. Shows your class and superficial impulse to make immature judgements based on crumbs of information. I am such a "radical fan". A few years ago, when I was posting positive things about one of my system Z390 with i7 and Pascal card, I was branded "Intel and Nvidia fan". It looks like one is always a "fan" no matter what products one uses. And a "radical" one too, no doubt, because one dares to explain his reasoning.
You'd better deal with tribalism in your own mind before you project it onto others. And ask members questions, above all, before you post nonsense judgements that sound like uninformed and meaningless label.
Nvidia cards are great in Optix, no doubt, but it's niche workload, just like 95% of GPU owners would never encode one single media file. Media encoders/decoders are pretty much similar between new cards from all three vendors, especially H265 and AV1. There is still a hiccup on H.264 for AMD which is being worked on, but other than that, it's fine for various codecs and variable bitrates. See Jarred Walton's extensive testing of media engines on twelve Intel, AMD and Nvidia cards published on Tom's Hardware. There's an informative table that compares all cards and one i9 CPU.
RDNA3 cards have already greatly improved in HIP workloads (see Phoronix extensive testing on this) and AMD engineers are working on the last bit of HIP RT for Radeon cards that should release in a month or so, alongside or near ROCm package. It's all open source, so it takes longer, but once finished, there will be a wide and free access to many professional and virtual GPU tools for all regardless of their GPU brand, which is great for consumers, especially for not-so-well-off creators who cannot afford packages behind Nvidia's pay wall.
Above all, many performance gaps will significantly reduce and in some workloads cards will trade blows. At the end fo the day, this area was significantly accelerated by AMD, effort is visible and growing number of creators and professionals do not want to accept CUDA-only dominance. At some point, in a few years, Nvidia is predicted to ditch many paid subscriptions once they see that people are able to use open source applications. Ironically, they might be tempted to supress some open source applications on GeForce cards, which would be absurd. There is no sane person on this planet who would not support accelerated adoption of open source. Ever growing Linux community is especially enthusiastic about it and cannot stand paid monopolies.
It was you who mentioned 6700XT, so I went to available testing videos from Steve at HUB. Both 6700XT and 6800XT in his tests show how almost three year old RDNA2 cards expose the absurdity of price/performance of both 4060Ti 8GB and 16GB models. Watch it. Videos are eye-opening. 4060 would be good @ $250+game($40 saved) and 7600 is already good @230+game($70 saved). Those cards are fine. Prices are wrong, with or without bundled games. Low tier class 60 new cards with 8GB cannot cost more than $230-250 in 2023 as their generational uplift in performance is rather miserable, especially 7600. By thew way, Overwatch 2 is the worst game of Steam, voted by damning reviews, so this $40 game bundle with 40 series cards sounds like a fresh juice and rotten egg for breakfast. Laughable.
Since its arrival on Steam on Thursday at 2.30pm ET, Overwatch 2 has amassed just over 95,000 reviews, 86,000 of which are negative. According to the Steam250.com...
www.techspot.com
Why such nonsense truism dude? Did you re-discover America today? Are you enjoying bathing in Columbus's glory? Is your Ego happy now? So many questions.
TPU survey
overrepresents AMD cards in comparison to global GPU market share. If you add up numbers, AMD has 34%. In reality it's less. Brain hurts. Get over yourself and tell us whether Navi 41 is cancelled or not. Nvidia is not the main topic in this article and you are spamming the thread.
Why such drama dude? Are you informed? That survey by Jon Peddie was officially called out for being deceptive regarding "9%" for Intel cards. It turned out that tens of thousands of rogue cards were wrongly calculated in client segment. Plus, you need to know his methodology and that he does not talk and gather data from all OEMs and is missing out on significant portion of market.
Super compute products skewed dGPU market share.
www.jonpeddie.com
You should know more about AMD's discrete GPU business in wider context before you post more brain dead comments. Their client GPUs are less than 9% of total revenues and small portion of other important product porfolio. They are not even aiming to increase GPU market share to 30% or above, as they are heavily focused now in other more lucrative segments, such as CPUs, AI APUs/GPUs, FPGAs and consoles. Those bring over 90% of revenues. Once you understand this, it is easier for you to see that there is no "drama" for AMD. Grow out of that brain dead narrative now. You only have one chance to be rational and to reason logically with adults here.
AMD has grown in server CPU market from almost 0% in 2017 to projected 30% by the end of this year. It has been a monumental effort and market share growth, unheard of in server, more difficult than any attempt to change GPU ratio with Nvidia, as Intel had almost exclusive foothold globally and very difficult to tackle with entranched deals with enterprises. But, investment in Zen architecture, relentless drive and R&D paid off. Entire company was invested in it.
You can't have two or three equally big pushes in different segments at the same time. AMD was a small revenue company pre-Covid, just below $7 billion annually. They almost quadrupled this in a few years thanks mostly to Zen, consoles and now FPGA and embedded too. Once they bring themselves into roughly equal position in server with Intel, in 2-3 years, they might be able to refocus more efforts and R&D into GPUs and give it more serious push, depending on how AI pans out.
Key takeaway is that AMD neither craves nor needs more makret share in client GPUs in this period, as their priority is elsewhere. They will be present with GPUs to hover around 15-25%, but that's it. So, it's completely stupid to post any narrative with "AMD GPU drama" words. Nonsense. If you want to be Nvidia's press secretary endlessly repeating how much market they have captured, go and find a job in Nvidia marketing department and good luck with good salary.
I explained to you figures from Jon Peddie. I explained to you situation with media engines and constantly improving, open source modern technologies. The drivers "issue" is nonsense as latest research suggests that both cards have roughly equal numbers of reported issues. In last three years, I had three or four game crashes and mostly because I was pushing the card really hard. Adrenaline software has never been as stable and pleasant to interact with, whereas Nvidia's GUI on my laptop feels dated, from Stone Age. They could finally modernise it too. Actually, when was the last time you have used AMD card and interacted with its software? Have you noticed changes?
Technologies are useless if generational performance uplift is dismal. RT is completely useless on lower tier cards. It kills performance and the card chokes in 8GB VRAM in increasing number of games. So, I'd buy 7600 for 230 with game ($70 saved) for my cousin's ITX build.
3070 and 3070Ti are increasingly choked with 8GB of VRAM and many Nvidia owners complain about it. They realised that having bare minimum of VRAM does not make their cards future-proof for more demanding games. Beggars belief how Nvidia has been able to get away with low VRAM offer on most of their Ampere cards. Clearly, marketing propaganda that blinded people with DLSS and RT worked, to hide and mask native hardware deficiencies. See testing in specific games by Hardware Unboxed. It's clearly visible in gameplay. Also, RT performance in several games makes it unplayable mess. There is stuttering and unloaded surfaces too. So, it depends on what you play. Most stuff will run well with reasonable settings though. Each card has its own limits.
But, let's leave all that and discuss Navi 41 in greater detail. Will it be released or not? What are the architectural challenges and where is the rumour rooted?