Saturday, April 29th 2023
AMD Marketing Highlights Sub-$500 Pricing of 16 GB Radeon GPUs
AMD's marketing department this week continued its battle to outwit arch rival NVIDIA in GPU VRAM pricing wars - Sasa Marinkovic, a senior director at Team Red's gaming promotion department, tweeted out a simple and concise statement yesterday: "Our @amdradeon 16 GB gaming experience starts at $499." He included a helpful chart that lines up part of the AMD Radeon GPU range against a couple of hand-picked NVIDIA GeForce RTX cards, with emphasis on comparing pricing and respective allotments of VRAM. The infographic indicates AMD's first official declaration of the (last generation "Big Navi" architecture) RX 6800 GPU bottoming out at $499, an all time low, as well as hefty cut affecting the old range topping RX 6950 XT - now available for $649 (an ASRock version is going for $599 at the moment). The RX 6800 XT sits in-between at $579, but it is curious that the RX 6900 XT did not get a slot on the chart.
AMD's latest play against NVIDIA in the video memory size stake is nothing really new - earlier this month it encouraged potential customers to select one of its pricey current generation RX 7900 XT or XTX GPUs. The main reason being that the hefty Radeon cards pack more onboard VRAM than equivalent GeForce RTX models - namely the 4070 Ti and 4080 - therefore future-proofed for increasingly memory hungry games. The latest batch of marketing did not account for board partner variants of the (RDNA3-based) RX 7900 XT GPU selling for as low as $762 this week.AMD's senior marketeer did not bother to include any of Intel's offerings in the comparison chart - Team Blue's Arc A770 16 GB graphics card can be purchased for $350, but this range-topper cannot trade blows performance-wise with the $499 RX 6800 GPU. AMD is currently busy working on lower specification cards in the Radeon RX-7000 family - set for tentative release windows in the coming months. It will be interesting to find out about intended memory allocations for the cheaper models, as well as a different marketing angle - how will Team Red address the fitting of smaller pools of VRAM to upcoming low and mid-range cards?
Source:
Sasa Marinkovic Tweet
AMD's latest play against NVIDIA in the video memory size stake is nothing really new - earlier this month it encouraged potential customers to select one of its pricey current generation RX 7900 XT or XTX GPUs. The main reason being that the hefty Radeon cards pack more onboard VRAM than equivalent GeForce RTX models - namely the 4070 Ti and 4080 - therefore future-proofed for increasingly memory hungry games. The latest batch of marketing did not account for board partner variants of the (RDNA3-based) RX 7900 XT GPU selling for as low as $762 this week.AMD's senior marketeer did not bother to include any of Intel's offerings in the comparison chart - Team Blue's Arc A770 16 GB graphics card can be purchased for $350, but this range-topper cannot trade blows performance-wise with the $499 RX 6800 GPU. AMD is currently busy working on lower specification cards in the Radeon RX-7000 family - set for tentative release windows in the coming months. It will be interesting to find out about intended memory allocations for the cheaper models, as well as a different marketing angle - how will Team Red address the fitting of smaller pools of VRAM to upcoming low and mid-range cards?
64 Comments on AMD Marketing Highlights Sub-$500 Pricing of 16 GB Radeon GPUs
When you are in an action game, the reflexes and the number of frames (frametime) count and how many cockroaches display your video card through the bushes. Not valid for nVidia?
The Last of US
Day 1: 13.5+ GB for 1080p ULTRA
Patch 1.0.4.1: Only 9 GB for 1080p ULTRA.
HU caught my attention years ago, when it was trying to beautify the hideous thing called Bulldozer. Now, right after AMD's announcement related to vRAM, HU are the most diligent in idiotic demonstrations. Isn't that bias? You are their fans because they speak your language. Red language!"While gaming on NVIDIA was crash-free, I've encountered several display driver crashes on AMD—a bit unexpected for a WHQL driver release"
Dude, you need to get your head examined. Requiring a minimum of 100 FPS in anything is in fact snobbery. You're turning the world upside down now to make your point, news flash, there isn't one, and there never will be.
You. Are. Wrong. But I do enjoy the hole you're digging, keep at it, I have a few more bags of popcorn here. Patch 1.0.4.1: Still more than 8 GB for a measly 1080p :roll:
Keep those examples coming! This is hilarious.
I'm glad we have members like yourself with an objectivity no one can match.
You keep trying to make this an AMD - Nvidia feud when in fact everyone is astounded as you are that AMD is likely to push 8GB in their midrange just the same - and wasn't that your argument, 8GB is enough? Even AMD agrees with you. See, you're fighting that battle nobody is fighting, in the strangest possible ways. The consensus was already reached weeks ago, there are no camps here, just wild imagination in the heads of a few members.
Of course, every buyer should check the performance of a card for their needs. It's a truism to say that. Please stop smearing independent professionals who work hard on behalf of tech community. It's embarrassing and tells more about you than about them.
You don't like HUB? Check out Gamers Nexus, Guru3D, 3DCentre or this very website TPU. Here:
www.techpowerup.com/review/geforce-rtx-3080-vs-radeon-rx-6800-xt-megabench/
And here; even higher frame rates than on Hardware Unboxed. I hope you don't start smearing TPU review staff now, on the very website you post these messages. You will need to control your language, seriously.
www.techpowerup.com/review/amd-radeon-rx-6800-xt/30.html
Perhaps for you, but not for me and many other people. I cannot care less about DLSS. Hardware counts in the first place, then software perks. One thing I do agree with you is content creation, where Nvidia cards are still faster, but the gap is closer, gen on gen, so I hope to see even smaller gap, if not equality next year between Blackwell and RDNA4. We shall see in good time.
For example, AV1 decoding and encoding is very similar already. RDNA3 cards are better in fps, but 5-6% less details in quality assessement. Getting closer.
With me, you will need to craft your claim very carefully, otherwise I will call it out immediately as BS or nonsense.
Sir, we are talking about the skeletons in the closet unearthed by HU at the behest of the Red Cartel. These are the 6000/3000 series, RDNA 2, not RDNA 3 or 4.
I am sending you some links for documentation and gentlemen's conclusions regarding RDNA2. In these links you will see that the HU skeleton does not even appear because the flagship RDNA2, 6900XT, overall, is surpassed by the 3070 Ti.
www.pugetsystems.com/labs/articles/nvidia-geforce-40-series-vs-amd-radeon-7000-for-content-creation/
www.pugetsystems.com/labs/articles/nvidia-geforce-rtx-3090-ti-24gb-review-roundup-2309/
www.pugetsystems.com/labs/articles/amd-radeon-rx-6900-xt-review-roundup-2034/
Conclusion: "Due to the performance we saw with the Radeon 6900 XT, we currently have no plans to offer them in our workstations."
So, they weren't even interested in the RDNA2 flagship. Where is 6800? Probably below 3060 in CC. And I'm pleased to note that even the 7900XTX can't beat the 3070Ti in Blender, a free and widely used program.
Did we stay at the level of the 90s? We still don't see that the video card has an impact in almost all applications and not only in 3D rasterization?
Why do we take only part of the skeleton out of the closet?
And one more thing: only AMD uses the software solution for upscaling. In nVidia hardware there is something called Tensor Cores.
You probably missed the class when the subject was taught. :D
To the original post, yes AMD has more VRAM. Yes, AMD has poorer or non-existent Ray Tracing. Yes, AMD seems to support cards for much longer.
On the other hand, Nvidia is more expensive, they're generally much better on software, and they generally age less like wine and more like milk.
Team Red and Team Green are stupid. I want the best card my money can buy...and right now it's neither of these price gouging houses of ill repute. You're welcome to argue all you'd like, but when a PS5 has to share ram between the VRAM and system RAM it means that it has under 16 GB if it's got 16 GB total. The PS5 isn't a 4k powerhouse...but modern releases running at QHD being incapable of maintaining a buttery smooth frame rate when a $500 console can is just silly.
Yes, I know that we keep buying cards because even FHD requires more RAM...but at this point when a garbage port of Hogwarts Legacy to PC is stirring this kind of vehement nonsense then we really need to stop attacking sacred cows and simply ask what we're willing to take...because the 8000 and 5000 series respectively should not be like the 7000 and 4000 series are today. Big, moderately faster, more power hungry, and price two tiers too high for what they are offering compared to most previous generations.
Also, since when has WB not been garbage? Anyone remember buying a AAA game and having to buy loot box orcs or grind 40+ hours? What about the inglorious end to the Arkham series? Yeah, I call this BS lazy development, likely never optimized because the people defining how developers worked never gave them enough time to QC... That's how WB works, isn't it?
I am very familiar with PugetSystems. Been reading it for years, including all links you sent. I have already said that I had agreed with you regarding content creation. You ignored this and bombarded me with Puget links instead. I haven't missed anything. Your language feels to me as if you project arrogant superiority. It's not going to fly. If you had listened more carefully what Steve said in several videos, you wouldn't have been mentioning any "skeletons". It's irrelevant.
His comparison of 6800 vs 3070 was purely for gaming purposes, which vast majority of buyers predominantly use those cards for. He has always recommended Nvidia cards for content creation.
Currently, 6000 cards are competitive for pure gaming usage, price-wise and VRAM-wise. This is because Nvidia has set the most absurd price increase on their 4000 cards that few people want to buy, forcing them to reduce procurement of dies with TSMC and wondering what to do with above $5 billion of lingering inventory. Those who wanted or needed to buy the halo model 4090, have already done so, for content creation and gaming alike. Nvidia has shifted their focus to data centre where real revenues are, and they have left crumbs with luxurious prices and bare bones VRAM below 4090 to gamers.
We will look into next quarterly report for more analysis. Always good to be patient.
www.dw.com/en/china-combat-drone-circles-taiwan-us-aircraft-transits-area/a-65459077
If this China/Taiwan/USA goes sour it'll make scalping and crypto look like a joke.
In the summer of 2021, when I bought the 3070Ti (800 euros, big discount), the 6700XT was more expensive (and weaker), the 6800 was impossible to find (anyway, at least 400-500 euros more expensive) and the 6800XT/6900XT was fighting with the 3080 at around 2000 euros. Mining hell and AMD video cards excelled here, not in Content Creation and other software. This anomaly made AMD video cards to be sold much more expensive (if they were available), 2-3 times above the MSRP.
The current prices for the 3000 and 6000 series (launched in 2020-2021) do not matter. Personally, I wouldn't buy anything. Either I buy the last appearances, or I stay with what I have and play even on Low if necessary. The reason: you pay a sum of money on an old model and, in a month or two, you will see that the new model costs the same or is a little more expensive and you will slap yourself in the face.
Prices below are from today.
www.pcgarage.ro/compara/2128106,2375056,2558959,2594271,2604463,2611023
What would you buy? 6900XT TUF or 7900XT TUF? The difference between them is only 45 euros.
PS.
finished The Last of US. 1080p High preset, no DLSS
If you think this is drama, better watch the game.
There are millions of GPU users who have not been able to buy any affordable mid-range card for a good price from 3000 and 6000 until recent months. I don't need any, as I also bought 7900XTX in December. If you don't need one, perfect. Enjoy gaming and wait until next gen and better prices. Local anomalies are always present. I visited dozens of diffrent websites and monitored tech shops for a long time before I found a suitable deal. That's what I would recommend to others. I even asked a friend from Canada to buy and bring a card, but I finally found one. It's a complicated market.
Again, this is not why I don't buy old video cards. I said why.
I bought the video card from Germany. It has enough power to satisfy me for another 2-3 years. I have my favorite games, well optimized, that eat up my time. The rest, when they appear at a minimum of 50% off.
www.mindfactory.de/search_result.php?search_query=16GB%20Sapphire%20Radeon%20RX%206800%20PULSE%20OC%20Gaming%20Retail
www.overclockers.co.uk/sapphire-radeon-rx-6800-pulse-16gb-gddr6-pci-express-graphics-card-gx-39d-sp.html
www.newegg.com/p/pl?d=6800&n=4841