The whole RDNA2 line-up failed to outperform their MSRP-sakes from NVIDIA. By a significant (15+ %) margin at least. Also no DLSS, CUDA, non-existent RT performance on top of that + insane power spikes.
Also, 6500 XT. Not as bad as GT 1630 but still ridiculous.
"At least"?. TPU shows 6900XT losing only 10% to 3090 while costing 500 less. That's an even better deal than 7900XTX is today compared to 4090.
6950XT was arguably an even better deal by increasing performance over 6900XT by further 7% while costing 100 more.
3090 Ti increased by 8% over 3090 but added another 500 and thus extending the price cap to 900.
We're also talking about 2020/2021 here. DLSS had just gotten to the point it was actually worth using but availability was still very limited and thus the fact that AMD did not have an answer at the time did not matter much. As for CUDA - well if Nvidia made it open source then AMD cards could run it no problem. You're also calling out non-existent RT perf. The same non-existent perf that applied to 3090 Ti...
RDNA3 is trickier:
7900 XTX looked somewhat attractive in comparison with 4080, however at this price point, a gamer expects more than just raw raster performance. They want to enable everything. You can't do that on 7900 XTX. Thus, it had to be launched more significantly below 1200. $850 tops.
7900 GRE is just an abomination and a half. At 600 dollars, it's just a meager 10 to 20 % boost over 4070 at the cost of being worse in power draw and scenarios that aren't gaming pure raster titles.
7800 XT is the same story as 7900 XTX. NOT CHEAP ENOUGH TO CONVINCE. 4070 is more feature rich and performance difference is only visible with FPS graphs enabled. $100 premium is low cost enough.
7700 XT is also an abomination.
7600... Don't even get me started, it's awful.
And 4080 users can enable everything and still enjoy high refreshrate gaming at 1200 or would a sane person look at 4080 price and conclude that if they're already ready to spend 1200 then why not jump to 4090?
AMD, unlike Nvidia did not increase their top cards price. 7900XTX launched at the same MSRP as 6900XT.
7900 GRE was and is an odd release. Probably for dumping defective N31 dies.
7800 XT and 7700 XT were the most popular RDNA3 cards i believe.
7600 may have been awful 8GB card but at least unlike Nvidia it was not priced at 400 and then charged another 100 for clamshell 16GB version.
Not to mention Nvidia not even releasing 3050 (a truly awful card that does not even have 8GB) successor.
That's why I'm not delusional. I'm just strongly pessimistic because AMD seem to live in the fairy tale where nothing NVIDIA better than 2080 Ti exists.
Strongly pessimistic person expects 550 or 600 most. Not over 750. You realize that if 8800XT really ended up costing 750+ then AMD would not be able to sell any because 7900XT and 7900XTX would be so much better deals?
The Radeon 5000 and Geforce GTX 1000 series were priced just fine. The Geforce RTX 2000 series introduced us to ray tracing where pricing started to get out of hand. Pricing went insane with the Geforce RTX3000, Geforce RTX4000, Radeon 6000 and Radeon 7000 series.
You're confusing something here. Yes 20 series was massive price hike for very little substance, but 30 series was very well priced thanks to cheaper node. 6000 and 7000 series had roughly the same prices with some outliers. 40 series was again a price hike.
GTX 480 was 250W, and it was not called "efficient". It was a disaster.
You did not find anything older than a 15 year old card?
Nvidia also had 250W 780, 780Ti, 980Ti and 1080Ti. 980Ti was praised for it's power efficiency and 1080Ti is legendary.
Also you do not account for the fact that 480 was a single fan blower card and it's performance was underwhelming.
Cooling 270W today is a far cry from cooling 250W fifteen years ago. The coolers are much bigger and can easily handle it.
Not to mention tolerable noise levels now vs then.
Forget what is playable. This is marketing. Someone pays $2000 for an RTX 4090, someone pays $1000 for the RX 7900XTX and one gets 60fps and the other one 15fps(I don't exactly remember the framerates, but I think PathTracing in those cards are like that). You know what you have? Not a playable game, but the first "proof" for the buyer of the RTX 4090 that their money where spend well. It's marketing and Nvidia is selling cards because of RT and DLSS.
Playable framerate is not marketing. It is essential. A person buying 7900XTX is not buying it for 60fps tech demo.
Playing one tech demo at barely playable framerate (these days i expect high refreshrate experience at 90+) is not what i call a "money spent well".
They might, then what is AMD going to do? Lower the price to $500? Then to $450 and then to $400? Then in their financial results the gaming department will be more in red than Intel's. From a gamer/consumer perspective we all love low prices. But with Nvidia having all the support of the world, with countless out there been educated to love Nvidia products and hate AMD products, with countless out there willing to spend more money to get a worst Nvidia product than a better AMD product, aggressive pricing could end up a financial disaster for AMD. So they need to be careful. Now, if RDNA4 is a marvel architecture that they know that Nvidia can't counter and if we assume that they have secured enough wafers to cover the high demand that we could expect from a positive reaction from consumers, then and only then AMD will price their products aggressively. Putting an MSRP of $400 and failing to cover demand or scalpers driving the price to $600 will do no good to AMD, only bad.
Nvidia lowering prices while manufacturing costs go up and new G7 being also more expensive? Never gonna happen. The best we can expect is the same price and that's assuming they're feeling generous and cut into their margins.
AMD wont start a price war with Nvidia because they dont have the money coffers and capacity.
Nvidia wont start a price was with AMD because they want to increase their money coffers.
Time — yeah, potentially RT can be faster since you don’t have to manually set up lighting.
That's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction.
Absolutely. I find it amusing how people now look at cards that are nearly double the TDP and it’s apparently fine, no problem there.
But it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner.
Even if this thing will indeed have 45% better RT performance or whatever it wont make a difference to the market share situation.
And people like john_ will still complain that AMD "only" manages 4080S RT performance. Nothing new here.
Conveniently ignoring the fact that Nvidia themselves do not give 4090 RT performance for 1/4th the price.
Nvidia does not really care about RT availability or market penetration. They only care how much more they can charge for this on their top cards.
If they truly cared (like they claim) they would do everything in their power to produce cheap mainstream cards with good RT perf.
Ironically it's AMD who has managed to bring RT to masses even on consoles. TBH i did not think consoles would get RT so soon and at this level of performance.
Nasty! /s.
This happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits.
X570 was justified because it had Gen4 capable chipset in 2019. Something Intel introduced a whole two years later (with fewer lanes).
Today's AM5 prices regardless of the chipset are way more arrogant.
Zen 3 also had massive performance increase. RDNA3 had some bad examples but the top card did not increase in price.
7700XT may have been that but at least it was 12GB. Meanwhile Nvidia asked 400 for a 8GB card and whopping 500 for 16GB despite AMD proving with 7600XT that going from 8GB to 16GB does not add 100 the the price. Not to mention that i remember 7700XT being out of stock because people bought it up compared to 7800XT.
I still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels.
I agree but practically i dont see this happening. The overhead of moving data over PCIe is so large that for real-time rendering this would introduce a whole host of problems that were prevalent in the SLI/CF days including the dreaded micro-stutter. Maybe future Gen6 or similar speeds can mitigate this issue somewhat but that still leaves the extra slot problem where most motherboards do not have and extra x16 electrical (not just physical) slot to plug in that RT card.