Tuesday, October 18th 2022
AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
In the run up to the November 3 reveal of the next-generation RDNA3 architecture, and with the 43% faster RTX 4090 mauling away its appeal to the enthusiast crowd, the AMD Radeon RX 6900 XT got another round of price-cuts, and can be had for as low as $669. Prices are down on both sides of the big pond, with European retailers listing it for as low as 699€. Although not technically AMD's flagship graphics card, with the RX 6950 XT (starts at $869); the RX 6900 XT is a formidable 4K gaming graphics card with a high performance-per-Dollar at its new price (roughly 35% higher than the RTX 4090). AMD's latest round of official price-cuts happened around mid-September as the company was bracing for the RTX 4090 "Ada."
Source:
VideoCardz
131 Comments on AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
You can search my post history on this very forum going back years to some of the radeon's i've owned.
Where can I see that ferrari you landed in?
Here are the typical prices from one store, here in Portugal:
And before you think "but that's only one store", think again:
Only a "slight" price variance between the models on offer, and i don't mean between the stores ...
www.theregister.com/2008/03/28/nvidia_vista_drivers/
Some issues have even been bad hardware choices like the capacitor guidance when the 3000 series first launched. Evga didn’t leave Nvidia because they are too awesome and the vast majority of the tech industry protested the ARM merger because they knew Nvidia would hurt everyone with their business practices.
I owned AMD/ATI and Nvidia cards alternating back and forth for decades. There is nothing significantly different between the driver quality of the two companies for awhile now. There are different feature sets and performance is different for similar features.
Most internet myths are a product of cognitive dissonance.
en.wikipedia.org/wiki/Cognitive_dissonance?wprov=sfti1
It is easier to justify irrational love of something if you can create an environment where opposing forces are bad and unworthy of your love. So stop spreading the myth of AMD driver issues. It doesn’t help ANYONE.
on a more serious note, i owned an equal number of Nvidia and ATI/AMD cards since late 1997 (starting with the Riva 128 and Rage LT Pro, aka: Mach64 LT) and truthfully i only got one major issue with the red team (that was corrected with the next driver) while i hade to rollback almost on a regular basis with Nvidia ... 6yrs with a GTX 1070 and i never could get the latest driver without issues :oops:
enjoying my current GPU with the latest driver in date is ... refreshing (also given it did cost me 75chf less than the GTX 1070, 450 vs 525chf :laugh: )
NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090) (2nd image at 15:17 UTC)
Even if it was with DLSS3, 2x should be attainable in some games.
I checked all TPU results at 4k and the best I could find was AFAICR 1.75x.
That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
www.tomshardware.com/news/Nvidia-196.75-drivers-over-heating,9802.html
www.destructoid.com/dont-install-the-newest-nvidia-driver-its-breaking-pcs/
Wait....what's this? AMD did it at least once, too?!?!
news.softpedia.com/news/warning-new-amd-crimson-driver-is-heating-and-killing-gpus-users-report-496867.shtml
Oh man! Is it true? Neither company is perfect?
Meanwhile at 1080p it's a mere 16% faster. At lower resolutions Nvidia's 4090 suffers from driver overhead as mentioned in the 4090 review.
Realtime RT is still very much in it's infancy. Games have at best 1-2 RT effects that were picked because they are light on performance and those effects are limited is scope. The 4090 finally brings acceptable RT performance for what games have now but when game devs start adding more it will essentially make the 4090 obsolete for those that care about realtime RT. Realtime RT is a blessing for Nvidia as it makes it extremely easy for them to tout massive gains in performance each generation and cash in.
The chances of you getting a bad mining card is the same as purchasing any other used card. According to TPU's review, the 4090 has additional driver overhead.
About all this free brand sentiment/free pessimism - its just freely boring!! Both NVIDIA and AMD are extremely fortunate to have free unemployed and free non-commissioned free members of a free society who out of their own free-will and exhausting free determination provide a free service to staunchly and freely support or market for free, each brand. Regardless of "fair play", no matter the cost, the free patriots always come out for free with their expensive sharpened swords and free patriotic flags - the craziest thing of all, they fund the war too (yep, no freebies there) I'm still trying to wrap my head around where all this freeotic behaviour comes from.... I stopped praising GPU manufacturers when the cost of GPUs was no longer acceptable. I'd be more than happy for this segment of the market to hit a all-time decline even if it challenges the very existence of these companies - well hopefully they will stay put and deliver more reasonable asking prices of which we can all blindly and patriotically chant about whilst surfing the freeotic patriot ship of contentment.
So far, 7 months in zero returns - nobody has even contacted me. I reckon I sold each card for £25 more than other cards listed simply because I had the confidence to offer a warranty.
I sell my gaming cards without warranty.
ETH mining (done by careful miners who cared about the hardware, efficiency, and their resale value) looked after the cards way better than any gamer would. Regular dust cleaning, careful 24/7 thermal monitoring, open-frames for exceptionally low operating temperatures, undervolted, GDDR6 temps lower than when gaming, and stable unchanging temperatures that meant it wasn't thermal-cycled like a gaming GPU is. My gaming cards get put in a box, never cleaned until they're replaced, and their workload is bursty resulting in frequent temperature and power spikes from idle, in a hotbox, with tons of thermal-cycling on the GPU die, the VRAM, and of course all the thermal pads.
The only caveat to a well-treated mining card is that the fans have been on their whole life. Given that their 24/7 TDP was around half of a gaming load, the fans were running fairly slowly so unlikely to be ruined, and you can replace GPU fans affordably and easily for most models. Which "compute applications"? It's nice in theory and I wish CUDA would die a horrible, proprietary death in favour of Opensource APIs, but the reality is that most software developers hook into CUDA.
I've been building and maintaining both CPU and GPU render farms at work for nearly two decades (well, one decade in the case of GPU rendering) and support for OpenCL is hot garbage. The API may be okay but most software wants CUDA so it doesn't matter how much RAM your Radeon has when the application only offers CUDA or CPU compute options. I'm coming from a 3D rendering/animation/modelling side, so perhaps financial/physics simulation software does actually have decent OpenCL support. I can only comment on the stuff my company does.
Want to see your PC brought to its knees? Play a game like Star Citizen, there should be a free- fly event, just be aware of the TOS, they enforce outside of what they claim they'll enforce, some people on their "player safety" team are political.act vists.
but yeah at 350 the A770 is almost adequately priced (if it was priced like that and not the 450$+ i will see :D ) given the RT application ... heck i even ran the CP2077 bench at 1440 and 1620p60 RT on, it wasn't a slideshow strangely ... a few drop under 30fps but nothing unplayable, ofc benchmark might not reflect (hehe reflect ... ) normal gameplay tho :oops:
although i saw prettier reflection and lightning work in Morrowind :laugh: (joke again but OpenMW is really awesome! ) oh, i do remember ... but the happiest thing for me is running a monitor to 3K does not need AA (yay, free performances increase, because nearly all AA algorithme will bring a performance drop even if small for some )