Tuesday, October 18th 2022
AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
In the run up to the November 3 reveal of the next-generation RDNA3 architecture, and with the 43% faster RTX 4090 mauling away its appeal to the enthusiast crowd, the AMD Radeon RX 6900 XT got another round of price-cuts, and can be had for as low as $669. Prices are down on both sides of the big pond, with European retailers listing it for as low as 699€. Although not technically AMD's flagship graphics card, with the RX 6950 XT (starts at $869); the RX 6900 XT is a formidable 4K gaming graphics card with a high performance-per-Dollar at its new price (roughly 35% higher than the RTX 4090). AMD's latest round of official price-cuts happened around mid-September as the company was bracing for the RTX 4090 "Ada."
Source:
VideoCardz
131 Comments on AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669
StableDiffusion/comments/ww436j
Stable diffusion is the popular thing these days. As is LeelaZero / KataGo (I am a go-player, so I actually use this one a lot for self-analysis of my own games), etc. etc. Blender does work with AMD, though suppoort is way worse than NVidia.
CUDA is popular because its a hella-good API. I don't want to ignore that fact. But at the same time, something needs to be said about raw vRAM capabilities of GPUs. Extra hardware is worth a bunch in a number of these applications.
--------
AMD support is likely better than Intel's (Intel is going all in with SYCL, which is nominally OpenCL based). I'd say AMD is more compatible with CUDA in my experience, its hippify script converts CUDA into HIP (and HIP itself is mostly a CUDA api layer). CUDA's newest stuff isn't implemented of course, but if you got 3 or 4 year old CUDA code, it ports over pretty decently.
Intel is probably a better software company than AMD but IMO is making a bit of a mistake with SYCL focus. No one else is going SYCL (even if it is OpenCL based). I'm not sure if OpenCL is really "alive" anymore... even AMD ROCm is kinda-sorta based on CUDA.
I can say with confidence that both AMD and Nvidia drivers have been periodically fine and also terrible, and that if there's any real difference between them, it's that Nvidia cards have fewer issues but they tend to be more severe and remain unfixed for much longer. Quite often I will be able to disable a workaround on AMD cards within a few months as it's been patched. With Nvidia you basically get a patch from the software developer instead and if you don't get that the workaround you have to use for that application is permanent.
Realistically, I dislike the dated/redundant Nvidia control panel with a lack of fan/clock tuning and monitoring, no controls whatsoever for the encoder, and no GPU-level screen-capture utility. I know third-party solutions exist because I have to install them on Nvidia systems, but it's just irritating that noisy people on the internet always harps on about AMD drivers being crap when Nvidia drivers barely even do anything that Windows display settings can't also control. There are also minor, reproducible bugs in the Nvidia control panel that haven't been fixed after 15 years, which is just pathetic....
- Streaming platforms like Twitch charge more for higher-bitrate subscriptions so streamers want the best quality they can get for their limited bitrate.
- File sizes are smaller at any given quality, meaning easier editing, faster uploads to Youtube/Vimeo/Whatever - and lower storage overheads.
I find the AMD ReLive encoder great to use but 1080p60 recordings at 10Mbps look worse than 6Mbps recordings on NVENC. The last time I had both AMD and Nvidia in the same room at home it was a 5700XT vs a 2070S. I don't really know how much either encoder has improved since then but it sounds from the usual sources and reviewers that nothing's really changed since then on either side.Less because I am actually a fanboy and more because when it has come time to buy, Nvidia hasn't delivered on what I need. I moved to linux as primary last year and yeh - Nvidia is garbage there.
However, NVENC is objectively better in quality than anything AMD has come up with - though that doesn't justify the rest of their ecosystem being cruel to users and developers.
The most recent Nvidia card I owned is a EVGA 1050ti and a 1070. I was so impressed by the 1050ti for what it did that I had it in my back up computer until last year and gave that computer, a 2016 rig that I did not need, to a friend. The EVGA 1070 I was Impressed as well.
But when I got the Standard AMD 5700 and undervolted it, man the wattage usage was fantastic. And as stated before I have NO problems with the Drivers.
a 6800 XT at 550 is a wet dream for me ...only 100 more than my 6700 XT damn ... where?
www.newegg.com/gigabyte-radeon-rx-6800-xt-gv-r68xtgaming-oc-16gd/p/N82E16814932381?Item=N82E16814932381&Description=rx%206800%20xt&cm_re=rx_6800%20xt-_-14-932-381-_-Product
sign me up for 6950XTX :laugh:
No flagship is ever good value, an RX 6700 10GB blows even the 6800XT out of the water in terms of value for money.
How very repetitive names used for many products of many different companies are...:sleep:
Wikipedia seems to indicate it was a "use up old silicon inventory however you can" with basically any leftover die and memory chips qualifying for the label :D
"The 6800 XT varies greatly depending on manufacturer. It is produced using three cores (NV40/NV41/NV42), four memory configurations (128 MiB DDR, 256 MiB DDR, 128 MiB GDDR3, 256 MiB GDDR3, and 512 MiB GDDR2), and has clock speeds ranging from 300 to 425 MHz (core) and 600-1000 MHz (memory)."
Is it a 435MHz NV40 with 1GHz GDDR3, or is it a 300MHz NV42 with 600MHz DDR1? ¯\_(ツ)_/¯
It's just like the Radeon X1800 series that came in XT, XTX, XL, Pro, GT, GTO and maybe 5 more flavours. You really new your stuff if you could make any sense of it.