• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6900 XT Price Cut Further, Now Starts at $669

meh easy to counter all of that.

i've owned almost every radeon since the first one radeon 64DDR and ive had minimal issues over the last 20 years :)

anecdotal yes indeed.
And ive jumped from the 5th floor and landed in a ferrari f40...

Hm, 4090 is only 31% stronger than a 3090ti. Someone must be upset that they didn't get FOUR TIMES PERFORMANCE. ︀︀︀:laugh::laugh::laugh:
Your math is....wrong
 
And ive jumped from the 5th floor and landed in a ferrari f40...


Your math is....wrong
That difference is i've acutally owned all those radeons and speaking from personal experience you have never done that nice try though.

You can search my post history on this very forum going back years to some of the radeon's i've owned.

Where can I see that ferrari you landed in?
 
Here in Portugal we can barely get a RX 6800 (non XT) for that price.

Screenshots took this morning from a price comparison website, and it's ordered from cheapest.
View attachment 265952

Cheapest 6900XTs just for reference.
View attachment 265953

Those are the cheapest.

Here are the typical prices from one store, here in Portugal:

Screenshot from 2022-10-18 14-41-09.png


And before you think "but that's only one store", think again:

Screenshot from 2022-10-18 14-47-31.png


Only a "slight" price variance between the models on offer, and i don't mean between the stores ...
 
Woah, stop the kool aid party there man.

years ago, did AMD "fix" their frame pacing issues? The frame pacing issues that forum users complained about for years with no resolution?

Navi black screen


AMD driver causing BSODs


AMD users ditching cards over driver issues


This doesnt get into the rDNA downlclocking, r300 black screen issues, the vegas burning out their PCBs, ece. How far you want to go down this rabbit hole? Until very recently, AMD drivers were complete trash, this is why the fermi series still sold so well. After rDNA came out AMD finally buckled down and fixed their garbage after yet more negative media attention.
Nvidia has had all these types of issues as well with their drivers going back to Vista.


Some issues have even been bad hardware choices like the capacitor guidance when the 3000 series first launched. Evga didn’t leave Nvidia because they are too awesome and the vast majority of the tech industry protested the ARM merger because they knew Nvidia would hurt everyone with their business practices.

I owned AMD/ATI and Nvidia cards alternating back and forth for decades. There is nothing significantly different between the driver quality of the two companies for awhile now. There are different feature sets and performance is different for similar features.

Most internet myths are a product of cognitive dissonance.

It is easier to justify irrational love of something if you can create an environment where opposing forces are bad and unworthy of your love. So stop spreading the myth of AMD driver issues. It doesn’t help ANYONE.
 
Good GPU, not a bad price.
 
AMD is supporting RT just for the sake of saying "look, we've got that functionality too". Their implementation is poor though, at least comparing to nVidia's. Also, very few games support RT and many new one's don't have any form of RT at all.

And if you're planning to compare Radeon drivers to Intel ARC ones, just to prove your point, like some other folks are doing 'round here, go ahead :)
135 games out 240 DX12 games support some type of Raytracing So how is that Very few when it's like 56% of games on that API ?
 
As someone who is generally favourable towards AMD and hates Nvidia for anti-consumer bullshit, it pains me to make the above post, but it's the cold hard truth.
good thing i disliked Nvidia because of drivers issues then :laugh: (/joke)

on a more serious note, i owned an equal number of Nvidia and ATI/AMD cards since late 1997 (starting with the Riva 128 and Rage LT Pro, aka: Mach64 LT) and truthfully i only got one major issue with the red team (that was corrected with the next driver) while i hade to rollback almost on a regular basis with Nvidia ... 6yrs with a GTX 1070 and i never could get the latest driver without issues :oops:

enjoying my current GPU with the latest driver in date is ... refreshing (also given it did cost me 75chf less than the GTX 1070, 450 vs 525chf :laugh: )
 
Hm, 4090 is only 31% stronger than a 3090ti. Someone must be upset that they didn't get FOUR TIMES PERFORMANCE. ︀︀︀:laugh::laugh::laugh:
Your math is....wrong
Their math is wrong, but +45%, or 1.45x average performance is still far from 2x-4x vs. 3090 Ti that was promised:
NVIDIA Project Beyond GTC Keynote Address: Expect the Expected (RTX 4090) (2nd image at 15:17 UTC)
Even if it was with DLSS3, 2x should be attainable in some games.
I checked all TPU results at 4k and the best I could find was AFAICR 1.75x.
 
Last edited:
Hm, 4090 is only 31% stronger than a 3090ti. Someone must be upset that they didn't get FOUR TIMES PERFORMANCE. ︀︀︀:laugh::laugh::laugh:
How did you get 31% from either of the graphs?
 
Comparing performance/$ to a 4090 seems disingenuous; If AMD want to compare performance/$ against Nvidia, they should use a card at the same performance/capability level.

At best it competes with the $800 3080Ti in raster performance. In DXR titles it's barely keeping up with a $530 3070.

Additionally, it can't do DLAA, DLSS has better game support at the moment than FSR, and then there's NVENC which is vastly superior. If you do anything other than gaming you'll also appreciate CUDA support as so many applications support CUDA but not OpenCL.

The 6900XT is a good card, but it's not a 4090. Comparing it to the true competition shows that it's still awful value for money, like any flagship always is. Just buy a 3070/3070Ti/3080/3080Ti instead for better API support, equivalent performance, and much better features.

As someone who is generally favourable towards AMD and hates Nvidia for anti-consumer bullshit, it pains me to make the above post, but it's the cold hard truth.

AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.

That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
 
I don't think that the vast majority of users care about CUDA and NVENC. These are marketing bullet points for the hardcore-die nvidia fanbase.
 
Nvidia has had all these types of issues as well with their drivers going back to Vista.


Some issues have even been bad hardware choices like the capacitor guidance when the 3000 series first launched. Evga didn’t leave Nvidia because they are too awesome and the vast majority of the tech industry protested the ARM merger because they knew Nvidia would hurt everyone with their business practices.

I owned AMD/ATI and Nvidia cards alternating back and forth for decades. There is nothing significantly different between the driver quality of the two companies for awhile now. There are different feature sets and performance is different for similar features.

Most internet myths are a product of cognitive dissonance.

It is easier to justify irrational love of something if you can create an environment where opposing forces are bad and unworthy of your love. So stop spreading the myth of AMD driver issues. It doesn’t help ANYONE.

Let's not forget these Nvidia blunders:


Wait....what's this? AMD did it at least once, too?!?!

Oh man! Is it true? Neither company is perfect?
futurama-calculon.gif
 
Even though not a big change in the US since cards were already at that price like some mentioned, it's a welcome discount in other countries were they don't get the typical US discounts. Hopefully they price cut the 6700s and 6600s too!
 
...the RX 6900 XT is a formidable 4K gaming graphics card...
Not really...Somewhat ironically (for being their almost top card), it's actually best suited for high refresh rate 1080p and 1440p gaming without RT:
relative-performance_3840-2160.png
relative-performance_1920-1080.png



That won't happen anytime soon, now that both will be on the same process...
I'm an owner of a 6900XT and I agree wholeheartedly with your take. I don't know why you're catching so much heat... The 6900XT could be buy one half off and get a second one free and it's still not a formidable 4K card. The 6900XT is a beast at 1440p and even then there are games like Cyberpunk that will bring it to its knees at 1440p. I'm an individual who loves to play at max settings above 90FPS and I wouldn't dare pair this with a 4K monitor. However, those who don't mind playing games 30-60 fps at low to medium setting and office work, it will do it.

Woah, stop the kool aid party there man.

years ago, did AMD "fix" their frame pacing issues? The frame pacing issues that forum users complained about for years with no resolution?

Navi black screen


AMD driver causing BSODs


AMD users ditching cards over driver issues


This doesnt get into the rDNA downlclocking, r300 black screen issues, the vegas burning out their PCBs, ece. How far you want to go down this rabbit hole? Until very recently, AMD drivers were complete trash, this is why the fermi series still sold so well. After rDNA came out AMD finally buckled down and fixed their garbage after yet more negative media attention.
I can't speak for everyone...I haven't had any of the AMD driver issues UNLESS I used their non-WHQL drivers back in the day (those did suck). I've been using AMD cards since the 9700PRO.
 

The 4090 wins big over both last gen AMD and Nvidia products at 4K with RT enabled. It's the best case scenario for the card compared to last gen.

Meanwhile at 1080p it's a mere 16% faster. At lower resolutions Nvidia's 4090 suffers from driver overhead as mentioned in the 4090 review.

Realtime RT is still very much in it's infancy. Games have at best 1-2 RT effects that were picked because they are light on performance and those effects are limited is scope. The 4090 finally brings acceptable RT performance for what games have now but when game devs start adding more it will essentially make the 4090 obsolete for those that care about realtime RT. Realtime RT is a blessing for Nvidia as it makes it extremely easy for them to tout massive gains in performance each generation and cash in.
 
The scalpers are still lurking around? How about them to lower the price to 50% of a brand new card?
Who knows if that "second hand" actually works properly and how many mining hours it had been tortured.

I bought 4 RX 580s that were from a mining farm in 2018 for $80 / each. Every one of them is still in operation in other people's rigs as they were given as gifts and still work flawlessly.

The chances of you getting a bad mining card is the same as purchasing any other used card.

Is this because of broken games and their code, or will we blame the "slow" CPUs which never see utilisation and load more than 20%?

According to TPU's review, the 4090 has additional driver overhead.
 
I haven't really been following the prices but its nice to see the 6900 XT dropping in price (or in an official capacity). Looks like AMD's making space for RDNA3 at more competitive prices or justifying potential inferior performance opposed to 40-series.

About all this free brand sentiment/free pessimism - its just freely boring!! Both NVIDIA and AMD are extremely fortunate to have free unemployed and free non-commissioned free members of a free society who out of their own free-will and exhausting free determination provide a free service to staunchly and freely support or market for free, each brand. Regardless of "fair play", no matter the cost, the free patriots always come out for free with their expensive sharpened swords and free patriotic flags - the craziest thing of all, they fund the war too (yep, no freebies there) I'm still trying to wrap my head around where all this freeotic behaviour comes from.... I stopped praising GPU manufacturers when the cost of GPUs was no longer acceptable. I'd be more than happy for this segment of the market to hit a all-time decline even if it challenges the very existence of these companies - well hopefully they will stay put and deliver more reasonable asking prices of which we can all blindly and patriotically chant about whilst surfing the freeotic patriot ship of contentment.
 
Sure there arent a lot of rt games, but those games are a big chunck of the the main reason you upgrade your gpu. I mean - i only played 4-5 games with rt on my 3090, but then again the number of games that even required a 3090 to play properly were 10 at most. The rest i could have played perfectly fine with my older card (like apex warzone stray etc). So 50% of the games i upgraded for do in fact have rt. And in those the 6900xt is a 3070 competitor.
A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
 
The scalpers are still lurking around? How about them to lower the price to 50% of a brand new card?
Who knows if that "second hand" actually works properly and how many mining hours it had been tortured.
I sold 24 mining GPUs on ebay in March, and I sold them with a no-quibble 12-month warranty backed at my own risk, and enforced by ebay
So far, 7 months in zero returns - nobody has even contacted me. I reckon I sold each card for £25 more than other cards listed simply because I had the confidence to offer a warranty.
I sell my gaming cards without warranty.

ETH mining (done by careful miners who cared about the hardware, efficiency, and their resale value) looked after the cards way better than any gamer would. Regular dust cleaning, careful 24/7 thermal monitoring, open-frames for exceptionally low operating temperatures, undervolted, GDDR6 temps lower than when gaming, and stable unchanging temperatures that meant it wasn't thermal-cycled like a gaming GPU is. My gaming cards get put in a box, never cleaned until they're replaced, and their workload is bursty resulting in frequent temperature and power spikes from idle, in a hotbox, with tons of thermal-cycling on the GPU die, the VRAM, and of course all the thermal pads.

The only caveat to a well-treated mining card is that the fans have been on their whole life. Given that their 24/7 TDP was around half of a gaming load, the fans were running fairly slowly so unlikely to be ruined, and you can replace GPU fans affordably and easily for most models.

AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.
Which "compute applications"? It's nice in theory and I wish CUDA would die a horrible, proprietary death in favour of Opensource APIs, but the reality is that most software developers hook into CUDA.

I've been building and maintaining both CPU and GPU render farms at work for nearly two decades (well, one decade in the case of GPU rendering) and support for OpenCL is hot garbage. The API may be okay but most software wants CUDA so it doesn't matter how much RAM your Radeon has when the application only offers CUDA or CPU compute options. I'm coming from a 3D rendering/animation/modelling side, so perhaps financial/physics simulation software does actually have decent OpenCL support. I can only comment on the stuff my company does.
 
Last edited:
AMD's Rx 6800+ series all have 16GBs of RAM though, which is very useful for compute applications.

That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
People keep forgetting most games are console games ported to PC.
Want to see your PC brought to its knees? Play a game like Star Citizen, there should be a free- fly event, just be aware of the TOS, they enforce outside of what they claim they'll enforce, some people on their "player safety" team are political.act vists.
 
A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
Remember back when AA and later implementations of it was as problem for GPUs? Now down to the lowest GPU they handle it flawlessly. We can't stay stagnant, I (we) wants all the graphical eye candy. I hope to live long enough to where I can't tell the difference from computer generated and real life (in motion).
 
That being said, Intel Arc 770 comes with 16GBs at $350, and probably is the price/performance king.
well if my RX 6700 XT 12gb price of 450chf/$ was not due to a promotion ... (650chf/$ regular price) -that one would be the price/performance king... :laugh:
but yeah at 350 the A770 is almost adequately priced (if it was priced like that and not the 450$+ i will see :D )

A very good point. If you're really interested in RT, you'll want to upgrade. Exactly working as Nvidia intended ;)
given the RT application ... heck i even ran the CP2077 bench at 1440 and 1620p60 RT on, it wasn't a slideshow strangely ... a few drop under 30fps but nothing unplayable, ofc benchmark might not reflect (hehe reflect ... ) normal gameplay tho :oops:
although i saw prettier reflection and lightning work in Morrowind :laugh: (joke again but OpenMW is really awesome! )

Remember back when AA and later implementations of it was as problem for GPUs?
oh, i do remember ... but the happiest thing for me is running a monitor to 3K does not need AA (yay, free performances increase, because nearly all AA algorithme will bring a performance drop even if small for some )
 
Back
Top