• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

Defending their scalpers price 3080 purchase may be? :)

Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####

First RTX cards were early adopters fools gold. You basically bankrolled future development of these techs by overspending on quite mediocre techniques at the time :D This is what marketing and naivety do to people.
 
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:

af6787b9beff0e1a0b3223bf6ce60a8a.jpg


I'm not sure they are comparable yet. Something is definitely missing. :cool:
Not the loss of detail on the jacket or pants, but a garbage bag? Not the too shiny canisters? Also looks like they need to adjust the color gamut on the Radeon card.
AMD.jpg
Nv.jpg
 
Last edited:
I really hope that the samsung nvidia node isnt that crappy as people make it so, because otherwise if nvidia goes to lower tsmc process AMD will be again spanked. I hope that's not the case and with RDNA3 they will be on par with nvidia both performance and feature wise in all fields. People think im nvidia fanboy - i currently own both amd cpu and amd gpu. Im just jaded after 5700 got me headaches for months. I don't approach AMD with rose tinted glasses, and i had nvidia gpus before almost exclusively bar a HD series gpu. I know both sides of equation.
I don't believe Samsung is crappy, but seeing this today it might emerge not as power efficient. It appears there's not any huge improvement/difference between the 8nm or 7nm, other than I would reflect Nvidia sees a better cost per chip and primarily if they receive too many geldings. The rumored RTX 3070 Ti (Super) might be filling out more bins.
 
First RTX cards were early adopters fools gold. You basically bankrolled future development of these techs by overspending on quite mediocre techniques at the time :D This is what marketing and naivety do to people.
You sure generation two (3080) is better?! Can't say I am.
 
You sure generation two (3080) is better?! Can't say I am.

RTX 3000 series did a quite a big improvement in RT performance, despite using same amount of RT cores. Compare 2080 ti amount of cores and rtx 3080 -it's the same, and then compare performance. Then you have two time less tensor cores with same DLSS performance as in 2080 ti. That also means there's huge improvement.
The best comparison is RTX 3070 vs RTX 2080ti = 46 vs 68 RT cores and same performance in RT games. Both gpus have same floating performance/rasterization without RT, so it doesn't influence RT fps amount. That's a 47% improvement in RT cores performance.
 
Last edited:
amoung of pure hate toward 6800 is quite insane.Just let's Owners of NV card cool off.wow.this is crazy.
 
amoung of pure hate toward 6800 is quite insane.Just let's Owners of NV card cool off.wow.this is crazy.


Its like they are afraid the leather jacket isn't cool enough to save their hero and or epeen shrinkage...... But good news, they can use 100W more to keep it warm.
 
I know now what my next GPU will be
 
This isn't quite accurate. Yes, you can enable it as you develop and try it out but if you want to ship a game with it then you need Nvidia's explicit blessing. That's why even though it's easy to implement you don't see it get widespread adoption. It's the same for every other RTX/Gameworks feature you see out there.
As in stick a GameWorks logo on it, as in registering in Nvidia Developer program? Yes. Why would that be an issue? I'd argue that most game devs are registered for both Nvidia and AMD developer support programs anyway.


Better comparison.
That's not good for the Radeons, at least in Watch Dogs. Thanks for that link. Wow. But that is exactly why I was asking for the RT quality comparison a few days back in another thread.

@W1zzard I know you are quite busy at the moment, but is there a chance for at least a few screenshot comparisons similar to this polish site?

Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####
Tbh I upgrade almost every gen as well, like you would. But on the other hand the people I sell or give the older cards are usually my friends and they tend to keep their GPUs for a much longer period of time. I bough a gtx750ti for one of them as a birthday gift and he still uses it every day. That's almost 7 years now. For people that do not upgrade every gen, but maybe every other gen or even slower, features like DLSS are a big deal, beleive me.
 
Last edited:
As in stick a GameWorks logo on it, as in registering in Nvidia Developer program? Yes. Why would that be an issue? I'd argue that most game devs are registered for both Nvidia and AMD developer support programs anyway.


That's not good for the Radeons, at least in Watch Dogs. Thanks for that link. Wow. But that is exactly why I was asking for the RT quality comparison a few days back in another thread.

@W1zzard I know you are quite busy at the moment, but is there a chance for at least a few screenshot comparisons similar to this polish site?
It is worth pointing out that these issues were mentioned by people from AMD to reviewers so it's not like AMD isn't aware of it or is trying to ignore it. People from purepc were like "ok, let's see what do they mean exactly" and here we are.
My initial assumption is that the game is using RT preset for ps5/xsx with much lower cutoff point for RT. Either this or some elements are defaulting to cube maps instead.
 
3dmark firestrike world record for single graphics card has been broken on one of this amd rx 6800x at 2650/2000Mhz , Holy performance in benchmarks Batman lol !!!! beating rtx 3090 and rtx 3080.

https://hwbot.org/submission/4606724_lucky_n00b_3dmark___fire_strike_radeon_rx_6800_xt_47932_marks

I am really excited that AMD has gone back to giving us a chip that we can tweak and squeeze the performance out of with cooling and more volts. Also we may see the return of BIOS unlocks depending on how they are handling shader counts,
 
I don't understand all the hate. As a few people mentioned, I don't think many people expected AMD to be able to churn out a card this competitive with Nvidia's offerings at the high-end. The fact that they're now able to offer a card in the same ballpark as Nvidia says leaps and bounds about their progress. You can nitpick all you want about DLSS and raytracing and this and that and the other thing, but the bottom line is that Nvidia isn't miles ahead anymore, which means competition is heating up, which is good for us consumers.

Did anybody actually expect AMD's first try at ray tracing performance to match Nvidia's after they had over a year to refine and improve it? Similar story in terms of raw rasterization performance: did anybody expect AMD to suddenly leapfrog Nvidia and take the performance crown after having been behind for years? Each team seemed to focus on different things. For Nvidia, that appears to have been overall performance as well as ray-tracing and DLSS. AMD seemed like they tried to make their best of their superior process node and maximize performance per watt, thermals, power draw, etc. The fact that AMD has gotten as close to Nvidia as they have is an applaudable achievement.

As with any other consumer product, which one you decide upon comes down to preference and that depends a lot on what you consider to be important to you. For those that don't give a shit about ray-tracing, Big Navi might look really appealing with its comparable performance, better thermals and lower noise at a slightly lower cost. For those that really want to crank the eye candy, Nvidia's refined RTX implementation and all the other fancy graphics tricks like DLSS and whatnot probably put Ampere ahead. Each comes with a tradeoff that can only be evaluated on a personal level, so spouting left and right that "ABC is better than XYZ" when that opinion is clearly influenced by personal preference (on what is important to have in a video card) and the desire to not feel like you've made (or will make) a bad purchasing decision is just adding fuel to the fire. Let people be happy they bought what they did or are excited about what they're excited about.

For a bunch of enthusiasts that are part of the "PC Master Race", everybody really does a good job of stooping down to the classic "Xbox vs. Playstation" console war levels. Just be happy that innovation will be driven by competition, be sad that availability is currently dogshit, be kind to each other, and be safe from COVID.

PS: And I guess be jealous of @lynx29 for somehow being lucky enough to score both a new CPU and GPU this generation. Congrats dude!
 
I don't understand all the hate. As a few people mentioned, I don't think many people expected AMD to be able to churn out a card this competitive with Nvidia's offerings at the high-end. The fact that they're now able to offer a card in the same ballpark as Nvidia says leaps and bounds about their progress. You can nitpick all you want about DLSS and raytracing and this and that and the other thing, but the bottom line is that Nvidia isn't miles ahead anymore, which means competition is heating up, which is good for us consumers.

Did anybody actually expect AMD's first try at ray tracing performance to match Nvidia's after they had over a year to refine and improve it? Similar story in terms of raw rasterization performance: did anybody expect AMD to suddenly leapfrog Nvidia and take the performance crown after having been behind for years? Each team seemed to focus on different things. For Nvidia, that appears to have been overall performance as well as ray-tracing and DLSS. AMD seemed like they tried to make their best of their superior process node and maximize performance per watt, thermals, power draw, etc. The fact that AMD has gotten as close to Nvidia as they have is an applaudable achievement.

As with any other consumer product, which one you decide upon comes down to preference and that depends a lot on what you consider to be important to you. For those that don't give a shit about ray-tracing, Big Navi might look really appealing with its comparable performance, better thermals and lower noise at a slightly lower cost. For those that really want to crank the eye candy, Nvidia's refined RTX implementation and all the other fancy graphics tricks like DLSS and whatnot probably put Ampere ahead. Each comes with a tradeoff that can only be evaluated on a personal level, so spouting left and right that "ABC is better than XYZ" when that opinion is clearly influenced by personal preference (on what is important to have in a video card) and the desire to not feel like you've made (or will make) a bad purchasing decision is just adding fuel to the fire. Let people be happy they bought what they did or are excited about what they're excited about.

For a bunch of enthusiasts that are part of the "PC Master Race", everybody really does a good job of stooping down to the classic "Xbox vs. Playstation" console war levels. Just be happy that innovation will be driven by competition, be sad that availability is currently dogshit, be kind to each other, and be safe from COVID.

PS: And I guess be jealous of @lynx29 for somehow being lucky enough to score both a new CPU and GPU this generation. Congrats dude!


thanks for the shout out, I still can't believe it myself. I got my Master's in December, I knew it would be somewhat hard finding a job but I was confident, then covid hit... really been down in dumps lately, the jobs I was applying to just vanished. This win has really helped motivate me again.

As Kevin says in The Office, "It's just nice to win one for a change" :rockout:

also the ryzen 5600x wasn't luck really, amazon/newegg had a lot of stock, it took a long time to sell out really from what i recall, i even made a topic here on tpu being like yo they are still in stock, they were eventually sold out like 10 mins after that topic was made still beats the 30 seconds sellout of nvidia gpu's though.

i think part of the fanboy wars is sometimes we enjoy being silly, but then someone takes it too far or too seriously because the internet is just a bad place for discourse due to lack of context (facial expressions, tone of voice, etc)
 
Grab a coffee or tea and watch this very informative.

 
Last edited:
Grab a coffee or tea and watch this very informative.


yeah that video they mention the fabs being overloaded at TSMC. cause of next gen consoles. I think xbox series x was already announced not to be in regular normal stock until april 2021... I expect same for PS5 and big navi card's... so yeah. crazy storm of releases all at once.
 
Were the review embargoes lifted on the day of release? Seems pretty shady of AMD
 
Were the review embargoes lifted on the day of release? Seems pretty shady of AMD

How so, if they released a product that was available for preorder and promised something they couldn't deliver, and held reviewers to keep people from backing out maybe. But it's a great product, the only issue is available cards.
 
Were the review embargoes lifted on the day of release? Seems pretty shady of AMD


all pc building companies do this as far as im aware, nvidia did as well.
 
On the power consumption, I do not believe the average power consumption of 210W for a 6800XT. Sorry, I don't believe that. Especially when under the same testing the 3080 does ~300W.
 
On the power consumption, I do not believe the average power consumption of 210W for a 6800XT. Sorry, I don't believe that. Especially when under the same testing the 3080 does ~300W.
its really easy to figure out what the avg power draw of 6800 xt is, just go look at gamersnexus review, skip ahead to power draw section, do same for a few other youtubers, and you get an overall idea. of course it will vary some, but it might give you a better idea overall.
 
Back
Top