• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6800 XT

You responded to a post about how "fanboyism is bad" with the biggest AMD fanboy page on the internet - HardwareUnboxed? Oh come on. You can't trust their results at all. Thier results are constantly significantly tilted in favor of AMD compared to the rest of the interner for their entire existence.

Trust their results more than your post.

You are in a RX 6000 review thread and pro Nvidia and don't intend on buying the hardware.

That applies to the other NV fan boys in this thread why are you here?

other than thread crapping?

AMD-sponsored games are a small percentage of all the games coming out. So recommending Nvidia is actually very reliable for GPU longevity.

Both consoles are RDNA 2 which means you will see more games supporting it from the ground up.
 
This isnt fully fair either. Nvidia do tweak performance over time too. They are engineers, not posthuman genetically engineered entities with future-seeing capabilities. They still need time.

As for AMD - they have a smaller team so things like their performance long term... it does need more work for sure. This is where the Fine Wine thing came from.
It is good though. As long as AMD prices products on performance during the product's release, it is A-OK to have drivers improve it further. It means you paid fair for it once and get better performance long term. That is my thought process and I used it for Turing too (which really did improve over time nice).

AMD driver team also has different philosophy -there are different teams working at different subsequent drivers, or even on different matters. Don't know how good communication is there and management, but i find it weird. One team can squash a bug in one driver revision, while the other did not and it resurfaces. Why use alternating team for driver releases? Why not just one team, with different departments -one go against these bugs, one against these.
 
Anyone find it ironic hypocrisy that a person shit posting all over AMD is calling other people fanboys, yet at the same time is some stalwart defender of everything intel and nvidia?
 
The dilemma of the Nvidia fanboy.

It used to be that they denied the existence of "Fine Wine", now apparently their drivers do improve performance over time. Man this is so strange.

No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
 
No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
Comedy double down, I wouldn't mention power draw, some never learn!?.
 
I tell you why, because some insist a 50% performance hit so much better than a 60% hit.

No, they're both crap. RT is still not ready for prime time.

I don't understand why people like you who misrepresent facts to prove a point.


In the worst-case scenario, the 3080 drops -43%. Once you account for the improvements from DLSS, the penalty is nowhere as bad as you suggest.
 
I don't understand why people like you who misrepresent facts to prove a point.


In the worst-case scenario, the 3080 drops -43%. Once you account for the improvements from DLSS, the penalty is nowhere as bad as you suggest.

You are right sir, excuse me.

- 43%, now that sounds absolutely amazing compared to 50 or 60. Losing almost half the performance is pretty good, what can I say you have completely changed my mind I'm sold.
 
Last edited:
That's interesting :D Then why are all the biggest titles of today implementing raytracing? Why is AMD rushing their own raytracing implementation (even though inferior to Nvidia's)? Why do the consoles now support raytracing? :D

It's an ignorant take. Console launch titles are already implementation raytracing and somehow it is not "ready for prime time" ?

Most PC games will either be console ports or co-developed. I think raytracing performance is going to matter a lot in next gen games.

There's also this little title called Cyberpunk which will utilize raytracing..
 
I see this as RTG's "Zen" moment. Is this XT "above" the Nvidia RTX 3080, ... no. That said, this is much more than just "nipping" at the heels, this is in a stride for stride. This is "competition"! and what we always looked for! AMD/RTG can make a marked play on the number of sales and begin to rival Nividia. At this point, RTG has found the momentum and only needs to focus hard and run their race.

Looking at "supposed issues" with Nvidia and their bigger GA102 Samsung 8mn and if those are inducing their issues with yield/supply. We can't say for certain but it could be/continue it doesn't bode well for Nvidia. While sure RTG has there own struggles, I don't see this initial release consideration as a long-term problem. AMD/RTG juggling their demands at TSMC has its own challenges, but direct supplier/yields is probably not one of them. AMD/RTG has probably got "Zen 3" CPU channel loaded, and for the last couple of weeks been loading "Navi 21" parts for reference boards and to AIB's. I say RTG is set well to grab the after Christmas funds.

Either side... either card if you can find one in your cart... lucky you! :toast:
 
Last edited:
I really hope that the samsung nvidia node isnt that crappy as people make it so, because otherwise if nvidia goes to lower tsmc process AMD will be again spanked. I hope that's not the case and with RDNA3 they will be on par with nvidia both performance and feature wise in all fields. People think im nvidia fanboy - i currently own both amd cpu and amd gpu. Im just jaded after 5700 got me headaches for months. I don't approach AMD with rose tinted glasses, and i had nvidia gpus before almost exclusively bar a HD series gpu. I know both sides of equation.
 
Fixed, thanks! GPU-Z has the wrong value, too

I seriously believe something gone wrong with the benchmarks here which dont seem to agree with the majority of major youtube channels that benched the GPU...

In some games even the 6800 takes the lead to the 6800 xt while the general difference seems to be 2 FPS between the two and on top of that the 3070 (a gpu that trades blows with the 2080 ti) comes on top of even the RX 6800 xt...

You should seriously check if something is wrong with your test bench hardware wise (eg dual channel is enabled? is the RAM more than 16GB when test the 6800 xt ? is the CPU a flagship one? ) or the drivers on windows and on the card itself...
 
I literally have 8 GB cards and this game. Your source is wrong or incompetent. 8GB does indeed have issues in the game at 4K.
Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.
doom-eternal-1920-1080.png
doom-eternal-2560-1440.png
doom-eternal-3840-2160.png
 
No AMD just uncover performance that is there. They don't magically create additional performance. That's why also usually AMD had higher raw power on paper but failed to deliver. Imagine a situation where you have two cars both have 120HP, but two different drivers. One is quick to learn so he basically use car full potential from the start, the other is slow learner and deliver the same performance after longer time.
.=


Have you literally took my "raw power" comment as higher wattage? Are you smoothbrained? What i wanted to say that AMD GPUs had often better spec on paper like more ROPS, SM, higher core clock etc. but somehow were slower or on par with nvidia. Now you understand?
No your so wrong it's not right, in the right application AMD showed their performance like Folding at home, mining, Doom.
Nvidia usually had higher sounding core counts, more rops and higher boost clocks, were you on another planet this last decade.
 
RT-off performance is pretty good, with the higher oc headroom 8% vs 4% and the speed deficit being around 5% while 50$ cheaper is right where it should be. But RT performance is terrible and with no DLSS 2.0 equivalent it's a big disadvantage. However all those games have been optimized for Nvidia so we'll have to see how future titles and future game updates do. Personally I'm not convinced that 8gb is a big disadvantage as I haven't seen any proof as vram utilization doesn't prove anything talking fps proof. That said 6800 with 8gb vram for $480 or 6800xt for $550 would be great value/performance product. Unfortunately it will never happen.
 
.=



No your so wrong it's not right, in the right application AMD showed their performance like Folding at home, mining, Doom.
Nvidia usually had higher sounding core counts, more rops and higher boost clocks, were you on another planet this last decade.

Remember GTX 1060 vs RX 480, where at launch GTX 1060 was winning, despite lower TFLOP performance, and then as GCN/polaris matured the performance improved and went on par with gtx 1060. That's what im talking about.
 
Sure you are well competent. Scales pretty much proportional. So your source is grossly incompetent.
doom-eternal-1920-1080.png
doom-eternal-2560-1440.png
doom-eternal-3840-2160.png


I literally have the game. IDK where Wizzard is testing, but any and all late-game levels perform worse on 8GB cards.

I personally test on Urdak since its a heavy and awesome map (perhaps third best in the game).
My 2080 and 5700 XT both choke here. The otherwise inferior (at lower resolutions) 1080 Ti outperforms them. It loses if you use lower texture settings. Its therefor VRAM.

I'd love it if people actually owned the hardware and games before talking bull :P
 
The experience is now streamlined meaning that game engines like Unreal Engine and Unity can support it out of the box. That in turn means a lot more adoption for DLSS going forward.

This isn't quite accurate. Yes, you can enable it as you develop and try it out but if you want to ship a game with it then you need Nvidia's explicit blessing. That's why even though it's easy to implement you don't see it get widespread adoption. It's the same for every other RTX/Gameworks feature you see out there.
 
And this is patently false. At the very minimum you can use textures streaming or make levels smaller. There are other methods as well.

Speaking of the RT performance in Watch dogs: Legion for AMD cards:

af6787b9beff0e1a0b3223bf6ce60a8a.jpg


I'm not sure they are comparable yet. Something is definitely missing. :cool:

Better comparison.
 
Yeah either something fishy is going in watch dogs legion, or the difference between AMD RT and nvidia RT is night and day. AMD probably decreased details to not completely hammer the frames. I wonder if these quality differences affect other RT games, have reviewers compared under a great scrutiny screenshots from both amd RT and nvidia RT implementations? Maybe they are differences like these in watch dog legion in other games too?
 
Last edited:
Some are abit too emotional, like they have something at stake.
I'm not even sure what are you arguing about anymore, fact of the matter is both gpu vendors will sell anything they made for months beause demand outstrips supply.

Defending their scalpers price 3080 purchase may be? :)

Saying DLSS made your nVidia purchase future proof is like, I don't know, Dumb? And I'm a RTX 2080 user. If I have to rely on DLSS/lower quality to play future games, I think it's time to upgrade. DLSS future proof my ####
 
Woww... nice! Great job! kudos to AMD! :rockout:

I hope after read this article, many people that pre-order RTX 3000 would change their mind, cancel their pre-order and turn to AMD card :D
Since I'm on queue 41th now. I hope with that I will got my new RTX 3080 before this Christmas :clap: ... LOL!!!



ps. just kidding!:roll:
 
Last edited:
Back
Top