# XFX Radeon RX 5700 XT THICC III Ultra



## W1zzard (Oct 31, 2019)

The XFX RX 5700 XT THICC III Ultra is a brand new triple-fan design from the company, which runs higher clocks, too. XFX listened to criticism and improved the memory cooling plate, reducing temperatures and noise levels significantly, which makes the THICC III one of the quietest RX 5700 XT cards out there.

*Show full review*


----------



## dj-electric (Oct 31, 2019)

I hate being a broken record, but it seems like more and more premium 5700 XT cards just can't reach the Noise to temp ratio of the simple 409$ Sapphire Pulse.
What an awful waste of money.


----------



## Cheeseball (Oct 31, 2019)

Noise-to-temp, yeah it fails at that. But at least they took heed of Gamer Nexus' points (especially after their rep blamed them on Reddit) and improved on this one.

I feel sorry for those who got the THICC II though, hopefully they can return those cards since its only been a little over a month since the THICC II was released.


----------



## somethinggeneric (Oct 31, 2019)

why is "no hardware-accelerated raytracing" listed as a con here? This is something none of the cards have or will have, most likely ever so this seems like an entirely moot point that could be argued as a good thing given the value of RTX.

Perhaps you explained this somewhere but I can't find it, just curious.


----------



## Jism (Oct 31, 2019)

60 watts on avg more for like 5% more FPS in comparison with a AMD reference model. AMD already sqeezes out already the best efficiency and leaves not much headroom left without the power consumption rising through the roof. Not a bad card overall tho.


----------



## W1zzard (Oct 31, 2019)

somethinggeneric said:


> why is "no hardware-accelerated raytracing" listed as a con here? This is something none of the cards have or will have, most likely ever so this seems like an entirely moot point that could be argued as a good thing given the value of RTX.
> 
> Perhaps you explained this somewhere but I can't find it, just curious.


There's people out there who aren't aware. I know that you guys commenting are experts. This is the same thing as "no analog vga on dvi" back in the day, or "no integrated graphics" on ryzen. I'm trying to educate people here, it would be easy to just hide it and not face your criticism.

Also it's not impossible that raytracing will take off in a year or two, driven by next-gen consoles and NVIDIA. All I want is that people think about points I make and come to their own conclusions.


----------



## 64K (Oct 31, 2019)

somethinggeneric said:


> why is "no hardware-accelerated raytracing" listed as a con here? This is something none of the cards have or will have, most likely ever so this seems like an entirely moot point that could be argued as a good thing given the value of RTX.
> 
> Perhaps you explained this somewhere but I can't find it, just curious.



One thing to consider is that most people don't buy a card just for today's games. If that were the case then hardware accelerated ray tracing isn't a big deal but in my experience the average person buys a card about every 2 generations or 4 years. It's my belief that ray tracing will be coming no matter what because the next gen consoles will have hardware accelerated ray tracing chips as well. It won't be full real time ray tracing of course. It will be limited by what the next gen consoles can handle but it most likely is the future.

Something to consider when buying a new GPU. I have already decided to rule out any GPUs for my next upgrade that don't have hardware accelerated ray tracing capability.


----------



## Anymal (Oct 31, 2019)

Its mos def a con, since you can easily turn it off and voila, no rtx.


----------



## the54thvoid (Oct 31, 2019)

I don't normally get a hard-on for hardware but I think XFX has nailed that design. Good to see the noise controlled, though shame about the power. Pulling more than a 2080ti in gaming is awful.


----------



## Totally (Oct 31, 2019)

W1zzard said:


> There's people out there who aren't aware. I know that you guys commenting are experts. This is the same thing as "no analog vga on dvi" back in the day, or "no integrated graphics" on ryzen. I'm trying to educate people here, it would be easy to just hide it and not face your criticism.
> 
> Also it's not impossible that raytracing will take off in a year or two, driven by next-gen consoles and NVIDIA. All I want is that people think about points I make and come to their own conclusions.



I never took an issue except "no cuda support" really not a feature aimed for the general public and those interested in it know well enough.


----------



## W1zzard (Oct 31, 2019)

Totally said:


> I never took an issue except "no cuda support" really not a feature aimed for the general public and those interested in it know well enough.


Agreed, back in the day there was no alternative and it looked like GPU-accelerated compute could become a killer feature for consumers. Now, with OpenCL around, and limited use of GPU acceleration in apps, this is a complete non-issue.


----------



## Assimilator (Oct 31, 2019)

W1zzard said:


> There's people out there who aren't aware. I know that you guys commenting are experts. This is the same thing as "no analog vga on dvi" back in the day, or "no integrated graphics" on ryzen. I'm trying to educate people here, it would be easy to just hide it and not face your criticism.
> 
> Also it's not impossible that raytracing will take off in a year or two, driven by next-gen consoles and NVIDIA. All I want is that people think about points I make and come to their own conclusions.



Regardless of whether ray-tracing does succeed or not, regardless of whether it delivers playable frame-rates on today's cards or not, it's just responsible journalism to inform buyers about what this product has or lacks versus their competitors', in order to allow readers to make the best-informed decision.


----------



## W1zzard (Oct 31, 2019)

Assimilator said:


> to inform buyers about what this product has or lacks versus their competitors', in order to allow readers to make the best-informed decision.


that's all i'm trying to do, every single day


----------



## Dave65 (Oct 31, 2019)

dj-electric said:


> I hate being a broken record, but it seems like more and more premium 5700 XT cards just can't reach the Noise to temp ratio of the simple 409$ Sapphire Pulse.
> What an awful waste of money.


Agreed, best 5700xt for price/performance ratio.


----------



## nguyen (Nov 1, 2019)

Seems like The Surge 2 results are fixed, it was odd seeing 2060S get 2x the fps of the 5700XT lol


----------



## kmetek (Nov 1, 2019)

Stay away from XFX, whoever will have to deal with RMA support within EU after 2 yrs mandatory retail warranty, will get possible custom taxes (they send replacament cards/psus from China),
Had PSU RMA fiasco with them, not willing to take resposability to cover potential custom taxes for replacament psu. At the end they tell you to buy new one.....5 yrs warranty, not even 3 passed.
XFX is a NO GO for me, for ever.


----------



## phill (Nov 1, 2019)

Lovely review there @W1zzard   The 5700XT looks to me to be a decent card, price isn't too bad it's just a shame that it looses the efficiency when it comes to this card, I wonder why the jump?  

Would be a card I'd definitely consider


----------



## W1zzard (Nov 1, 2019)

phill said:


> I wonder why the jump?


From the review: It seems in order to achieve stability at their high out-of-the-box clocks, XFX has bumped up the power limit quite a bit, and possibly voltage, too.


----------



## phill (Nov 1, 2019)

W1zzard said:


> From the review: It seems in order to achieve stability at their high out-of-the-box clocks, XFX has bumped up the power limit quite a bit, and possibly voltage, too.



Apologises there @W1zzard I missed that..  Well you can't have more performance without increasing something I guess  

Would it be a card (the 5700XT in general rather than this XFX card) that you'd consider yourself W1zzard?


----------



## W1zzard (Nov 1, 2019)

phill said:


> Would it be a card (the 5700XT in general rather than this XFX card) that you'd consider yourself W1zzard?


Of course, otherwise I wouldn't give it an award


----------



## EarthDog (Nov 1, 2019)

somethinggeneric said:


> why is "no hardware-accelerated raytracing" listed as a con here? This is something none of the cards have or will have, most likely ever so this seems like an entirely moot point that could be argued as a good thing given the value of RTX.
> 
> Perhaps you explained this somewhere but I can't find it, just curious.


i'm curious if it is listed as a con for all AMD cards then???

EDIT: It is, what do ya know?!!!


----------



## phill (Nov 1, 2019)

W1zzard said:


> Of course, otherwise I wouldn't give it an award



Just checking


----------



## rrrrex (Nov 1, 2019)

-Large overclock out of the box
        But +10% clocks gives +3% perfomance

-7 nanometer production process
        But it's even worse than geforce 1000 in terms of power efficiency


----------



## Mouth of Sauron (Nov 7, 2019)

It's totally not possible that users who buy a card TODAY should worry about ray-tracing. Console will have it, but PC games - they are what? 2 games that use different approach and none does full ray-tracing? What about development circle? Blizzard "considers" ray-tracing for D4. When it will be out? I'd say at least 3 years...

Besides, the gains of hardware-accelerated are somewhere between probably and maybe, each current CPU and GPU can do the math needed, whether there will be gain or not of '*a' *change to CUs (or, indeed, is it possible at all) is one question, other is does it sacrifices something else for it - eg. slowing down something that is actually used.

NVIDIA uses it as an questionable marketing trick, 'early adopters' just get the same what they typically get - option that they don't need. Yet. In two years users, no matter of knowledge, will probably be looking for a new GPU anyway, and that is the time when raytracing (and it's implementation, and it's natural slowness as a property of algorhytm) can be an issue.

Other than that, good review.


----------



## Anymal (Nov 7, 2019)

We can assume that nvidia incalculated the time they need to test it in RL apps and to properly grease big game developers. Time, time, time... money, money, money...


----------



## EarthDog (Nov 7, 2019)

Mouth of Sauron said:


> It's totally not possible that users who buy a card TODAY should worry about ray-tracing. Console will have it, but PC games - they are what? 2 games that use different approach and none does full ray-tracing? What about development circle? Blizzard "considers" ray-tracing for D4. When it will be out? I'd say at least 3 years...
> 
> Besides, the gains of hardware-accelerated are somewhere between probably and maybe, each current CPU and GPU can do the math needed, whether there will be gain or not of '*a' *change to CUs (or, indeed, is it possible at all) is one question, other is does it sacrifices something else for it - eg. slowing down something that is actually used.
> 
> ...


There are currently several titles out already with RT and more (AAA titles too) planned. 


			https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/
		


I don't think any title does "full" RT of a scene. That was never its intent.


----------



## Mouth of Sauron (Nov 8, 2019)

"are planned" is the key here. I've mentioned Blizzard just as an example, pipeline for releasing games takes years, look around and there is majority of DirectX11 titles...



EarthDog said:


> I don't think any title does "full" RT of a scene. That was never its intent.



No, that *has* to be the goal. Or it isn't ray-tracing. We have supplements already... Reason why they are implemented partially is most likely inability of any GPU today to do real-time ray-tracing. People (outside of circle who actually did some rendering), usually make mistakes - there is no partial ray-tracing. Each light wave is either calculated completely (with all recursions) or is not. No middle ground. I *did* said 'none full', meaning it's enough. True, *some* stuff may look better - and... it also can look better with any other CPU/GPU, depending of calculation capacity.

Don't get me wrong, it's just... that I was in this (rendering methods) for a long time. There is just one full or 'pure' ray-tracing method. There are few 'shortcuts' to do less math, but difference is typically quite noticeable. Also, ray-tracing is not necessarily photo-realism. I'm too excited to see this huge step in computer graphics, but also am bit skeptical about it happening that soon - more objects, vertices, light sources, shadows = exponentially more calculation. First true ray-traced game will need to reduce something (most likely - light sources; next - reflective surfaces and so on). Yet, they still can look good... We'll see, in time...


----------



## EarthDog (Nov 8, 2019)

Mouth of Sauron said:


> "are planned" is the key here. I've mentioned Blizzard just as an example, pipeline for releasing games takes years, look around and there is majority of DirectX11 titles...
> 
> 
> 
> ...


Just look up the list of expected titles, bud. 

As far as splitting hairs on RT and what Nvidia is doing... I just don't care. Nobody claimed it was full scene RT man... so, coolio but, not really a talking point.


----------



## Mouth of Sauron (Nov 9, 2019)

Did you, by chance, read anything except the first sentence? I'll shorten it:
- Ray-tracing does a LOT of math
- To be called ray-tracing, it must do every light ray, with recursions
- Implementing it on a whole scene with lots of light sources and reflections requires either enormous computing power, or... well, the scene must be simple by design - does not look good on "non-ray-tracing" (TM NVIDIA) GPUs


----------



## savas (Nov 13, 2019)

W1zzard said:


> There's people out there who aren't aware. I know that you guys commenting are experts. This is the same thing as "no analog vga on dvi" back in the day, or "no integrated graphics" on ryzen. I'm trying to educate people here, it would be easy to just hide it and not face your criticism.
> 
> Also it's not impossible that raytracing will take off in a year or two, driven by next-gen consoles and NVIDIA. All I want is that people think about points I make and come to their own conclusions.



I see where both is coming from. May I suggest then an alternative third section for these exact type of issues. Like more of a yellow/warning symbol section that it has no RTcores. A bad for AMD because it doesn't have Nvidia RTX? Seems wrong / inaccurate.


----------



## namesurename (Dec 17, 2019)

Hello, little update from somebody who bought the actual card.
In short - it is a disaster. 

Tested as is in Timespy benchmark - gpu temp up to 90, edge temp 100+, load as jet engine(3000rpm max).
Tried to undervolt it and set fan profile - did help, but not much, still absolutely unhealthy temps, and fans even at 1600rmp can't cool it.

So I remembered about what Gamer's Nexus Steve did, and repasted my card.
Memory modules are properly covered with thermal pads and everything, gpu die and heatspreader are in proper contact based on thermal paste pattern...But the termal paste itself... Well, I can't call it a paste, it was dry chunky plastic substance.

After that a healthy undervolt of 1905 mhz at 1052Mv(may be a bit lower I think, but at 950 it crashes), less agressive fan profile and -5% power limit it's at last quiet and still retain 95% of it's power. Still 75GPU and 86 hotspot but it's fine. At least it's a decent card now.

But still, for a really big heatsink(and it is BIG), to have issues, I think they really cheaped out on heat pipes.
Or maybe I just got unlucky?


----------



## Firestorm1439 (Mar 6, 2020)

I bought this card about 2 weeks ago. The first thing I noticed is that the card boosts way above the boost clock target very often reaching 2080mhz for extended periods. Temperatures in an environment with a 26-degree ambient temp are (GPU: 75-78, MEM 75, VRM 68). So far I am super impressed, perhaps I just won the silicon lottery with this example, I haven't had any issues with driver stability either.


----------



## Anymal (Mar 6, 2020)

Lol, is ambient so hot before or after gaming?


----------



## Firestorm1439 (Mar 6, 2020)

Anymal said:


> Lol, is ambient so hot before or after gaming?



LOL before , it was 36 degrees outside today. I may as well be in the desert.


----------

