• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 5700 XT Nitro+

They are clearly pushing the clock speeds way beyond efficiency on the XT model, it does not mean a bigger Navi in the same clock speed domain has to be inefficient.
If you're talking about the same architecture revision as I am, then yes, it kinda has to.
 
You can only reliably and consistently reduce power consumption by reducing clocks and if you reduce them too much on a larger chip, you get a pointless product that is only as fast or slightly faster than a smaller one while being more expensive to produce - i.e. a "big Navi" if it ever materializes (and that's a big if) will be an other 300W+ gas guzzler, no doubt about it :rolleyes:
 
You can only reliably and consistently reduce power consumption by reducing clocks and if you reduce them too much on a larger chip, you get a pointless product that is only as fast or slightly faster than a smaller one while being more expensive to produce - i.e. a "big Navi" if it ever materializes (and that's a big if) will be an other 300W+ gas guzzler, no doubt about it :rolleyes:
Actually, it's kinda common to reduce the clocks slightly when adding more shaders. But then you'd need a significantly bigger die and 7nm production capacity is still at a premium.
So Navi is good, better than I expected, but in its current form it's no threat to 2080. It's also the reason Nvidia didn't feel the need to lower prices, despite AMD's claim to a clever marketing ploy wrt pricing.
 
AMD had a tendency to already boot up the clocks for reference designs that much close to or beyond it's point of efficiency. There's not much headroom left on the clocks, which translates in 3% more performance. It would be cooler tho to have a 10% uplift. Perhaps memory timings could be tuned or memory clocks even further. The base of this card is good. Cooling and VRM is'nt a problem.
 
At well over 200W consumption for mid-range performance, even the base of this card (the xt) can't be considered good, more like barely passable, while theese OCed cards are simply terrible. The only relatively good card is the base 5700 (non-xt) model, but decent variants will have to come down close to 300$ to become real options
 
The extra power draw on these AIB cards is ludicrous. To see all new 7nm cards drawing so much power it's like nothing has changed from Vega. Who is making a near stock card with better cooling, because where I live we have the world's dearest electricity, and I'm not gonna use 20% more power for a piddling 3% more fps. Juts want a great cooler.
 
Navi cards only makes sense if you buy the non overclocked variants.They are maxed out as it is. This also shows there won't be rx 5800 or so to compete with higher nvidia offerings.but perhaps rx5600 or so to compete in lower tier is a good possibility and a route they should explore.
 
The extra power draw on these AIB cards is ludicrous. To see all new 7nm cards drawing so much power it's like nothing has changed from Vega. Who is making a near stock card with better cooling, because where I live we have the world's dearest electricity, and I'm not gonna use 20% more power for a piddling 3% more fps. Juts want a great cooler.

Switch it to the Quiet BIOS then the power consumption is barely above reference.
Expanded the software BIOS switch section in conclusion as there has been some confusion about it on Reddit

Could you guys do power consumption testing with the three BIOS configs if possible?
 
You guys and your 200W IS HIGH complaint.
That power draw rarely occurs in gaming, and mostly appears in 4K gaming. Even Gears of Wars in 1440p using Navi rarely goes beyond 150W.
The power draw in 1080p is even more ridiculously low. One enterprising AIB partner could simply make a card that only draws from the PCIE slot and it would (theoritically) work!!
 
It's not 200W though, is it? Especially not on the AIB cards and that's AVERAGE, peaks can be above 300 and before you say anything, I trust this site far above some anonymous screeching forum user who claims otherwise
power-gaming-average.png
 
You guys and your 200W IS HIGH complaint.
That power draw rarely occurs in gaming, and mostly appears in 4K gaming. Even Gears of Wars in 1440p using Navi rarely goes beyond 150W.
The power draw in 1080p is even more ridiculously low. One enterprising AIB partner could simply make a card that only draws from the PCIE slot and it would (theoritically) work!!
So why does the review peg the average power draw in gaming at 266W for this card?
 
But I am an actual user.
Actually use it for gaming sessions
And I know what I have been observing for almost two months now.

4K, 60 FPS, observe the power draw on top left
 
Last edited:
But I am an actual user.
Actually use it for gaming sessions
And I know what I have been observing for almost two months now.

4K, 60 FPS, observe the power draw on top left
Nobody's calling you a liar. But we don't what you're playing, which areas you're looking at. Hell, we don't even know what card you own. So...

Also, if you cap the rendering at 60fps (which is imho wise), you're not stressing the card. You're just telling it to take a break after rendering those 60fps.
 
Nobody's calling you a liar. But we don't what you're playing, which areas you're looking at. Hell, we don't even know what card you own. So...

Also, if you cap the rendering at 60fps (which is imho wise), you're not stressing the card. You're just telling it to take a break after rendering those 60fps.
99% load with 97C on GPU isn't stressing?
 
It might be undervolted, we don't know but one way or an other, I'm not putting my faith in a video with 168 views from a guy with 48 subscribers :rolleyes:
 
It might be undervolted, we don't know but one way or an other, I'm not putting my faith in a video with 168 views from a guy with 48 subscribers :rolleyes:

Dear lord that is a stupid mentality....
But then your previous comment, the unneeded addition of the word "screeching", already painted a picture.
 
Yes, let's all start taking random obscure Youtubers as a point of reference, screw all the established reviewers, what do they know... :rolleyes:
 
You are aware that amd sensors report gpu chip only power? Not card/board power
I'm pretty sure he's not. I know I wasn't.
 
So why does the review peg the average power draw in gaming at 266W for this card?

Are they measuring the same game ?

TPU said:
We use Metro: Last Light as a standard test for typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is supported on all cards; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.
 
Last edited:
Heh, first wave Navi sure is funky. Neither the references or AIBs feel as bulky as 99% of the Vega roster but unlike top of the line Vegas, even the coldest Navi XT is still above 70 Celsius...
Also, this ain't gonna make me many friends :D but as a pretty shoddy imho, Vega Nitro design > Navi Nitro design. Didn't dig this for some reason, liked that one more. Not as generic but not overly flashy, either... In comparison, Polaris Nitro is interesting to say the least (veeery boxy) and earlier 300 series... they actually looked real solid!
 
That's concentrated heat for you - 5700XT consumes only slightly less watts on average than Vega, yet the die size is almost halved meaning there is that much less surface to transfer heat away...
Oh and I agree that Vega Nitro design was one of the best in recent years
 
Is lack of hardware ray tracing really a con?
 
Back
Top