• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

PowerColor Radeon RX 7900 GRE Hellhound

That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...
 
That insane video playback consumption will be a real problem for AMD

Let's not put the whole of AMD or all of its products under a common denominator. Some AMD products are great, running with exceptionally low video playback power consumption, while others are bad, but particularly the 7900 GRE is going in the right direction.
Previous Sapphire review showed disastrous power consumption, the worst of them all:

26 February t.y.
1711214564971.png


Today:
1711214603711.png


Radeon RX 6600 and Radeon RX 6600 XT are the best:

1711214643533.png

1711214656586.png


I do hope AMD will fix the awful Counter - Strike 2 performance, because I want my game to be running at 200+ FPS, not at 130ish FPS. :banghead:

I read and by the looks of it, there is a thing called DXNAVI which causes the problems.
 
I'm probably unlucky but my memory won't go over 2500 mhz on my sapphire pure.

My maximum score is 74 fps in time spy gt, i guess i lost at lottery :(
 
That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...
Lets be real for a second here. What would be a "tolerable" power consumption? Minus 20W like the middle of the table? That's 20 W / 1000 W = 0.02 kWh | 0.02 kWh * 0.2 € = 0.004 € per hour is the difference between running this one and one in the middle of the table. Double the price of electricity to 0.4 €/kWh? Still less than one cent difference between running both.

Let's run how much you'd need to watch Youtube/whatever to make a 10€ difference: 10 € / 0.004 € = 2500 hours.

The difference is huge, but not something to slit your veins over.
 
Let's not put the whole of AMD or all of its products under a common denominator. Some AMD products are great, running with exceptionally low video playback power consumption, while others are bad, but particularly the 7900 GRE is going in the right direction.
Previous Sapphire review showed disastrous power consumption, the worst of them all:

26 February t.y.
View attachment 340301

Today:
View attachment 340302

Radeon RX 6600 and Radeon RX 6600 XT are the best:

View attachment 340303
View attachment 340304

I do hope AMD will fix the awful Counter - Strike 2 performance, because I want my game to be running at 200+ FPS, not at 130ish FPS. :banghead:

I read and by the looks of it, there is a thing called DXNAVI which causes the problems.
True !
I should say RX 6800(and UP) cause I do own a RX 6700 XT with 12W power consumption @ video playback... and I'm OK with that :peace:
 
Impressive. Too impressive...
1711223889935.png

Nearly 10c better than the Nitro+; which, clearly has a slightly 'beefier' cooler.

Physics is telling me there must be something wrong with the methodology (ambient temps? GPU orientation?),
or
Powercolor is the only AIB using superior TIM and/or mounting pressure.

1711223520186.png
1711223531484.png


1711223561869.png1711223575185.png

1711223644321.png1711223675047.png

1711223715764.png1711223746620.png

It almost looks like Powercolor opted for much thinner thermal pads and the TIM-remnants look a different color (diff TIM).
So, maybe Powercolor actually did make a superior assembly?


NGL, I'm a little bit miffed at the fact my 'premo' Nitro+ is being beaten (thermally) by some Wet Nose*.

1711224412444.png1711224675303.png
*Someone over at Powercolor is a BattleTech fan.
 
That insane video playback consumption will be a real problem for AMD when people realize that x4 consumption on YouTube especially in countries with higher electricity prices.
UK & most of Europe will be counting those Watts tbh...

Is this a troll post? This is not a laptop. Many RTX cards idle at 20W, what is 37W while watching youtube lmao. "insane" I don't even know how to respond to that nonsense. People used to use a 100W lightbulb and think nothing of it, now if your video card uses 10W more than the other guy's while watching youtube it is "insane".

10 W, max 15 W is required. This is a very low load task, you can't overload the GPU to insane numbers like 40s or 50s watts...

"insane" another troll... just say "it uses 10W more than the competition" to watch video...

---

I see where this narrative is going. I hope these are the same people that don't buy Intel.
 
Last edited:
Is this a troll post? This is not a laptop. Many RTX cards idle at 20W, what is 37W while watching youtube lmao. "insane" I don't even know how to respond to that nonsense. People used to use a 100W lightbulb and think nothing of it, now if your video card uses 10W more than the other guy's while watching youtube it is "insane".



"insane" another troll...

---

I see where this narrative is going. I hope these are the same people that don't buy Intel.
'Green Mindshare' :laugh:

Won't go so far as to say 'paid agitator' but, they do 'learn' from them:
-Pick one point, take it to an extreme, and get everyone focused on it.
(Bonus points for purposefully convoluting and conflating data counter to one's argument)
 
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.

I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
 
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.
XeSS works in CP'77. I too thought it looked 'a little better' v. FSR. However, FSR is actively being improved.
Me, though: CP'77 MTFO is the *only* title I need anything like FSR/XeSS. @ 1080P my GRE is justabout the right level of overkill.
I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
Please. Yes.
The GPU(vBIOS) db facilitates this but, it seems like those of us w/ GREs are 'shy' :laugh:

Historically, Samsung is the superior option. However, here... Unless there's some major latency/timing differences, it's looking like 20gbps GDDR6 from Samsung and Hynix are 'equal' in quality
My Nitro+ has Hynix RAM, does the same 2600-2614Mhz as seen in the Hellhound review (which, was equipped w/ Samsung VRAM).

entirely unlike what occurred w/ HBM2 on the Vega 56-64, etc. and the RX580's GDDR5.
Early Production V56 and V64 are all Hynix IIRC, and can barely hit 1Ghz HBM clocks. My Vega 10 16GB MI25s (all Samsung HBM2) all can pull off 1107Mhz or higher
On Polaris, the couple Samsung-equipped 580s I had, mined more efficiently vs. the Hynix-equipped examples.


I almost grabbed a Hellhound off Amazon when my Gigabyte GRE from NE came out of the box in pieces.
Kinda regretting that after seeing the markedly better thermal performance and Samsung VRAM
-but, there were no trusted reviews on the Hellhound

*advertising* purple fans, was a bit of an aesthetic turn-off; plus, I'm more a Periphery Freebirth, not some canister born Clanner :D
 
Last edited:
DLSS is a valid reason for people purchasing nowadays unfortunately. FSR straight up looks worse than it's competitors. I have a RX 6800 XT myself but will GO OUT OF MY WAY to mod Xess into games using the Starfield FSR bridge mod because it looks that much better.

Balanced Xess modded on Alan Wake 2 looked better than FSR Quality.

I wonder if there is a easy list of the 7900 GRE cards by what memory they use. A nice easy chart.
Last time I checked Intel GPUs don't have DLSS and it's not listed under their "cons".

FSR or XeSS being better or worse than each other is irrelevant since both and DLSS are garbage. It's like having a contest where everyone dies and the rotten corpse of the winner is exhibited as if it was any better than the other two.
 
Last time I checked Intel GPUs don't have DLSS and it's not listed under their "cons".

FSR or XeSS being better or worse than each other is irrelevant since both and DLSS are garbage. It's like having a contest where everyone dies and the rotten corpse of the winner is exhibited as if it was any better than the other two.
Not the exact PoV I'd take but...
Agreed

XeSS and FSR are non-proprietary and 'algorithmic'.
DLSS is 100% nVidia-propietary, MI-based, and 'trained' by nVidia on a title-by-title basis.

I'd 'argue' that DLSS is 'better' but...
A. It's something entirely different, functionally.
B. All these 'intelligent' scaler/pixel generators are crap.
You lose so much detailed definition to faces and objects/textures, and can feel 'the fuzziness' in motion. What ever happened to 'crisp-ness'?


People seem to think the GRE is 'overkill' or 'excessive' for 1080P gaming; they call it 'a 1440P card'.
I disagree. It *is* the 1080P card.
Once you subtract-out DLSS/FSR/XeSS, modern titles need the grunt to perform smoothly.
 
Not the exact PoV I'd take but...
Agreed

XeSS and FSR are non-proprietary and 'algorithmic'.
DLSS is 100% nVidia-propietary, MI-based, and 'trained' by nVidia on a title-by-title basis.

I'd 'argue' that DLSS is 'better' but...
A. It's something entirely different, functionally.
B. All these 'intelligent' scaler/pixel generators are crap.
You lose so much detailed definition to faces and objects/textures, and can feel 'the fuzziness' in motion. What ever happened to 'crisp-ness'?


People seem to think the GRE is 'overkill' or 'excessive' for 1080P gaming; they call it 'a 1440P card'.
I disagree. It *is* the 1080P card.
Once you subtract-out DLSS/FSR/XeSS, modern titles need the grunt to perform smoothly.
The problem is that everyone is so busy declaring a winner that the fact they are all garbage is lost in the discussion.

Example: https://www.techpowerup.com/review/starfield-xess-1-2-vs-dlss-3-vs-fsr-3-comparison/


How can you say any of them are ANY good when you have a massive screen tall artifact staring at you in the middle of the screen? Do you know which one wins? NO UPSCALING because it doesn't introduce artifacts, blurriness and so on!

At least now the media outlets stopped hammering with the "better than native" BS. My fucking god it was like being in They Live and only a handful of us had the glasses.
 
The V-sync 60hz test stood out to me in this one and I was wondering if anyone had any insight on it

power-vsync1.png


It's a huge difference across two of the same model card on the same driver. The Nitro+ only sits at 103w for this test so it's not even an increased power draw difference.
The graph for the Hellhound shows much more varaition in power draw from data point to data point than any of the other GRE cards. I almost thought it may have been a testing error because the difference is so huge, but I assume that would have been retested on being noticed?

Is this likely to be a BIOS issue, maybe unfixable because as far as I know powercolor doesn't update BIOS on their cards, or something drivers could sort out?

I'd also be curious to know if this sort of behavior happens in other games when framerate is capped, because this would have a huge effect on the overall power efficency of the card when not running at 100%, which I would argue most games don't unless you run everything with uncapped FPS regardless of screen sync. I'm still learning about power draw on cards so any insight is appreciated

Edit: Adding in an overlayed chart of the power data point graph for what I mean above about the strange behavior and how much more variation there is at V-sync than any other review I've seen on this site

power-consumption1.png
 
Last edited:
The V-sync 60hz test stood out to me in this one and I was wondering if anyone had any insight on it

View attachment 340384

It's a huge difference across two of the same model card on the same driver. The Nitro+ only sits at 103w for this test so it's not even an increased power draw difference.
The graph for the Hellhound shows much more varaition in power draw from data point to data point than any of the other GRE cards. I almost thought it may have been a testing error because the difference is so huge

Is this likely to be a BIOS issue, maybe unfixable because as far as I know powercolor doesn't update BIOS on their cards, or something drivers could sort out?

I'd also be curious to know if this sort of behavior happens in other games when framerate is capped. I'm still learning about power draw on cards so any insight is appreciated
My immediate guess:
Powercolor purposefully altered the frequency/voltage curve to minimize on 'power state stutter'*. No gamer or enthusiast will ever want to use VSYNC, it's a huge 'latency add'. Only time I've EVER used VSYNC is when a game's lighting or physics get 'messed up' by excessive framerate.

(IMEO) Basically, the vBIOS is set to 'hold' higher clocks and voltages, rather than so-aggressively downclocking in momentary 'power savings'.

In my experience across Polaris-Vega-Navi, the bigger the delta in moment-to-moment reported clocks, the more you 'feel' some kind of hitching/stuttering.
It's a well-documented cause for high Avg and good Min FPS but, poor 'experience' (due to frametime inconsistency)


Notably, this is tangentially-referenced in the review; 'minimum clocks' in the OCing portion.
MOST people, are not so 'latency sensitive', and cannot quantify the 'difference' without extensive and damned-near pedantic testing.
With cursory and standardized testing, there is no quantifiable difference (other than Power Consumption)

Personally,
leaving 500Mhz minimum clocks vs. 2000Mhz minimum clocks on my GRE, is very noticeable.
I feel less 'frametime jitter' @ 2Ghz min. core clocks.



*I bought a Nitro+ 7900 GRE for the multiple vBIOS.
I don't want to be first to try it, but I am interested in "cross-flashing"
 
Last edited:
My immediate guess:
Powercolor purposefully altered the frequency/voltage curve to minimize on 'power state stutter'. No gamer or enthusiast will ever want to use VSYNC, it's a huge 'latency add'. Only time I've EVER used VSYNC is when a game's lighting or physics get 'messed up' by excessive framerate.

(IMEO) Basically, the vBIOS is set to 'hold' higher clocks and voltages, rather than so-aggressively downclocking in momentary 'power savings'.

In my experience across Polaris-Vega-Navi, the bigger the delta in moment-to-moment reported clocks, the more you 'feel' some kind of hitching/stuttering.
It's a well-documented cause for high Avg and good Min FPS but, poor 'experience' (due to frametime inconsistency)


Notably, this is tangentially-referenced in the review; 'minimum clocks' in the OCing portion.
MOST people, are not so 'latency sensitive', and cannot quantify the 'difference' without extensive and damned-near pedantic testing.
With cursory and standardized testing, there is no quantifiable difference (other than Power Consumption)

Personally,
leaving 500Mhz minimum clocks vs. 2000Mhz minimum clocks on my GRE, is very noticeable.
I feel less 'frametime jitter' @ 2Ghz min. core clocks.

I had just edited my post above as I forgot to include the image I made overlaying the graph of the Hellhound vs Pure's power consumption graph, which is relevant.

In that you see there is actually much larger variance from data point to data point than the others, which makes the average much higher as a result. If what you're saying about the min clock holds true, and the minimum clock speed slider not mattering much is also noted in the other GRE reviews, I thought it would be the opposite with less spikes moment to moment

Side note: I use v-sync on most games because I am annoyingly sensitive to screen tearing and for some reason despite my tv reporting that it is adaptive sync compatible, it never seems to work.
 
I had just edited my post above as I forgot to include the image I made overlaying the graph of the Hellhound vs Pure's power consumption graph, which is relevant.

In that you see there is actually much larger variance from data point to data point than the others, which makes the average much higher as a result. If what you're saying about the min clock holds true, and the minimum clock speed slider not mattering much is also noted in the other GRE reviews, I thought it would be the opposite with less spikes moment to moment

Side note: I use v-sync on most games because I am annoyingly sensitive to screen tearing and for some reason despite my tv reporting that it is adaptive sync compatible, it never seems to work.
Point! :D

Still, is likely 'related'. Perhaps, just for a different reason.
It *does* look like it's 'aggressively and rapidly clock shifting', for some reason.

(W/o further info/testing)
I'd go so far as to theorize this is still from a 'gaming tune' curve.

I know I'm not alone in my utter disdain for VSYNC
VRR display technologies, were unironically a dream come true, for me and many gamer-enthusiasts.
The card may be 'freaking out' at the improper use-case.

Edit:
Speaking towards your issue w/ your Adaptive Sync...
I'm *very* tearing-sensitive too but, moreso latency sensitive.

There's a 'trick' to getting it to work smoothly on older and poorly-implemented Adaptive Sync displays:
You *cannot* allow your framerate to exceed "Refresh - 1hz", nor allow your minimum framerate to go below "VRRrange + 1hz".

Setting up a 'by-profile' Radeon Chill or Framerate Cap of 143FPS on my ancient 32" 144hz screen, made a huge difference when I was still running a Vega. On my GRE, I can notice FreeSync Range Excursions in CP'77 and BattleTech.
I'll be 'addressing' those minor issues after I get the 24.3.1 drivers back onto my system
(Doing F@H for Team EHW @TM, req. a driver w/ full OpenCL support)
 
Last edited:
Point! :D

Still, is likely 'related'. Perhaps, just for a different reason.
It *does* look like it's 'aggressively and rapidly clock shifting', for some reason.

(W/o further info/testing)
I'd go so far as to theorize this is still from a 'gaming tune' curve.

I know I'm not alone in my utter disdain for VSYNC
VRR display technologies, were unironically a dream come true, for me and many gamer-enthusiasts.
The card may be 'freaking out' at the improper use-case.

Edit:
Speaking towards your issue w/ your Adaptive Sync...
I'm *very* tearing-sensitive too but, moreso latency sensitive.

There's a 'trick' to getting it to work smoothly on older and poorly-implemented Adaptive Sync displays:
You *cannot* allow your framerate to exceed "Refresh - 1hz", nor allow your minimum framerate to go below "VRRrange + 1hz".

Setting up a 'by-profile' Radeon Chill or Framerate Cap of 143FPS on my ancient 32" 144hz screen, made a huge difference when I was still running a Vega. On my GRE, I can notice FreeSync Range Excursions in CP'77 and BattleTech.
I'll be 'addressing' those minor issues after I get the 24.3.1 drivers back onto my system
(Doing F@H for Team EHW @TM, req. a driver w/ full OpenCL support)
Re-checked the Efficency page of the review vs the other GRE cards, and it does have higher minimum clocks, but that still doesn't seem to fit the weird spikes for the V-sync test, and the high minimum clocks on other Hellhound models don't seem to cause this behavior. Unless it's just an issue with minimum clocks that high on the GRE in particular.

I also presume that this test was only run on the performance BIOS, and would be curious if the same behavior is seen on the quiet BIOS, or if it could be overridden on the user end

I'm also not a fan of v-sync, I've just learnt to live with it through decades of gaming on both console and cheap screens, and it helps I play mostly third person action games. And my TV isn't even a 2yo model (brought less than a year ago), it's just being dodgy.
 
The higher media power playback power thing is i think related to forcing the modules to be turned on. I'm not sure how to pronounce it (lol) but the whole threadripper has a insane idle power requirement as well due to the nature of the chip, the cache and all that.

It just disappoints me that the whole 7x00 series has bin locked out by the use of MPT and such. You can easily uncork to cards to excessive 600W power consumption if you wanted to and break some records with that.
 
Impressive. Too impressive...
View attachment 340337
Nearly 10c better than the Nitro+; which, clearly has a slightly 'beefier' cooler.

Physics is telling me there must be something wrong with the methodology (ambient temps? GPU orientation?),
or
Powercolor is the only AIB using superior TIM and/or mounting pressure.
It stuck out to me too, but actually PowerColor's 7900 GRE Hellhound is performing similarly to the 7800 XT Hellhound, which was only modestly better than the competition. The Sapphire 7900 GRE Nitro cooler should be close too, but is substantially worse now than when it was used on the 7800 XT (the fan speed graph shows why).
noise-normalized-temperature.png
 
Out of curiosity, why are the 4080 Super results not included in the charts?
 
Out of curiosity, why are the 4080 Super results not included in the charts?
Because it's the same as 4080, so there is no point.
 
Out of curiosity, why are the 4080 Super results not included in the charts?
What @Pumper said, it's kinda pointless, but now I'm realizing that there's the point that people might not be aware that it's pointless, so it's been added to the summary graphs and I'll include it in the future, too
 
Back
Top