• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

PowerColor Radeon RX 7900 GRE Hellhound

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,935 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
PowerColor's Radeon RX 7900 GRE Hellhound comes with a fantastic cooling solution that's the best of all the GRE cards that we've tested so far. Not only temperatures are low, but noise levels are outstanding, and the dual BIOS capability lets you turn the card into a whisper-quiet gaming machine.

Show full review
 
You can now go higher with 23.4.1 in the oc section. Thanks for the review, looks like a nice card.
 
You can now go higher with 23.4.1 in the oc section. Thanks for the review, looks like a nice card.
Confirmed, it goes up to 3000 MHz now. Testing right now, will update the review soon
 
Very high power consumption, though.

1711136059423.png


Meanwhile, its direct FPS competitor is significantly less power hungry:
1711136472836.png


Very low Counter - Strike 2 performance, still not fixed. Waiting for a driver.
Does anyone know why RDNA 3 is broken in CS 2? :banghead:

1711136132877.png
 
Last edited:
Confirmed, it goes up to 3000 MHz now. Testing right now, will update the review soon
Am intrigued. Curious how it effects other models (such as the nitro+) with higher PL as well? Also, is this just the core or did the ram limits change? Thanks for doing the extra work; we appreciate you.

------------------

I am quite bothered by both news articles and this review saying nVIDIA has no such artificial limitations. You and I both know that's not true. I don't know how educated Toast et al are on the matter.

While the SLIDERS may not be limited, nVIDIA is quick to lock down both voltage and power limit to segment their cards very specifically (a recent example being the 4070 Ti Super 16GB), there-by reducing the usefulness of those sliders and over-all practicality/longevity of those products. They also tailored the process (N5->'4N') to focus on efficiency/area to (apparently) not scale (well) with voltage past the optimal power/frequency curve (which likely saved them money in silicon costs), where-as it would appear AMD did not. If you want to applaud nVIDIA's designs for those decisions, that's a personal opinion, but so should praise be given (by enthusiasts) when a product can be pushed past the curve. So to exclaim one thing, when the actual opposite is true, is simply disingenuous. To excise those notions from the reality of the situation while pointing out the slider issue (which I'm not saying you shouldn't; I'm glad you [and HWU] did; it puts pressure on AMD to give customers the right and ability to achieve the best conceivable outcome if they choose to OC their cards) is ignorant, if not 'picking a side'.

There should be just as much pressure on nVIDIA to raise voltage levels to wherever it will literally stop scaling (~1.1v/3200mhz? on n5/'4n'?), and to raise their power limits (especially on expensive/overbuilt cards with VRM/coolers that can most certainly handle the extra heat and sustained higher frequency under load). I know I'm not the only one that feels this way, although it has become clear less and less of the customer base actually understands it (I know you do), but I might be the only one that will say it to someone (that would be reviewers/notable HW youtubers such as yourself) that could actually put public pressure on said company to change their anti-consumer philosophies so that everyone can make the most out of their cards (and cpus...but that's a fight for another day), their (planned) stacks (and early obsolescene) be damned. Cards should be able to perform the best they possibly can (just like a 7900xtx with a ridiculously high voltage [1.2v?/each card should be power-limited to the power connector spec IMO] to max it out at ~3100-3200mhz or whatever the case may be) because that is the AIB and consumer perogative wrt the capability of their coolers/VRM, the (sometimes binning of) actual silicon, and how much that perf is worth using how much power; it should not be on the manufacturer (beyond safeguards to avoid actual damage; but you, I, and many others know the set levels are much lower than that and are not for that reason) to make that decision for us, especially under the guise of safety when it's really all about (them making more) money.

To condone, abide, or allow ANY company to limit OCing on cards otherwise very capable, only to push customers to spend more money (that most never will) and limit their conceivably 'best' experience, if they choose to spend the time to tinker, is anti-consumer, and is in-fact hurting this scene. It already has. I don't think companies understand how much negativity surrounds them by these limitations, nor does AMD apparently remember how much positive consumer sentiment their 'old ways' (of bios flashing first shipments of cut-down cards to the higher-end products, lack of lockdown on higher voltage/PL, not limiting clocks within bios) gave them within the (knowledgeable/) enthusiast community; it's literally how (IMO) they built their (positive) reputation. The trickle down IS real. WRT to the current issue, I'm glad AMD is being kept more honest; more true to the soul of what old AMD/ATi products used to be about. Limits should be above (conceivable) capability, especially wrt how boost works, but I think people can (realistically) work with 3000 in this case.

nVIDIA may likely never change; their goal is to hope consumers don't understand they do these things or forget they ever could be done. They should be consistantly scrutinized for that, not let them 'win' and let it be forgotten, only for the sake of their coffers and consumers having a less-than-optimal experience with lower-end hardware (that may be the best they could ever afford) in hopes to upsell them to something (overpriced) they may not actually need were things different.

If companies like AMD/nVIDIA feel they need to limit their overclocking potential (and AIBs right to use a better cooler/higher voltage/power level), my message to them would to be to focus on making better cards and a better (innate hardware) stack in which their own cards don't threaten each other. Also, while not-limiting cards may allow some overlap (which is VERY MUCH why these limitations exist; to create specific markets that sometimes do not line up with consumer needs but do line up to those companies making higher margains), it's not like MOST people will ever go that far or even think about it that much when making a decision to buy a card. People like myself are a minority (and even more-so now, sadly). It likely won't move the needle much in high-end sales as I think/know that most people generally look at stock performance when picking any product. The (overclocking) capability is what builds enthusiast sentiment, or loses it. Similar could be said of shorter/thinner/low-power versions of cards of old using a larger chip than generally intended but reduced voltage/power to coincide with the less componenets and/or cooler to fit in a smaller space. All of those areas have been effected by (largely nVIDIA) meddling, and not only should it not be forgotten, it should be actively scrutinized (as I believe many AIBs would love to offer those niche options across all spectrums), as those customers do/would exist and should be allowed to have options; they shouldn't have to fit into nVIDIA's (et al) pre-ordained box; completely controlled wrt (what size system to use and when to upgrade) ecosystem and rung on the current and future ladder.

As we've seen from these choices, it has done nothing but hurt the knowledge/interest in the workings of these chips by a larger audience, and both companies (but mostly nVIDIA) should be shunned and ashamed for their part in that. I'm glad when AMD (sometimes) dials it back, as they've been known to do with enough pressure. nVIDIA will make bullshit excuses; but let them, and I/we can call them on it.

That's how sentiment should work.

So yeah, please don't ever say nVIDIA does not have any artificial limitations. Love and respect you, but it pisses me off very much because not only is it wrong, but because many very much trust your words.

Thanks for your consideration. I know these 'battles' can be tough, and for many they don't feel the need/desire to fight them, but it must be done for the betterment of the consumer/enthusiast space.

You know, the betterment of (and options for) the PEOPLE, not the COMPANIES (and ability to force upgrades and/or demand higher margains).

At the end of the day, all these things are very real, and we've seen options/value evaporate over the years. At some point you have to pick a side when you know something (better) can be done about it.

You have to ask yourself, if you're knowledagble-enough of the capability of such products and remember (capabilities and options of) the past...Do you know which side you're on?

Because I certainly do, and it's pretty freaking lonely sometimes.

I don't blame people that don't know any better, but that is a planned construct created by these companies to influence the thinking of what's possible wrt the current/next generation...and they're winning.

It's on us that remember, and know things can be better, to actually do/say something about it. If not for the benefit of us (that may be able to pony up the cash for higher-end or more frequent upgrades), then for that next generation of people that, given the chance/opportunity/understanding, may be more like us. Or, perhaps, how some of us used to be. Before we got old, tired, and/or sold out.

(By 'sold out', I'm talking Anand, Shrout, Kyle, etc leaving the space to work for the companies they used to keep in check. I'm not trying to imply you sold out [I'm trying to explain many people that kept that space 'honest' and enthusiast sentiment alive were bought up and/or took opportunies by Apple, Intel, AMD, Asus, etc], but the opposite; you're still here putting in the hard work with hard numbers and honest [if subjective] opinions. That's why what you say and do is more important than ever...You're one of the few [and in my mind, very important] OGs that are still around, and I truly respect/appreciate that very much! I just want things to be the best [and people best-informed] they can be from the people that they trust the most [which you've earned]. Not trying to imply there are not areas you are right and I am wrong, or we can't have different subjective opinions; simply that sometimes blanket statements can give people the wrong impression [which is not a flaw that escapes myself on an on-going basis], and being thorough in a review that effects many peoples' decisions is perhaps more important than one guy spouting an thoughts on a forum. I get there's something to being concise [which obviously isn't a talent of mine] and not every issue can be touched on at every single moment; just throwing it out there as something to consider given your position and willingness to write a lengthy conclusion!)
 
Last edited:
Also, is this just the core or did the ram limits change?
This is just the ram limits

nVIDIA is quick to lock down both voltage and power limit to segment their cards very specifically (a recent example being the 4070 Ti Super 16GB
tdp-adjustment-limit.png

Nothing really locked down. Or do you mean the +0% cards that are +0% because the AICs cheaped out?
 
Wow, how is that cooler so effective at only 1200g? It's W/ºC is like 50% better than the other 7900 GRE coolers at about the same weight. And on par with RTX 4090 coolers twice its mass.

Edit: actually this cooler is about the same as the Hellhound 7800 XT (no surprise, as W1zzard mentioned this chip is designed to fit those existing PCBs and coolers) and on par with several other 7800 XT coolers. I forget that the Nvidia chips are harder to cool and can't be directly compared to AMD.

I like to see, in addition to a well tuned fan curve, it has a low 600 RPM minimum fan speed on the quiet BIOS setting, for extra flexibility with custom fan profiles. I wish other AIB parters would follow suit.
 
Last edited:
It is an excellent card, the only downside for me is lacking of RGB.
 
Very high power consumption, though.

View attachment 340195

Meanwhile, its direct FPS competitor is significantly less power hungry:
View attachment 340197

Very low Counter - Strike 2 performance, still not fixed. Waiting for a driver.
Does anyone know why RDNA 3 is broken in CS 2? :banghead:

View attachment 340196

1711152802506.png

The 4070 is slower than both the 7800XT and GRE by a good margin. Plus, Why manipulate information showing consumption peaks instead of averages?
1711152663632.png


CS2 runs at 700fps if you have a good processor and turn off 8x MSAA
 
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.
 
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.
It’s fine if tuned but the 7900 XT is a great buy that’s true.

thanks @W1zzard for retesting OC with higher vram limit. It’s quite clear this GPU is very bandwidth starved, even more than the 7900 XT is at 4K. Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it. The same happened to the 7600 XT but that card is mostly only used for 1080p anyway which has much less issues with bandwidth.
 
Why ... showing consumption peaks instead of averages?

When you choose a power supply, you need to look at this particular information, because otherwise you could make a wrong decision.
The power spikes are the most important, not idle or video playback.
 
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.

The 7900 XT is only cheap in the US. The GRE is a good price in many countries. It's easy to find for $550 USD outside of the US.

Frankly this Hellhound card is incredible. Too bad this specific model is currently $100 more in my country than the normal GRE cards. High demand. Selling really well.

An RTX 3090 for less. A 4070 Ti for $220 USD less. Etc. Really quite good. I'm glad they fixed the GRE it was very disappointing at first.

When you choose a power supply, you need to look at this particular information, because otherwise you could make a wrong decision.
The power spikes are the most important, not idle or video playback.

Buy AMD for your CPU and don't worry about your power supply at all. Any good 650W gold is more than enough.

It's the Intel + GPU combo that is a mess. Those spikes from Intel are what make your PSU fail.
 
Nice to see this reviewed and getting such a good grade since this is the card I have and was wondering what TechPowerUp would say about it. :)
Especially happy about the low fan noise which is something I find important.

But definitely concerned about the high power draw when playing back media. Just staring a clip on YouTube bumps the power consumtion up 40 W. Hope that can be fixed, because I see the 7900 XT and XTX aren't that bad.

It’s fine if tuned but the 7900 XT is a great buy that’s true.

thanks @W1zzard for retesting OC with higher vram limit. It’s quite clear this GPU is very bandwidth starved, even more than the 7900 XT is at 4K. Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it. The same happened to the 7600 XT but that card is mostly only used for 1080p anyway which has much less issues with bandwidth.
In what scenarios would you say this about the memory bandwith being lower would be a problem? I'm not planing to move to native 4K gaming anytime soon (staying at 1440p) and will likely hold on to the next generation of GPUs until going 4K native.
 
Last edited:
Nice to see this reviewed and getting such a good grade since this is the card I have and was wondering what TechPowerUp would say about it. :)

But definitely concerned about the high power draw when playing back media. Just staring a clip on YouTube bumps the power consumtion up 40 W. Hope that can be fixed, because I see the 7900 XT and XTX aren't that bad.

I wouldn't worry about 40W. I have a laptop without a discrete card. It uses almost 40W to play a 4k youtube video using the iGPU.

If your desktop PC uses 100W all the time (at the wall), even a 20W reduction from the GPU would still be 80W total for example.

Yeah less power is nice but I don't think 40W is much of a problem. Remember that idle power for this card is extremely low (even lower than RTX cards). Unless you are only watching youtube ALL DAY LONG without a break, I think it is a non issue.
 
In what scenarios would you say this about the memory bandwith being lower would be a problem?
Only 4K (gaming) is problematic (in this case it means they lose more performance than they would lose with more bandwidth available) for 7900 GRE and 7900 XT, as they have a lot of shaders/cores but the bandwidth has been cut down compared to the XTX, which itself barely has enough. 1440p is fine in any case as it has way less requirements for bandwidth.
 
Last edited:
Still not a fan of listing lack of DLSS support as a negative. I'm well aware of the reason, but to me it's akin to reviewing a smart phone and then list "not an iPhone" as a negative. Further to the point, AMD (and also Intel) is actually punished for developing an architecture-agnostic frame generation solution.
 
That "RGB OFF" switch.

*chefs kiss*

Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it.
The accepted reason is "artificial product segmentation" to avoid the GRE ruining sales of the far more profitable XT.

A hypothesis I have is that the GRE is very defective N31 silicon, and so AMD are pairing it with their lowest-binned MCDs that perhaps aren't capable of driving the GDDR6 at its rated speed with 100% stability. You can take your changes with overclocking and probably get the full 2500MHz out of it, but this way AMD aren't having to deal with RMAs for any samples that have MCDs that lost out in the silicon lottery.
 
I opted for the 7800XT Hellhound directly at release and it was a good decission, 650-800 RPM fanspeed @ 250 watt is a dream, even if AMD is still behind in some disciplines.

Happy to see the GRE with the Hellhound design, its a hell of a card. Great cooling and BIOS fancurve, but still a bit to expensive in Germany ...

I am relying on your reviews since almost 20 years now @W1zzard, thank you very much.
 
I wouldn't worry about 40W. I have a laptop without a discrete card. It uses almost 40W to play a 4k youtube video using the iGPU.

If your desktop PC uses 100W all the time (at the wall), even a 20W reduction from the GPU would still be 80W total for example.

Yeah less power is nice but I don't think 40W is much of a problem. Remember that idle power for this card is extremely low (even lower than RTX cards). Unless you are only watching youtube ALL DAY LONG without a break, I think it is a non issue.
I guess you're right. But still think the GRE should be able to achieve at least the same levels as XT and XTX when it comes to media playback power consumption, which currently isn't the case (I have the 24.3.1 WHQL drivers as of this writing).
 
Only 4K (gaming) is problematic (in this case it means they lose more performance than they would lose with more bandwidth available) for 7900 GRE and 7900 XT, as they have a lot of shaders/cores but the bandwidth has been cut down compared to the XTX, which itself barely has enough. 1440p is fine in any case as it has way less requirements for bandwidth.
My 7900XT plays fine with my 144hz 4K panel. You honestly would be hard pressed to see the difference in the 2 at 4K in terms of feel.
 
No support for DLSS

Why is this in a negative section??? How can AMD use a PROPRIETARY tech like DLSS? Remove this nonsense..
Funniest thing is that he doesn't put it in Intel's GPU reviews. EDIT: And I also noticed that XeSS is listed under the "pros" while FSR doesn't deserve such a thing.


You can't make this shit up.
 
Last edited:
Back
Top