• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PowerColor Radeon RX 7900 GRE Hellhound

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,778 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
PowerColor's Radeon RX 7900 GRE Hellhound comes with a fantastic cooling solution that's the best of all the GRE cards that we've tested so far. Not only temperatures are low, but noise levels are outstanding, and the dual BIOS capability lets you turn the card into a whisper-quiet gaming machine.

Show full review
 
Joined
Mar 1, 2021
Messages
486 (0.36/day)
Location
Germany
System Name Homebase
Processor Ryzen 5 5600
Motherboard Gigabyte Aorus X570S UD
Cooling Scythe Mugen 5 RGB
Memory 2*16 Kingston Fury DDR4-3600 double ranked
Video Card(s) AMD Radeon RX 6800 16 GB
Storage 1*512 WD Red SN700, 1*2TB Curcial P5, 1*2TB Sandisk Plus (TLC), 1*14TB Toshiba MG
Display(s) Philips E-line 275E1S
Case Fractal Design Torrent Compact
Power Supply Corsair RM850 2019
Mouse Sharkoon Sharkforce Pro
Keyboard Fujitsu KB955
You can now go higher with 23.4.1 in the oc section. Thanks for the review, looks like a nice card.
 
Joined
Oct 27, 2014
Messages
181 (0.05/day)
Great card indeed. They been doing excellent job with their Hellhound series. Personally love the clean look, don't really need the RGB.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,778 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
You can now go higher with 23.4.1 in the oc section. Thanks for the review, looks like a nice card.
Confirmed, it goes up to 3000 MHz now. Testing right now, will update the review soon
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.66/day)
Location
Ex-usa | slava the trolls
Very high power consumption, though.

1711136059423.png


Meanwhile, its direct FPS competitor is significantly less power hungry:
1711136472836.png


Very low Counter - Strike 2 performance, still not fixed. Waiting for a driver.
Does anyone know why RDNA 3 is broken in CS 2? :banghead:

1711136132877.png
 
Last edited:
Joined
May 13, 2008
Messages
762 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Confirmed, it goes up to 3000 MHz now. Testing right now, will update the review soon
Am intrigued. Curious how it effects other models (such as the nitro+) with higher PL as well? Also, is this just the core or did the ram limits change? Thanks for doing the extra work; we appreciate you.

------------------

I am quite bothered by both news articles and this review saying nVIDIA has no such artificial limitations. You and I both know that's not true. I don't know how educated Toast et al are on the matter.

While the SLIDERS may not be limited, nVIDIA is quick to lock down both voltage and power limit to segment their cards very specifically (a recent example being the 4070 Ti Super 16GB), there-by reducing the usefulness of those sliders and over-all practicality/longevity of those products. They also tailored the process (N5->'4N') to focus on efficiency/area to (apparently) not scale (well) with voltage past the optimal power/frequency curve (which likely saved them money in silicon costs), where-as it would appear AMD did not. If you want to applaud nVIDIA's designs for those decisions, that's a personal opinion, but so should praise be given (by enthusiasts) when a product can be pushed past the curve. So to exclaim one thing, when the actual opposite is true, is simply disingenuous. To excise those notions from the reality of the situation while pointing out the slider issue (which I'm not saying you shouldn't; I'm glad you [and HWU] did; it puts pressure on AMD to give customers the right and ability to achieve the best conceivable outcome if they choose to OC their cards) is ignorant, if not 'picking a side'.

There should be just as much pressure on nVIDIA to raise voltage levels to wherever it will literally stop scaling (~1.1v/3200mhz? on n5/'4n'?), and to raise their power limits (especially on expensive/overbuilt cards with VRM/coolers that can most certainly handle the extra heat and sustained higher frequency under load). I know I'm not the only one that feels this way, although it has become clear less and less of the customer base actually understands it (I know you do), but I might be the only one that will say it to someone (that would be reviewers/notable HW youtubers such as yourself) that could actually put public pressure on said company to change their anti-consumer philosophies so that everyone can make the most out of their cards (and cpus...but that's a fight for another day), their (planned) stacks (and early obsolescene) be damned. Cards should be able to perform the best they possibly can (just like a 7900xtx with a ridiculously high voltage [1.2v?/each card should be power-limited to the power connector spec IMO] to max it out at ~3100-3200mhz or whatever the case may be) because that is the AIB and consumer perogative wrt the capability of their coolers/VRM, the (sometimes binning of) actual silicon, and how much that perf is worth using how much power; it should not be on the manufacturer (beyond safeguards to avoid actual damage; but you, I, and many others know the set levels are much lower than that and are not for that reason) to make that decision for us, especially under the guise of safety when it's really all about (them making more) money.

To condone, abide, or allow ANY company to limit OCing on cards otherwise very capable, only to push customers to spend more money (that most never will) and limit their conceivably 'best' experience, if they choose to spend the time to tinker, is anti-consumer, and is in-fact hurting this scene. It already has. I don't think companies understand how much negativity surrounds them by these limitations, nor does AMD apparently remember how much positive consumer sentiment their 'old ways' (of bios flashing first shipments of cut-down cards to the higher-end products, lack of lockdown on higher voltage/PL, not limiting clocks within bios) gave them within the (knowledgeable/) enthusiast community; it's literally how (IMO) they built their (positive) reputation. The trickle down IS real. WRT to the current issue, I'm glad AMD is being kept more honest; more true to the soul of what old AMD/ATi products used to be about. Limits should be above (conceivable) capability, especially wrt how boost works, but I think people can (realistically) work with 3000 in this case.

nVIDIA may likely never change; their goal is to hope consumers don't understand they do these things or forget they ever could be done. They should be consistantly scrutinized for that, not let them 'win' and let it be forgotten, only for the sake of their coffers and consumers having a less-than-optimal experience with lower-end hardware (that may be the best they could ever afford) in hopes to upsell them to something (overpriced) they may not actually need were things different.

If companies like AMD/nVIDIA feel they need to limit their overclocking potential (and AIBs right to use a better cooler/higher voltage/power level), my message to them would to be to focus on making better cards and a better (innate hardware) stack in which their own cards don't threaten each other. Also, while not-limiting cards may allow some overlap (which is VERY MUCH why these limitations exist; to create specific markets that sometimes do not line up with consumer needs but do line up to those companies making higher margains), it's not like MOST people will ever go that far or even think about it that much when making a decision to buy a card. People like myself are a minority (and even more-so now, sadly). It likely won't move the needle much in high-end sales as I think/know that most people generally look at stock performance when picking any product. The (overclocking) capability is what builds enthusiast sentiment, or loses it. Similar could be said of shorter/thinner/low-power versions of cards of old using a larger chip than generally intended but reduced voltage/power to coincide with the less componenets and/or cooler to fit in a smaller space. All of those areas have been effected by (largely nVIDIA) meddling, and not only should it not be forgotten, it should be actively scrutinized (as I believe many AIBs would love to offer those niche options across all spectrums), as those customers do/would exist and should be allowed to have options; they shouldn't have to fit into nVIDIA's (et al) pre-ordained box; completely controlled wrt (what size system to use and when to upgrade) ecosystem and rung on the current and future ladder.

As we've seen from these choices, it has done nothing but hurt the knowledge/interest in the workings of these chips by a larger audience, and both companies (but mostly nVIDIA) should be shunned and ashamed for their part in that. I'm glad when AMD (sometimes) dials it back, as they've been known to do with enough pressure. nVIDIA will make bullshit excuses; but let them, and I/we can call them on it.

That's how sentiment should work.

So yeah, please don't ever say nVIDIA does not have any artificial limitations. Love and respect you, but it pisses me off very much because not only is it wrong, but because many very much trust your words.

Thanks for your consideration. I know these 'battles' can be tough, and for many they don't feel the need/desire to fight them, but it must be done for the betterment of the consumer/enthusiast space.

You know, the betterment of (and options for) the PEOPLE, not the COMPANIES (and ability to force upgrades and/or demand higher margains).

At the end of the day, all these things are very real, and we've seen options/value evaporate over the years. At some point you have to pick a side when you know something (better) can be done about it.

You have to ask yourself, if you're knowledagble-enough of the capability of such products and remember (capabilities and options of) the past...Do you know which side you're on?

Because I certainly do, and it's pretty freaking lonely sometimes.

I don't blame people that don't know any better, but that is a planned construct created by these companies to influence the thinking of what's possible wrt the current/next generation...and they're winning.

It's on us that remember, and know things can be better, to actually do/say something about it. If not for the benefit of us (that may be able to pony up the cash for higher-end or more frequent upgrades), then for that next generation of people that, given the chance/opportunity/understanding, may be more like us. Or, perhaps, how some of us used to be. Before we got old, tired, and/or sold out.

(By 'sold out', I'm talking Anand, Shrout, Kyle, etc leaving the space to work for the companies they used to keep in check. I'm not trying to imply you sold out [I'm trying to explain many people that kept that space 'honest' and enthusiast sentiment alive were bought up and/or took opportunies by Apple, Intel, AMD, Asus, etc], but the opposite; you're still here putting in the hard work with hard numbers and honest [if subjective] opinions. That's why what you say and do is more important than ever...You're one of the few [and in my mind, very important] OGs that are still around, and I truly respect/appreciate that very much! I just want things to be the best [and people best-informed] they can be from the people that they trust the most [which you've earned]. Not trying to imply there are not areas you are right and I am wrong, or we can't have different subjective opinions; simply that sometimes blanket statements can give people the wrong impression [which is not a flaw that escapes myself on an on-going basis], and being thorough in a review that effects many peoples' decisions is perhaps more important than one guy spouting an thoughts on a forum. I get there's something to being concise [which obviously isn't a talent of mine] and not every issue can be touched on at every single moment; just throwing it out there as something to consider given your position and willingness to write a lengthy conclusion!)
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,778 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Also, is this just the core or did the ram limits change?
This is just the ram limits

nVIDIA is quick to lock down both voltage and power limit to segment their cards very specifically (a recent example being the 4070 Ti Super 16GB

Nothing really locked down. Or do you mean the +0% cards that are +0% because the AICs cheaped out?
 
Joined
Nov 22, 2020
Messages
70 (0.05/day)
Processor Ryzen 5 3600
Motherboard ASRock X470 Taichi
Cooling Scythe Kotetsu Mark II
Memory G.SKILL 32GB DDR4 3200 CL16
Video Card(s) EVGA GeForce RTX 3070 FTW3 Ultra (1980 MHz / 0.968 V)
Display(s) Dell P2715Q; BenQ EX3501R; Panasonic TC-P55S60
Case Fractal Design Define R5
Audio Device(s) Sennheiser HD580; 64 Audio 1964-Q
Power Supply Seasonic SSR-650TR
Mouse Logitech G700s; Logitech G903
Keyboard Cooler Master QuickFire TK; Kinesis Advantage
VR HMD Quest 2
Wow, how is that cooler so effective at only 1200g? It's W/ºC is like 50% better than the other 7900 GRE coolers at about the same weight. And on par with RTX 4090 coolers twice its mass.

Edit: actually this cooler is about the same as the Hellhound 7800 XT (no surprise, as W1zzard mentioned this chip is designed to fit those existing PCBs and coolers) and on par with several other 7800 XT coolers. I forget that the Nvidia chips are harder to cool and can't be directly compared to AMD.

I like to see, in addition to a well tuned fan curve, it has a low 600 RPM minimum fan speed on the quiet BIOS setting, for extra flexibility with custom fan profiles. I wish other AIB parters would follow suit.
 
Last edited:
Joined
Sep 15, 2013
Messages
54 (0.01/day)
Processor i5-4670k @ 4.2 GHz
Motherboard ASUS Z87 Pro
Cooling Corsair H105
Memory G.SKILL RipjawsX 16GB @ 2133 MHz
Video Card(s) Gigabyte GTX 780 GHz Edition
Storage Samsung 840 Evo 500GB
Case Thermaltake MK-1
Power Supply Seasonic X 750w
Mouse Razer DeathAdder
It is an excellent card, the only downside for me is lacking of RGB.
 
Joined
Oct 6, 2021
Messages
1,605 (1.41/day)
Very high power consumption, though.

View attachment 340195

Meanwhile, its direct FPS competitor is significantly less power hungry:
View attachment 340197

Very low Counter - Strike 2 performance, still not fixed. Waiting for a driver.
Does anyone know why RDNA 3 is broken in CS 2? :banghead:

View attachment 340196

1711152802506.png

The 4070 is slower than both the 7800XT and GRE by a good margin. Plus, Why manipulate information showing consumption peaks instead of averages?
1711152663632.png


CS2 runs at 700fps if you have a good processor and turn off 8x MSAA
 
Joined
May 3, 2018
Messages
2,881 (1.21/day)
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.
 
Joined
Aug 10, 2023
Messages
341 (0.73/day)
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.
It’s fine if tuned but the 7900 XT is a great buy that’s true.

thanks @W1zzard for retesting OC with higher vram limit. It’s quite clear this GPU is very bandwidth starved, even more than the 7900 XT is at 4K. Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it. The same happened to the 7600 XT but that card is mostly only used for 1080p anyway which has much less issues with bandwidth.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.66/day)
Location
Ex-usa | slava the trolls
Why ... showing consumption peaks instead of averages?

When you choose a power supply, you need to look at this particular information, because otherwise you could make a wrong decision.
The power spikes are the most important, not idle or video playback.
 
Joined
Apr 17, 2021
Messages
564 (0.43/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
Would honestly just save up for a month longer and get the 7900XT. If they didn't gimp the 7900GRE so much it would be tempting.

The 7900 XT is only cheap in the US. The GRE is a good price in many countries. It's easy to find for $550 USD outside of the US.

Frankly this Hellhound card is incredible. Too bad this specific model is currently $100 more in my country than the normal GRE cards. High demand. Selling really well.

An RTX 3090 for less. A 4070 Ti for $220 USD less. Etc. Really quite good. I'm glad they fixed the GRE it was very disappointing at first.

When you choose a power supply, you need to look at this particular information, because otherwise you could make a wrong decision.
The power spikes are the most important, not idle or video playback.

Buy AMD for your CPU and don't worry about your power supply at all. Any good 650W gold is more than enough.

It's the Intel + GPU combo that is a mess. Those spikes from Intel are what make your PSU fail.
 

star-affinity

New Member
Joined
Mar 1, 2024
Messages
18 (0.07/day)
Nice to see this reviewed and getting such a good grade since this is the card I have and was wondering what TechPowerUp would say about it. :)
Especially happy about the low fan noise which is something I find important.

But definitely concerned about the high power draw when playing back media. Just staring a clip on YouTube bumps the power consumtion up 40 W. Hope that can be fixed, because I see the 7900 XT and XTX aren't that bad.

It’s fine if tuned but the 7900 XT is a great buy that’s true.

thanks @W1zzard for retesting OC with higher vram limit. It’s quite clear this GPU is very bandwidth starved, even more than the 7900 XT is at 4K. Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it. The same happened to the 7600 XT but that card is mostly only used for 1080p anyway which has much less issues with bandwidth.
In what scenarios would you say this about the memory bandwith being lower would be a problem? I'm not planing to move to native 4K gaming anytime soon (staying at 1440p) and will likely hold on to the next generation of GPUs until going 4K native.
 
Last edited:
Joined
Apr 17, 2021
Messages
564 (0.43/day)
System Name Jedi Survivor Gaming PC
Processor AMD Ryzen 7800X3D
Motherboard Asus TUF B650M Plus Wifi
Cooling ThermalRight CPU Cooler
Memory G.Skill 32GB DDR5-5600 CL28
Video Card(s) MSI RTX 3080 10GB
Storage 2TB Samsung 990 Pro SSD
Display(s) MSI 32" 4K OLED 240hz Monitor
Case Asus Prime AP201
Power Supply FSP 1000W Platinum PSU
Mouse Logitech G403
Keyboard Asus Mechanical Keyboard
Nice to see this reviewed and getting such a good grade since this is the card I have and was wondering what TechPowerUp would say about it. :)

But definitely concerned about the high power draw when playing back media. Just staring a clip on YouTube bumps the power consumtion up 40 W. Hope that can be fixed, because I see the 7900 XT and XTX aren't that bad.

I wouldn't worry about 40W. I have a laptop without a discrete card. It uses almost 40W to play a 4k youtube video using the iGPU.

If your desktop PC uses 100W all the time (at the wall), even a 20W reduction from the GPU would still be 80W total for example.

Yeah less power is nice but I don't think 40W is much of a problem. Remember that idle power for this card is extremely low (even lower than RTX cards). Unless you are only watching youtube ALL DAY LONG without a break, I think it is a non issue.
 
Joined
Aug 10, 2023
Messages
341 (0.73/day)
In what scenarios would you say this about the memory bandwith being lower would be a problem?
Only 4K (gaming) is problematic (in this case it means they lose more performance than they would lose with more bandwidth available) for 7900 GRE and 7900 XT, as they have a lot of shaders/cores but the bandwidth has been cut down compared to the XTX, which itself barely has enough. 1440p is fine in any case as it has way less requirements for bandwidth.
 
Last edited:
Joined
Dec 16, 2021
Messages
332 (0.31/day)
Location
Denmark
Processor AMD Ryzen 7 3800X
Motherboard ASUS Prime X470-Pro
Cooling bequiet! Dark Rock Slim
Memory 64 GB ECC DDR4 2666 MHz (Samsung M391A2K43BB1-CTD)
Video Card(s) eVGA GTX 1080 SC Gaming, 8 GB
Storage 1 TB Samsung 970 EVO Plus, 1 TB Samsung 850 EVO, 4 TB Lexar NM790, 12 TB WD HDDs
Display(s) Acer Predator XB271HU
Case Corsair Obsidian 550D
Audio Device(s) Creative X-Fi Fatal1ty
Power Supply Seasonic X-Series 560W
Mouse Logitech G502
Keyboard Glorious GMMK
Still not a fan of listing lack of DLSS support as a negative. I'm well aware of the reason, but to me it's akin to reviewing a smart phone and then list "not an iPhone" as a negative. Further to the point, AMD (and also Intel) is actually punished for developing an architecture-agnostic frame generation solution.
 
Joined
Feb 20, 2019
Messages
8,251 (3.94/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
That "RGB OFF" switch.

*chefs kiss*

Again it is puzzling why AMD chose to detune the vram to 2250 MHz - it’s 2500 MHz G6 and I checked multiple reviews to be sure, all cards have it.
The accepted reason is "artificial product segmentation" to avoid the GRE ruining sales of the far more profitable XT.

A hypothesis I have is that the GRE is very defective N31 silicon, and so AMD are pairing it with their lowest-binned MCDs that perhaps aren't capable of driving the GDDR6 at its rated speed with 100% stability. You can take your changes with overclocking and probably get the full 2500MHz out of it, but this way AMD aren't having to deal with RMAs for any samples that have MCDs that lost out in the silicon lottery.
 

cox

Joined
Dec 30, 2019
Messages
15 (0.01/day)
Location
Germany
System Name Telekraft
Processor 5800X3D @ KomboStrike 3
Motherboard MSI MPG B550 Gaming Plus
Cooling Thermalright Phantom Spirit 120
Memory 32GB T-Force Vulcan Z CL18
Video Card(s) 7800XT Hellhound
Storage 500GB 960 Evo + 6 TB
Display(s) AOC 34" UWQHD 144Hz (CU34G2X/BK)
Case Lancool K7 (modded)
Audio Device(s) UMC204HD & S.M.S.L. SA-36A pro
Power Supply be Quiet! Pure Power 11 CM 500W
Mouse G502 Hero
Keyboard RK68 @ custom switches & caps
I opted for the 7800XT Hellhound directly at release and it was a good decission, 650-800 RPM fanspeed @ 250 watt is a dream, even if AMD is still behind in some disciplines.

Happy to see the GRE with the Hellhound design, its a hell of a card. Great cooling and BIOS fancurve, but still a bit to expensive in Germany ...

I am relying on your reviews since almost 20 years now @W1zzard, thank you very much.
 

star-affinity

New Member
Joined
Mar 1, 2024
Messages
18 (0.07/day)
I wouldn't worry about 40W. I have a laptop without a discrete card. It uses almost 40W to play a 4k youtube video using the iGPU.

If your desktop PC uses 100W all the time (at the wall), even a 20W reduction from the GPU would still be 80W total for example.

Yeah less power is nice but I don't think 40W is much of a problem. Remember that idle power for this card is extremely low (even lower than RTX cards). Unless you are only watching youtube ALL DAY LONG without a break, I think it is a non issue.
I guess you're right. But still think the GRE should be able to achieve at least the same levels as XT and XTX when it comes to media playback power consumption, which currently isn't the case (I have the 24.3.1 WHQL drivers as of this writing).
 
Joined
Jun 2, 2017
Messages
9,071 (3.33/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Only 4K (gaming) is problematic (in this case it means they lose more performance than they would lose with more bandwidth available) for 7900 GRE and 7900 XT, as they have a lot of shaders/cores but the bandwidth has been cut down compared to the XTX, which itself barely has enough. 1440p is fine in any case as it has way less requirements for bandwidth.
My 7900XT plays fine with my 144hz 4K panel. You honestly would be hard pressed to see the difference in the 2 at 4K in terms of feel.
 
Joined
Aug 7, 2019
Messages
361 (0.19/day)
No support for DLSS

Why is this in a negative section??? How can AMD use a PROPRIETARY tech like DLSS? Remove this nonsense..
Funniest thing is that he doesn't put it in Intel's GPU reviews. EDIT: And I also noticed that XeSS is listed under the "pros" while FSR doesn't deserve such a thing.


You can't make this shit up.
 
Last edited:
Top