• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sapphire Radeon RX 7900 GRE Pulse

Meaning, DLSS Quality running at 1440p like I use, will look better than 1440p native, while boosting performance by 50-75% on top. Win/win for most gamers.

DLAA is everyhing DLSS is, just without upscaling, uses native res and just improve the image quality. And this is why native res gaming don't matter anymore, because native res with 3rd party AA on top looks worse. TAA is terrible just to name one inferior solution.

DLAA is the best anti aliasing method today and will work in all DLSS games. DLAA is part of DLSS presets now.

Still boggles my mind that some people think DLSS is only for improving performance while sacrificing visuals, this is not true at all.
This might not be true NOW, but DLSS 1 had lots of problems and crap visual quality. DLSS 2.0 was better but still had worse visuals than native, DLSS 2.1 (2.5 or whatever) dramatically improved on one of these, I don't remember which, and made DLSS as good as or better than native for the first time, whilst improving performance but still had bugs and problems in some games.

So with that little history lesson out of the way, there are a lot of people who have the impression of DLSS still having crap visual quality and it not really being any better than any other solution whether in game or FSR because up until DLSS 3, that was the case. DLSS 3 really changed this. A lot of people don't keep up with all of the new things all of the time, and tend to look deeply into their options a while before making a purchase at which point if someone was in the market right now DLSS 3 would be a real choice for people to consider, but when someone bought a graphics card 2-3 years ago it wouldn't have made much of an impact on their choice for most people.

AMD needs to vastly improve FSR going forward, maybe they should go the hardware route like Nvidia. More and more games rely on upscalers as AA now. Upcoming PS5 Pro and next gen Xbox also will rely on upscalers. Upscaling is here to stay, game devs embraced it and every single new AAA games has it built in. It is even enabled as default, also in AMD sponsored Starfield. Sadly it looked crazy bad and RTX users installed DLSS/DLAA mod on day one which beat FSR2 with ease o_O AMD users were better off enabling CAS Shapening in the game, sadly performance takes a hit.
AMD certainly IS now putting a lot of time effort and money into FSR, not least because as you say, both the PS5 Pro and XBox 7X will use upscaling AND both of them are using semi-custom AMD silicon, with the PS5 Pro being dubbed as Radeon 3.5 with something special requested by Sony, and the XBox C4 will either do something similar or use Radeon 4 (or 4.5, or 5 as they are way behind).

On that note, it has been rumoured (take a pinch of salt) that there is dedicated hardware for FSR 3.x (or 4, or whatever) in the RDNA 4 cards coming out later this year.
 
W1zz, whats the diff between Erratic fan speed behavior and Fan overshoot, later you were writting after many Radeons reviews? Its AMDs fault not AIBs, right?
It's pretty much the same. I'd say "overshoot" is when the fans starts too high and gradually reduce in speed until they stabilize. "Erratic" is when the fan speed keeps going up an down. Also I felt that "erratic" will make people more curious, so they look at the data.

AMD designed their fan control in a certain way, AIBs must be aware of that and properly work around it. Not easy to point fingers, but since ASRock got it right, I'd say Sapphire could definitely do better, and they have done so in the past
 
Its AMDs fault not AIBs, right?
No, it's down to the AIB, look at the links below, same chip, same drivers, different manufacturer, different Graphics BIOS.

Yo-Yo fan.


Sensible fan curve.


I rest my case.
 
No, it's down to the AIB, look at the links below, same chip, same drivers, different manufacturer, different Graphics BIOS.

Yo-Yo fan.


Sensible fan curve.


I rest my case.
Some people think that the everything bad within this fleeting human experience called life is AMD/Nvidia/Intel's fault. You can't help it. Well pointed out with the review links, though.
 
I remember everytime W1zz reviewed a Radeon, fan overshoot was AMDs fault, so I mirrored it to Erratic fan behavior as the same fault.
 
It's the same power draw that was present in 7900XT and 7900XTX on day 1 drivers. I bet AMD just did not apply that fix to GRE for whatever reason.
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(

And what's with the "erratic fan behavior" mention that the 7800 XT don't have?

Not that I've really noticed anything yet on the PowerColor "Hellhound" version of the 7900 GRE that I have.
Would be interesting if TechPowerUp could review the 7900 GRE Hellhound too.
 
Last edited:
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(

And what's with the "erratic fan behavior" mention that the 7800 XT don't have?

Not that I've really noticed anything yet on the PowerColor "Hellhound" version of the 7900 GRE that I have.
Would be interesting if TechPowerUp could review the 7900 GRE Hellhound too.
The erratic fan behaviour is simple: Sapphire messed up something. Just get the PowerColor, and you'll be fine. :)

The high media playback power consumption is what it is, I'm afraid. 45-50 W on a 7800 XT is pretty normal, unfortunately. :(
 
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(
That idle is insane, having in mind that AMD usually is no worse, or even better, at pure idle TDP vs. nvidia. Something was not right with your system, rather than the GPU, I think.

If it was an upgrade, maybe you did not wipe the previous driver properly or something.

My 3060Ti is using 10-11w in idle, 13-14w with 1080p YT video playing, and 14.5-16.5w with 4K video. It's the 3900X that is doing most of the power draining in that scenario, with total system power draw from the outlet going from ~50W idle to ~80W with youtube playing.
 
Last edited:
It would work if 4070 was 16GB and 7900Gre was 20GB, I would say who needs extra 4GB, when 16GB is more than Fine. But sadly 4070 has only 12GB of vram.
12 gigs of vram is more than enough for 1440p and below. And that isn't going to change anytime soon. These cards are already at the bottom end for 4k which is where you need 16 gigs. There's no future-proofing by having 16 gig of vram with cards in these price ranges.
 
The erratic fan behaviour is simple: Sapphire messed up something. Just get the PowerColor, and you'll be fine. :)

The high media playback power consumption is what it is, I'm afraid. 45-50 W on a 7800 XT is pretty normal, unfortunately. :(

The 40-50 W when idle (and then over 80 W when playing back video) is on the PowerColor 7900 GRE "Hellhound" for me.
I never tested this when I had the PowerColor 7800 XT "Hellhound", so I don't know how it was then, but it feels like close to doubling in power consumption is a bit much.

That idle is insane, having in mind that AMD usually is no worse, or even better, at pure idle TDP vs. nvidia. Something was not right with your system, rather than the GPU, I think.

If it was an upgrade, maybe you did not wipe the previous driver properly or something.

My 3060Ti is using 10-11w in idle, 13-14w with 1080p YT video playing, and 14.5-16.5w with 4K video. It's the 3900X that is doing most of the power draining in that scenario, with total system power draw from the outlet going from ~50W idle to ~80W with youtube playing.

This is where I get the "Total Board Power" numbers from in the AMD Software:
Screenshot 2024-03-02 125838.png


So by that I assume it's only the GPU drawing that much, not including the rest of the system components.
I did just shut down the computer, removed the 7800 XT and put in the 7900 GRE not touching the drivers, since I was thinking it should all be in place driver wise for all supported AMD cards anyway. Or am I wrong about that?

The 3060 Ti you have seem to draw much less when idle, which of course is good.
Interesting also how much better for example 6700 XT is at video playback (15 W) compared to 7700 XT (31 W). And 7800 XT (37 W) and the horrible (at least currently) 7900 GRE (64 W).


One would think power draw at idle and when playing back video should be less with newer GPU generations, but seems to be the opposite. :-/
 
The 40-50 W when idle (and then over 80 W when playing back video) is on the PowerColor 7900 GRE "Hellhound" for me.
I never tested this when I had the PowerColor 7800 XT "Hellhound", so I don't know how it was then, but it feels like close to doubling in power consumption is a bit much.



This is where I get the "Total Board Power" numbers from in the AMD Software:
View attachment 337308

So by that I assume it's only the GPU drawing that much, not including the rest of the system components.
I did just shut down the computer, removed the 7800 XT and put in the 7900 GRE not touching the drivers, since I was thinking it should all be in place driver wise for all supported AMD cards anyway. Or am I wrong about that?

The 3060 Ti you have seem to draw much less when idle, which of course is good.
Interesting also how much better for example 6700 XT is at video playback (15 W) compared to 7700 XT (31 W). And 7800 XT (37 W) and the horrible (at least currently) 7900 GRE (64 W).


One would think power draw at idle and when playing back video should be less with newer GPU generations, but seems to be the opposite. :-/
The AMD driver menu uses the GPU, so don't get your numbers from there. Get them from GPU-Z. My idle on the 7800 XT is around 10 W.
 
This is where I get the "Total Board Power" numbers from in the AMD Software
As AusWolf said, get GPU-z instead https://www.techpowerup.com/download/techpowerup-gpu-z/.

The Radeon software looks nice, but that only loads the GPU for no good reason.

It's similar to stuff like the MSI software to control RGB and other crap, which loads the CPU 24/7 even when it's not even running, just installed in the system like some malware.
 
Last edited:
The AMD driver menu uses the GPU, so don't get your numbers from there. Get them from GPU-Z. My idle on the 7800 XT is around 10 W.

As AusWolf said, get GPU-z instead https://www.techpowerup.com/download/techpowerup-gpu-z/.

The Radeon software looks nice, but that only loads the GPU for no good reason.

It's similar to stuff like the MSI software to control RGB and other crap, which loads the CPU 24/7 even when it's not even running, just installed in the system like some malware.
Thanks for the tip.

On the PowerColor 7900 GRE Hellhound I get about 20-40 W "Board Power Draw" in GPU-Z when not touching the computer (I have some tabs in Firefox currently, though) and that jumps to 84 W as soon as I start a 1080p 60fps clip on YouTube. A 40 W increase for that sure seems a bit much.
 
I love your initiative to add a Blender benchmark. In a similar vein, would you consider adding Unreal Engine 5 to your testing? There's some charts out there but none cover and compare as many GPUs as you guys do.
 
would you consider adding Unreal Engine 5 to your testing?
What kind of testing are you looking for? UE5 game performance is covered in Lords of the Fallen and Remnant
 
What kind of testing are you looking for? UE5 game performance is covered in Lords of the Fallen and Remnant
I was thinking testing how a GPU behaves in Unreal Editor using one of those free sample environments provided by Epic Games at their store, as a more official standard for UE5.
Seeing how it behaves with a UE5 game works too I guess, hadn't thought about that.
 
how a GPU behaves
I don't think it's fundamentally different to a shipping UE5 title, probably more on the light side, because you can lower the settings easily. do you feel the editor is constrained by GPU performance? For me it mostly feels like I'm waiting on the GUI/storage
 
I don't think it's fundamentally different to a shipping UE5 title, probably more on the light side, because you can lower the settings easily. do you feel the editor is constrained by GPU performance? For me it mostly feels like I'm waiting on the GUI/storage
It does take a long time to completely load the scene when it's starting up. That part is not affected by the GPU?
 
It does take a long time to completely load the scene when it's starting up. That part is not affected by the GPU?
Nope, check bottom right, do you see shaders compiling? This should be a one-time process, if not, make sure your derived data caches are setup properly. Also check Task Manager, your SSD or CPU is probably sitting at high load
 
RoboCop ue5 rendition is piece of schit!
 
How thick are the Thermal pads on the 7900 GRE Pulse? Is there any info?
 
The card seems relatively weak compared to other 7900GRE cards, why does it still get an award?
 
The card seems relatively weak compared to other 7900GRE cards, why does it still get an award?
If I had to guess, it's because all the GRE cards are exceptionally good compared to the 7900XT and 7800XT alternatives. Not only do they offer better performance/Watt and better performance/$ than those other two, all of the GRE coolers reviewed here are overkill.

Yes, the Pulse here has the worst of the four GRE thermals in TechPowerUp's mini group test, but it's still only 57C at the relatively quiet 35dBA noise-normalised test. Realistically there's a lot of headroom in the pulse cooler if you want a silent card, or to overclock your sample to the 311W power limit. Don't forget that the Pulse is also the only one at the MSRP of $550 and all the others are physically larger and come with a cost premium. Given that the cooler on this Pulse model is already overkill, it actually makes this the best value card in this group test.

If you want to spend another $50 you can get a card like the TUF with a higher power limit for a fraction more overclocking headroom, but Radeons don't really respond well to more power, they're better when you undervolt them. The difference between the best and worst card in this test is just 3% higher clocks, which will likely result in a 1-2% performance gain, something too small for most people to even measure, let alone notice! It's absolutely not worth paying $30 or $50 more for in my opinion.
 
Back
Top