• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 7900 GRE Pulse

Joined
Feb 12, 2021
Messages
156 (0.13/day)
Meaning, DLSS Quality running at 1440p like I use, will look better than 1440p native, while boosting performance by 50-75% on top. Win/win for most gamers.

DLAA is everyhing DLSS is, just without upscaling, uses native res and just improve the image quality. And this is why native res gaming don't matter anymore, because native res with 3rd party AA on top looks worse. TAA is terrible just to name one inferior solution.

DLAA is the best anti aliasing method today and will work in all DLSS games. DLAA is part of DLSS presets now.

Still boggles my mind that some people think DLSS is only for improving performance while sacrificing visuals, this is not true at all.
This might not be true NOW, but DLSS 1 had lots of problems and crap visual quality. DLSS 2.0 was better but still had worse visuals than native, DLSS 2.1 (2.5 or whatever) dramatically improved on one of these, I don't remember which, and made DLSS as good as or better than native for the first time, whilst improving performance but still had bugs and problems in some games.

So with that little history lesson out of the way, there are a lot of people who have the impression of DLSS still having crap visual quality and it not really being any better than any other solution whether in game or FSR because up until DLSS 3, that was the case. DLSS 3 really changed this. A lot of people don't keep up with all of the new things all of the time, and tend to look deeply into their options a while before making a purchase at which point if someone was in the market right now DLSS 3 would be a real choice for people to consider, but when someone bought a graphics card 2-3 years ago it wouldn't have made much of an impact on their choice for most people.

AMD needs to vastly improve FSR going forward, maybe they should go the hardware route like Nvidia. More and more games rely on upscalers as AA now. Upcoming PS5 Pro and next gen Xbox also will rely on upscalers. Upscaling is here to stay, game devs embraced it and every single new AAA games has it built in. It is even enabled as default, also in AMD sponsored Starfield. Sadly it looked crazy bad and RTX users installed DLSS/DLAA mod on day one which beat FSR2 with ease o_O AMD users were better off enabling CAS Shapening in the game, sadly performance takes a hit.
AMD certainly IS now putting a lot of time effort and money into FSR, not least because as you say, both the PS5 Pro and XBox 7X will use upscaling AND both of them are using semi-custom AMD silicon, with the PS5 Pro being dubbed as Radeon 3.5 with something special requested by Sony, and the XBox C4 will either do something similar or use Radeon 4 (or 4.5, or 5 as they are way behind).

On that note, it has been rumoured (take a pinch of salt) that there is dedicated hardware for FSR 3.x (or 4, or whatever) in the RDNA 4 cards coming out later this year.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,119 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
W1zz, whats the diff between Erratic fan speed behavior and Fan overshoot, later you were writting after many Radeons reviews? Its AMDs fault not AIBs, right?
It's pretty much the same. I'd say "overshoot" is when the fans starts too high and gradually reduce in speed until they stabilize. "Erratic" is when the fan speed keeps going up an down. Also I felt that "erratic" will make people more curious, so they look at the data.

AMD designed their fan control in a certain way, AIBs must be aware of that and properly work around it. Not easy to point fingers, but since ASRock got it right, I'd say Sapphire could definitely do better, and they have done so in the past
 
Joined
Feb 12, 2021
Messages
156 (0.13/day)
Its AMDs fault not AIBs, right?
No, it's down to the AIB, look at the links below, same chip, same drivers, different manufacturer, different Graphics BIOS.

Yo-Yo fan.


Sensible fan curve.


I rest my case.
 
Joined
Jan 14, 2019
Messages
10,078 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
No, it's down to the AIB, look at the links below, same chip, same drivers, different manufacturer, different Graphics BIOS.

Yo-Yo fan.


Sensible fan curve.


I rest my case.
Some people think that the everything bad within this fleeting human experience called life is AMD/Nvidia/Intel's fault. You can't help it. Well pointed out with the review links, though.
 
Joined
Jul 10, 2015
Messages
750 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
I remember everytime W1zz reviewed a Radeon, fan overshoot was AMDs fault, so I mirrored it to Erratic fan behavior as the same fault.
 

star-affinity

New Member
Joined
Mar 1, 2024
Messages
13 (0.15/day)
It's the same power draw that was present in 7900XT and 7900XTX on day 1 drivers. I bet AMD just did not apply that fix to GRE for whatever reason.
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(

And what's with the "erratic fan behavior" mention that the 7800 XT don't have?

Not that I've really noticed anything yet on the PowerColor "Hellhound" version of the 7900 GRE that I have.
Would be interesting if TechPowerUp could review the 7900 GRE Hellhound too.
 
Last edited:
Joined
Jan 14, 2019
Messages
10,078 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(

And what's with the "erratic fan behavior" mention that the 7800 XT don't have?

Not that I've really noticed anything yet on the PowerColor "Hellhound" version of the 7900 GRE that I have.
Would be interesting if TechPowerUp could review the 7900 GRE Hellhound too.
The erratic fan behaviour is simple: Sapphire messed up something. Just get the PowerColor, and you'll be fine. :)

The high media playback power consumption is what it is, I'm afraid. 45-50 W on a 7800 XT is pretty normal, unfortunately. :(
 
Joined
Jun 21, 2013
Messages
543 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
So you think it will be fixed?
I just today decided to return my one month old purchase of the PowerColor 7800 XT "Hellhound" card and get the same model of the 7900 GRE since I got it for just a little more money, but I'm quite disappointed and surprised about the media playback power consumption.

"Total Board Power" for me goes from around 40-50 W (when idle and not touching the computer) to around 85 W if I playback a 1080p clip at 60fps on YouTube. Pretty crazy I think. :(
That idle is insane, having in mind that AMD usually is no worse, or even better, at pure idle TDP vs. nvidia. Something was not right with your system, rather than the GPU, I think.

If it was an upgrade, maybe you did not wipe the previous driver properly or something.

My 3060Ti is using 10-11w in idle, 13-14w with 1080p YT video playing, and 14.5-16.5w with 4K video. It's the 3900X that is doing most of the power draining in that scenario, with total system power draw from the outlet going from ~50W idle to ~80W with youtube playing.
 
Last edited:
Joined
Aug 23, 2017
Messages
95 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
It would work if 4070 was 16GB and 7900Gre was 20GB, I would say who needs extra 4GB, when 16GB is more than Fine. But sadly 4070 has only 12GB of vram.
12 gigs of vram is more than enough for 1440p and below. And that isn't going to change anytime soon. These cards are already at the bottom end for 4k which is where you need 16 gigs. There's no future-proofing by having 16 gig of vram with cards in these price ranges.
 

star-affinity

New Member
Joined
Mar 1, 2024
Messages
13 (0.15/day)
The erratic fan behaviour is simple: Sapphire messed up something. Just get the PowerColor, and you'll be fine. :)

The high media playback power consumption is what it is, I'm afraid. 45-50 W on a 7800 XT is pretty normal, unfortunately. :(

The 40-50 W when idle (and then over 80 W when playing back video) is on the PowerColor 7900 GRE "Hellhound" for me.
I never tested this when I had the PowerColor 7800 XT "Hellhound", so I don't know how it was then, but it feels like close to doubling in power consumption is a bit much.

That idle is insane, having in mind that AMD usually is no worse, or even better, at pure idle TDP vs. nvidia. Something was not right with your system, rather than the GPU, I think.

If it was an upgrade, maybe you did not wipe the previous driver properly or something.

My 3060Ti is using 10-11w in idle, 13-14w with 1080p YT video playing, and 14.5-16.5w with 4K video. It's the 3900X that is doing most of the power draining in that scenario, with total system power draw from the outlet going from ~50W idle to ~80W with youtube playing.

This is where I get the "Total Board Power" numbers from in the AMD Software:
Screenshot 2024-03-02 125838.png


So by that I assume it's only the GPU drawing that much, not including the rest of the system components.
I did just shut down the computer, removed the 7800 XT and put in the 7900 GRE not touching the drivers, since I was thinking it should all be in place driver wise for all supported AMD cards anyway. Or am I wrong about that?

The 3060 Ti you have seem to draw much less when idle, which of course is good.
Interesting also how much better for example 6700 XT is at video playback (15 W) compared to 7700 XT (31 W). And 7800 XT (37 W) and the horrible (at least currently) 7900 GRE (64 W).


One would think power draw at idle and when playing back video should be less with newer GPU generations, but seems to be the opposite. :-/
 
Joined
Jan 14, 2019
Messages
10,078 (5.15/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
The 40-50 W when idle (and then over 80 W when playing back video) is on the PowerColor 7900 GRE "Hellhound" for me.
I never tested this when I had the PowerColor 7800 XT "Hellhound", so I don't know how it was then, but it feels like close to doubling in power consumption is a bit much.



This is where I get the "Total Board Power" numbers from in the AMD Software:
View attachment 337308

So by that I assume it's only the GPU drawing that much, not including the rest of the system components.
I did just shut down the computer, removed the 7800 XT and put in the 7900 GRE not touching the drivers, since I was thinking it should all be in place driver wise for all supported AMD cards anyway. Or am I wrong about that?

The 3060 Ti you have seem to draw much less when idle, which of course is good.
Interesting also how much better for example 6700 XT is at video playback (15 W) compared to 7700 XT (31 W). And 7800 XT (37 W) and the horrible (at least currently) 7900 GRE (64 W).


One would think power draw at idle and when playing back video should be less with newer GPU generations, but seems to be the opposite. :-/
The AMD driver menu uses the GPU, so don't get your numbers from there. Get them from GPU-Z. My idle on the 7800 XT is around 10 W.
 
Joined
Jun 21, 2013
Messages
543 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
This is where I get the "Total Board Power" numbers from in the AMD Software
As AusWolf said, get GPU-z instead https://www.techpowerup.com/download/techpowerup-gpu-z/.

The Radeon software looks nice, but that only loads the GPU for no good reason.

It's similar to stuff like the MSI software to control RGB and other crap, which loads the CPU 24/7 even when it's not even running, just installed in the system like some malware.
 
Last edited:

star-affinity

New Member
Joined
Mar 1, 2024
Messages
13 (0.15/day)
The AMD driver menu uses the GPU, so don't get your numbers from there. Get them from GPU-Z. My idle on the 7800 XT is around 10 W.

As AusWolf said, get GPU-z instead https://www.techpowerup.com/download/techpowerup-gpu-z/.

The Radeon software looks nice, but that only loads the GPU for no good reason.

It's similar to stuff like the MSI software to control RGB and other crap, which loads the CPU 24/7 even when it's not even running, just installed in the system like some malware.
Thanks for the tip.

On the PowerColor 7900 GRE Hellhound I get about 20-40 W "Board Power Draw" in GPU-Z when not touching the computer (I have some tabs in Firefox currently, though) and that jumps to 84 W as soon as I start a 1080p 60fps clip on YouTube. A 40 W increase for that sure seems a bit much.
 

Isaak

New Member
Joined
Jan 25, 2023
Messages
14 (0.03/day)
I love your initiative to add a Blender benchmark. In a similar vein, would you consider adding Unreal Engine 5 to your testing? There's some charts out there but none cover and compare as many GPUs as you guys do.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,119 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
would you consider adding Unreal Engine 5 to your testing?
What kind of testing are you looking for? UE5 game performance is covered in Lords of the Fallen and Remnant
 

Isaak

New Member
Joined
Jan 25, 2023
Messages
14 (0.03/day)
What kind of testing are you looking for? UE5 game performance is covered in Lords of the Fallen and Remnant
I was thinking testing how a GPU behaves in Unreal Editor using one of those free sample environments provided by Epic Games at their store, as a more official standard for UE5.
Seeing how it behaves with a UE5 game works too I guess, hadn't thought about that.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,119 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
how a GPU behaves
I don't think it's fundamentally different to a shipping UE5 title, probably more on the light side, because you can lower the settings easily. do you feel the editor is constrained by GPU performance? For me it mostly feels like I'm waiting on the GUI/storage
 

Isaak

New Member
Joined
Jan 25, 2023
Messages
14 (0.03/day)
I don't think it's fundamentally different to a shipping UE5 title, probably more on the light side, because you can lower the settings easily. do you feel the editor is constrained by GPU performance? For me it mostly feels like I'm waiting on the GUI/storage
It does take a long time to completely load the scene when it's starting up. That part is not affected by the GPU?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,119 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
It does take a long time to completely load the scene when it's starting up. That part is not affected by the GPU?
Nope, check bottom right, do you see shaders compiling? This should be a one-time process, if not, make sure your derived data caches are setup properly. Also check Task Manager, your SSD or CPU is probably sitting at high load
 
Joined
Jul 10, 2015
Messages
750 (0.23/day)
Location
Sokovia
System Name Alienation from family
Processor i7 7700k
Motherboard Hero VIII
Cooling Macho revB
Memory 16gb Hyperx
Video Card(s) Asus 1080ti Strix OC
Storage 960evo 500gb
Display(s) AOC 4k
Case Define R2 XL
Power Supply Be f*ing Quiet 600W M Gold
Mouse NoName
Keyboard NoNameless HP
Software You have nothing on me
Benchmark Scores Personal record 100m sprint: 60m
RoboCop ue5 rendition is piece of schit!
 

riOrizOr

New Member
Joined
Apr 9, 2024
Messages
3 (0.07/day)
How thick are the Thermal pads on the 7900 GRE Pulse? Is there any info?
 
Joined
Feb 20, 2019
Messages
7,461 (3.89/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The card seems relatively weak compared to other 7900GRE cards, why does it still get an award?
If I had to guess, it's because all the GRE cards are exceptionally good compared to the 7900XT and 7800XT alternatives. Not only do they offer better performance/Watt and better performance/$ than those other two, all of the GRE coolers reviewed here are overkill.

Yes, the Pulse here has the worst of the four GRE thermals in TechPowerUp's mini group test, but it's still only 57C at the relatively quiet 35dBA noise-normalised test. Realistically there's a lot of headroom in the pulse cooler if you want a silent card, or to overclock your sample to the 311W power limit. Don't forget that the Pulse is also the only one at the MSRP of $550 and all the others are physically larger and come with a cost premium. Given that the cooler on this Pulse model is already overkill, it actually makes this the best value card in this group test.

If you want to spend another $50 you can get a card like the TUF with a higher power limit for a fraction more overclocking headroom, but Radeons don't really respond well to more power, they're better when you undervolt them. The difference between the best and worst card in this test is just 3% higher clocks, which will likely result in a 1-2% performance gain, something too small for most people to even measure, let alone notice! It's absolutely not worth paying $30 or $50 more for in my opinion.
 
Top