• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS Radeon RX 7900 GRE TUF OC

He is correct , remove the no DLSS as a negative cause it makes no sense to be there. How can AMD use DLLS exactly?
You're too late to the party, check the rest of the discussion
 
I tend to accept that the price set at 550 is okeyish, given that this is the company's largest, most difficult to make (lowest yields ?) and most expensive GPU.
Yes, it has 4 GB of VRAM less than the XT and as much as 8 GB of VRAM less than the XTX, but still...

The shocking part is that nvidia seems to not care - XTX 24GB is 910, while RTX 4080 16GB is 1200-1300 lol what...
 
I tend to accept that the price set at 550 is okeyish, given that this is the company's largest, most difficult to make (lowest yields ?) and most expensive GPU.
Yes, it has 4 GB of VRAM less than the XT and as much as 8 GB of VRAM less than the XTX, but still...

The shocking part is that nvidia seems to not care - XTX 24GB is 910, while RTX 4080 16GB is 1200-1300 lol what...
The 4080 (vanilla) doesn't seem to have any real discount since the release of the Super.

I don't keep track of 4080-series pricing much since it's such a pointless card with horrible performance/$ and not enough VRAM to be useful for productivity over far cheaper options - even so, the 4080S is selling for less than the 4080 in the first three places I checked. I'll take that as speculative evidence that nobody in the UK bothered holding much inventory of the 4080 because it sold atrociously and practically nobody ever bought one, meaning zero need to discount current inventory to make room for new product.

The complete lack of any discounts at the 4080-series level just means utter stagnation at this price point. The XTX remains exactly as it did in 2022 at almost the exact same price because despite appearances, the 4080 and XTX don't really compete; The 4080 is what you buy for DLSS, RT and CUDA, the XTX is what you buy if you want higher-framerate with RT disabled or dialled down to console levels.

Another $100 lower down the product stack, the 4070Ti Super showed promise until reviews came out and showed it to be unexpectedly hamstrung by extremely strict power limits. Sure, you pay $100 more for a factory-overclocked version that has those arbitrary power limits removed, but then you should just get a 4080S which will always be faster, even the most locked-down, barebones model. There's just no competition at this price point. Nvidia refuses to compete with itself, and AMD can't offer the RT performance needed for 4K60 Ultra settings.

IMO the GRE/7900XT is as high as you need to go with AMD, and RT really is a feature that has limited value, even as we close in on 6 years of the technology. I played with a 3090, 3080, and 3070 in 2022 and found only the 3090 to offer enough performance to meaningfully enable RT in games from 2020-2022. I've not actually used a 4090 for gaming yet, but the 4070 was so far short of the mark in those RT-heavy 2023 titles like Phantom Liberty with path-tracing or Ratchet and Clank that it proves RT at uncompromised settings will always require a flagship GPU, which has a price tag too high for 99% of buyers to justify the mandatory replacement each and every generation. Even now, that 3090 just doesn't cut it for any meaningful RT-enabled high-resolution gaming any more.
 
The RX 7900 GRE is a great product but I don't really think that the price premium that ASUS cards command is warranted. My RX 7900 XTX is an ASRock Phantom Gaming model, one of the least expensive variants, but it sure doesn't leave any performance on the table.

Paying more just to have a card that says ASUS on it isn't what I call a sound financial decision.

I'd take a less expensive model from XFX, ASRock or Powercolor over an overpriced ASUS model seven days a week and twice on Sunday.

Having said that, cards go on sale all the time and I'm sure that this one will eventually come down in price to match the others.
 
The RX 7900 GRE is a great product but I don't really think that the price premium that ASUS cards command is warranted. My RX 7900 XTX is an ASRock Phantom Gaming model, one of the least expensive variants, but it sure doesn't leave any performance on the table.

Paying more just to have a card that says ASUS on it isn't what I call a sound financial decision.

I'd take a less expensive model from XFX, ASRock or Powercolor over an overpriced ASUS model seven days a week and twice on Sunday.

Having said that, cards go on sale all the time and I'm sure that this one will eventually come down in price to match the others.
You really can't infer much from brands' individual series names.

By far the worst 30-series RTX cards I encountered were Asus Dual models - 3060 and 3060Ti, yet their "Dual" 3070 was fine, and their Dual 4070 is so good it makes more expensive variants like the TUF and Strix almost pointless.

Given there are about 20 completely different variants of the "Asus Dual" cooler strapped to around 85 different RTX SKUs, it's pretty clear that the model name "dual" is meaningless. Some have been reviewed as garbage-tier coolers with a shortage of heatpipes and cheap extruded alu heatsinks, whilst others are high-end coolers with premium construction, overkill fin stacks and a surplus of cooling. I'm only even looking at Asus' RTX product stack, they make Radeon "Dual" cards too, and their design sometimes bears little resemblance to the Geforce of similar TDP which makes no sense at all because the TDP determines the quality of the VRM and cooler which is 90% of what makes a premium card premium and an entry-level card, entry-level.

My general opinion is that most Asus cards are overpriced but there are enough individual SKUs that buck the trend and are highly competitive to not dismiss Asus as an overpriced brand as a rule.

TL; DR
The quality of a specific sub-brand within a manufacturer's range is meaningless, which means trying to paint a whole manufacturer with the same brush is impossible. You need to looks for reviews of the exact, individual GPU in question. Even a V2 suffix is enough to make it a completely different product.

Thank <deity of choice> for independent journalism and their impartial reviews!
 
The RX 7900 GRE is a great product but I don't really think that the price premium that ASUS cards command is warranted. My RX 7900 XTX is an ASRock Phantom Gaming model, one of the least expensive variants, but it sure doesn't leave any performance on the table.

Paying more just to have a card that says ASUS on it isn't what I call a sound financial decision.

I'd take a less expensive model from XFX, ASRock or Powercolor over an overpriced ASUS model seven days a week and twice on Sunday.

Having said that, cards go on sale all the time and I'm sure that this one will eventually come down in price to match the others.
That's the case with every single Asus thing sadly. OLED monitors? Yep, there are others cheaper. Video cards? You know it. Sound cards? Motherboards? They do some great stuff, but they aint double better. That's why the last Asus thing i got was in 2008. If they wanna act like Apple = skippppppp
 
That's the case with every single Asus thing sadly. OLED monitors? Yep, there are others cheaper. Video cards? You know it. Sound cards? Motherboards? They do some great stuff, but they aint double better. That's why the last Asus thing i got was in 2008. If they wanna act like Apple = skippppppp
Yeah, just look at the flagship AM5 motherboards, more than 4x what I paid for my X370 board (Which was ASUS) and that oled monitor is comical, 20% premium for less features than some models with the same panel. After the warranty debacle they had a little while back they need to wind their necks in a bit.
 
That's the case with every single Asus thing sadly. OLED monitors? Yep, there are others cheaper. Video cards? You know it. Sound cards? Motherboards? They do some great stuff, but they aint double better. That's why the last Asus thing i got was in 2008. If they wanna act like Apple = skippppppp
I completely agree and you know it's not like their products are bad, far from it. My craptop is actually an ASUS and I bought it because it was actually the best deal at the time. I got a backlit keyboard and a GTX 1050M for the same price as other craptops that didn't have those features. They were otherwise identical.

It's the same attitude that I have towards GeForce cards. I'll never say that they're not great products (because of course that are), it's just that, at the same price point, a Radeon is almost always faster and usually has more VRAM.

I've been building PCs since 1988 (yeah, I'm old..lol) and the most important thing that I've learnt over the years is to never be a brand-wh0re. This is because, no matter what the brand is, the product was made by companies with years of experience that have employed real professionals. The only brand I refuse to buy is MSi but that's not because I think their stuff is bad, it's because they treated me badly when one of their flagship motherboards died on me (the only motherboard that has ever died on me).

Yeah, just look at the flagship AM5 motherboards, more than 4x what I paid for my X370 board (Which was ASUS) and that oled monitor is comical, 20% premium for less features than some models with the same panel. After the warranty debacle they had a little while back they need to wind their necks in a bit.
I agree completely. That's why my two AM4 boards have been ASRock. It's funny because ASRock boards tend to offer the best value of "The Big Four" (ASRock, ASUS, Gigabyte & MSi) yet ASRock began as ASUS' OEM division.

Like you, I also tend to buy X-series boards because I like the extra PCIe and M.2 slots. My first AM4 board was an ASRock X370 Killer SLI and it was fantastic. I replaced it with an ASRock X570 Pro4 because I wanted to try Smart Access Memory. I should've stuck with the X370 because SAM wasn't worth it at the time.

If I were to choose an AM5 motherboard today, it would definitely be the ASRock X670E PG Lightning. It's a full-featured X670E board for less than the others want for normal X670 boards. Steve Walton said it's a great value and recommends it.

You really can't infer much from brands' individual series names.

By far the worst 30-series RTX cards I encountered were Asus Dual models - 3060 and 3060Ti, yet their "Dual" 3070 was fine, and their Dual 4070 is so good it makes more expensive variants like the TUF and Strix almost pointless.

Given there are about 20 completely different variants of the "Asus Dual" cooler strapped to around 85 different RTX SKUs, it's pretty clear that the model name "dual" is meaningless. Some have been reviewed as garbage-tier coolers with a shortage of heatpipes and cheap extruded alu heatsinks, whilst others are high-end coolers with premium construction, overkill fin stacks and a surplus of cooling. I'm only even looking at Asus' RTX product stack, they make Radeon "Dual" cards too, and their design sometimes bears little resemblance to the Geforce of similar TDP which makes no sense at all because the TDP determines the quality of the VRM and cooler which is 90% of what makes a premium card premium and an entry-level card, entry-level.

My general opinion is that most Asus cards are overpriced but there are enough individual SKUs that buck the trend and are highly competitive to not dismiss Asus as an overpriced brand as a rule.

TL; DR
The quality of a specific sub-brand within a manufacturer's range is meaningless, which means trying to paint a whole manufacturer with the same brush is impossible. You need to looks for reviews of the exact, individual GPU in question. Even a V2 suffix is enough to make it a completely different product.

Thank <deity of choice> for independent journalism and their impartial reviews!
I completely agree. I don't care what brand something is. I have a saying that noobs look at brand while experts just look at spec.

Some ASUS stuff is pretty good value (my craptop is proof of that) and most ASUS products are of very good quality. However, I don't think that their motherboards or video cards are superior to ASRock, Gigabyte or MSi. I also don't think that their video cards are better-made than brands like Powercolor, Sapphire, XFX or Yeston. In fact, I would trust those four brand more than ASUS because video cards are their core competencies while ASUS makes so many other things.
 
Please change 'No support for DLSS' in the cons to 'FSR not as good as competing solutions such as DLSS and XeSS.' AMD cannot solve your 'con' because DLSS is a proprietary solution owned by Nvidia. Any attempt for AMD to implement it without permission/license would lead to a lawsuit.

How do you solve a con when it is illegal to do so? You might as well say 'Not an Nvidia card' in the cons column.

It's not about AMD literally adding in Nvidia's DLSS upscaler, but rather that AMD needs to deliver an updated FSR2 with an ML image reconstruction algorithm to properly compete. AMD talked up the new ML capabilities integrated into the architecture of the 7000-series, but has yet to deliver any useful software taking advantage of them.

FSR suffers from all of the problems that Nvidia found and documented while developing their game engine-hooked upscaler that were ultimately solved by that ML image reconstruction. The reconstruction algorithm has advanced to the point it is able to improve the overall image fidelity compared to straight native resolution, whether that's through DLAA running on that native res image, or often times even with upscaling when there's enough render resolution (particularly noticeable at 4K output resolution with DLSS Quality).

Personally I applaud reviewers who choose to bring that up explicitly as a con on products like this, because it's something that often just gets lost in the background noise particularly with talking head YT reviews who gush about a modest raster performance advantage over Nvidia's competing equivalents while glossing over fairly transformative features. Looks like W1zzard already implemented a rewording suggestion from a reader changing it to "FSR not as good as DLSS" which pretty much resolves the complaints with the initial wording entirely.
 
I'm very curious to know the performance gain from memory OC alone on 7900GRE vs 7800XT given that most GREs come with 18Gbps memory but this TUF comes with faster 20Gbps memory.
 
I'm very curious to know the performance gain from memory OC alone on 7900GRE vs 7800XT given that most GREs come with 18Gbps memory but this TUF comes with faster 20Gbps memory.
Same, I wonder how fast GRE is compared to the 7800xt if they have the same memory speed.
 
How is this product considered a but expensive? It's not crazy expensive like the 4080 on launch
 
How is this product considered a but expensive? It's not crazy expensive like the 4080 on launch

600.PNG
 
Hello, how is it possible that you got 316 fps in battlefield 5? My card pushes around 200 fps (all low settings and everything disabled in adrenaline) on most maps.... My CPU is 5950x and is at 20% usage most of the time so shouldn't be an issue.
 
  • Haha
Reactions: ARF
Hello, how is it possible that you got 316 fps in battlefield 5? My card pushes around 200 fps (all low settings and everything disabled in adrenaline) on most maps.... My CPU is 5950x and is at 20% usage most of the time so shouldn't be an issue.
You're limited by your CPU, so there's not a lot of point in lowering settings since they'll only ease the load on your GPU, and do almost nothing to help your CPU. Techpowerup's 14900K review shows that the 5950X is only 73% the speed of the 14900K used for this 7900GRE review, and 73% of 316fps in Battlefield V is only 230fps, which is around what you said you were getting. Your system is working as expected.

The reason you see only 20% usage is because Battlefield V is only using 3-6 cores at peak boost clocks and all the extra cores you have are going to waste. Unless you need the 16 cores for other workloads like rendering/compiling/simulation, there's no real point in having a 5950X because the dual-chiplet design is a (slight) hinderance for gaming performance due to the cache being split across two physical pieces of silicon and connected with a much slower link. What the X3D series of processors has proved, is that games LOVE unified, huge caches.

If you want more gaming performance sell your 5950X and drop a 5800X3D in there. It's not going to be as fast as a 14900K but you'll probably end up spending nothing at all, since the 5950X should sell for enough to pay for a 5800X3D outright. The 5800X3D is about 15% faster, averaged across all games tested, so for the net price of "probably nothing" it's worth a shot, IMO; You just have to put enough effort in to make an ebay or FB marketplace listing for your 5950X.

1722619719286.png
 
Hello, how is it possible that you got 316 fps in battlefield 5? My card pushes around 200 fps (all low settings and everything disabled in adrenaline) on most maps.... My CPU is 5950x and is at 20% usage most of the time so shouldn't be an issue.
TPU's tests are always ran with highest settings unless said otherwise, so unless you run at highest settings, they're not comparable.
 
You're limited by your CPU, so there's not a lot of point in lowering settings since they'll only ease the load on your GPU, and do almost nothing to help your CPU. Techpowerup's 14900K review shows that the 5950X is only 73% the speed of the 14900K used for this 7900GRE review, and 73% of 316fps in Battlefield V is only 230fps, which is around what you said you were getting. Your system is working as expected.

The reason you see only 20% usage is because Battlefield V is only using 3-6 cores at peak boost clocks and all the extra cores you have are going to waste. Unless you need the 16 cores for other workloads like rendering/compiling/simulation, there's no real point in having a 5950X because the dual-chiplet design is a (slight) hinderance for gaming performance due to the cache being split across two physical pieces of silicon and connected with a much slower link. What the X3D series of processors has proved, is that games LOVE unified, huge caches.

If you want more gaming performance sell your 5950X and drop a 5800X3D in there. It's not going to be as fast as a 14900K but you'll probably end up spending nothing at all, since the 5950X should sell for enough to pay for a 5800X3D outright. The 5800X3D is about 15% faster, averaged across all games tested, so for the net price of "probably nothing" it's worth a shot, IMO; You just have to put enough effort in to make an ebay or FB marketplace listing for your 5950X.

View attachment 357295
Ohh okay thanks for the answer :/

TPU's tests are always ran with highest settings unless said otherwise, so unless you run at highest settings, they're not comparable.
At highest settings I get even less fps
 
If you want more gaming performance sell your 5950X and drop a 5800X3D in there.

Wrong. If you want more performance, upgrade the monitor to 4K.

do almost nothing to help your CPU

At highest settings I get even less fps

You can increase the resolution, in which way you shift the load to the GPU, and there is no difference between many CPUs.
Performance difference between Ryzen 7 7800X3D and Ryzen 5 5600X is miserable 14%.

1722750099164.png
 
Wrong. If you want more performance, upgrade the monitor to 4K.



You can increase the resolution, in which way you shift the load to the GPU, and there is no difference between many CPUs.
Performance difference between Ryzen 7 7800X3D and Ryzen 5 5600X is miserable 14%.

View attachment 357447
Hmm would the 13600k give me 300+ fps with no stutters?
 
Hmm would the 13600k give me 300+ fps with no stutters?
Why would you buy Intel as of currently? That's one of the CPU's affected by the microcode issues.
Buy a 5800X3D if you want 13600k performance without a new motherboard, but you already have a very playable framerate using a 5950X.
 
Last edited:
Hmm would the 13600k give me 300+ fps with no stutters?
Also, not even the 14900K can give you 300+ fps with no stutters. Watch any hardware review on YouTube with Afterburner's or FRAPS' frame-time graph showing and you'll see that framerates are reasonably stable compared to some games, yet the minimum framerates are still less than half the maximum framerates.

That means an average result of 316fps probably means that even on a 14900K with an RTX 4090, the minimum framerates and 99th percentile framerates are much lower. If your definition of "no stutters" means no frames dropped on a 300Hz monitor, then no hardware on earth can achieve that at the moment. If the 13600K is capable of an average 290fps then it's probably down at under 200fps quite frequently.

Given that the server tickrate for Battlefield V is a paltry 60Hz, there's really not a huge amount of point in aiming for framerates much over 120Hz IMO. You're almost better just using Freesync and getting tear-and judder free gaming at whatever rate your CPU+GPU can output. Yes, Freesync adds around half a frame of input lag but if your PC is throwing out ~200fps then your worst-case input lag is still less than half of the server's tickrate lag, which means you're giving anything up, competitively.
 
Last edited:
Why would you buy Intel as of currently? That's one of the CPU's affected by the microcode issues.
Buy a 5800X3D if you want 13600k performance without a new motherboard, but you already have a very playable framerate using a 5950X.
its just "playable", i was doing better on my gtx 1050ti in this game. ;) Im getting stuttering weird inconsistent input lag etc. Especially when someone is very close then i cant really do anything. From what i've read 13600k is not affected by stability issues only the 14900 and 13900. I guess ill just save up some money and buy the 7800x3d i assume 5800x3d wouldnt be any different than my current cpu.

Also, not even the 14900K can give you 300+ fps with no stutters. Watch any hardware review on YouTube with Afterburner's or FRAPS' frame-time graph showing and you'll see that framerates are reasonably stable compared to some games, yet the minimum framerates are still less than half the maximum framerates.

That means an average result of 316fps probably means that even on a 14900K with an RTX 4090, the minimum framerates and 99th percentile framerates are much lower. If your definition of "no stutters" means no frames dropped on a 300Hz monitor, then no hardware on earth can achieve that at the moment. If the 13600K is capable of an average 290fps then it's probably down at under 200fps quite frequently.

Given that the server tickrate for Battlefield V is a paltry 60Hz, there's really not a huge amount of point in aiming for framerates much over 120Hz IMO. You're almost better just using Freesync and getting tear-and judder free gaming at whatever rate your CPU+GPU can output. Yes, Freesync adds around half a frame of input lag but if your PC is throwing out ~200fps then your worst-case input lag is still less than half of the server's tickrate lag, which means you're giving anything up, competitively.
14900k got 300+ fps average in this test though? im getting avg like 150-160
 
its just "playable", i was doing better on my gtx 1050ti in this game. ;) Im getting stuttering weird inconsistent input lag etc. Especially when someone is very close then i cant really do anything. From what i've read 13600k is not affected by stability issues only the 14900 and 13900. I guess ill just save up some money and buy the 7800x3d i assume 5800x3d wouldnt be any different than my current cpu.


14900k got 300+ fps average in this test though? im getting avg like 150-160
From all the videos I've watched, the microcode effects 13600 - 14900 ks and all their variants.

7800X3D is wonderful, just get that. :clap:
 
From all the videos I've watched, the microcode effects 13600 - 14900 ks and all their variants.

7800X3D is wonderful, just get that. :clap:
I know it's good, I'm pretty crafty with the PC's tho I'm still thinking about 13600k and the stability issues don't really scare me, I'm pretty sure I would get it stable in no time if I had issues. 7800x3d seems like a overkill and the 13600k is just cheaper. 5950x was probably the worst purchase I ever made lmao I'm working with audio a lot and thought it's perfect for that, turned out it's trash for that too.
 
Back
Top