Thursday, September 21st 2023

Undervolted Radeon RX 7800 XT Gets Closer to GeForce RTX 4070 Efficiency Levels

Techtesters, a Dutch online publication and YouTube channel, took the time to investigate whether AMD's Radeon RX 7800 XT 16 GB GPU can compete with NVIDIA's GeForce RTX 4070 12 GB GPU in power efficiency stakes. Naturally, Team Red loses with their new mid-ranger running under normal conditions when lined up against its main rival - ranging from 252 W to 286 W versus 200 W (sometimes 196 W during gaming sessions) respectively. Nada Overbeeke (of Techtesters) chose to set a 90% power limit for their main subject matter—Gigabyte's custom design Navi 32-based Gaming OC model—through AMD software adjustments.

Its "aggressive" 200 W undervolted state was compared to stock performance in a number of modern game environments (refer to the charts below). The Gigabyte RX 7800 XT Gaming OC—using stock settings—consumed around 40% more power while managing only a 9% performance increase over its 200 W undervolted guise. VideoCardz notes that AMD's reference model requires 24% more power at stock: "As mentioned, a 9% performance boost should not be underestimated, but the substantial reduction in power consumption also resulted in quieter GPU operation and lower temperatures." It would have been interesting to see Techtesters undervolt their RTX 4070 FE candidate as well, but emphasis seemed to be placed on the newer card.
The VideoCardz verdict stated: "More importantly, even with a 200 W configuration, the card managed to hold its own against the RTX 4070, which consumed roughly the same amount of power at stock settings. This essentially means that there's virtually no difference between these GPUs once the RX 7800 XT is undervolted. Naturally, the choice between the two cards will heavily depend on the specific games chosen to play, as not everyone is comfortable with undervolting or altering GPU settings."

Undervolting the AMD Radeon RX 7800 XT all the way down to RTX 4070 levels 😎:


The Techtesters rig:
  • CPU: Intel Core i9-13900K
  • Motherboard: ASUS ROG Maximus Z790 HERO
  • Memory: 32 GB Corsair Vengeance RGB DDR5-6000 (2x16)
  • Cooling: Corsair H150i Elite LCD
  • PSU: Seasonic Prime TX 1600 Watt
  • OS: Windows 11 PRO 22H2
  • XMP Setting Applied: Resizable Bar Enabled
  • Core Isolation is disabled
Source: VideoCardz
Add your own comment

33 Comments on Undervolted Radeon RX 7800 XT Gets Closer to GeForce RTX 4070 Efficiency Levels

#1
Chaitanya
Not surprising both parties have pushed their GPUs too much this generation and undervolting and underclocking seem to be the 1st thing to do with latest GPUs.
Posted on Reply
#2
agent_x007
I like how they put a "undervolting and overclocking may damage the card..." warning in there - "just in case" software has reverse bug (and increases vGPU, instead of lowering it) :D
Also, lowering power target may be unpredictable from stability standpoint when you decrease voltage on GPU (since Power = Voltage^2 * Resistance).
Instead of playing with power slider, just lowering GPU (max.) frequency boost to a point it doesn't crash with decreased voltage setting is better option.
Posted on Reply
#3
Dr. Dro
I should show them what my RTX 4080 can do then... hell if I simply enable DLSS frame generation it's doing Starfield at 144fps 1080p ultra at below 140 W. No need to dial down clocks or undervolt.

RDNA 3 simply has no redemption if you're talking about absolute performance per watt, it's straight up worse than Ada here.
Posted on Reply
#4
Arkz
And what about if you undervolt the 4070 too?
Dr. DroI should show them what my RTX 4080 can do then... hell if I simply enable DLSS frame generation it's doing Starfield at 144fps 1080p ultra at below 140 W. No need to dial down clocks or undervolt.

RDNA 3 simply has no redemption if you're talking about absolute performance per watt, it's straight up worse than Ada here.
Fake frames and 1080p? Oh wow, I'm sure everyone wants that.
Posted on Reply
#5
TheinsanegamerN
Once again it shows that, in their obsession with yeilds, manufacturers will throw efficiency to the wind. At least they havent locked down undervolting yet.
Posted on Reply
#6
Dr. Dro
ArkzAnd what about if you undervolt the 4070 too?


Fake frames and 1080p? Oh wow, I'm sure everyone wants that.
I was every bit as skeptical as you, if you experience DLSS-G (at least with DLAA preset F applied, so native resolution without upscaling), I promise you can't tell the difference, even if you're sensitive to motion. and I am.

The 1080p is just a technicality, I'm on some budget monitor until I figure out how to pay for an Odyssey OLED G9 or similarly exuberant display to match my lovely PC.
Posted on Reply
#7
AusWolf
She put a 90% power limit on the card. Then se compared the undervolted results with stock. Where is the undervolt here? :wtf:

Anyway, here's my Time Spy result at stock (246 W average board power):
www.3dmark.com/3dm/100224381?

And here's one with the -10% power limit (222 W average board power):
www.3dmark.com/3dm/100225475?
Posted on Reply
#8
Sabotaged_Enigma
Can't change the fact that RDNA 3 has design fault/failure at architecture level and is inferior to Ada Lovelace at power efficiency.
Posted on Reply
#9
evernessince
Dr. DroI was every bit as skeptical as you, if you experience DLSS-G (at least with DLAA preset F applied, so native resolution without upscaling), I promise you can't tell the difference, even if you're sensitive to motion. and I am.

The 1080p is just a technicality, I'm on some budget monitor until I figure out how to pay for an Odyssey OLED G9 or similarly exuberant display to match my lovely PC.
You should be fine with AMD's FG as well given they are both interpolation.

I personally am not, enabling FG on my 4080 makes me nauseous. Interpolated frames just feel off.

As others have said, 4070 can be undervolted as well. At the end of the day Nvidia is just more efficient this generation although I suppose it's nice to know you can UV either and get a nice reduction in power consumption.
Posted on Reply
#10
ARF
Яid!culousOwOCan't change the fact that RDNA 3 has design fault/failure at architecture level and is inferior to Ada Lovelace at power efficiency.
I am not sure that they can't at least try to fix the issues with another revision of the silicon, but yeah in principle - TSMC N4 used by nvidia even though slightly is still better than TSMC N5 process used by AMD, together with the stupid chiplet design on N6 process, which together with it introduces more bottlenecks as we speak.
Posted on Reply
#11
ZoneDymo
Dr. DroI should show them what my RTX 4080 can do then... hell if I simply enable DLSS frame generation it's doing Starfield at 144fps 1080p ultra at below 140 W. No need to dial down clocks or undervolt.

RDNA 3 simply has no redemption if you're talking about absolute performance per watt, it's straight up worse than Ada here.
you want to compare a 1300 dollar gpu to a 500 dollar gpu and think you look sane for wanting that?
Posted on Reply
#12
ARF
ZoneDymoyou want to compare a 1300 dollar gpu to a 500 dollar gpu and think you look sane for wanting that?
If we use the cherry-picking method, then the difference is not that large apart lol

Battlefield V at 1080p:



Posted on Reply
#13
AusWolf
Яid!culousOwOCan't change the fact that RDNA 3 has design fault/failure at architecture level and is inferior to Ada Lovelace at power efficiency.
Well, the opening article shows that you actually can.
Posted on Reply
#14
TheinsanegamerN
Dr. DroI was every bit as skeptical as you, if you experience DLSS-G (at least with DLAA preset F applied, so native resolution without upscaling), I promise you can't tell the difference, even if you're sensitive to motion. and I am.

The 1080p is just a technicality, I'm on some budget monitor until I figure out how to pay for an Odyssey OLED G9 or similarly exuberant display to match my lovely PC.
I've seen the videos and the still images, and the dali.ai looking fake frame that are generated look like absolute trash. How anyone likes looking at a game that resembels a rendering glitch is beyond me.

Same with RT. I've seem the still images, and the videos. The only real difference is the framerate. ZOMG A SHADOW IS 5% DIFFERENT ON THAT LEAF THATS WORTH TAKING A 66% FRAMERATE HIT!

Yeah no thanks.
Posted on Reply
#15
AusWolf
ARFIf we use the cherry-picking method, then the difference is not that large apart lol

Battlefield V at 1080p:



Are you trying to prove that CPU bottlenecking exists, or that the 4080 and 4090 are a total waste of money for certain scenarios? (which are both true, imo)
Posted on Reply
#16
Dr. Dro
TheinsanegamerNI've seen the videos and the still images, and the dali.ai looking fake frame that are generated look like absolute trash. How anyone likes looking at a game that resembels a rendering glitch is beyond me.
I can't speak for the earliest implementations, but with DLSS 3.5 I have no complaints about how it looks. This is a lossless capture from Special K, so direct from the pipeline as it's displayed on the screen:



Hopefully AMD's FSR 3 isn't a wreck and you'll be able to pass frame generation some judgment of your own with first hand experience sometime. If anything, it might let you save some power by lightening the load on your card. It's a bonus either way.
ZoneDymoyou want to compare a 1300 dollar gpu to a 500 dollar gpu and think you look sane for wanting that?
Sure but it runs on any 40 series GPU right now and could be brought to the 30 and 20 series if Nvidia really felt the pressure from AMD. This is what we want.
evernessinceYou should be fine with AMD's FG as well given they are both interpolation.

I personally am not, enabling FG on my 4080 makes me nauseous. Interpolated frames just feel off.

As others have said, 4070 can be undervolted as well. At the end of the day Nvidia is just more efficient this generation although I suppose it's nice to know you can UV either and get a nice reduction in power consumption.
I'm hoping you are right, I was skeptical until I started using it on Starfield (only game I have ever tried it), and even for a community implementation it has blown me away. It increases the power efficiency/lowers thermals considerably, and fixes up eventual performance shortcomings. I liked it. I agree with the frames feeling off, from the footage I had seen on videos before it did give me that vibe. But on Starfield I just haven't been able to feel it.
Posted on Reply
#17
ARF
AusWolfAre you trying to prove that CPU bottlenecking exists, or that the 4080 and 4090 are a total waste of money for certain scenarios? (which are both true, imo)
The 4080 and 4090 are a total waste of money for all scenarios. Of course, if you ask the majority of brain-washed masses of average joes and paid supporters, they will tell you that those are cheap, and more money should be asked :D
Posted on Reply
#18
Dr. Dro
ARFThe 4080 and 4090 are a total waste of money for all scenarios. Of course, if you ask the majority of brain-washed masses of average joes and paid supporters, they will tell you that those are cheap, and more money should be asked :D
As a high-res enthusiast, Arf, you should know why they exist

It's for people who push demanding monitors and all
Posted on Reply
#19
ARF
Dr. DroAs a high-res enthusiast, Arf, you should know why they exist

It's for people who push demanding monitors and all
I am not saying anything about their existance in principle. I am questioning their market positioning against AMD's offers RX 7900 XTX and RX 7900 XT which are cheaper and superior at the same time.
But this is for higher level of understanding, that even the PC master race cannot grow to understand..
Posted on Reply
#20
THU31
Comparing an undervolted card to a different stock card seems kind of weird. My undervolted 4070 is peaking at ~150 W, and most of the time it's 100-120 W when playing with a capped framerate.
Posted on Reply
#21
sephiroth117
If you want a fair comparison it would be 7800XT undervolted vs 4070 undervolted too ;)

Ada lovelace undervolts VERY well
Posted on Reply
#22
AusWolf
ARFI am not saying anything about their existance in principle. I am questioning their market positioning against AMD's offers RX 7900 XTX and RX 7900 XT which are cheaper and superior at the same time.
But this is for higher level of understanding, that even the PC master race cannot grow to understand..
The 4080 and 4090 are positioned as such for cultists people who firmly believe that green is "the way it's meant to be played", and people who want the best of the best for one reason or another. Also people who gladly pay 1.5× the price for DLSS and a bit more RT performance either because they want it, or because the media said so. Technology and marketing are Nvidia's cash cows, even if said technology doesn't bring much to the table for the majority of gamers.

AMD's "fine wine" argument is a bit laughable, but RDNA 3 shows that there is some truth to it sometimes. With drivers that have less CPU overhead, RDNA 3 has the potential to come out being superior in CPU-limited scenarios, while also being cheaper.

I'm not saying that one side is right and the other is wrong, just that I don't think we had this much differentiation in terms of technology and market positioning within the GPU industry since the 3DFX times.
Posted on Reply
#23
MarsM4N
A 9% performance drop looks quite high. I guess 200W is a too optimistic target (esp. for the 7000 Series with their crappy power management) and bad frame times are dragging down the results. :wtf: We will never know since she didn't test frame times. Pretty sure she missed the "sweet spot" by a tiny tad.
sephiroth117If you want a fair comparison it would be 7800XT undervolted vs 4070 undervolted too ;)

Ada lovelace undervolts VERY well
I would even throw in the 6800 XT. :laugh: They underclock way better and don't have downclock problems in low/mid power stages. They're also scaling better with capped frames than the 7000 Series.
Posted on Reply
#24
Pumper
But it doesn't fix the absurd low load power draw, which is more important, unless you never do anything else with your PC other than gaming.
Posted on Reply
#25
las
Nice but 4070 can be undervolted as well.
Posted on Reply
Add your own comment
Dec 22nd, 2024 02:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts