Wednesday, May 17th 2023

Volt-modded RX 7900 XTX Hits 3.46 GHz, Trades Blows with RTX 4090

An AMD Radeon RX 7900 XTX graphics card is capable of trading blows with NVIDIA GeForce RTX 4090, as overclocker jedi95 found out. With its power limits unlocked, the RX 7900 XTX was found reaching engine clocks as high as 3.46 GHz, significantly beyond the "architected for 3.00 GHz" claim AMD made in its product unveil last Fall. At these frequencies, the RX 7900 XTX is found to trade blows with the RTX 4090, a segment above its current segment rival, the RTX 4080.

Squeezing 3.46 GHz out of the RX 7900 XTX is no child's play, jedi95 used an Elmor EVC2SE module for volt-modding an ASUS TUF Gaming RX 7900 XTX, essentially removing its power-limit altogether. He then supplemented the card's power supply, so it could draw as much as 708 W (peak), to hold its nearly 1 GHz overclock. A surprising aspect of this feat is that an exotic cooling solution, such as liquid-nitrogen evaporator, wasn't used. A full-coverage water block and DIY liquid cooling did the job. The feat drops a major hint at how AMD could design the upcoming Radeon RX 7950 XTX despite having maxed out the "Navi 31" silicon with the RX 7900 XTX. The company could re-architect the power-supply design to significantly increase power limits, and possibly even get the GPU to boost to around the 3 GHz-mark.
Sources: jedi95 (Reddit), HotHardware
Add your own comment

75 Comments on Volt-modded RX 7900 XTX Hits 3.46 GHz, Trades Blows with RTX 4090

#51
Makaveli
GodrillaSo there hope for the 7950 xtx to compete with the 4090. @ Nvidia release the Titan.
And here I'm rocking 300 watt power limit on the 4090 suprim liquid on a daily basis. Oh right this is purely for benchmarking.
Not sure there is without blowing the power budget.
Posted on Reply
#52
Vayra86
MakaveliIt also affected single monitors but worse on dual monitor setups but I believe it will be addressed by a driver update.

My old 6800XT use to idle around 7-8 Watts single 34 inch ultrawide display

On my 7900XTX idle is 50 watts on the same display.

Wizzards card's show less idle wattage so its varies on your card and monitor combo.
This is really as old as multi monitor and high refresh rates, the GPU needs a power state where it provides enough to drive a highly varied number of configs on the desktop. GPU idle isn't really idle, let's face it, you do get video from it all the time. Yawnfest, honestly. The fact some people single that out as a major thing, yeah, whatever. It'll get fixed eventually, most likely, like it did every time before.
Posted on Reply
#53
Lionheart
fevgatosAre you suggesting there is not a single game that the 4070ti beats the 7900xtx? What amdkoolaid are you drinking?
Give me a break! I can say the same thing for the RX 7900 XT beating the RTX 4080 in a few games even though the 4080 is clearly the superior product! It's your cringe worthy wording that make you obvious where your alliance lies.
Posted on Reply
#54
Vayra86
LionheartGive me a break! I can say the same thing for the RX 7900 XT beating the RTX 4080 in a few games even though the 4080 is clearly the superior product! It's your cringe worthy wording that make you obvious where your alliance lies.
The original comment was about a value proposition, and the reply he gave completely omits the fact every equally performant RTX card is a whole lot more expensive - even despite the fact it was a direct response to someone talking about a 6-900 dollar price gap.

'Value' is in his abstract mind a weird construction of highly variable RT performance that always compensates perfectly for Nvidia's price premium over comparative raster performance, good luck discussing with that. This is the same person that then continues the argument on chiplets being worse for high idle power, himself running top end Intel xx900K CPUs. Undervolted, obviously :D

I hear they have special places for this kind of logic, and I do know how to approach said person going forward :)
Posted on Reply
#55
nguyen
Why are there people discussing the terrible "900usd" 7900XT in this thread LOL

7900XTX lose in 99.9% of games out there against 4090 that people have to rely on cherry picked games in order to defend it :D
Posted on Reply
#56
fevgatos
LionheartGive me a break! I can say the same thing for the RX 7900 XT beating the RTX 4080 in a few games even though the 4080 is clearly the superior product! It's your cringe worthy wording that make you obvious where your alliance lies.
I think you completely missed the point of my post. Im saying that the fact that a 4070ti might win in the odd game against the XTX doesn't mean anything - at all. In the same fashion,, the fact that the xtx can win in the odd game against a 4090 means nothing either.
Vayra86The original comment was about a value proposition, and the reply he gave completely omits the fact every equally performant RTX card is a whole lot more expensive - even despite the fact it was a direct response to someone talking about a 6-900 dollar price gap.

'Value' is in his abstract mind a weird construction of highly variable RT performance that always compensates perfectly for Nvidia's price premium over comparative raster performance, good luck discussing with that. This is the same person that then continues the argument on chiplets being worse for high idle power, himself running top end Intel xx900K CPUs. Undervolted, obviously :D

I hear they have special places for this kind of logic, and I do know how to approach said person going forward :)
Im running 3 zen chips and a 12900k. The 12900k is the most efficient of them all in my type of workloads. You want me to lie about it? I don't get your point honestly. AMD cpus are better in the laptop space, that's why my laptop has an amd cpu, but on the desktop, for my kind of workloads intel is better. If you can show me a desktop zen pulling 6 watts while running 3 streams a browser and a discord call ill change my mind.


Funnily enough the person you are talking to - who seemingly seems to agree with you - has a 4080. So he thinks that the RT performance is actually worth the extra money over the XTX, you know, the same thing you criticize me for, lol.
Posted on Reply
#57
Vayra86
fevgatosI think you completely missed the point of my post. Im saying that the fact that a 4070ti might win in the odd game against the XTX doesn't mean anything - at all. In the same fashion,, the fact that the xtx can win in the odd game against a 4090 means nothing either.
Right, but you omitted the crucial detail here. The immense price difference. This wasn't about end or cherry picked performance to begin with, its about the combination of a major price difference and still being close to top end performance occasionally. So the 4070ti has a few unicorns too, but that GPU is a mere 200-250 bucks cheaper, not 600. In your mind these things are both equally irrelevant, but they're really not. They're both outliers, but one is far bigger than the other.

Other than that, I already commented at length about your use case, like I said, great read. I'm not going to dissect this further for you, if you say you're not biased, that is what it is, but it sure doesn't read that way to me.
Posted on Reply
#58
fevgatos
Vayra86Right, but you omitted the crucial detail here. The immense price difference. This wasn't about end or cherry picked performance to begin with, its about the combination of a major price difference and still being close to top end performance occasionally. So the 4070ti has a few unicorns too, but that GPU is a mere 200-250 bucks cheaper, not 600. In your mind these things are both equally irrelevant, but they're really not.

Other than that, I already commented at length about your use case, like I said, great read. I'm not going to dissect this further for you, if you say you're not biased, that is what it is, but it sure doesn't read that way to me.
But the XTX is not close to top end performance. At least not any closer than a 4080 is, since they basically average around the same FPS.

Whether or not im biased is irrelevant honestly. Either what im saying is true, or it isn't. Being biased or not being biased won't make a false statement true or a true statement false. I can claim you are biased but it won't change a thing, it's completely irrelevant.
Posted on Reply
#59
mama
Interesting side discussion but the issue is the 7900xtx being pushed to a performance level not seen. I'm wondering whether something Moore's Law is Dead speculated is true. He suggested the 7900xtx performance was significantly better initially but they had to gimp the card because of weird artifacting to get it out by the release date. Is there untapped potential here?
Posted on Reply
#60
Makaveli
nguyenWhy are there people discussing the terrible "900usd" 7900XT in this thread LOL

7900XTX lose in 99.9% of games out there against 4090 that people have to rely on cherry picked games in order to defend it :D
Because this thread isn't about that.

Take off the green tinted glasses my guy.

The topic is about an overclocked 7900XTX.
Posted on Reply
#61
nguyen
MakaveliBecause this thread isn't about that.

Take off the green tinted glasses my guy.

The topic is about an overclocked 7900XTX.
Massively Overclocked 7900XTX vs 4090? Said so in the title
Posted on Reply
#62
Godrilla
mamaInteresting side discussion but the issue is the 7900xtx being pushed to a performance level not seen. I'm wondering whether something Moore's Law is Dead speculated is true. He suggested the 7900xtx performance was significantly better initially but they had to gimp the card because of weird artifacting to get it out by the release date. Is there untapped potential here?
If there is untapped potential the time for some Hadouken is now!
update unless they are waiting for next gen to compete with their own lineup like they are doing now.
Posted on Reply
#63
Hxx
fevgatosThe difference is negligible, like less than 5% on average, while the xtx is much bigger in transistor count, bus width etc.
Averages are misleading in this case because pick your average gamer that prefers certain types of games maybe a couple genres but not 20 games across all types . Averages are more of an informative benchmark but shouldn’t guide a purchasing decision . Like if you play lots of COD you won’t be buying a 4080. If you play lots of metro type games than yes probably over a 7900xtx
Posted on Reply
#64
sethmatrix7
nguyenonly 17% gain in CP2077, meanwhile 4090 is on average 25% faster than 7900XTX.

So yeah, trade blows in CP2077 and Timespy only LOL
Where on this green earth is the 4090 25% faster than a 7900xtx? www.techpowerup.com/review/amd-radeon-rx-7900-xtx/32.html

Not that it matters but the 7900XTX does win in Battlefield 5 and FarCry 6 per the review
Posted on Reply
#65
Godrilla
sethmatrix7Where on this green earth is the 4090 25% faster than a 7900xtx? www.techpowerup.com/review/amd-radeon-rx-7900-xtx/32.html

Not that it matters but the 7900XTX does win in Battlefield 5 and FarCry 6 per the review
Not trying to trigger anyone but it seems to match performance ( anything less than a 5% delta is insignificant) in 2 outliers. The take-home message is as it always was if you want to play games in rasterization only than team Red is great but if you want rt on heck even the 4090 gets slaughtered in cyberpunk 2077 with rt overdrive. The 7900 xtx is already great for rasterization at default by matching the 4090 in rasterization wouldn't really change much imo would it?
Posted on Reply
#66
nguyen
sethmatrix7Where on this green earth is the 4090 25% faster than a 7900xtx? www.techpowerup.com/review/amd-radeon-rx-7900-xtx/32.html

Not that it matters but the 7900XTX does win in Battlefield 5 and FarCry 6 per the review
When you take average result across multiple review site, 4090 is 27.7% faster in rasterization and 70% faster in RT vs 7900XTX
Posted on Reply
#67
TheoneandonlyMrK
nguyenWhy are there people discussing the terrible "900usd" 7900XT in this thread LOL

7900XTX lose in 99.9% of games out there against 4090 that people have to rely on cherry picked games in order to defend it :D
Because people similar to you keep going on about alternative to the OP topics.

Likely to massage their ego that they're precious 4090 is still best.
nguyenWhen you take average result across multiple review site, 4090 is 27.7% faster in rasterization and 70% faster in RT vs 7900XTX
I didn't post part A earlier but who knew how fitting it would be.

And yet if you add another circuit for power, a second for Vreg control, a PSU for the pc obviously.
Loads of effort and time with only a water block, you might break 3Ghz.

The point:)

Imagine what three times that time, bigger PSU and loads of vasoline and LN2 might do, I sure couldn't be arsed, but I am intrigued.
Posted on Reply
#68
kapone32
nguyenWhy are there people discussing the terrible "900usd" 7900XT in this thread LOL

7900XTX lose in 99.9% of games out there against 4090 that people have to rely on cherry picked games in order to defend it :D
There is no one disputing the 4090. That is why it is $2500. The 7900XTX is as "inexpensive" at $1399. I know people like to push absolutes but for Gaming it would be in my estimation foolish to pay almost double for a card that is 10 to 15% faster in Gaming. I just finished a 3 hour session of KOA remastered and the 295 AVG FPS at 4K is plenty for me to enjoy the Game. As I have stated before owners of 7900 cards are not moaning about how much their card is vs Nvidia only users like you who own 4000 GPUs and have context of Day one reviews to use as the basis of all your arguments.

I will give you an example. The 7900XTX has the RT performance of the 3090. That doesn't matter though because the 4090 is faster at RT. Even today 3090s are $2200 so is it really worth that to get a 3090 over a 7900XTX? Is Nvidia RT worth the cost of a High end GPU? If you say yes it is your opinion but keep in mind that there are more Games than the 50-60 that all Reviewers use to establish the narrative while the fact of seamless 4K 144hz Gaming is true for ALL Games in your library, especially the fun ones with a 7900 GPU.

The 7900XT is about 5-7% slower than the 7900XTX in all things but saving $400 for 4GB less VRAM with a 349 Watt Power profile is plenty good to me.
Posted on Reply
#69
fevgatos
kapone32There is no one disputing the 4090. That is why it is $2500. The 7900XTX is as "inexpensive" at $1399. I know people like to push absolutes but for Gaming it would be in my estimation foolish to pay almost double for a card that is 10 to 15% faster in Gaming. I just finished a 3 hour session of KOA remastered and the 295 AVG FPS at 4K is plenty for me to enjoy the Game. As I have stated before owners of 7900 cards are not moaning about how much their card is vs Nvidia only users like you who own 4000 GPUs and have context of Day one reviews to use as the basis of all your arguments.

I will give you an example. The 7900XTX has the RT performance of the 3090. That doesn't matter though because the 4090 is faster at RT. Even today 3090s are $2200 so is it really worth that to get a 3090 over a 7900XTX? Is Nvidia RT worth the cost of a High end GPU? If you say yes it is your opinion but keep in mind that there are more Games than the 50-60 that all Reviewers use to establish the narrative while the fact of seamless 4K 144hz Gaming is true for ALL Games in your library, especially the fun ones with a 7900 GPU.

The 7900XT is about 5-7% slower than the 7900XTX in all things but saving $400 for 4GB less VRAM with a 349 Watt Power profile is plenty good to me.
Ιn the same vein, it's also foolish to buy an xtx for 1399$, the 4070ti is 800 and it's just 15% slower on raster and faster in RT.
Posted on Reply
#70
kapone32
fevgatosΙn the same vein, it's also foolish to buy an xtx for 1399$, the 4070ti is 800 and it's just 15% slower on raster and faster in RT.
Yep 12GB of VRAM for $1100 Canadian there you go.. You know what else is true. The 6700XT Is also more then 1/2 the price of that and is a perfectly viable 1440P card. There is no way that I would ever pay that much for 12GB of VRAM and my last card was 16GB. In fact I would buy the 6800XT for $799 than a 4070TI for $1100 all day long.
Posted on Reply
#71
fevgatos
kapone32In fact I would buy the 6800XT for $799 than a 4070TI for $1100 all day long.
I know you would
Posted on Reply
#72
kapone32
fevgatosI know you would
Exactly but you have unwavering support for Nvidia. I have legit reasons why I do not use Nvidia. In fact the only thing I compare GPUs to are the previous Gen. I do enjoy my laptop 3060 but I made sure that is a 1080P panel so it's fine. For my 4K panel the only cards that work no issue at 144Hz are 7000 GPUs and 4080/4090s, that is a fact. Once we establish that cost must be the next factor. The prices of most things are coming back to Earth. SSDs are finally going the way of HDDs in terms of pricing and we are starting to see 4TB NVME drives for under $300 (CAD). The Game promotions are back as well. CPUs have never been cheaper and the newest platform is about 5% more expensive but last Gen is still plenty fast. On the Intel side there is a complete stack for you to pick from in terms of cost, cores and APUs.

The only thing that has not shifted is Nvidia's pricing. The 4070TI will not be successful. I have some anecdotal data. With Newegg you can put anything in your cart. It will tell you how many other people have it in their cart as well. So there was a 6750XT for $458 (I posted about it). I put it in my cart but I also put the cheapest 4070TI and within 24 hours the 6750XT was gone (with a limit of 5) while it's been 4 days and that 4070TI is still there.

Unfortunately the narrative that is so strong in North America that even when Nvidia largest partner in the space makes an outwardly seeming moral decision to no longer do business with them (Just like XFX years ago) does nothing more than create a ripple in the Space and in fact there is no mention of EVGA and Nvidia at all in the tech media so it is no longer important. This in a world where you could only buy an Nvidia GPU from Bestbuy, while social media was full of posts with farms of at least 20+ 3070/3080 cards. For an entire year.

The support for closed systems is also insane to me. I was a fan of Physx. It was cool in Games at the time, so cool that Nvidia bought it and took it away. Then we shifted and started buying budget Nvidia cards to get Physx (Remember that) until Nvidia wrote a driver that made it not work it if sensed an AMD GPU in the system. Can we talk about SLI and people will tell you it didn't work but they don't mention Crossfire. Most people are unaware that in the age of the RX480/580/Vega you could have crossfire work at the driver level to the point where if the Game did not natively support Multi GPU you could go into the script and enable Multi GPU (TW) and see almost double the frame rates in battles and I know people are going to talk about stuttering and whatever else but we must remember that at that time 1440P was the high end and 1080P was the Gamer's resolution but 60Hz was undisputed and you had people that refuted the benefits of 120Hz. That led to Gysnc/Freesync and the truth is that changed the Game forever. What do I mean, the monitor and GPU give you butter smooth frames between a range. What does that mean? That when you are playing an ARPG and the screen gets filled with enemies, spells, arrows and explosions. There is no slowdown, stuttering or tearing. You could also take a battle in TW using the largest units in numbers with as many characters as the Game can load and see no slow downs as you pan the map.

The Gaming performance of RDNA. I had a 5600XT and was satisfied. I bought a 6800XT after that and that is a product that gives you the smile that compelling hardware brings the first time you use it. The 7900XT is much faster than the 6800XT and you know what? It was actually cheaper too. I also had a 7900XTX but something happened that caused me to get a 7900XT but from the time I heard about the Specs of the 7900 series GPUs I bought a Gigabyte FV43U. That is a 4K 144HZ panel. When using the 6800XT I had to play most Games at 1440P to enjoy 100+ FPS but once 7000 was installed 4K 144 it is. It is so crazy that even Vsync works fully at 144Hz but turn it off and enjoy the hell out of the Game. Especially New Games, are pushing the GPU and playing Greedfall or Hogwarts will see a clock of 2600+ on 7000 GPUs all Game session long. If you really want to unlock your 7000 pair it with a 13th Gen, if you want to see what 4K Gaming really is put a 70003D chip and load up the VRAM buffer as you explore much more of your library than you intended. Yesterday I got the urge to play some KOA and yes that is 5 hours of marvelling at the beauty of that Game in terms of the use of color. Even some of the enemies are great to look at but those FPS make this a sweet Action RPG in 3D.

You are saying to me whether in inference or not that none of that matters as long as the 4080/4070TI is faster in RT and supports DLSS. That is THE argument that Nvidia supporters post in every single thread about AMD cards.

Posted on Reply
#73
Dr. Dro
sethmatrix7Where on this green earth is the 4090 25% faster than a 7900xtx? www.techpowerup.com/review/amd-radeon-rx-7900-xtx/32.html

Not that it matters but the 7900XTX does win in Battlefield 5 and FarCry 6 per the review
I feel like this is just grasping at straws. Some games with an hilarious AMD bias such as Call of Duty Warzone indeed perform equal or better than the 4090, but those are extreme outliers. If you sample 100 games, the RTX 4090 will be faster in at least 96, and 70 of these in the high double digits. When you turn raytracing on, the 7900 XTX gets sent to the middle of the ampere high end pack, offering you anywhere between the 3080 Ti and the 3090 Ti, usually closer to the 3090's level of raytracing performance, all the while still not offering DLSS or the other RTX ecosystem features. In short; it's a worse card, when not in performance, definitely in experience.

However, the 7900 XTX does not need to be faster than the 4090 to be a successful product. It needs to offer desirable features, be affordable - and more importantly, the drivers can't let users down. These are three areas where AMD has consistently failed to deliver, even after the massive strides they have achieved in improving their high-level API UMDs as of late, and rewriting large portions of the graphics driver stack. But they won't, they are blowing this golden opportunity they have to prove themselves. I'll be real with you chief, if AMD sold the 7900 XTX for $699, I wouldn't have a SINGLE bone to pick with it, no matter how many features it lacked vs. the RTX 4080, or even if it had some nastier than usual driver bugs - it's right in my element, you see, I've been down that road for over 5 years now.

The RTX 4060 Ti unveiling is an unmitigated disaster. True to every Ada SKU released thus far including RTX 4090, it's a hollow shell of what it could have really been, upmarked and sold as a higher SKU than it ever had the right to be called. Reviewers have universally rejected and reviled it, and the RX 7600 is looking to follow in the exact same mistakes, offering the same deficient 8 GB frame buffer, with even lower performance (as if it wasn't anemic enough to begin with - preliminary leaked benchmarks show 20% behind 4060 Ti in time spy), and less features to boot. Betcha they are gonna charge $330 for it.
Posted on Reply
#74
TheinsanegamerN
evernessinceThis is not a good take IMO. This is akin to saying Zen 1 was a terrible architecture because it wasn't the most efficient or fastest product on the market. You are ignoring the fact that the 7900 XTX only has a graphic die size of around 300mm2. The 4090 on the other hand is a 608mm2 size die. As we all know, breaking a die down into smaller pieces drastically reduces cost and increases yields, particularly when we are talking about large dies like GPUs.
OK, and what do higher yields from break of controllers have to do with architectural superiority? The TOTAL die size of the 7900xtx, including its memory controllers, is 529mm2. The 4090 is not much larger but is significantly faster and with comparable power use.

Also, zen 1 WAS terrible. The memory controller was attrocious it was comparable to ye olde sandy bridge in games, and it couldnt OC worth a darn. The only benefit it had was that it was cheaper then intel and offered more cores for compute tasks.

The framework laid by zen 1 is what was actually good.
evernessinceThat's aside from the other advantages of chiplets, like being able to stack cache, modularization of parts of the GPU (which allowed AMD to increase the density of cores in it's latest GPUs), exceed the reticle limit, improved binning, and more.
Absolutely none of which is helping AMD win on the arch front seeing how well the 4090 performs.

You should really learn what whataboutism is.
evernessinceArchitecturally speaking Ada is Ampere on a much better node with extra cache taped on and not the fancy stacked cache either. RDNA3 simply being the first dGPU based on a chiplet design makes it more impressive than adding more cache to a monolithic die as Nvidia has done.
Architecturally speaking ADA has provided a significant jump in performance and efficiency. rDNA3 has done neither, as demonstrated by the huge disparity between core size and performance gain on the 7900 series, and the wet fart that is the 7600.
evernessinceThe only shame is that AMD didn't go for the throat with pricing like they did with Zen 1. The 4090 is an impressive card but the architecture itself is more a beneficiary of node improvements than itself being anything revolutionary.
By that standard the last time AMD produced an impressive arch was the original GCN in 2012.
Posted on Reply
#75
TheoneandonlyMrK
TheinsanegamerNOK, and what do higher yields from break of controllers have to do with architectural superiority? The TOTAL die size of the 7900xtx, including its memory controllers, is 529mm2. The 4090 is not much larger but is significantly faster and with comparable power use.

Also, zen 1 WAS terrible. The memory controller was attrocious it was comparable to ye olde sandy bridge in games, and it couldnt OC worth a darn. The only benefit it had was that it was cheaper then intel and offered more cores for compute tasks.

The framework laid by zen 1 is what was actually good.

Absolutely none of which is helping AMD win on the arch front seeing how well the 4090 performs.

You should really learn what whataboutism is.

Architecturally speaking ADA has provided a significant jump in performance and efficiency. rDNA3 has done neither, as demonstrated by the huge disparity between core size and performance gain on the 7900 series, and the wet fart that is the 7600.

By that standard the last time AMD produced an impressive arch was the original GCN in 2012.
Depends what impressed you, I think you a tad harsh.
It's worth remembering that AMD went with stepping on evolved 2.5D designs, Nvidia went again with maximum, they can indeed keep doing this, though even they won't, and are not (hopper), but again AMD are stretching a lead in chip aggregate designs IMHO.
Given the feat, it's clear with ridiculous extra effort even the 7900 got close, the 4090 could also be tuned like a loon though, rdna3 for me brought enough performance, efficiency, Raytacing, new features etc for me it just cost arse.
But it was a five year update tbf.
Posted on Reply
Add your own comment
Sep 13th, 2024 01:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts