Wednesday, May 17th 2023
Volt-modded RX 7900 XTX Hits 3.46 GHz, Trades Blows with RTX 4090
An AMD Radeon RX 7900 XTX graphics card is capable of trading blows with NVIDIA GeForce RTX 4090, as overclocker jedi95 found out. With its power limits unlocked, the RX 7900 XTX was found reaching engine clocks as high as 3.46 GHz, significantly beyond the "architected for 3.00 GHz" claim AMD made in its product unveil last Fall. At these frequencies, the RX 7900 XTX is found to trade blows with the RTX 4090, a segment above its current segment rival, the RTX 4080.
Squeezing 3.46 GHz out of the RX 7900 XTX is no child's play, jedi95 used an Elmor EVC2SE module for volt-modding an ASUS TUF Gaming RX 7900 XTX, essentially removing its power-limit altogether. He then supplemented the card's power supply, so it could draw as much as 708 W (peak), to hold its nearly 1 GHz overclock. A surprising aspect of this feat is that an exotic cooling solution, such as liquid-nitrogen evaporator, wasn't used. A full-coverage water block and DIY liquid cooling did the job. The feat drops a major hint at how AMD could design the upcoming Radeon RX 7950 XTX despite having maxed out the "Navi 31" silicon with the RX 7900 XTX. The company could re-architect the power-supply design to significantly increase power limits, and possibly even get the GPU to boost to around the 3 GHz-mark.
Sources:
jedi95 (Reddit), HotHardware
Squeezing 3.46 GHz out of the RX 7900 XTX is no child's play, jedi95 used an Elmor EVC2SE module for volt-modding an ASUS TUF Gaming RX 7900 XTX, essentially removing its power-limit altogether. He then supplemented the card's power supply, so it could draw as much as 708 W (peak), to hold its nearly 1 GHz overclock. A surprising aspect of this feat is that an exotic cooling solution, such as liquid-nitrogen evaporator, wasn't used. A full-coverage water block and DIY liquid cooling did the job. The feat drops a major hint at how AMD could design the upcoming Radeon RX 7950 XTX despite having maxed out the "Navi 31" silicon with the RX 7900 XTX. The company could re-architect the power-supply design to significantly increase power limits, and possibly even get the GPU to boost to around the 3 GHz-mark.
75 Comments on Volt-modded RX 7900 XTX Hits 3.46 GHz, Trades Blows with RTX 4090
'Value' is in his abstract mind a weird construction of highly variable RT performance that always compensates perfectly for Nvidia's price premium over comparative raster performance, good luck discussing with that. This is the same person that then continues the argument on chiplets being worse for high idle power, himself running top end Intel xx900K CPUs. Undervolted, obviously :D
I hear they have special places for this kind of logic, and I do know how to approach said person going forward :)
7900XTX lose in 99.9% of games out there against 4090 that people have to rely on cherry picked games in order to defend it :D
Funnily enough the person you are talking to - who seemingly seems to agree with you - has a 4080. So he thinks that the RT performance is actually worth the extra money over the XTX, you know, the same thing you criticize me for, lol.
Other than that, I already commented at length about your use case, like I said, great read. I'm not going to dissect this further for you, if you say you're not biased, that is what it is, but it sure doesn't read that way to me.
Whether or not im biased is irrelevant honestly. Either what im saying is true, or it isn't. Being biased or not being biased won't make a false statement true or a true statement false. I can claim you are biased but it won't change a thing, it's completely irrelevant.
Take off the green tinted glasses my guy.
The topic is about an overclocked 7900XTX.
update unless they are waiting for next gen to compete with their own lineup like they are doing now.
Not that it matters but the 7900XTX does win in Battlefield 5 and FarCry 6 per the review
Likely to massage their ego that they're precious 4090 is still best. I didn't post part A earlier but who knew how fitting it would be.
And yet if you add another circuit for power, a second for Vreg control, a PSU for the pc obviously.
Loads of effort and time with only a water block, you might break 3Ghz.
The point:)
Imagine what three times that time, bigger PSU and loads of vasoline and LN2 might do, I sure couldn't be arsed, but I am intrigued.
I will give you an example. The 7900XTX has the RT performance of the 3090. That doesn't matter though because the 4090 is faster at RT. Even today 3090s are $2200 so is it really worth that to get a 3090 over a 7900XTX? Is Nvidia RT worth the cost of a High end GPU? If you say yes it is your opinion but keep in mind that there are more Games than the 50-60 that all Reviewers use to establish the narrative while the fact of seamless 4K 144hz Gaming is true for ALL Games in your library, especially the fun ones with a 7900 GPU.
The 7900XT is about 5-7% slower than the 7900XTX in all things but saving $400 for 4GB less VRAM with a 349 Watt Power profile is plenty good to me.
The only thing that has not shifted is Nvidia's pricing. The 4070TI will not be successful. I have some anecdotal data. With Newegg you can put anything in your cart. It will tell you how many other people have it in their cart as well. So there was a 6750XT for $458 (I posted about it). I put it in my cart but I also put the cheapest 4070TI and within 24 hours the 6750XT was gone (with a limit of 5) while it's been 4 days and that 4070TI is still there.
Unfortunately the narrative that is so strong in North America that even when Nvidia largest partner in the space makes an outwardly seeming moral decision to no longer do business with them (Just like XFX years ago) does nothing more than create a ripple in the Space and in fact there is no mention of EVGA and Nvidia at all in the tech media so it is no longer important. This in a world where you could only buy an Nvidia GPU from Bestbuy, while social media was full of posts with farms of at least 20+ 3070/3080 cards. For an entire year.
The support for closed systems is also insane to me. I was a fan of Physx. It was cool in Games at the time, so cool that Nvidia bought it and took it away. Then we shifted and started buying budget Nvidia cards to get Physx (Remember that) until Nvidia wrote a driver that made it not work it if sensed an AMD GPU in the system. Can we talk about SLI and people will tell you it didn't work but they don't mention Crossfire. Most people are unaware that in the age of the RX480/580/Vega you could have crossfire work at the driver level to the point where if the Game did not natively support Multi GPU you could go into the script and enable Multi GPU (TW) and see almost double the frame rates in battles and I know people are going to talk about stuttering and whatever else but we must remember that at that time 1440P was the high end and 1080P was the Gamer's resolution but 60Hz was undisputed and you had people that refuted the benefits of 120Hz. That led to Gysnc/Freesync and the truth is that changed the Game forever. What do I mean, the monitor and GPU give you butter smooth frames between a range. What does that mean? That when you are playing an ARPG and the screen gets filled with enemies, spells, arrows and explosions. There is no slowdown, stuttering or tearing. You could also take a battle in TW using the largest units in numbers with as many characters as the Game can load and see no slow downs as you pan the map.
The Gaming performance of RDNA. I had a 5600XT and was satisfied. I bought a 6800XT after that and that is a product that gives you the smile that compelling hardware brings the first time you use it. The 7900XT is much faster than the 6800XT and you know what? It was actually cheaper too. I also had a 7900XTX but something happened that caused me to get a 7900XT but from the time I heard about the Specs of the 7900 series GPUs I bought a Gigabyte FV43U. That is a 4K 144HZ panel. When using the 6800XT I had to play most Games at 1440P to enjoy 100+ FPS but once 7000 was installed 4K 144 it is. It is so crazy that even Vsync works fully at 144Hz but turn it off and enjoy the hell out of the Game. Especially New Games, are pushing the GPU and playing Greedfall or Hogwarts will see a clock of 2600+ on 7000 GPUs all Game session long. If you really want to unlock your 7000 pair it with a 13th Gen, if you want to see what 4K Gaming really is put a 70003D chip and load up the VRAM buffer as you explore much more of your library than you intended. Yesterday I got the urge to play some KOA and yes that is 5 hours of marvelling at the beauty of that Game in terms of the use of color. Even some of the enemies are great to look at but those FPS make this a sweet Action RPG in 3D.
You are saying to me whether in inference or not that none of that matters as long as the 4080/4070TI is faster in RT and supports DLSS. That is THE argument that Nvidia supporters post in every single thread about AMD cards.
However, the 7900 XTX does not need to be faster than the 4090 to be a successful product. It needs to offer desirable features, be affordable - and more importantly, the drivers can't let users down. These are three areas where AMD has consistently failed to deliver, even after the massive strides they have achieved in improving their high-level API UMDs as of late, and rewriting large portions of the graphics driver stack. But they won't, they are blowing this golden opportunity they have to prove themselves. I'll be real with you chief, if AMD sold the 7900 XTX for $699, I wouldn't have a SINGLE bone to pick with it, no matter how many features it lacked vs. the RTX 4080, or even if it had some nastier than usual driver bugs - it's right in my element, you see, I've been down that road for over 5 years now.
The RTX 4060 Ti unveiling is an unmitigated disaster. True to every Ada SKU released thus far including RTX 4090, it's a hollow shell of what it could have really been, upmarked and sold as a higher SKU than it ever had the right to be called. Reviewers have universally rejected and reviled it, and the RX 7600 is looking to follow in the exact same mistakes, offering the same deficient 8 GB frame buffer, with even lower performance (as if it wasn't anemic enough to begin with - preliminary leaked benchmarks show 20% behind 4060 Ti in time spy), and less features to boot. Betcha they are gonna charge $330 for it.
Also, zen 1 WAS terrible. The memory controller was attrocious it was comparable to ye olde sandy bridge in games, and it couldnt OC worth a darn. The only benefit it had was that it was cheaper then intel and offered more cores for compute tasks.
The framework laid by zen 1 is what was actually good. Absolutely none of which is helping AMD win on the arch front seeing how well the 4090 performs.
You should really learn what whataboutism is. Architecturally speaking ADA has provided a significant jump in performance and efficiency. rDNA3 has done neither, as demonstrated by the huge disparity between core size and performance gain on the 7900 series, and the wet fart that is the 7600. By that standard the last time AMD produced an impressive arch was the original GCN in 2012.
It's worth remembering that AMD went with stepping on evolved 2.5D designs, Nvidia went again with maximum, they can indeed keep doing this, though even they won't, and are not (hopper), but again AMD are stretching a lead in chip aggregate designs IMHO.
Given the feat, it's clear with ridiculous extra effort even the 7900 got close, the 4090 could also be tuned like a loon though, rdna3 for me brought enough performance, efficiency, Raytacing, new features etc for me it just cost arse.
But it was a five year update tbf.