• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Volt-modded RX 7900 XTX Hits 3.46 GHz, Trades Blows with RTX 4090

Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The problem with RT is that we have no sane metric for it. The performance drop you get - in both camps - depends heavily on the amount of RT used. So we can compare architectures, and sure, Nvidia HAS more perf on offer there, but its misty as fuck. Fact is, every single GPU chokes hard on RT, today, with perf loss north of 40% just for mild IQ improvements. Its not sustainable 'as is'.

The reason I say this is because we're still in the early adopting days. There is no transparency here wrt which architecture will remain best fit going forward. What we are seeing though is that engines deploy lighter / more adaptive technologies to cater to different performance levels. Nvidia is capable of leveraging their proprietary approach today, but that advantage is going to end soon - and its not an architectural advantage so much as an optimization exercise with developers.

When RT gets more mainstream and there is more consensus on how it gets used, we can get a much better view on what the architectural impact is on RT perf. Today it simply is bonus, at best, and many gamers will be turning it off, especially lower down the stack.

The analogy to core counts on CPUs though... lol. There is a lot wrong with that - if games would have been primarily single threaded, guess what, the 8 core CPU is definitely the more competitive product and if it has, say, a fast cache to accelerate gaming workloads and topped out at 8 cores, while beating high core count adversaries in the other camp on a smaller footprint, then yes, it is architecturally stronger, at least for that use case. So I guess if you buy into the RT fairy tale, you can defend Ada is stronger for your use case right now.

But there is the long term perspective here, and with architectures that one matters. Nvidia I'm sure has a plan going forward, but the fact is, 600mm is the end of the road for Geforce products, and they're on it, while giving that same SKU a very high power limit while on a very efficient node. All I see here is another Intel, to be fair, with the added point of using RT/DLSS to avoid the death of monolithic instead of a big little approach :)
You are missing the point. I'm not arguing about the usefulness of rt. Let's for the sake of argument agree that's it's completely garbage and no game should ever use it. Still when you want to compare architectural efficiency you need to tske that into account. If half the 4090 is used for RT acceleration (it's not, just an example) then just comparing transistors to raster performance would get you completely flawed results. There are purely rt workloads / benchmarks you cna use for such a comparison. 3d Mark has such a test for example

Regarding your comment about cpus, your argument leads to the 7700x being better architecturally than a 7950x,since they perform similarly in games. Which is just not true, it's just flawed comparison, exactly like comparing 4090 purely on raster is flawed.

Anyways we have the 4080 vs the xtx comparison, amd is achieving much less with much more. Bigger die, bigger bus widths, more vram, more of everything, it just ties the 4080 in raster, loses in rt.
 
Joined
Dec 25, 2020
Messages
6,669 (4.68/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Architectural superiority?!

Did you consider this?

View attachment 296480 vs View attachment 296482

And let's not forget the 5nm GCD is only 300mm2 as well.

Ada is not superior, its more dense, a bigger die and a LOT more expensive to bake. Most other metrics that originate from the hardware are much the same, such as power consumption / W/frame. That 20% ish gap on raster translates almost directly in the transistor count gap as well. What else has Nvidia got? Featureset. Not hardware or architecture. That's why they push DLSS3 so hard.

This is the equivalent of early day Ryzen vs Intel monolithic, and its clear AMD is building competitive advantage if you look at die size / perf.


Oh lol, missed this one :D

With 25% of the cache (very transistor heavy) plus 12.5% of SMs/CUDA cores disabled, how much of that die area is simply dark on an RTX 4090 and how much transistor logic AMD simply moved off-die into the MCDs? There's also the consideration of dedicated tensor cores which add up die area but aren't normally used for regular shader processing.

I think comparing AD102 and N31 on die area and transistor count isn't going to be any accurate in justifying it as a smaller processor. If anything, N31s design might be actually more complex all things considered.

Not that either of us can actually verify our claims, I just strongly believe that for what they are and with the intent of graphics, they are much closer than apart.
 
Joined
Apr 30, 2008
Messages
4,897 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor Intel Core i5 12600H
Motherboard MinisForum NAB6 Lite Board
Cooling Mini PC Cooling
Memory Apacer 16GB 3200Mhz
Video Card(s) Intel Iris Xe Graphics
Storage Kingston 512GB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum NAB6 Lite Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Home 64bit
Benchmark Scores Don't do them anymore.
Uhm, that's silly. The 4070ti beats the 7900xtx in some games as well. Is that a testament to how great a value it its?
Lol what Nvidia Koolaid are you drinking ....
 
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Lol what Nvidia Koolaid are you drinking ....
Are you suggesting there is not a single game that the 4070ti beats the 7900xtx? What amdkoolaid are you drinking?
 

Hxx

Joined
Dec 5, 2013
Messages
303 (0.08/day)
Anyways we have the 4080 vs the xtx comparison, amd is achieving much less with much more. Bigger die, bigger bus widths, more vram, more of everything, it just ties the 4080 in raster, loses in rt.
Ignoring everything else but raster , the 7900 xtx is faster than a 4080 in most games . Including minimum fps . Depending on which games you prefer , it can be noticeably faster in a few titles . I don’t recall seeing a title where the 4080 is noticeably faster (maybe civilization ).
 
Last edited:
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Ignoring everything else but raster , the 7900 xtx is faster than a 4080 in most games . Including minimum fps . Depending on which games you prefer , it can be noticeably faster in a few titles . I don’t recall seeing a title where the 4080 is noticeably faster (maybe civilization ).
The difference is negligible, like less than 5% on average, while the xtx is much bigger in transistor count, bus width etc.
 
Joined
Jun 2, 2017
Messages
9,090 (3.33/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
How many PC Games are out there? How many of those Games support Ray Tracing? If the Ray Tracing uptake was even 5% of the overall library it would matter more than it does now with less than 1% total for all Games you can play on PC. The people arguing that the 4070TI is faster than the 7900XTX are looking at day one numbers and not taking into account that owners of the 7000 series cards are not complaining about the issues that people have blasted all over social media about console ports with cards with less than 12GB.

I will bring my second argument. When I see Ray tracing in TWWH, XCOM or ARPGs I will care more. I know I have not seen many posts about the Ray tracing performance of Diablo 4 but there are plenty of people enjoying that and the most popular Game right now Age of Wonders 4 has no posts on TPU about Ray Tracing performance.

There are so many Culture War issues in today's society that the truth is hard to see. I will expand, there was a post(s) on TPU that establish that TPU users are AMD Centric. I look at it another way. Lets look at the cost of the 4090 vs the cost of the 7900XTX and you could go from the Asus Prime B650 to the MSI X670E Ace Max. You could go from a 7600 to to 7950X. You could also buy the most expensive Case Cooler Master HAF whatever, You could go full ARGB and buy into the Corsiar eco system from a Deepcool Matrix 55. Is RT really worth that much?

The thing about 7000 that people that own them appreciate is you generally don't have to do anything to enjoy 4K High refresh rate panels with high FPS, butter smooth, High Fidelity (With Mini LED you can generally turn the contrast and colors way up) Games that have come full circle with the best Japanese Games coming to the PC Master Race. There are more people excited about Armored Core 6 than not. Now we get updates for Games like CP2077 and Gears 5 when the amount of quality is so much more than just those, Greedfall anyone? There are also plenty of older Games like Red Faction Guerilla that are sweet on modern systems.

Unreal 5 and Open adoption of DX Ray tracing is the last piece that the RT fans don't give context to. IF the PS5 and Xbox1 are adopting Ray Tracing on AM4 CPUs and 6000 series GPUs. Those 3 months that AMD spent updating the 7000 was also to make sure it worked with 6000 and provided the benefits of more bandwidth and we will see in about a year when RT Games look no better than Unreal 5 but there will be tons more Unreal 5 Games that have native DX Ray Tracing and don't need Nvidia's version. As the PC has many different software options but consoles are all about Gaming.

If you really step back you will see that there are many similarities to the Freesync vs Gsync debate and we all know how that turned out. It's like the people that bash FSR, when it's the only thing you have gotten for free from any GPU vendor that gives cards 10 years old performance improvements. Of course someone is going to tell me how much better DLSS is than FSR but DLSS is not open. Sure Nvidia released a SDK to the wild for DLSS but will that support your 2000 series card or 1080TI?
 
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
How many PC Games are out there? How many of those Games support Ray Tracing? If the Ray Tracing uptake was even 5% of the overall library it would matter more than it does now with less than 1% total for all Games you can play on PC. The people arguing that the 4070TI is faster than the 7900XTX are looking at day one numbers and not taking into account that owners of the 7000 series cards are not complaining about the issues that people have blasted all over social media about console ports with cards with less than 12GB.

I will bring my second argument. When I see Ray tracing in TWWH, XCOM or ARPGs I will care more. I know I have not seen many posts about the Ray tracing performance of Diablo 4 but there are plenty of people enjoying that and the most popular Game right now Age of Wonders 4 has no posts on TPU about Ray Tracing performance.

There are so many Culture War issues in today's society that the truth is hard to see. I will expand, there was a post(s) on TPU that establish that TPU users are AMD Centric. I look at it another way. Lets look at the cost of the 4090 vs the cost of the 7900XTX and you could go from the Asus Prime B650 to the MSI X670E Ace Max. You could go from a 7600 to to 7950X. You could also buy the most expensive Case Cooler Master HAF whatever, You could go full ARGB and buy into the Corsiar eco system from a Deepcool Matrix 55. Is RT really worth that much?

The thing about 7000 that people that own them appreciate is you generally don't have to do anything to enjoy 4K High refresh rate panels with high FPS, butter smooth, High Fidelity (With Mini LED you can generally turn the contrast and colors way up) Games that have come full circle with the best Japanese Games coming to the PC Master Race. There are more people excited about Armored Core 6 than not. Now we get updates for Games like CP2077 and Gears 5 when the amount of quality is so much more than just those, Greedfall anyone? There are also plenty of older Games like Red Faction Guerilla that are sweet on modern systems.

Unreal 5 and Open adoption of DX Ray tracing is the last piece that the RT fans don't give context to. IF the PS5 and Xbox1 are adopting Ray Tracing on AM4 CPUs and 6000 series GPUs. Those 3 months that AMD spent updating the 7000 was also to make sure it worked with 6000 and provided the benefits of more bandwidth and we will see in about a year when RT Games look no better than Unreal 5 but there will be tons more Unreal 5 Games that have native DX Ray Tracing and don't need Nvidia's version. As the PC has many different software options but consoles are all about Gaming.

If you really step back you will see that there are many similarities to the Freesync vs Gsync debate and we all know how that turned out. It's like the people that bash FSR, when it's the only thing you have gotten for free from any GPU vendor that gives cards 10 years old performance improvements. Of course someone is going to tell me how much better DLSS is than FSR but DLSS is not open. Sure Nvidia released a SDK to the wild for DLSS but will that support your 2000 series card or 1080TI?
Amen to this whole story and perspective man. You really hit the nail when you say 'just simple out of the box perf'. It really is like that. Shit just runs, I dont have to even consider FSR or RT, even if you can turn it on, I dont have to pixel peep to spot differences between settings. Just push every slider to the far right, disable the annoying post fx, 3440x1440 native, and Go.

Frankly this exact experience is what I always loved about Nvidia cards too. Pascal was definitely similar in experience. I dont need it any other way! Im SO past comparing AA methods and whatnot. Its completely irrelevant, just like every option that isnt universally available in games on any card.

You are missing the point. I'm not arguing about the usefulness of rt. Let's for the sake of argument agree that's it's completely garbage and no game should ever use it. Still when you want to compare architectural efficiency you need to tske that into account. If half the 4090 is used for RT acceleration (it's not, just an example) then just comparing transistors to raster performance would get you completely flawed results. There are purely rt workloads / benchmarks you cna use for such a comparison. 3d Mark has such a test for example

Regarding your comment about cpus, your argument leads to the 7700x being better architecturally than a 7950x,since they perform similarly in games. Which is just not true, it's just flawed comparison, exactly like comparing 4090 purely on raster is flawed.

Anyways we have the 4080 vs the xtx comparison, amd is achieving much less with much more. Bigger die, bigger bus widths, more vram, more of everything, it just ties the 4080 in raster, loses in rt.
I think you and @Dr. Dro are right. In the current state of affairs, Ada is the better one. I think the point Im pushing on is more on the perspective of its long term prospects - its potential if you will. Thats an old AMD story... so much potential, but.... But this time though the potential is proven already with Zen. Chiplet works. I think that is why I view RDNA3 as superior. The cost and yield advantages are undeniably there and will bring a much bigger benefit to gaming perf than whatever Ada is.

Some specifics that are quite remarkable:
- perf/W is stellar on RDNA3, despite presence of interconnect ie more hardware to facilitate its design
- support for excellent VRAM amount within its TDP bracket
- price/frame substantially better than Ada

This is a much better list than what Ryzen 1 brought to the table. Chiplets are The Way.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,090 (3.33/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Amen to this whole story and perspective man. You really hit the nail when you say 'just simple out of the box perf'. It really is like that. Shit just runs, I dont have to even consider FSR or RT, even if you can turn it on, I dont have to pixel peep to spot differences between settings. Just push every slider to the far right, disable the annoying post fx, 3440x1440 native, and Go.

Frankly this exact experience is what I always loved about Nvidia cards too. Pascal was definitely similar in experience. I dont need it any other way! Im SO past comparing AA methods and whatnot. Its completely irrelevant, just like every option that isnt universally available in games on any card.

I don't use it either and have no need. The 2080TI was the fastest card ever made when it was released. They did intro Ray Tracing with that card but there was no doubting the Gaming performance of that card.
 
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I think you and @Dr. Dro are right. In the current state of affairs, Ada is the better one. I think the point Im pushing on is more on the perspective of its long term prospects - its potential if you will. Thats an old AMD story... so much potential, but.... But this time though the potential is proven already with Zen. Chiplet works. I think that is why I view RDNA3 as superior. The cost and yield advantages are undeniably there and will bring a much bigger benefit to gaming perf than whatever Ada is.

Some specifics that are quite remarkable:
- perf/W is stellar on RDNA3, despite presence of interconnect ie more hardware to facilitate its design
- support for excellent VRAM amount within its TDP bracket
- price/frame substantially better than Ada

This is a much better list than what Ryzen 1 brought to the table. Chiplets are The Way.
I don't like and will never like chiplets. I don't know what you mean "they work". Meaning, you can use them? Sure. Are they better than monolithic? Hell no.. You can see that even on CPUs, AMD has the better core design right now in terms of core transistors / performance and per watt, but they need a stupendous amount of power for simple tasks (or for just ... sitting there idle) compared to Intel chips cause of how much the rest of the die / dies consume. I recently bought a 6900hs laptop and I was astonished to find out it's extremely efficient at idle and light loads, contrary to desktop Zen parts which just stink. Chiplet only works best for the company making them, not for the consumer buying / using them.
 
Joined
Apr 12, 2013
Messages
7,518 (1.77/day)
Except 6900hs is not a chiplet based CPU, also optimization by OEM/ODM is a thing that's why virtually the same laptops can behave very different if configured (im)properly.
 
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't like and will never like chiplets. I don't know what you mean "they work". Meaning, you can use them? Sure. Are they better than monolithic? Hell no.. You can see that even on CPUs, AMD has the better core design right now in terms of core transistors / performance and per watt, but they need a stupendous amount of power for simple tasks (or for just ... sitting there idle) compared to Intel chips cause of how much the rest of the die / dies consume. I recently bought a 6900hs laptop and I was astonished to find out it's extremely efficient at idle and light loads, contrary to desktop Zen parts which just stink. Chiplet only works best for the company making them, not for the consumer buying / using them.
You do know what I mean with they work, I just literally explained it. Idle usage is a thing now? I guess its a thing for you, it seems awkwardly convenient too given the general stance. I can't help but smell heavy bias, much the same when you posit an example of how a 4070ti can beat a 7900XTX... You don't see me saying a 7900XTX can beat a 4090 in certain games either do you? We both know its an irrelevant outlier. You're entitled to your opinion, let's leave it there.

Chiplet seems to work well for gamers using an X3D CPU... - thát is the example I was pointing at earlier when I said smaller die better arch which you managed to turn into a 7700X comparison ;)
But yeah... that idle usage. Damn. Can't be having that now. Guess you're better off power limiting a 13th gen Core instead, yeah... I mean we don't really need high min. FPS in games right, with that idle usage. Its much better to use 3x the load Power in gaming!

Here's a pro tip against idle usage. Turn the PC off. You use less than a single Watt. And it boots in 15 seconds.
 
Last edited:
Joined
Jul 13, 2016
Messages
3,270 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
But the xtx is bigger - has more transistors - and is much slower in rt? How is this not a clear indication of architectural superiority. I mean is obvious that if the while die of the 4080 or the 4090 was used for raster performance the gap would be massive in raster, no?


This is akin to saying the 7950X3D is architecturally inferior to the 13900K because if you add up the size of the two CPU core dies, the IO die, and the cache die it works out to be larger than the 13900K. You are completely ignoring the factors underpinning that total die size, as in one is chiplet based and the other isn't.

Not really sure how number of transistors is relevant here. Number of transistors isn't indicative of a product being inherently more expensive or better. It will vary widely as different features on the die require differing amounts of transistors.

Would the 4080 be better without RT parts? I'm not sure this is relevant given AMD has RT acceleration built into it's cores as well and would theoretically gain by getting rid of those and replacing them with more raster. On the architectural level I'm not sure I'd call AMD much slower in RT. If you look at fortnite's semi-recent UE5 update you can see the 7900 XTX can easily perform just as well as the 4080. Clearly there is a lot of performance left on the table in many games on RDNA3 cards (which makes sense given Nvidia sponsors so many of the games that do include RT) so I'd not rush to say AMD's RT is terrible on an architectural level when you cannot ignore the software component of that equation.

but they need a stupendous amount of power for simple tasks (or for just ... sitting there idle) compared to Intel chips cause of how much the rest of the die / dies consume.

I can assure you they don't:

1684432757055.png

1684432772950.png


At idle both Intel and AMD consume around the same amount of power:

1684432906485.png


Chiplets are the future for desktop parts. Nvidia said as much in a 2017 white paper and Intel is rushing as fast as it can to get them. There are far too many benefits to ignore for any chip larger than 120mm2.


You do know what I mean with they work, I just literally explained it. Idle usage is a thing now? I guess its a thing for you, it seems awkwardly convenient too given the general stance. I can't help but smell heavy bias, much the same when you posit an example of how a 4070ti can beat a 7900XTX... You don't see me saying a 7900XTX can beat a 4090 in certain games either do you? We both know its an irrelevant outlier. You're entitled to your opinion, let's leave it there.

Chiplet seems to work well for gamers using an X3D CPU... - thát is the example I was pointing at earlier when I said smaller die better arch which you managed to turn into a 7700X comparison ;)
But yeah... that idle usage. Damn. Can't be having that now. Guess you're better off power limiting a 13th gen Core instead, yeah... I mean we don't really need high min. FPS in games right, with that idle usage. Its much better to use 3x the load Power in gaming!

Here's a pro tip against idle usage. Turn the PC off. You use less than a single Watt. And it boots in 15 seconds.

Heck the idle power usage excuse isn't even a valid one given AMD and Intel are within 2 watts of each other in that regard.
 
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Heck the idle power usage excuse isn't even a valid one given AMD and Intel are within 2 watts of each other in that regard.
The GPU idle power on RDNA3 is what he refers to I think. I hope, at least I took it that way. Ryzen has 'higher idle' as well, so technically ... I mean that is kind of the level I feel I'm communicating at with him right now. Its too bad, really.
 
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I can assure you they don't:
Yeah, right



Intel does not idle at 20+ watts, lol. Mine is streaming 2 videos with around 15 browser tabs open while being on discord call at 6 watts (12900k). My 3700x needs around 40 for the same workload. Only the mobile zen parts get close to intel in that regard.

you posit an example of how a 4070ti can beat a 7900XTX... You don't see me saying a 7900XTX can beat a 4090 in certain games either do you? We both know its an irrelevant outlier. You're entitled to your opinion, let's leave it there.
You misunderstood the point. I used the 4070ti replying to a guy sayin the xtx beats the 4090 in some games. It's completely irrelevant, exactly as irrelevant it is that the 70ti beats the xtx in some games. Doesn't matter, it's an outlier.
 
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yeah, right



Intel does not idle at 20+ watts, lol. Mine is streaming 2 videos with around 15 browser tabs open while being on discord call at 6 watts (12900k). My 3700x needs around 40 for the same workload. Only the mobile zen parts get close to intel in that regard.


You misunderstood the point. I used the 4070ti replying to a guy sayin the xtx beats the 4090 in some games. It's completely irrelevant, exactly as irrelevant it is that the 70ti beats the xtx in some games. Doesn't matter, it's an outlier.
This one is much more funny, and recent:

1684434359778.png


Yeah... so.... euhm... let's try and pull a general statement from this then

Or this, I especially love that monolithic 13900K there. This one is great for context. See, watt per frame/points is nothing unless you factor in the actual total usage and performance. What it underlines is this: the results are all over the place, chiplet excels in about as many use cases as monolithic, and this is a wildly moving target now as stuff gets optimized for both chiplet and big little.

1684434414732.png
 
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You do know what I mean with they work, I just literally explained it. Idle usage is a thing now? I guess its a thing for you, it seems awkwardly convenient too given the general stance. I can't help but smell heavy bias, much the same when you posit an example of how a 4070ti can beat a 7900XTX... You don't see me saying a 7900XTX can beat a 4090 in certain games either do you? We both know its an irrelevant outlier. You're entitled to your opinion, let's leave it there.

Chiplet seems to work well for gamers using an X3D CPU... - thát is the example I was pointing at earlier when I said smaller die better arch which you managed to turn into a 7700X comparison ;)
But yeah... that idle usage. Damn. Can't be having that now. Guess you're better off power limiting a 13th gen Core instead, yeah... I mean we don't really need high min. FPS in games right, with that idle usage. Its much better to use 3x the load Power in gaming!

Here's a pro tip against idle usage. Turn the PC off. You use less than a single Watt. And it boots in 15 seconds.
The 13900k also stinks in gaming power draw, that's why I sold mine and went back to the 12900k. But zen cpus for daily use are just horrible in power draw, and that's just a fact. Most of my daily work involves multiple tasks (but not multithreaded), with multiple browsers and stuff, and goddamn any zen part consumes 3-4-5 times more power than the intel equivalent.

This one is much more funny, and recent:
There is nothing funny about it, of course if you use the 13900k without limits it's a power hog. But who does that? Who the heck runs 10 hour long blenders at 300+ watts? I bet no one. Just reviewers I guess, to generate clicks from the amd armada.

Also in your graph the 3d still sucks in single thread efficiency, lol. In order to get the 95 pts shown in your graph, it drops its clockspeeds to the point it's as fast as the 13400f. And the 13400f at that point is much more efficient, lol (50+% more to be exact)
 
Joined
Sep 17, 2014
Messages
22,417 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
The 13900k also stinks in gaming power draw, that's why I sold mine and went back to the 12900k. But zen cpus for daily use are just horrible in power draw, and that's just a fact. Most of my daily work involves multiple tasks (but not multithreaded), with multiple browsers and stuff, and goddamn any zen part consumes 3-4-5 times more power than the intel equivalent.
The 12900k is still a power hog, but its good you're living that alternate reality hardcore.

There is nothing funny about it, of course if you use the 13900k without limits it's a power hog. But who does that? Who the heck runs 10 hour long blenders at 300+ watts? I bet no one. Just reviewers I guess, to generate clicks from the amd armada.

Also in your graph the 3d still sucks in single thread efficiency, lol. In order to get the 95 pts shown in your graph, it drops its clockspeeds to the point it's as fast as the 13400f. And the 13400f at that point is much more efficient, lol (50+% more to be exact)
Yes it is funny because I can look beyond the single result and see context and variance, as I pointed out a post earlier shit's moving all over the place. You're just frantically defending a point that was defeated 5 years ago. You're that unique guy with a set of use cases that perfectly fit Intel and Nvidia's architectural choices and product inner workings. It has been an amazing read!
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
I don't like and will never like chiplets. I don't know what you mean "they work". Meaning, you can use them? Sure. Are they better than monolithic? Hell no.. You can see that even on CPUs, AMD has the better core design right now in terms of core transistors / performance and per watt, but they need a stupendous amount of power for simple tasks (or for just ... sitting there idle) compared to Intel chips cause of how much the rest of the die / dies consume. I recently bought a 6900hs laptop and I was astonished to find out it's extremely efficient at idle and light loads, contrary to desktop Zen parts which just stink. Chiplet only works best for the company making them, not for the consumer buying / using them.
Your going to love pc in ten years then
Yeah, right



Intel does not idle at 20+ watts, lol. Mine is streaming 2 videos with around 15 browser tabs open while being on discord call at 6 watts (12900k). My 3700x needs around 40 for the same workload. Only the mobile zen parts get close to intel in that regard.


You misunderstood the point. I used the 4070ti replying to a guy sayin the xtx beats the 4090 in some games. It's completely irrelevant, exactly as irrelevant it is that the 70ti beats the xtx in some games. Doesn't matter, it's an outlier.
Your entire input in this threads Been irrelevant, you n Nguy and your precious 4090S four foot snake, just like when Linus pushed said 4090 pointlessly to near 1000 Watts, it's just a look at what's left un tapped it's not a look into the future or a brand fight for the ages so there is probably no need to go on and on about your precious in every thread.

Wtaf CPU idle watts has to do with a GPU oc is beyond me.

Maybe make a thread, call it come get some 4090 love or something.
 
Joined
Jun 14, 2020
Messages
3,423 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
The 12900k is still a power hog, but its good you're living that alternate reality hardcore.


Yes it is funny because I can look beyond the single result and see context and variance, as I pointed out a post earlier shit's moving all over the place. You're just frantically defending a point that was defeated 5 years ago. You're that unique guy with a set of use cases that perfectly fit Intel and Nvidia's architectural choices and product inner workings. It has been an amazing read!
No I'm not, that's why I have an amd laptop. Full amd. Both cpu and gpu.

And no the 12900k isn't a power hog. It's actually very very efficient for my workload. Gaming it's below 60w unless I drop to 1080p in which case it goes up to 100 on the heaviest game in existence right now (tlou), and in every other workload I do it's sitting below 20, usually below 10. I'm pretty impressed by its efficiency actually.
 
Joined
Jul 13, 2016
Messages
3,270 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The GPU idle power on RDNA3 is what he refers to I think. I hope, at least I took it that way. Ryzen has 'higher idle' as well, so technically ... I mean that is kind of the level I feel I'm communicating at with him right now. Its too bad, really.

Isn't high idle power consumption only with two different high refresh rate monitors? I believe AMD has it marked as a bug but I haven't kept up with whether they fixed it or not. I think in any case, whether RDNA3 has high idle power or not, I was pointing out with the Zen CPUs that you can have a chiplet based architecture that is efficient. You are probably right with your assumption of what his angle is but I don't see his end goal.
 
Joined
Feb 21, 2006
Messages
2,220 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
Isn't high idle power consumption only with two different high refresh rate monitors? I believe AMD has it marked as a bug but I haven't kept up with whether they fixed it or not. I think in any case, whether RDNA3 has high idle power or not, I was pointing out with the Zen CPUs that you can have a chiplet based architecture that is efficient. You are probably right with your assumption of what his angle is but I don't see his end goal.
It also affected single monitors but worse on dual monitor setups but I believe it will be addressed by a driver update.

My old 6800XT use to idle around 7-8 Watts single 34 inch ultrawide display

On my 7900XTX idle is 50 watts on the same display.

Wizzards card's show less idle wattage so its varies on your card and monitor combo.
 
Last edited:
Joined
Dec 25, 2020
Messages
6,669 (4.68/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Lol what Nvidia Koolaid are you drinking ....

The koolaid that both AMD and NVIDIA diehards drunk reeks of sadness and desperation (a little more apparent on the AMD side as a lot of their fans use red herrings, false equivalences and rely on outlier statistics to claim their ground, but also pretty obvious on the Nvidia side when they're reminded their 4090's check is in the mail), as if it this whole us vs. them thing even mattered.

They are both megacorporations interested in earning your money. Neither are your friends and neither are interested in cutting you a particularly good deal, they just want to dig at each other and make as much money as possible in the process. Buy what suits your needs or corresponds to your ideological bias, end of the day both sides to this pitiless argument are going to be playing the exact same video games, arguably at the same level of fidelity overall. This clubism is no different from console wars, except thrice as pointless. Then again, 3 times 0 is still 0.

I think the point Im pushing on is more on the perspective of its long term prospects - its potential if you will. Thats an old AMD story... so much potential, but.... But this time though the potential is proven already with Zen. Chiplet works. I think that is why I view RDNA3 as superior. The cost and yield advantages are undeniably there and will bring a much bigger benefit to gaming perf than whatever Ada is.

Some specifics that are quite remarkable:
- perf/W is stellar on RDNA3, despite presence of interconnect ie more hardware to facilitate its design
- support for excellent VRAM amount within its TDP bracket
- price/frame substantially better than Ada

This is a much better list than what Ryzen 1 brought to the table. Chiplets are The Way.

We are in resounding agreement about that - chiplets are the future, hold the most potential and MCMs are going to be necessary to continue scaling forward to improve yield, manage costs and scale performance further. The 7900 XTX's "underwhelming" performance is - IMO - excusable given that it is the very first design of its kind, much like the R9 Fury X and its HBM pitch.

A lot of the anger coming from the usage of words such as "underwhelming" come from their generally negative connotation and people are very proud but as you perfectly stated, it obviously performs exceptionally well even having missed its intended performance targets. That's why AMD did not ask more money for it, and instead pitched it against the 4080 after all. I can still mostly run every game out there at 4K with maximum settings at great frame rates on my RTX 3090, so I just don't see why anyone with a 7900 XT or XTX cannot do it either. Some of the games that use more RT effects require DLSS, but I toggle that to quality and go on my merry way. Same you would do with FSR on an AMD card, when the option is spending between $1600 to $2000 on a 4090, I'm not complaining.
 
Joined
Feb 21, 2006
Messages
2,220 (0.32/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5013 AM4 AGESA V2 PI 1.2.0.Cc.
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) XFX Radeon RX 7900 XTX Magnetic Air (24.10.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 20TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c 5800X3D https://valid.x86.fr/b7d
when the option is spending between $1600 to $2000 on a 4090, I'm not complaining.
Good post and that cost for me would be like $2000-$2500 CAD.
 
Joined
Dec 25, 2020
Messages
6,669 (4.68/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Good post and that cost for me would be like $2000-$2500 CAD.

Yup. Cheapest 4090 on record here in Brazil was a Leadtek WinFast on sale for exactly 10 grand ($2000 USD) this week. I don't earn in dollars, however. :oops:
 
Top