• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Bottleneck? AMD Radeon RX 6700 and AMD Ryzen 5 3400G

Joined
Mar 29, 2023
Messages
9 (0.02/day)
Location
Argentina
Processor AMD RYZEN 5 3400G STEALTH
Motherboard ASUS TUF Gaming B450 PLUS II AM4
Memory 8GBs x2
Video Card(s) GPU AMD Radeon RX 6700
Storage SSD 480GB
Display(s) SAMSUNG S22F350H HDMI FHD TN 5MS 60HZ
Case NOX HUMMER SPARK
Power Supply Cooler Master Gold 700w G700W 80+ Gold
Keyboard Redragon K589 SHRAPNEL RGB
Software Windows 10 64bits
After getting my PC built, the technician told me I'd suffer a bottleneck with my GPU and CPU. As the title says, I have an AMD Radeon RX 6700 10GB and a AMD Ryzen 5 3400G with Radeon Vega Graphics 3.70 GHz.
So far I've only tested the performance hit on Skyrim with ENBs and high graphic settings, I get 30 FPS in certain areas so I'm guessing that's because of the bottleneck?
All in all I was just wondering if I should try more demanding and modern games or not. I don't wanna set my PC on fire...
I appreciate any advice!
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,769 (1.53/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
It definitely won’t cause a fire but that CPU is absolutely dog water and a severe bottleneck.
Luckily you can easily upgrade the CPU with no issues and have many options to choose from. I’d recommend a 5600 or 5600X which would pair nicely with that GPU.
 
Last edited:
Joined
Oct 29, 2019
Messages
455 (0.26/day)
Yeah that's a pretty good bottleneck. The 3400G if I remember correctly is slightly slower than a 9th gen i3

Probably the only thing saving it from being a gigantic issue is your monitor is only capped at 60hz so it's not like you're doing high FPS gaming. Bottlenecks like this can still exist at 60hz but is way more noticeable or annoying at higher FPS (EX: Can't get more than 70FPS on your favorite games but your monitor can run 144hz).

For your scenario (6700 with 3400G at 60hz) what would probably be the most noticeable is dips more so average FPS
 
Joined
Nov 15, 2021
Messages
2,853 (2.83/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
Yep. The 3400G is the Zen+ uArch, pretty low on the scale in 2023. It is also a quad-core, which isn't much for modern AAA games. Your 6700 could stretch its legs a lot more if you had a Zen 3 CPU in there.

R5 5500 at minimum, 5600(X) or even better 5700X would be good. 5800X3D if you have the money - those things are awesome and you could keep the system for many years to come.
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,769 (1.53/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Yeah that's a pretty good bottleneck. The 3400G if I remember correctly is slightly slower than a 9th gen i3

Probably the only thing saving it from being a gigantic issue is your monitor is only capped at 60hz so it's not like you're doing high FPS gaming. Bottlenecks like this can still exist at 60hz but is way more noticeable or annoying at higher FPS (EX: Can't get more than 70FPS on your favorite games but your monitor can run 144hz).

For your scenario (6700 with 3400G at 60hz) what would probably be the most noticeable is dips more so average FPS
It’s normally a misconception of bottlenecks being a problem just at FPS 60 or above. A lot of people seem to forget the lows which actually make the difference between playable and unplayable. You can have 100FPS but if you get dips to 30 or less your experience will be horrid and the stutter irritating. And with a CPU like the 3400G it’s definitely going to have crap lows which IPC plays a huge part of.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,051 (6.56/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
After getting my PC built, the technician told me I'd suffer a bottleneck with my GPU and CPU. As the title says, I have an AMD Radeon RX 6700 10GB and a AMD Ryzen 5 3400G with Radeon Vega Graphics 3.70 GHz.
So far I've only tested the performance hit on Skyrim with ENBs and high graphic settings, I get 30 FPS in certain areas so I'm guessing that's because of the bottleneck?
All in all I was just wondering if I should try more demanding and modern games or not. I don't wanna set my PC on fire...
I appreciate any advice!
Get hwinfo64 so we can see all that you have as of hardware, the powersupply you will have to take a picture of.
 
Joined
Oct 29, 2019
Messages
455 (0.26/day)
It’s normally a misconception of bottlenecks being a problem just at FPS 60 or above. A lot of people seem to forget the lows which actually make the difference between playable and unplayable. You can have 100FPS but if you get dips to 30 or less your experience will be horrid and the stutter irritating. And with a CPU like the 3400G it’s definitely going to have crap lows which IPC plays a huge part of.
Depends on the type of games.

I just recently retired a 3770k system with slow 1333mhz ram (overall probably 15 to 20 percent slower then his setup) and on all the modern AAA titles I played (ff7 remake, red dead 2, hitman 3, Forza horizon 5, God of war) I never noticed any irritating stutter or drops. The in game benchmarks never showed bad 1% lows either

Don't get me wrong I'm not defending the 3400g but on low hz monitors it's going to be a little harder to notice drops especially on single player heavy games (poorly optimized games and online heavy ones sure but those arent easy to run even on expensive hardware)
 
Joined
Jul 20, 2020
Messages
990 (0.66/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 sumpin'
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / PNY RTX 4060 Ti
Storage 4TB Team MP34 / 2TB WD SN580
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / EVGA BQ 500
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
It totally depends on game type, I used an i7-4790 (close enough to the 3400G) with a 6600XT for a bit and while it was CPU bound in many games, that doesn't mean that they dropped below 60fps much. However when playing games that have that CPU demand (Horizon ZD fights with lots of bots and NPCs), it would dip to the 40s. That's not stuttering, that's a reduction in fps. It was still smooth though those are marginal fps for my playing taste.

The 6700 will be bottlenecked by the 3400G in more recent CPU demanding games but not in all. Try your games out and use the AMD Adrenaline overlay to view your FPS, %GPU use and %CPU use. If your %GPU use goes below 95 or 90% when you see those 30fps numbers, then you have a bottleneck. And if those 30fps times bug you, then consider an R5 5500 or better CPU upgrade. If they don't bug you much, then just enjoy!
 
Joined
Jul 26, 2013
Messages
407 (0.10/day)
Location
Midlands, UK
System Name Electra III
Processor AMD Ryzen 5 3600 @ 4.40 GHz (1.3 V)
Motherboard ASUS PRIME X570-PRO with BIOS 5003
Cooling Cooler Master Hyper 212 EVO V1 + 4× ARCTIC P12 PWM
Memory 32 GiB Kingston FURY Renegade RGB (DDR4-3600 16-20-20-39)
Video Card(s) PowerColor Fighter RX 6700 XT with Adrenalin 24.7.1
Storage 1 TiB Samsung 970 EVO Plus + 4 TB WD Red Pro
Display(s) Dell G3223Q + Samsung U28R550Q + HP 22w
Case Fractal Design Focus G (Black)
Audio Device(s) Realtek HD Audio S1220A
Power Supply EVGA SuperNOVA G3 750 W
Mouse Logitech G502 X Lightspeed
Keyboard MSI VIGOR GK71 SONIC Blue
Software Windows 10 22H2 Pro x64
Benchmark Scores CPU-Z = 542/4,479 — R15 = 212/1,741 — R20 = 510/3,980 — PM 10 = 2,784/19,911 — GB 5 = 1,316/7,564
After getting my PC built, the technician told me I'd suffer a bottleneck with my GPU and CPU. As the title says, I have an AMD Radeon RX 6700 10GB and a AMD Ryzen 5 3400G with Radeon Vega Graphics 3.70 GHz.
So far I've only tested the performance hit on Skyrim with ENBs and high graphic settings, I get 30 FPS in certain areas so I'm guessing that's because of the bottleneck?
All in all I was just wondering if I should try more demanding and modern games or not. I don't wanna set my PC on fire...
I appreciate any advice!

You won't have any fires.

What games do you play, and what board do you have?
 
Joined
Dec 26, 2006
Messages
3,675 (0.57/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
After getting my PC built, the technician told me I'd suffer a bottleneck with my GPU and CPU. As the title says, I have an AMD Radeon RX 6700 10GB and a AMD Ryzen 5 3400G with Radeon Vega Graphics 3.70 GHz.
So far I've only tested the performance hit on Skyrim with ENBs and high graphic settings, I get 30 FPS in certain areas so I'm guessing that's because of the bottleneck?
All in all I was just wondering if I should try more demanding and modern games or not. I don't wanna set my PC on fire...
I appreciate any advice!
Depends on resolution and frame rate. If your monitor is 1080p and 60HZ and you optimize your video settings should be no issue. Sometimes there is quite a few graphics settings that are diminishing returns imho. They really nerf fps for trivial increases in eye candy. Just do a search and should be lots of recommendations for ‘optimized’ fps vs eye candy settings. CPU usually becomes less of an issue at higher resolution like 4k.
 
Joined
Mar 29, 2023
Messages
9 (0.02/day)
Location
Argentina
Processor AMD RYZEN 5 3400G STEALTH
Motherboard ASUS TUF Gaming B450 PLUS II AM4
Memory 8GBs x2
Video Card(s) GPU AMD Radeon RX 6700
Storage SSD 480GB
Display(s) SAMSUNG S22F350H HDMI FHD TN 5MS 60HZ
Case NOX HUMMER SPARK
Power Supply Cooler Master Gold 700w G700W 80+ Gold
Keyboard Redragon K589 SHRAPNEL RGB
Software Windows 10 64bits
You won't have any fires.

What games do you play, and what board do you have?
if board means motherboards then I have an ASUS TUF Gaming B450 PLUS II AM4. So far I've only tried Skyrim with ENBs and high graphic settings. Also The Sims 4 but runs smoothly.

Depends on resolution and frame rate. If your monitor is 1080p and 60HZ and you optimize your video settings should be no issue. Sometimes there is quite a few graphics settings that are diminishing returns imho. They really nerf fps for trivial increases in eye candy. Just do a search and should be lots of recommendations for ‘optimized’ fps vs eye candy settings. CPU usually becomes less of an issue at higher resolution like 4k.
I see. I do have a 60hz monitor so should I change the graphic setting from the game a bit? Or you mean other type of settings
 
Joined
Dec 25, 2020
Messages
5,774 (4.33/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
if board means motherboards then I have an ASUS TUF Gaming B450 PLUS II AM4. So far I've only tried Skyrim with ENBs and high graphic settings. Also The Sims 4 but runs smoothly.


I see. I do have a 60hz monitor so should I change the graphic setting from the game a bit? Or you mean other type of settings

When you refer to Skyrim, are you running the old version of Skyrim (released in 2011), or the newer Special Edition? In any case, you will definitely benefit from a CPU upgrade across pretty much any and all games.

If it's the latter, the primary issue lies in the Creation Engine itself. It's basically a sworn archenemy of Radeon graphics as it makes extensive use of driver command lists with deferred contexts, which are AMD's achilles heel in DirectX 11 games. You often hear of the infamous frame rate issues that Fallout 4 exhibits in the Boston outskirts and in and around the Corvega plant, this occurs due to an overload in the graphics driver caused by excessive amount of draw calls. The same phenomenon occurs in modded installations of Skyrim SE and in Fallout 76. A well-written engine wouldn't overload the graphics driver, but this is not a well-written engine.

Nvidia cards can batch and defer draw calls across multiple processors through the use of a command list, so the graphics driver is able to scale this workload across multiple CPU cores. The result is that it makes CPU usage balloon up and heavily strains the processor, but as long as you have CPU resources left the frame rate will scale accordingly (or even be hurt if the game ends up fighting for resources - this is often seen on low end 4-thread CPUs), while on AMD, the graphics driver uses the GPU's own command processor in an immediate context, which means that the driver will serially introduce commands using a single CPU thread into the GPU, causing a massive bottleneck (or a potential relief in the same thread-bound scenario I've described earlier).

Note that my information on this might not be fully up to date, for RDNA 2 and 3 GPUs AMD has recently rewritten the DirectX 11 stack to run on their new PAL abstraction layer and I do not know if they have finally introduced command lists support. GPU-Z can tell you if it is supported by going on Advanced tab and then selecting DirectX 11, I would actually like to know if this is supported in RDNA 3 as I am a massive fan of Bethesda RPGs and this is an actual consideration for me. What I do know, however, is that RX Vega cards and GCN-based APUs do not support and do not benefit from the new PAL abstraction layer.
 
Joined
Sep 21, 2020
Messages
1,555 (1.09/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
for RDNA 2 and 3 GPUs AMD has recently rewritten the DirectX 11 stack to run on their new PAL abstraction layer and I do not know if they have finally introduced command lists support. GPU-Z can tell you if it is supported by going on Advanced tab and then selecting DirectX 11
According to GPU-Z, this feature isn't currently implemented on RDNA3:

dx11.jpg


But we see a similar bottleneck in games utilizing the Creation engine on current NVidia cards as well. Even the RTX4090 combined with a 13900K suffers huge fps drops in some scenes, as evidenced in this thread.
 
Last edited:
Joined
Dec 25, 2020
Messages
5,774 (4.33/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
According to GPU-Z, this feature isn't currently implemented on RDNA3:

View attachment 296490

I see, that is a shame. I've linked this same article to someone else while explaining the whole deal with scheduling earlier today, but you should read it too. It's good technical info to understand how draw call handling works:


A very common misconception is that AMD's graphics driver is more optimized and thus places a lower burden on the CPU, but that isn't what occurs. It's the approach that Nvidia chose that favors PCs with high spec CPUs with more room for improvement (i.e. widening their performance potential at the cost of higher CPU load) over ensuring a smoother experience at the low end. Many games that seem to perform inexplicably poorly in an Nvidia GPU with a low end CPU are actually experiencing an unusual situation where the graphics driver is fighting for resources with the game itself, while Radeon's conservative threading keeps the frame rate more consistent in that scenario. Recently this has also affected some games with unusually high CPU usage due to bugs and/or DRM, with similar behavior where the Radeon cards punch way above their usual weight.

Last that I've had a chance to play with this, the Radeon driver with the Radeon VII didn't use more than 6 CPU threads even in the most parallelized tests, but Nvidia's would load all 36 threads on the same processor.

Unfortunately it's not a setting you can choose and is intertwined with the driver code itself. Nvidia explains the different driver instancing models on this article:


I'm not a graphics programmer but I've spent a very long time poring over the documentation on this specific issue trying to understand it better, as I've always been very frustrated with Creation's performance on Radeon. I've come to the conclusion the engine is just very unfriendly to the way AMD elected to write their driver, which isn't bad, just different.

Oh well, as our Godd and Savior would say, "see that mountain? you can climb it, it just works, sixteen times the detail. thank you for listening to my ted talk and remember to buy Skyrim".

BTW checked out that thread, note that they're using a i7-9700K: it's essentially the one modernish processor you didn't want to use with a 4090 if you wanted to upkeep performance in this scenario. Having only 8 threads will really have it by the balls here, you could knock it down to the 4070 and they'd be getting the exact same fps with that CPU, I'm sure of it.

Creation also has a weird affinity for the Ryzen's 3D VCache for some reason. 7950X3D just demolishes even the 13900KS in this game.
 
Last edited:
Joined
Sep 21, 2020
Messages
1,555 (1.09/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
A very common misconception is that AMD's graphics driver is more optimized and thus places a lower burden on the CPU, but that isn't what occurs. It's the approach that Nvidia chose that favors PCs with high spec CPUs with more room for improvement (i.e. widening their performance potential at the cost of higher CPU load)
Thanks for the links, I'm currently reading these. In theory, Nvidia's approach to DX11 MT rendering should have been more future proof, but it seems the benefits may not always translate to actual games.

I just ran the 3DMark API Overhead test, which employs the same method as the one in the Intel article you posted. The DX11 MT run uses all available CPU threads. In this particular test RDNA3 doesn't appear to take advantage of multiple threads utilized in deferred context rendering:

API.jpg

Even though this is an unsupported test from 2015, it'd be interesting to see how contemporary Nvidia hardware does here when not limited by the CPU. Could you run this test on your system and post the results?

BTW checked out that thread, note that they're using a i7-9700K
There is a user there with a 13900K and an RTX4090 showing 42% GPU utilization in a problematic area in Fallout 4, and another one with a 12900K showing 40% usage with the same card.
 
Joined
Dec 26, 2006
Messages
3,675 (0.57/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
if board means motherboards then I have an ASUS TUF Gaming B450 PLUS II AM4. So far I've only tried Skyrim with ENBs and high graphic settings. Also The Sims 4 but runs smoothly.


I see. I do have a 60hz monitor so should I change the graphic setting from the game a bit? Or you mean other type of settings
Well. Ideally make sure your monitor is set to default resolution and refresh rate. Probably 1920x1080 at 60 Hz.

As for graphics settings it’s personal preference. I personally disable hdr/bloom, depth of field and some others I don’t really like or really hinder performance. If it runs to your liking just sit back and enjoy or you will get lost down the upgrade rabbit hole.
 
Joined
Dec 25, 2020
Messages
5,774 (4.33/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Thanks for the links, I'm currently reading these. In theory, Nvidia's approach to DX11 MT rendering should have been more future proof, but it seems the benefits may not always translate to actual games.

I just ran the 3DMark API Overhead test, which employs the same method as the one in the Intel article you posted. The DX11 MT run uses all available CPU threads. In this particular test RDNA3 doesn't appear to take advantage of multiple threads utilized in deferred context rendering:

View attachment 296514

Even though this is an unsupported test from 2015, it'd be interesting to see how contemporary Nvidia hardware does here when not limited by the CPU. Could you run this test on your system and post the results?


There is a user there with a 13900K and an RTX4090 showing 42% GPU utilization in a problematic area in Fallout 4, and another one with a 12900K showing 40% usage with the same card.

I'll definitely run it again but just this old score with a GTX 580(!) has your 7900 XTX beat in DX11MT with a slower CPU too, and I have a hunch that is why the 3DMark guys discontinued this test, there's just no possible way any game will run better on a 580 than on your card


I see about the guys using a 13900K, was very sleepy... I'm installing F4 for another playthrough but it takes a few days with the mod setup I run. It's some 80 GB of stuff I have to sift and sort through :oops:

@QuietBob I have the update for you, ran it on my PC

 
Last edited:
Joined
Sep 21, 2020
Messages
1,555 (1.09/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
a GTX 580(!) has your 7900 XTX beat in DX11MT with a slower CPU too
According to the whitepaper, this synthetic test represents a hypothetical "best-case" scenario. Unlike typical 3DMark benchmarks, it cannot be used to compare GPU architectures, or even GPU performance in general. Its sole purpose is to highlight the difference in API overhead on a specific hardware configuration. As such, results from systems utilizing different graphics cards and processors are not comparable.

I have the update for you, ran it on my PC
Interesting results! So, in DX11 the RTX3090 could theoretically achieve 62% higher performance in CPU bound scenarios, when paired with the fastest 32-threaded desktop processor. And on condition that deferred context rendering was implemented properly in the game engine.

Creation also has a weird affinity for the Ryzen's 3D VCache for some reason. 7950X3D just demolishes even the 13900KS in this game.
I wish I could test the engine's dependence on the Vcache myself, but just realized I don't have Fallout 4. I was sure I grabbed it in some sale :oops:
 
Joined
Dec 25, 2020
Messages
5,774 (4.33/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That is true, but it also shows how lopsided the API overhead situation has been all along, and that doesn't quite translate in real world performance as you have seen. It can cause problems or huge benefits for both vendors within that range of whether something benefits from it or not. I think the true reason why 3DMark devs weren't willing to call this a competitive benchmark because it really depends so much on the system configuration that even two practically identical PCs can show wildly different results with changes that could be called generally minimal such as memory timings or interconnect bandwidth, and at the same time the drivers from each vendor do things their own way as well.

Regarding Fallout 4, it's not a new thing that it loves cache, I found out about it a very long time ago, that's actually the reason why I've purchased that 18-core Xeon for my old X99 machine. Despite low clocks, the 45 MB L3 was unheard of back in 2017, and Fallout 4 delighted itself on it vs. my previous 4770K build. Much better performance. That's also kind of a reason why these decommissioned Haswell and Broadwell Xeons became so popular, even though they finally began to show their age (as they have a lot of cache but IPC is terrible, and software just came about requiring a bit of both).
 
Top