- Joined
- May 23, 2023
- Messages
- 66 (0.13/day)
System Name | anal fucker |
---|---|
Processor | a |
Motherboard | n |
Cooling | a |
Memory | l |
Storage | f |
Display(s) | u |
Case | c |
Audio Device(s) | k |
Power Supply | e |
Mouse | r |
System Name | anal fucker |
---|---|
Processor | a |
Motherboard | n |
Cooling | a |
Memory | l |
Storage | f |
Display(s) | u |
Case | c |
Audio Device(s) | k |
Power Supply | e |
Mouse | r |
System Name | His & Hers |
---|---|
Processor | R7 5800X/ R7 7950X3D Stock |
Motherboard | X670E Aorus Pro X/ROG Crosshair VIII Hero |
Cooling | Corsair h150 elite/ Corsair h115i Platinum |
Memory | Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk |
Video Card(s) | Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090 |
Storage | lots of SSD. |
Display(s) | A whole bunch OLED, VA, IPS..... |
Case | 011 Dynamic XL/ Phanteks Evolv X |
Audio Device(s) | Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B |
Power Supply | Seasonic Ultra Prime Titanium 1000w/850w |
Mouse | Logitech G502 Lightspeed/ Logitech G Pro Hero. |
Keyboard | Logitech - G915 LIGHTSPEED / Logitech G Pro |
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
System Name | anal fucker |
---|---|
Processor | a |
Motherboard | n |
Cooling | a |
Memory | l |
Storage | f |
Display(s) | u |
Case | c |
Audio Device(s) | k |
Power Supply | e |
Mouse | r |
Thank you for this information, very helpful!The situation is always evolving, but in short, yes, Radeon's design is friendlier towards machines with limited CPU power with the tradeoff of having less potential when ample CPU power is available. Like I mentioned on the Valley thread, AMD's DirectX 11 driver operates in an immediate context. This means that it issues commands through a limited amount of, and primarily one CPU thread. These commands are then processed by the GPU's hardware command processor itself. It is highly optimized, and thus quite fast, but there are situations where this approach doesn't work as well, notably, when you're talking about Bethesda's Creation Engine games.
Nvidia's driver, on the other hand, is capable of parsing an instance where an immediate context is created, generate driver command lists and defer these commands through multiple CPU threads. This results in an increased CPU load, but significantly higher potential, as it will be able to handle a much, much, *much* higher amount of draw calls than an immediate context would. While a little older now, this is a good technical explanation by Intel, I had been talking about this with another forum member shortly before you joined:
Game Development
Access tools, tutorials libraries, and code samples from Intel to optimize your games.software.intel.com
Support for driver command lists is detected by GPU-Z and shown in the Advanced > DirectX 11 tab:
View attachment 297658
3DMark's old API overhead test is an useful tool to show the result of this behavior, you will find that NVIDIA GPUs have substantially higher performance in that benchmark, despite that not reflecting in real world applications. Thus, the UL guys dropped support, as it's a very technical tool which does not really serve to compare between different setups. The DirectX SDK samples will also contain examples where this will be quite apparent.
In DirectX 12 or Vulkan, this is irrelevant: they should be significantly closer apart, with NVIDIA's driver again using a little more CPU in exchange for slightly stronger performance if it can be helped, but nowhere near the drastic difference between vendors when it comes to DirectX 11. This is because they are low-level APIs and this abstraction is done in a completely different manner. Also, for the record: Intel does not support driver command lists in their Gen 12.2 architecture integrated graphics (such as UHD 770). Unsure if Arc does, but I would guess it does not either. Would appreciate if someone could clarify.
System Name | Pioneer |
---|---|
Processor | Ryzen R9 9950X |
Motherboard | GIGABYTE Aorus Elite X670 AX |
Cooling | Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans... |
Memory | 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30 |
Video Card(s) | XFX RX 7900 XTX Speedster Merc 310 |
Storage | Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs |
Display(s) | 55" LG 55" B9 OLED 4K Display |
Case | Thermaltake Core X31 |
Audio Device(s) | TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED |
Power Supply | FSP Hydro Ti Pro 850W |
Mouse | Logitech G305 Lightspeed Wireless |
Keyboard | WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps |
Software | Gentoo Linux x64 / Windows 11 Enterprise IoT 2024 |
Processor | 5800X3D -30 CO |
---|---|
Motherboard | MSI B550 Tomahawk |
Cooling | DeepCool Assassin III |
Memory | 32GB G.SKILL Ripjaws V @ 3800 CL14 |
Video Card(s) | ASRock MBA 7900XTX |
Storage | 1TB WD SN850X + 1TB ADATA SX8200 Pro |
Display(s) | Dell S2721QS 4K60 |
Case | Cooler Master CM690 II Advanced USB 3.0 |
Audio Device(s) | Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled) |
Power Supply | Seasonic Prime TX-750 |
Mouse | Logitech Cordless Desktop Wave |
Keyboard | Logitech Cordless Desktop Wave |
Software | Windows 10 Pro |
For the context, here's our original conversationI had been talking about this with another forum member shortly before you joined
System Name | The Phantom in the Black Tower |
---|---|
Processor | AMD Ryzen 7 5800X3D |
Motherboard | ASRock X570 Pro4 AM4 |
Cooling | AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm |
Memory | 64GB Team Vulcan DDR4-3600 CL18 (4×16GB) |
Video Card(s) | ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB |
Storage | WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space |
Display(s) | Haier 55E5500U 55" 2160p60Hz |
Case | Ultra U12-40670 Super Tower |
Audio Device(s) | Logitech Z200 |
Power Supply | EVGA 1000 G2 Supernova 1kW 80+Gold-Certified |
Mouse | Logitech MK320 |
Keyboard | Logitech MK320 |
VR HMD | None |
Software | Windows 10 Professional |
Benchmark Scores | Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439 |
The situation is always evolving, but in short, yes, Radeon's design is friendlier towards machines with limited CPU power with the tradeoff of having less potential when ample CPU power is available. Like I mentioned on the Valley thread, AMD's DirectX 11 driver operates in an immediate context. This means that it issues commands through a limited amount of, and primarily one CPU thread. These commands are then processed by the GPU's hardware command processor itself. It is highly optimized, and thus quite fast, but there are situations where this approach doesn't work as well, notably, when you're talking about Bethesda's Creation Engine games.
Nvidia's driver, on the other hand, is capable of parsing an instance where an immediate context is created, generate driver command lists and defer these commands through multiple CPU threads. This results in an increased CPU load, but significantly higher potential, as it will be able to handle a much, much, *much* higher amount of draw calls than an immediate context would. While a little older now, this is a good technical explanation by Intel, I had been talking about this with another forum member shortly before you joined:
Game Development
Access tools, tutorials libraries, and code samples from Intel to optimize your games.software.intel.com
Support for driver command lists is detected by GPU-Z and shown in the Advanced > DirectX 11 tab:
View attachment 297658
3DMark's old API overhead test is an useful tool to show the result of this behavior, you will find that NVIDIA GPUs have substantially higher performance in that benchmark, despite that not reflecting in real world applications. Thus, the UL guys dropped support, as it's a very technical tool which does not really serve to compare between different setups. The DirectX SDK samples will also contain examples where this will be quite apparent.
In DirectX 12 or Vulkan, this is irrelevant: they should be significantly closer apart, with NVIDIA's driver again using a little more CPU in exchange for slightly stronger performance if it can be helped, but nowhere near the drastic difference between vendors when it comes to DirectX 11. This is because they are low-level APIs and this abstraction is done in a completely different manner. Also, for the record: Intel does not support driver command lists in their Gen 12.2 architecture integrated graphics (such as UHD 770). Unsure if Arc does, but I would guess it does not either. Would appreciate if someone could clarify.
System Name | Planet Espresso |
---|---|
Processor | 13700KF @ 5.5GHZ 1.285v - 235W cap |
Motherboard | MSI 690-I PRO |
Cooling | Thermalright Phantom Spirit EVO |
Memory | 48 GB DDR5 7600 MHZ CL36 |
Video Card(s) | RTX 4090 FE |
Storage | 2TB WD SN850, 4TB WD SN850X |
Display(s) | Alienware 32" 4k 240hz OLED |
Case | Jonsbo Z20 |
Audio Device(s) | Yes |
Power Supply | Corsair SF750 |
Mouse | Xlite V2 |
Keyboard | 65% HE Keyboard |
Software | Windows 11 |
Benchmark Scores | They're pretty good, nothing crazy. |
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
You seem to be quite right! I upgraded my GPU from a 4G Nvidia GTX 960 to a 12G AMD 6700 XT, having an i5 4690k at 4.3 Ghz (4 cores without hyperthreading, so 4 threads total), 2400 Mhz (Or should I say 2400MT/s, anyway...) Cl11 2x8 RAM sticks and a 960 evo nvme drive, and the AMD driver seem to significantly reduce stuttering on Assetto Corsa Competizione (probably the extra gpu power and video memory helps too, as it usually consumes about 7G in the game, with mixed settings and 1440p internal resolution [133% resolution scale], with the GTX 960 I was using similar if not exact settings at internal 1080p resolution, and the video ram usage is not much lower like this, still way above 4G). My target is 60 fps as I'm aware of the CPU/system RAM limitations and my monitor (tv really) is 60hz. A few months back I upgraded from 2x8 1600cl10 as one of the sticks started failing, and that upgrade was noticeable, and this also was, probably more.
Unfortunately it seems the best would be to have a toggle in driver settings to have either Nvidia approach for older not very well threaded (and hopefully not very CPU demanding) games, or the less CPU consuming AMD approach, but I don't see that happening anytime soon. It may be placebo but I feel like I lost a bit of performance for instance in a very old Directx 9 game, NFS: MW from 2005, probably because here the Nvidia driver is capable of squeezing more draw calls out if it.
The performance loss in the lower end of the CPU spectrum due to the Nvidia drivers can be very noticeable. This is an example of this happening in Battlefield V, the frametime graph... :
An option to disable threaded optimization in the Nvidia drivers seems to exist, but I've never tried it.
System Name | Firelance. |
---|---|
Processor | Threadripper 3960X |
Motherboard | ROG Strix TRX40-E Gaming |
Cooling | IceGem 360 + 6x Arctic Cooling P12 |
Memory | 8x 16GB Patriot Viper DDR4-3200 CL16 |
Video Card(s) | MSI GeForce RTX 4060 Ti Ventus 2X OC |
Storage | 2TB WD SN850X (boot), 4TB Crucial P3 (data) |
Display(s) | 3x AOC Q32E2N (32" 2560x1440 75Hz) |
Case | Enthoo Pro II Server Edition (Closed Panel) + 6 fans |
Power Supply | Fractal Design Ion+ 2 Platinum 760W |
Mouse | Logitech G602 |
Keyboard | Razer Pro Type Ultra |
Software | Windows 10 Professional x64 |
Why on God's green Earth would you pair an ancient, underpowered CPU like that with a far faster GPU? That 6700 XT is probably spending 50% or more of its time idle, waiting for your CPU to catch up.You seem to be quite right! I upgraded my GPU from a 4G Nvidia GTX 960 to a 12G AMD 6700 XT, having an i5 4690k at 4.3 Ghz (4 cores without hyperthreading, so 4 threads total), 2400 Mhz (Or should I say 2400MT/s, anyway...) Cl11 2x8 RAM sticks and a 960 evo nvme drive, and the AMD driver seem to significantly reduce stuttering on Assetto Corsa Competizione (probably the extra gpu power and video memory helps too, as it usually consumes about 7G in the game, with mixed settings and 1440p internal resolution [133% resolution scale], with the GTX 960 I was using similar if not exact settings at internal 1080p resolution, and the video ram usage is not much lower like this, still way above 4G). My target is 60 fps as I'm aware of the CPU/system RAM limitations and my monitor (tv really) is 60hz. A few months back I upgraded from 2x8 1600cl10 as one of the sticks started failing, and that upgrade was noticeable, and this also was, probably more.
Unfortunately it seems the best would be to have a toggle in driver settings to have either Nvidia approach for older not very well threaded (and hopefully not very CPU demanding) games, or the less CPU consuming AMD approach, but I don't see that happening anytime soon. It may be placebo but I feel like I lost a bit of performance for instance in a very old Directx 9 game, NFS: MW from 2005, probably because here the Nvidia driver is capable of squeezing more draw calls out if it.
The performance loss in the lower end of the CPU spectrum due to the Nvidia drivers can be very noticeable. This is an example of this happening in Battlefield V, the frametime graph... :
Cool story bro!My current pairing of old PC is Phenom 1100T/Vega 56 Nano.
Because I'm a hardware modder, I can extract more performance than most users when it comes to old hardware. It's built to be extraordinary snappy.
System Name | Good enough |
---|---|
Processor | AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge |
Motherboard | ASRock B650 Pro RS |
Cooling | 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30 |
Memory | 32GB - FURY Beast RGB 5600 Mhz |
Video Card(s) | Sapphire RX 7900 XT - Alphacool Eisblock Aurora |
Storage | 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB |
Display(s) | LG UltraGear 32GN650-B + 4K Samsung TV |
Case | Phanteks NV7 |
Power Supply | GPS-750C |
System Name | Lynni PS \ Lenowo TwinkPad L14 G2 |
---|---|
Processor | AMD Ryzen 7 7700 Raphael (Waiting on 9800X3D) \ i5-1135G7 Tiger Lake-U |
Motherboard | ASRock B650M PG Riptide Bios v. 3.08 AMD AGESA 1.2.0.2 \ Lenowo BDPLANAR Bios 1.68 |
Cooling | Noctua NH-D15 Chromax.Black (Only middle fan) \ Lenowo C-267C-2 |
Memory | G.Skill Flare X5 2x16GB DDR5 6000MHZ CL36-36-36-96 AMD EXPO \ Willk Elektronik 2x16GB 2666MHZ CL17 |
Video Card(s) | Asus GeForce RTX™ 4070 Dual OC (Waiting on RX 8800 XT) | Intel® Iris® Xe Graphics |
Storage | Gigabyte M30 1TB|Sabrent Rocket 2TB| HDD: 10TB|1TB \ WD RED SN700 1TB |
Display(s) | KTC M27T20S 1440p@165Hz | LG 48CX OLED 4K HDR | Innolux 14" 1080p |
Case | Asus Prime AP201 White Mesh | Lenowo L14 G2 chassis |
Audio Device(s) | Steelseries Arctis Pro Wireless |
Power Supply | Be Quiet! Pure Power 12 M 750W Goldie | 65W |
Mouse | Logitech G305 Lightspeedy Wireless | Lenowo TouchPad & Logitech G305 |
Keyboard | Ducky One 3 Daybreak Fullsize | L14 G2 UK Lumi |
Software | Win11 Pro 23H2 UK | Win11 LTSC UK / Arch (Fan) |
Benchmark Scores | 3DMARK: https://www.3dmark.com/3dm/89434432? GPU-Z: https://www.techpowerup.com/gpuz/details/v3zbr |
This is wild because it used to be that AMD had way bigger driver overhead I wonder how Nvidia screwed this up.
Why on God's green Earth would you pair an ancient, underpowered CPU like that with a far faster GPU? That 6700 XT is probably spending 50% or more of its time idle, waiting for your CPU to catch up.
System Name | Still not a thread ripper but pretty good. |
---|---|
Processor | Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste |
Motherboard | ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2) |
Cooling | EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360 |
Memory | Micron DDR4-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1 |
Video Card(s) | XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate |
Storage | Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk |
Display(s) | 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount) |
Case | Lian Li PC-O11 Dynamic Black (original model) |
Audio Device(s) | Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4) |
Power Supply | Corsair RM750x |
Mouse | Logitech M575 |
Keyboard | Corsair Strafe RGB MK.2 |
Software | Windows 10 Professional (64bit) |
Benchmark Scores | RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1) |
Perhaps it's by design since you can throw a little more power at the Intel CPU and it won't matter?This is wild because it used to be that AMD had way bigger driver overhead I wonder how Nvidia screwed this up.
Perhaps it's by design since you can throw a little more power at the Intel CPU and it won't matter?
System Name | 4K-gaming |
---|---|
Processor | AMD Ryzen 7 5800X @ PBO +200 -20CO |
Motherboard | Asus ROG Crosshair VII Hero |
Cooling | Arctic Freezer 50, EKWB Vector TUF |
Memory | 32GB Kingston HyperX Fury DDR4-3200 |
Video Card(s) | Asus GeForce RTX 3080 TUF OC 10GB |
Storage | A pack of SSDs totaling 3.2TB + 3TB HDDs |
Display(s) | 27" 4K120 IPS + 32" 4K60 IPS + 24" 1080p60 |
Case | Corsair 4000D Airflow White |
Audio Device(s) | Asus TUF H3 Wireless / Corsair HS35 |
Power Supply | EVGA Supernova G2 750W |
Mouse | Logitech MX518 + Asus TUF P1 mousepad |
Keyboard | Roccat Vulcan 121 AIMO |
VR HMD | Oculus Rift CV1 |
Software | Windows 11 Pro |
Benchmark Scores | It runs Crysis |
System Name | Pioneer |
---|---|
Processor | Ryzen R9 9950X |
Motherboard | GIGABYTE Aorus Elite X670 AX |
Cooling | Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans... |
Memory | 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30 |
Video Card(s) | XFX RX 7900 XTX Speedster Merc 310 |
Storage | Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs |
Display(s) | 55" LG 55" B9 OLED 4K Display |
Case | Thermaltake Core X31 |
Audio Device(s) | TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED |
Power Supply | FSP Hydro Ti Pro 850W |
Mouse | Logitech G305 Lightspeed Wireless |
Keyboard | WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps |
Software | Gentoo Linux x64 / Windows 11 Enterprise IoT 2024 |
They didn't. AMD just really got it's act together only recently, but they did so with such advancements that it left nvidia behind.This is wild because it used to be that AMD had way bigger driver overhead I wonder how Nvidia screwed this up.
with dx11 etc they did, with low level apis they already had. They basically introduced that stuff with mantle. dx11 etc is much more driver dependant so it took them more time.They didn't. AMD just really got it's act together only recently, but they did so with such advancements that it left nvidia behind.
Nvidia is still as good as its always been, overhead wise. But AMD is now slightly better.
System Name | "Icy Resurrection" |
---|---|
Processor | 13th Gen Intel Core i9-13900KS Special Edition |
Motherboard | ASUS ROG MAXIMUS Z790 APEX ENCORE |
Cooling | Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM |
Memory | 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V |
Video Card(s) | ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition |
Storage | 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD |
Display(s) | 55-inch LG G3 OLED |
Case | Pichau Mancer CV500 White Edition |
Power Supply | EVGA 1300 G2 1.3kW 80+ Gold |
Mouse | Microsoft Classic Intellimouse |
Keyboard | Generic PS/2 |
Software | Windows 11 IoT Enterprise LTSC 24H2 |
Benchmark Scores | I pulled a Qiqi~ |
This is wild because it used to be that AMD had way bigger driver overhead I wonder how Nvidia screwed this up.
with dx11 etc they did, with low level apis they already had. They basically introduced that stuff with mantle. dx11 etc is much more driver dependant so it took them more time.