• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Readies X870E Chipset to Launch Alongside First Ryzen 9000 "Granite Ridge" CPUs

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
That also depends on your play style. For me, a slow-paced, atmospheric, single player gamer, RAM speed and latency don't matter at all. It's probably different for players of fast-paced online shooters.
OK, set your memory to 4800 MT JEDEC and report back afterwards if you still believe this.

You're on a 144 Hz monitor, so your averages probably wouldn't be affected too much, but I'd wager almost anything you'd notice your minimum FPS dipping below 60 FPS quite frequently.

Regardless, RAM speed/latency objectively matters to gaming performance, even if you have an X3D chip, this is a statement of fact. Depending on your monitor refresh rate, your GPU power and the type of gamer you are, you may notice this more or less, but the underlying technical data is irrefutable.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
OK, set your memory to 4800 MT JEDEC and report back afterwards if you still believe this.

You're on a 144 Hz monitor, so your averages probably wouldn't be affected too much, but I'd wager almost anything you'd notice your minimum FPS dipping below 60 FPS quite frequently.

Regardless, RAM speed/latency objectively matters to gaming performance, even if you have an X3D chip, this is a statement of fact. Depending on your monitor refresh rate, your GPU power and the type of gamer you are, you may notice this more or less, but the underlying technical data is irrefutable.
I can't see I see that and I run my RAM at 5200. Could be your 12GB Vram buffer effecting that. Maybe it is more CPU cores I have and more VRAM to not experience that.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
I can't see I see that and I run my RAM at 5200. Could be your 12GB Vram buffer effecting that. Maybe it is more CPU cores I have and more VRAM to not experience that.
Completely missing the point as usual, and somehow conflating GPU bottlenecking with CPU bottlenecking, well done.
 
Joined
Apr 14, 2022
Messages
671 (0.86/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
RAM speed/latency objectively matters to gaming performance, even if you have an X3D chip, this is a statement of fact. Depending on your monitor refresh rate, your GPU power and the type of gamer you are, you may notice this more or less, but the underlying technical data is irrefutable.
+1

RAM speed and latency is one of the pillars for high end gaming. No matter if you use Intel or AMD (X3D).
Yes, in slow paced games you won't notice a thing but the numbers say the truth.

Also the X3Ds can cover the memory latency issues and it's more difficult to notice while gaming.
Basically because the X3Ds improve more the minFPS than the avgFPS, the ram issues are hiding behind that.
But still, even with a X3D CPU there is a difference in performance.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Completely missing the point as usual, and somehow conflating GPU bottlenecking with CPU bottlenecking, well done.
Of course you are right. Playing at 1440P 165Hz with 12GB is fine but you are telling people that VRAM is not the culprit when both of those can have an effect on Gaming performance when talking about 1% lows, when using system RAM means that SAM does not have enough VRAM to not send data to system RAM. The CPU cores can be enough to add more Cache so that even less data is sent to System RAM. That is when RAM latency matters. I guess I am dropping performance as well as my RAM runs at 5200 mt/s all day. That is not too far from the 4800 you referenced.
 
Joined
Jan 14, 2019
Messages
10,132 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
OK, set your memory to 4800 MT JEDEC and report back afterwards if you still believe this.

You're on a 144 Hz monitor, so your averages probably wouldn't be affected too much, but I'd wager almost anything you'd notice your minimum FPS dipping below 60 FPS quite frequently.

Regardless, RAM speed/latency objectively matters to gaming performance, even if you have an X3D chip, this is a statement of fact. Depending on your monitor refresh rate, your GPU power and the type of gamer you are, you may notice this more or less, but the underlying technical data is irrefutable.
I am running my RAM at 4800 MHz JEDEC (because my CPU needs a lot less SoC voltage this way), and my gaming experience is just fine. :)

Here is the underlying technical data:
1711805942502.png


My own take on it is that if I have 218/166 avg/min FPS or 204/149 is an insignificant difference, as the amount of info reaching my eyes and brain is exactly the same.

Also, this test was done with a 4090 at 1080p. With my 7800 XT at 1440 UW, I am infinitely more GPU-limited, therefore, my RAM speed matters even less.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
I am running my RAM at 4800 MHz JEDEC (because my CPU needs a lot less SoC voltage this way), and my gaming experience is just fine. :)

Here is the underlying technical data:
View attachment 341238

My own take on it is that if I have 218/166 avg/min FPS or 204/149 is an insignificant difference, as the amount of info reaching my eyes and brain is exactly the same.

Also, this test was done with a 4090 at 1080p. With my 7800 XT at 1440 UW, I am infinitely more GPU-limited, therefore, my RAM speed matters even less.
And here's some other testing.

In your own testing for a 7700X, the difference between 4800 and 6000 is 30 FPS.

Some games such as Tarkov, Factorio or Minecraft are intensely CPU bottlenecked almost at all times, and will have more significant differences regardless of GPU used.

Other games will be GPU bottlenecked at all times, mostly single player games.

Regardless of how significant you consider 10-30 FPS, or if you consider it to be the "same amount of info reaching your eyes and brain", it's a difference. Your earlier statement of "RAM speed and latency don't matter at all" is therefore subjective, not objective.


1711806900190.png


If you have low expectations of your hardware, that's fine. If you don't notice the difference between 120 and 150 FPS, that's fine, but lets not pretend that then translates into "RAM speed doesn't matter".

Because people read technical threads like these, and misinformation is then propagated.

Even once you go past 6000 MT, the "sweet spot" (more like the spot where you can probably reach with AMD, since 6400 MT+ is pretty much unattainable without going out of sync and losing performance), you still see CPU and therefore FPS improvements from faster RAM. E.g. ~10 FPS just from 400 MT in the chart I linked. Intel CPUs getting 8000+ MT operate on a whole other level compared to tests done at 6000 MT.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
And here's some other testing.

In your own testing for a 7700X, the difference between 4800 and 6000 is 30 FPS.

Some games such as Tarkov, Factorio or Minecraft are intensely CPU bottlenecked almost at all times, and will have more significant differences regardless of GPU used.

Other games will be GPU bottlenecked at all times, mostly single player games.

Regardless of how significant you consider 10-30 FPS, or if you consider it to be the "same amount of info reaching your eyes and brain", it's a difference. Your earlier statement of "RAM speed and latency don't matter at all" is therefore subjective, not objective.


View attachment 341239

If you have low expectations of your hardware, that's fine. If you don't notice the difference between 120 and 150 FPS, that's fine, but lets not pretend that then translates into "RAM speed doesn't matter".

Because people read technical threads like these, and misinformation is then propagated.
I did not know that a 13900K comes with V cache otherwise this chart means nothing. In case you did not know AMD Overlay also shows 1% data and the X3D is famous for holding 1% numbers. The fact that you have a 7800X3D and did not use your own data speaks volumes.

Yep you can definitely tell the difference between 130 and 100 FPS on a Freesync panel....not.
 
Joined
Jan 14, 2019
Messages
10,132 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
"RAM speed and latency don't matter at all" is therefore subjective, not objective.
If you read my earlier comment again, you'll see that this is exactly what I said. RAM speed doesn't matter to me. The information that reaches my brain is exactly the same.

If the difference above is night and day to you, fair enough. All I'm saying is, it means nothing to me. My play style is slow enough not to notice any difference above a certain FPS. Of course everybody is different.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
I did not know that a 13900K comes with V cache otherwise this chart means nothing. In case you did not know AMD Overlay also shows 1% data and the X3D is famous for holding 1% numbers. The fact that you have a 7800X3D and did not use your own data speaks volumes.

Yep you can definitely tell the difference between 130 and 100 FPS on a Freesync panel....not.
Noted. I now know that Kapone cannot tell the difference when looking at a 30% performance improvement. Thanks for the quote, will be useful for future reference when you start discussing performance.

I'm sure you do indeed have "the best AMD computer", but how could you tell? What if another PC was 29% faster?

The fact that you have a 7800X3D and did not use your own data speaks volumes.
It speaks for the fact I don't need to waste time logging my own testing to disprove anything you say, benchmarks off TPU are more than sufficient.

Or the general web, as AusWolf showed with his data.
 
Joined
Nov 20, 2012
Messages
117 (0.03/day)
OK, set your memory to 4800 MT JEDEC and report back afterwards if you still believe this.

You're on a 144 Hz monitor, so your averages probably wouldn't be affected too much, but I'd wager almost anything you'd notice your minimum FPS dipping below 60 FPS quite frequently.

Regardless, RAM speed/latency objectively matters to gaming performance, even if you have an X3D chip, this is a statement of fact. Depending on your monitor refresh rate, your GPU power and the type of gamer you are, you may notice this more or less, but the underlying technical data is irrefutable.
In theory, yes. In real life, for most people it doesn't matter. All the benchmarks above useing a 4090 @1080p speaks volumes.

I'm not talking JEDEC 4800. Im talking 5600 or 6000 Cl32/CL30 with EXPO. And I'm talking 1440p@144Hz or even UHD. Even with a 4080, you will run into GPU limit most of the time, and when you don't, there wont be any tangible difference between 5600, 6000 ord 6400. I definiately won't consider anything abvoe that for AM5, because I don't have the time I would need to invest into optimzing for so little gain.

If you want to achieve the highest possible framerate @1080p because you think that makes a better experience than hhigher resolutions, then that might be something different, but even then I think you really have to have a 4080 or 4090 before any investment in fast RAM makes sense.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
In theory, yes. In real life, for most people it doesn't matter.

I'm not talking, JEDEC 4800. Im talking 5600 or 6000 Cl32/CL30 with EXPO. And I'm talking 1440p@144Hz or even UHD. Even with a 4080, you will run into GPU limit most of the time, and when you don't, there wont be any tangible difference between 5600, 6000 ord 6400. I definiately won't consider anything abvoe that for AM5, because I don't have the time I would need to invest into optimzing for so little gain.
What you consider "tangible" or "real life" are subjective analyses, the fact is that faster/lower latency RAM affects performance, even on an X3D platform, unless you think the entire game data fits in ~100 MB of cache. Considering this is a hardware enthusiast forum, I would expect less resistance to this statement.

The benchmarks use a 4090 at 1080p because that's an easy way to force a CPU bottleneck, instead of playing for an hour to get 10 minutes of usable data demonstrating the differences you want to focus on. Unless you think CPU/RAM benchmarking should use GPU limited scenarios? This is the basis of scientific testing, you exclude the control variables and test the variable you're interested in.

1440p is considered a CPU limited resolution these days BTW. Especially with the (now cheap) popular 240 Hz monitors, or the emerging 360 Hz/480 Hz 1440p monitors. 4K is pretty much the only resolution where you're GPU bound all the time, but even then you can see improved minimum FPS with better RAM. 144 Hz monitors have been around for more than a decade now, it's pretty entry level.
 
Joined
Nov 20, 2012
Messages
117 (0.03/day)
What you consider "tangible" or "real life" are subjective analyses, the fact is that faster/lower latency RAM affects performance, even on an X3D platform, unless you think the entire game data fits in ~100 MB of cache. Considering this is a hardware enthusiast forum, I would expect less resistance to this statement.
That's what I meant with "in theory". I don't doubt it does. What I doubt is that RAM speed affects performance @1440p or, UHD to such an extent that it warrants investing the time and/or money needed especially over somethin like 6000 CL30 or 6400 CL32 with EXPO that isn't much more expensive than 5600 and is usable out of the box.

If you can't afford at least a 4080, worrying about highend RAM is nonsense. Even then, you should much rather invest in a 4090. If you can't even afford a 7800X3D and 4080, even more so.

What I do think helps performance a lot, but won't do myself for time reasons, is buying cheap RAM and OCing it to 6000/6400 or lowering bad timings to good.

So, for people useing a 7800X3D/7950X3D or 13900K/14900K(S), 4090 and a monitor supporting 240Hz+ wanting to get the framerate as high as possible, especially the lows, RAM OC might be important, but those people are few.
The benchmarks use a 4090 at 1080p because that's an easy way to force a CPU bottleneck, instead of playing for an hour to get 10 minutes of usable data demonstrating the differences you want to focus on. Unless you think CPU/RAM benchmarking should use GPU limited scenarios? This is the basis of scientific testing, you exclude the control variables and test the variable you're interested in.

1440p is considered a CPU limited resolution these days BTW. Especially with the (now cheap) popular 240 Hz monitors, or the emerging 360 Hz/480 Hz 1440p monitors. 4K is pretty much the only resolution where you're GPU bound all the time, but even then you can see improved minimum FPS with better RAM. 144 Hz monitors have been around for more than a decade now, it's pretty entry level.
I have yet to see such a benchmark in 1440p. Then I again I just wen't from 1200p@60Hz to 3440x1440p@144Hz, but haven't had a chance to really play on that since for time reasons. There isn't only gaming. I for example have to read a lot on my monitor, so I didn't want OLED. There isn't much with IPS, 3440@1440p, 10Bit above 144Hz.

But my point was: Consider how many can't afford a 4080 or even 4070TiS. How many have to make do with a 4060, 7600, 6600Xt or something like that, and a 7600, 5700X, 12400 etc. RAM Speed is something they should worry about last.
 
Last edited:
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Noted. I now know that Kapone cannot tell the difference when looking at a 30% performance improvement. Thanks for the quote, will be useful for future reference when you start discussing performance.

I'm sure you do indeed have "the best AMD computer", but how could you tell? What if another PC was 29% faster?


It speaks for the fact I don't need to waste time logging my own testing to disprove anything you say, benchmarks off TPU are more than sufficient.

Or the general web, as AusWolf showed with his data.
Yep you again are right a Freesync panel that supports 45-165Hz does make a difference when you are at 100 vs 130 FPS. I guess you don't understand how Freesync works. Thanks for confirming that.

Does the 13900K have the same 1% performance of X3D chips?
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
That's what I meant with "in theory". I don't doubt it does. What I doubt is that RAM speed affects performance @1440p or, UHD to such an extent that it warrants investing the time and/or money needed especially over somethin like 6000 CL30 or 6400 CL32 with EXPO that isn't much more expensive than 5600 and is usable out of the box.

If you can't afford at least a 4080, worrying about highend RAM is nonsense. Even then, you should much rather invest in a 4090. If you can't even afford a 7800X3D and 4080, even more so.

What I do think helps performance a lot, but won't do myself for time reasons, is buying cheap RAM and OCing it to 6000/6400 or lowering bad timings to good.

So, for people useing a 7800X3D/7950X3D or 13900K/14900K(S), 4090 and a monitor supporting 240Hz+ wanting to get the framerate as high as possible, especially the lows, RAM OC might be important, but those people are few.
While some of what you're saying is true (but exaggerated, you don't need a 4090 and a 240 Hz+ panel to notice differences in RAM and therefore CPU performance), there are issues with the "20 FPS doesn't matter" attitude. Especially as this performance % difference sticks across all levels of hardware (moving your minimums from 100 to 130 is noticable even if you're only using a 4070/7700 class GPU, which is easily capable of 120 FPS. In a CPU limited scenario, the FPS is limited by the CPU, so whether or not you are using a 4090 or a 4070, those minimum lows will still be very similar, as long as the 4070 is capable of reaching a higher average than the CPU dictates, which it is).

For instance, ~20-25 FPS is the difference between a 4070 Ti and a 4080 (which used to be $400 difference), yet I don't see anyone saying that a faster GPU won't make a difference? Yet when that 20-25 FPS is coming from a RAM tune (free, or maybe $50 more expensive if you want to buy a faster stock kit), suddenly it's imperceivable or "not tangible". A 7900XTX vs 7900XT is even less than 20 FPS, does that render AMD's flagship pointless? No.

1711809958128.png
 
Joined
Nov 20, 2012
Messages
117 (0.03/day)
[...] there are issues with the "20 FPS doesn't matter" attitude. Especially as this performance % difference sticks across all levels of hardware (moving your minimums from 100 to 130 is noticable even if you're only using a 4070/7700 class GPU, which is easily capable of 120 FPS. In a CPU limited scenario, the FPS is limited by the CPU, so whether or not you are using a 4090 or a 4070, those minimum lows will still be very similar, as long as the 4070 is capable of reaching a higher average than the CPU dictates, which it is).

For instance, ~20-25 FPS is the difference between a 4070 Ti and a 4080 (which used to be $400 difference), yet I don't see anyone saying that a faster GPU won't make a difference? Yet when that 20-25 FPS is coming from a RAM tune (free, or maybe $50 more expensive if you want to buy a faster stock kit), suddenly it's imperceivable or "not tangible". A 7900XTX vs 7900XT is even less than 20 FPS, does that render AMD's flagship pointless? No.
You are still missing the point.

First, I have yet to see bencmarks in 1440p were RAM-tuning above DDR5-6000 CL30 from the shelf on a 7800XD does even make a difference of a two digit percentage, in low fps if you like. The benchmark you posted is, again, unrealistic 1080p with cards designed for 1440p and UHD. Then, even if there were games were that was the case, it would have to be in a fps-range were I would see the difference. I don't play fast shooters and I didn't get to play on my 34" 144Hz-monitor, but I really doubt I would see a difference between ~120fps and 130fps.

On the other hand, if you really achive 10fps+ performance gain by tuning cheap RAM to 6000+ with low latency and, of course that's good, but since up to that point the price difference is minimal in my country, my time is to precious for that. Gaining more than 10fps+ by tuning above DDR5-6000 CL30 with an 7800X3D in 1440p, that I relly wan't to see.
 
Joined
Jan 14, 2019
Messages
10,132 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Noted. I now know that Kapone cannot tell the difference when looking at a 30% performance improvement. Thanks for the quote, will be useful for future reference when you start discussing performance.

I'm sure you do indeed have "the best AMD computer", but how could you tell? What if another PC was 29% faster?
It's all relative. 30% over 50 FPS is noticeable. 30% over 200 FPS is not (unless the game you're playing is called FRAPS or Afterburner).

Edit: The only difference is that you don't get 30% extra with high graphical settings / resolutions, and with mid-range or lower GPUs.

Your own sensitivity matters a lot as well. I know someone who demands a constant 360 FPS on his 360 Hz monitor. As for me, anything above ~40 is smooth enough, especially with Freesync.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
First, I have yet to see bencmarks in 1440p were RAM-tuning above DDR5-6000 CL30 from the shelf on a 7800XD does even make a difference of a two digit percentage, in low fps if you like. The benchmark you posted is, again, unrealistic 1080p with cards designed for 1440p and UHD. Then, even if there were games were that was the case, it would have to be in a fps-range were I would see the difference. I don't play fast shooters and I didn't get to play on my 34" 144Hz-monitor, but I really doubt I would see a difference between ~120fps and 130fps.
Noone tests at 1440p, not because you wouldn't see a difference, but because it's a mixed benchmark. At 1440p sometimes you'll be CPU limited, sometimes you'll be GPU limited, this isn't helpful when you're trying to evaluate the effect of RAM performance. At 4K we know CPU and therefore RAM speed becomes less meaningful, because only high end cards can start to push more than 120 FPS, but 1440p is significantly easier to drive for the GPU than 4K. To the point where you can reach up to 300 FPS without much trouble in your average game (600 FPS if you're talking esports titles). A 10-20% difference in FPS matters more at 100 FPS+ than it does at 50 FPS, as AusWolf has just said. Again, it's not the average FPS which is more significant here, although moving from 200 to 230 FPS is nice indeed if you're running a 240 Hz panel, it's the minimum FPS, moving from 230 FPS to 150 is a lot less jarring than moving from 230 to 120, or from 120 FPS to 100 FPS, instead of 120 FPS to 80 FPS.

The point I've been trying to make for the past 30 minutes, which seems to have significant resistance (for some reason), is that you don't ever want to be CPU bottlenecked, because that is what people notice as stuttering, or dips. It's irrelevant if you're playing at 100 FPS or 200 FPS, that number suddenly halving or going down by an appreciable amount because your CPU is struggling to keep up is noticable and immersion breaking, and coincides with massively increased input lag. If you want to talk esports, then "muscle memory" is tied to frame rates, you want consistency, not high averages.

It's all relative. 30% over 50 FPS is noticeable. 30% over 200 FPS is not (unless the game you're playing is called FRAPS or Afterburner).

Edit: The only difference is that you don't get 30% extra with high graphical settings / resolutions, and with mid-range or lower GPUs.

Your own sensitivity matters a lot as well. I know someone who demands a constant 360 FPS on his 360 Hz monitor. As for me, anything above ~40 is smooth enough, especially with Freesync.
Like I said, subjective analysis is fine, but lets call it for what it is.

Not much point buying a high refresh monitor if you barely ever hit or sustain that high FPS.
 
Joined
Nov 20, 2012
Messages
117 (0.03/day)
@dgianstefani : Even then, as I said, most people just don't have the GPU to not be GPU-bttlenecked in 1440p. I won't pay 1.000€+ for a GPU even tough I can afford it, because I can't spent enough time on gaming to justify that.
And I still don't believe in differences this huge between DDR5-6000 CL30 out of the box and tuning on 7800X3D. If you do, please show me. But, because I don't have enough time for gaming, I wouldn't invest the time I don't have in finetuning my RAM. Buying 6000 CL30 or 6400 CL32 with EXPO instead of 5x00 for 20-30€ more, fine by me. But only because I can afford a 7800XD. If I couldn't, I wouldn't waste money on RAM.
 
Joined
Jan 14, 2019
Messages
10,132 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Noone tests at 1440p, not because you wouldn't see a difference, but because it's a mixed benchmark. At 1440p sometimes you'll be CPU limited, sometimes you'll be GPU limited, this isn't helpful when you're trying to evaluate the effect of RAM performance. At 4K we know CPU and therefore RAM speed becomes less meaningful, because only high end cards can start to push more than 120 FPS, but 1440p is significantly easier to drive for the GPU than 4K.
That's why I'm saying that RAM speed doesn't matter to me, as I'm playing at 1440 UW with as high graphical details as my GPU allows. This way, I'm always GPU limited.

Artificially inducing a CPU-limited scenario purely for the sake of science is fine, but why should I give the results more credit than they're worth?

A 10-20% difference in FPS matters more at 100 FPS+ than it does at 50 FPS, as AusWolf has just said.
I actually said the opposite. At low FPS, any small extra can help, but at high FPS, I couldn't care less if there's any difference.

The point I've been trying to make for the past 30 minutes, which seems to have significant resistance (for some reason), is that you don't ever want to be CPU bottlenecked, because that is what people notice as stuttering, or dips. It's irrelevant if you're playing at 100 FPS or 200 FPS, that number suddenly halving or going down by an appreciable amount because your CPU is struggling to keep up is noticable and immersion breaking, and coincides with massively increased input lag. If you want to talk esports, then "muscle memory" is tied to frame rates, you want consistency, not high averages.
I agree with this sentiment, but which situations your experience is limited by your CPU/RAM is highly dependent on your hardware and your sensitivity. I doubt that I could ever notice when a game running at 200 FPS dips into 100 for a microsecond, and I also doubt that any mid-range GPU is CPU limited at 1440p unless you pair it with a Celeron.

Like I said, subjective analysis is fine, but lets call it for what it is.

Not much point buying a high refresh monitor if you barely ever hit or sustain that high FPS.
Exactly - it's all subjective. That's my point all along. :)

The point is not necessarily the high refresh rate, but VRR, which eliminates any screen tearing at the appropriate performance levels.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
4,484 (1.91/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans removed
Cooling Optimus AMD Raw Copper/Plexi, HWLABS Copper 240/40+240/30, D5, 4x Noctua A12x25, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MHz 26-36-36-48, 56.6ns AIDA, 2050 FLCK, 160 ns TRFC
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front panel with pump/res combo
Audio Device(s) Audeze Maxwell Ultraviolet, Razer Nommo Pro
Power Supply SF750 Plat, transparent full custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU Redux Burgundy w/brass weight, Prismcaps White & Jellykey, lubed/modded
Software Windows 10 IoT Enterprise LTSC 19053.3803
Benchmark Scores Legendary
That's why I'm saying that RAM speed doesn't matter to me, as I'm playing at 1440 UW with as high graphical details as my GPU allows. This way, I'm always GPU limited.

Artificially inducing a CPU-limited scenario purely for the sake of science is fine, but why should I give the results more credit than they're worth?


I actually said the opposite. At low FPS, any small extra can help, but at high FPS, I couldn't care less if there's any difference.


I agree with this sentiment, but which situations your experience is limited by your CPU/RAM is highly dependent on your hardware and your sensitivity. I doubt that I could ever notice when a game running at 200 FPS dips into 100 for a microsecond, and I also doubt that any mid-range GPU is CPU limited at 1440p unless you pair it with a Celeron.


Exactly - it's all subjective. That's my point all along. :)

The point is not necessarily the high refresh rate, but VRR, which eliminates any screen tearing at the appropriate performance levels.
No. You're not.

I'm busy doing other things now, but good chat I guess.

Again with the exaggerations though.

Dips do not last "microseconds".

Screen tearing is an entirely different problem than stuttering or framerate dips. They may have similar causes, but they're different issues entirely.
 
Joined
Jan 14, 2019
Messages
10,132 (5.16/day)
Location
Midlands, UK
System Name Holiday Season Budget Computer (HSBC)
Processor AMD Ryzen 7 7700X
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 16 GB Corsair Vengeance EXPO DDR5-6000
Video Card(s) Sapphire Pulse Radeon RX 6500 XT 4 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
No. You're not.

I'm busy doing other things now, but good chat I guess.
Okay, I'm not. My gaming experience is still as solid as I could want for, and I still don't see much, if any difference between 6000 and 4800 MHz RAM.

Screen tearing is an entirely different problem than stuttering or framerate dips. They may have similar causes, but they're different issues entirely.
Of course they're different things. I just demonstrated that there can be multiple reasons for buying a high refresh rate monitor. The high refresh rate does not necessarily have to be the no.1 buying criterion.
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
18,911 (3.06/day)
Location
UK\USA
Processor AMD 3900X \ AMD 7700X
Motherboard ASRock AM4 X570 Pro 4 \ ASUS X670Xe TUF
Cooling D15
Memory Patriot 2x16GB PVS432G320C6K \ G.Skill Flare X5 F5-6000J3238F 2x16GB
Video Card(s) eVga GTX1060 SSC \ XFX RX 6950XT RX-695XATBD9
Storage Sammy 860, MX500, Sabrent Rocket 4 Sammy Evo 980 \ 1xSabrent Rocket 4+, Sammy 2x990 Pro
Display(s) Samsung 1080P \ LG 43UN700
Case Fractal Design Pop Air 2x140mm fans from Torrent \ Fractal Design Torrent 2 SilverStone FHP141x2
Audio Device(s) Yamaha RX-V677 \ Yamaha CX-830+Yamaha MX-630 \Paradigm 7se MKII, Paradigm 5SE MK1 , Blue Yeti
Power Supply Seasonic Prime TX-750 \ Corsair RM1000X Shift
Mouse Steelseries Sensei wireless \ Steelseries Sensei wireless
Keyboard Logitech K120 \ Wooting Two HE
Benchmark Scores Meh benchmarks.
I mean, my current X670E mobo already has two USB4 ports, so hopefully there's also other improvements or the refresh is pretty boring.

Mandating it is good I suppose, but surely there's more interesting stuff to improve.

Thunderbolt 4? WiFi 7? Better memory traces?

Extra cost for sure maybe another +$60+ ?m on top. OOoh it has USB 3 wow big deal hahahha.
 
Joined
Jun 2, 2017
Messages
8,122 (3.18/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
While some of what you're saying is true (but exaggerated, you don't need a 4090 and a 240 Hz+ panel to notice differences in RAM and therefore CPU performance), there are issues with the "20 FPS doesn't matter" attitude. Especially as this performance % difference sticks across all levels of hardware (moving your minimums from 100 to 130 is noticable even if you're only using a 4070/7700 class GPU, which is easily capable of 120 FPS. In a CPU limited scenario, the FPS is limited by the CPU, so whether or not you are using a 4090 or a 4070, those minimum lows will still be very similar, as long as the 4070 is capable of reaching a higher average than the CPU dictates, which it is).

For instance, ~20-25 FPS is the difference between a 4070 Ti and a 4080 (which used to be $400 difference), yet I don't see anyone saying that a faster GPU won't make a difference? Yet when that 20-25 FPS is coming from a RAM tune (free, or maybe $50 more expensive if you want to buy a faster stock kit), suddenly it's imperceivable or "not tangible". A 7900XTX vs 7900XT is even less than 20 FPS, does that render AMD's flagship pointless? No.

View attachment 341249
What you do not understand is that Freesync has made that moot in the accepted range. It is almost impossible to discern between 120 and 140 FPS, Unless there is a FPS counter used. The reason that X3D chips feel so fast is that it mitigates the RAM theory that you are trying to promote. Of course if you were talking about APUs that have no Vram buffer then your argument comes into focus. It would also seem that you are still using AM4 as a basis. On that I agree that AM4 CPUs felt faster with using faster Ram with tight timings.to get the last 5-8 FPS. I also have the best RAM you can buy for AM4 in the Team Extreme kit that costs 3 times what the Gskill 3600 18 costs and do you know where it made a difference? It did not. You see I bought that RAM with an X3D chip and it basically defeated the notion. I cannot believe that you have an X3D chip and do not understand that.
 
Last edited:
Joined
Nov 20, 2012
Messages
117 (0.03/day)
@dgianstefani : Just out of interest I looked up recent benchmarks about the benefits of fast RAM on Ryzen 7000 and found this. While DDR5-7400 is possible since AGESA 1.0.0.7b, it performs even slightly worse than DDR5-6200, because of 1:2 mode. Same for 7950X in 1440p and 7800X3D in 1080p. So what's your argument again? There is no performance to be gained by fast RAM. You can buy RAM fast enough to max out Ryzen 7000 for just slightly more than cheap RAM. Only thing you can do ist buy cheap RAM and tune it to 6400 CL30/32 yourself, if you have time but no money.
 
Top