• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 3080 with AMD Ryzen 3900XT vs. Intel Core i9-10900K

People keep repeating this like it is true. PCI-e 3.0 x4 for storage is not bottlenecking the system performance.

I sometimes edit videos, and absolutely see my streaming I/O speeds max out under those circumstances. Faster storage means faster editing, especially with lossless or low-compression formats: HuffYUV, UTVideo, 4k, 8k videos, etc. etc. Depends how keen you are about compression losses between editing steps.

This really pictures shows a interesting problem with Zen2. That either ALU/ALG and FPU could be maxing out. Then when people push to 5.0ghz on extreme there is almost no gain. Sure they need to lower Infinity clocks on cold, but that really shouldn't be probably to be around 1500mhz and still have the same performance. I really don't think infinity fabric can be that big of a bottleneck.

The opposite. If ALUs / FPUs were maxing out, then overclocking would help.

What we're seeing here is that overclocked processors do NOT help improving FPS. This means the bottleneck is elsewhere (probably RAM latency if I were to take a guess)
 
Thank you very much for doing this guys. As a happy owner of a 3900X I'm glad you also did the GPU review with an AMD CPU for once. ;)
 
Age is not the relevant factor. The comparative real world performance data is.
Yeah, but I'm not sure it would say much without also running benchmarks with a PCIe 3 system for the same era, since the CPU's have improved much since then.
Running a 2700K on both a Z68 and a Z77 board (or just limit the PCIe) would give a better picture IMO.

Age is a relevant factor since reviewing takes time, it's all about priorities. :) Though I'm sure we'll see an Ampere + Sandy Bridge review somewhere soon.
 
Wizzard, just a suggestion. CPU Limited games should not be included in GPU Benchmarks. It kinda skews the overall percentage and does not fully reflect the GPU's full potential.
 
Very interesting findings, makes it a pretty close call between the two top CPU's. Though the findings on PCIE 4.0 are not too surprising, its probably more likely to be beneficial on newer SSD's.
 
This question was asked every single time there was new pcie standard and it never mattered as it's always ahead in terms of bandwidth. So big shocker same thing this time around.

Hope that every single idiot mouth is now shut. Sick and tired about the million topics asking how bad pcie 3.0 will bottleneck things. Intel users with pcie 3.0 are just fine and actually still have the upper hand in gaming.
 
The only difference I'm focused on is Microsoft Flight simulator 2020. As far as I can tell, 1440p and 4K still can't hit 60fps on the 3080.

That game must be poorly optimized.
 
I sometimes edit videos, and absolutely see my streaming I/O speeds max out under those circumstances. Faster storage means faster editing, especially with lossless or low-compression formats: HuffYUV, UTVideo, 4k, 8k videos, etc. etc. Depends how keen you are about compression losses between editing steps.



The opposite. If ALUs / FPUs were maxing out, then overclocking would help.

What we're seeing here is that overclocked processors do NOT help improving FPS. This means the bottleneck is elsewhere (probably RAM latency if I were to take a guess)

I don't think so, Because 4,000G series which is Zen 2 on monolithic die, with 1/4 the L3 cache has less latency vs 3,000 series. The 4000G series is still slower than Zen 2 chiplets



It could be the Load/Store can't keep the cores fed.
 
As an AMD fan myself... Well I mean, it'd probably be closer if Ryzen could also run 5Ghz all core... but it can't.

Credit where credit is due, no matter what else you might try and ding Intel for - Intel does know how to build a good clockin' processor.
 
AMD NO!
In Far Cry 5 FullHD, 3900XT made RTX3080 worse than Intel CPU + RTX2070:respect:
That's because it's limited by the CPU, not the GPU. Look what happens when it gets bumped to 1440. Nothing, with either CPU.
 
AMD NO!
In Far Cry 5 FullHD, 3900XT made RTX3080 worse than Intel CPU + RTX2070:respect:
Same with Divinity. I really hope the Ryzen 4000 series is amazing in gaming or else im fecked. Got the 3700x thinking i was gonna be good for at least 3-4 years oh well :rolleyes:
 
Got the 3700x thinking i was gonna be good for at least 3-4 years oh well :rolleyes:
If you're going to run it with anything less than 4k then yeah, I see your point, but if that's the case you might just as well buy the 3070 instead.
 
I was always wondering. What is the performance on a non-standard resolution, such as 3440x1440?
There are a lots of citizens out there with 21:9 monitors and 5MP resolution.
 
I was always wondering. What is the performance on a non-standard resolution, such as 3440x1440?
There are a lots of citizens out there with 21:9 monitors and 5MP resolution.
In practice there might be some variations to a few benchmarks, but the performance scaling from one GPU to the next are going to be similar if not identical for a given resolution.
 
I don't think so, Because 4,000G series which is Zen 2 on monolithic die, with 1/4 the L3 cache has less latency vs 3,000 series. The 4000G series is still slower than Zen 2 chiplets



It could be the Load/Store can't keep the cores fed.
Renoir APUs have way less L3 cache than Matisse CPUs, there's your bottleneck
 
AMD NO!
In Far Cry 5 FullHD, 3900XT made RTX3080 worse than Intel CPU + RTX2070:respect:

In those benchmarks the 2080Ti is neck to neck with the 3080

I would blame optimization instead of the CPU for those games...
 
"which your high refresh-rate monitor can benefit from."

If only we all did...................still rocking 60Hz here. Maybe one of these days 120Hz+ screens will drop the 'gaming' moniker and price.
 
Back
Top