• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

can someone explain this graph of cpu/gpu?

Joined
Sep 23, 2023
Messages
623 (1.24/day)
makes no sense what you said. in the 5700xt and 5600xt, the fps are the same there. why is there no variation between the cpu if its "faster cpu=higher fps". there should be variation and on the 3070 3090 there is.

have no idea what youre saying about .5/1.5
 
Joined
Jul 20, 2020
Messages
1,180 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) Ventus 3060 / Challenger B580
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
The 5600 XT and 5700 XT are completely GPU-limited and all CPUs are waiting for something to do because the GPUs are too slow. Frankly the 3070 is GPU-limited as well as the differences are very small.

The 3090 is also GPU limited with the 3600X and 5600X but is finally CPU-limited by the slower 1600x and 2600X. So the 3090 is finally waiting for the 1600X and 2600X to finish their calculations as they're too slow for the very fast 3090.

Remember that this behavior will be different in every game and even different sections of the same game and also at different graphics quality settings. It's likely in Cyberpunk 2077 at Ultra that all GPUs will be the limiting factor as CP2077 is much more demanding on the GPU.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
A faster graphics card needs a faster processor to show its true power.

The 5600 and 5700 aren't fast enough to keep the CPU busy, therefore, the graphics card is the limiting factor in game performance.

With the 3090, however, every single drop is extracted from the CPU, so the differences show.

This is why it is important to match your CPU with your GPU properly when building a gaming PC. A 4090 with a Core i3 would be equally bad as a Core i9 with a 3050.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.86/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
makes no sense what you said. in the 5700xt and 5600xt, the fps are the same there. why is there no variation between the cpu if its "faster cpu=higher fps". there should be variation and on the 3070 3090 there is.

have no idea what youre saying about .5/1.5
Because any single component in a PC can be the limiter for the achieved frame rates.
Why are these all the same? Because the weakest CPU in each group is still faster than those GPUs.

1698477485492.png



This is why CPU reviews always use the fastest GPU out at the time of testing, so you can see actual differences without all the results being the same.
 
Joined
Sep 23, 2023
Messages
623 (1.24/day)
This is why it is important to match your CPU with your GPU properly when building a gaming PC. A 4090 with a Core i3 would be equally bad as a Core i9 with a 3050.
thanks for the clarification. all helpful replies from all. is there a good pairing site to look at? I was looking at the 5600x and 5700xt/6600xt but I dont plan to play any shooter games and if I hit 50-60fps, im ecstatic as thats more then enough for me. hell ive played games at 35-50fps. its not bad and still enjoyable.
 
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
thanks for the clarification. all helpful replies from all. is there a good pairing site to look at? I was looking at the 5600x and 5700xt/6600xt but I dont plan to play any shooter games and if I hit 50-60fps, im ecstatic as thats more then enough for me. hell ive played games at 35-50fps. its not bad and still enjoyable.
You are on one. :) Just ask, and we'll try our best to help you. The 5600X and 6600 XT is a good pair, by the way.
 
Joined
Sep 23, 2023
Messages
623 (1.24/day)
The 5600X and 6600 XT is a good pair, by the way.
but its shown that the 6600 and 5700 are quite similar. only a bit boost in performance and less watts used. I saw that it was between 3-5fps on average. I would take the 5700xt if it wasnt for the power draw as many people dont take into consideration investment over the time of use. and gpu are getting crazy high tdp. its a huge factor for my purchases.

still curious why in the graph the 5700xt doesnt give more fps to the 5600x vs the 3600x. the 3600x may be a weak link, but in comes the 5600x and it should be a bit more. something is off there. is that all of those cpu are stronger then the 5700xt and they all max it out? seems weird a gen 1 ryzen 1600x maxes out a 5700xt which is much newer, no?

thanks man. much appreciated. someone should consider making a site that does pairing with charts and stuff. theres too many reviewers as is. weird there isnt one.
 
Joined
Feb 20, 2019
Messages
8,688 (3.99/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Whenever you hear someone say "GPU bottleneck" this is what they mean.

The reason the CPU doesn't matter for the 5700XT results is because the GPU is because all the CPUs are idle some of the time. The 1600X might be running at 95% and idle 5% of the time, while the 5600X might be running at 50% and idle 50% of the time. Both CPUs are waiting for the GPU to finish rendering the last frame before they send it the next one to work on....

Think of it like cars on a road with traffic signals; The 1600X is a slow car, the 5600X is a fast car. At every stop when the traffic light turns green, the 5600X races ahead to the next red light and waits there for the green light. The slow 1600X eventually gets to the same stop light and they both wait there a little longer until the light goes green. So the 5600X is a faster car, but both cars are covering ground at the exact same average speed because it's not the speed or acceleration of the car that matters, it's how fast the signal lights change.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
but its shown that the 6600 and 5700 are quite similar. only a bit boost in performance and less watts used. I saw that it was between 3-5fps on average. I would take the 5700xt if it wasnt for the power draw as many people dont take into consideration investment over the time of use. and gpu are getting crazy high tdp. its a huge factor for my purchases.
I would take the 6600 XT because of its smaller power draw, and more modern features. The 5000 series is already showing its age in games where mesh shaders are required, like Alan Wake 2.

still curious why in the graph the 5700xt doesnt give more fps to the 5600x vs the 3600x. the 3600x may be a weak link, but in comes the 5600x and it should be a bit more. something is off there. is that all of those cpu are stronger then the 5700xt and they all max it out? seems weird a gen 1 ryzen 1600x maxes out a 5700xt which is much newer, no?
It depends on the game. In that particular game, the CPU isn't used enough to be maxed out. In some other games, it might be. Generally, you want a gaming PC where the GPU is the bottleneck. It usually gives smoother gameplay, not to mention, the GPU is easier to upgrade.

thanks man. much appreciated. someone should consider making a site that does pairing with charts and stuff. theres too many reviewers as is. weird there isnt one.
No worries. :)

The problem is that there's so many variables (which game, which CPU architecture, which CPU or GPU model, etc), that you can't just set up a chart to recommend something definite. You need a bit of experience to figure this out.
 
Joined
Sep 17, 2014
Messages
23,173 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
but its shown that the 6600 and 5700 are quite similar. only a bit boost in performance and less watts used. I saw that it was between 3-5fps on average. I would take the 5700xt if it wasnt for the power draw as many people dont take into consideration investment over the time of use. and gpu are getting crazy high tdp. its a huge factor for my purchases.

still curious why in the graph the 5700xt doesnt give more fps to the 5600x vs the 3600x. the 3600x may be a weak link, but in comes the 5600x and it should be a bit more. something is off there. is that all of those cpu are stronger then the 5700xt and they all max it out? seems weird a gen 1 ryzen 1600x maxes out a 5700xt which is much newer, no?

thanks man. much appreciated. someone should consider making a site that does pairing with charts and stuff. theres too many reviewers as is. weird there isnt one.
They exist but this isnt cut n dry 'do this cpu with that GPU'. And as a result these simple sites just suck.

The balance you need is time dependant and on top of that depends on your needs and wants.

Example: say you play on a 1080p monitor while I play on a 4K monitor. Our games will run fine on say, a 5800X3D with any GPU. But for my 4K rig I definitely want a much faster GPU, for each frame the CPU is calculating, I need to calculate 4x more pixels to show the image.
 
Joined
Feb 20, 2019
Messages
8,688 (3.99/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I would take the 6600 XT because of its smaller power draw, and more modern features. The 5000 series is already showing its age in games where mesh shaders are required, like Alan Wake 2.
Yes and no. You're not wrong. I'd also choose the 6600XT over a 5700XT but....

Right now 99.9% of games don't use or need mesh shaders. In 3-5 years time, 97% of games probably won't need or use mesh shaders.

If you're the sort of person who only plays the latest AAA games then mesh shaders are probably more important, but at the same time both the 5700XT and 6600XT are already starting to struggle with last year's AAA games - 1080p60 usually requires a few compromises to medium settings for 2022 and 2023 titles. Mesh shader support is almost irrelevant because the new 2024 and 2025 AAA games that NEED them are also fairly likely to be beyond the reasonable scope of either card without seriously compromised image quality or framerate.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.86/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
thanks for the clarification. all helpful replies from all. is there a good pairing site to look at? I was looking at the 5600x and 5700xt/6600xt but I dont plan to play any shooter games and if I hit 50-60fps, im ecstatic as thats more then enough for me. hell ive played games at 35-50fps. its not bad and still enjoyable.
You're on that website.

You start with the performance of your CPU, because you can't "turn down" CPU intensive settings in a game - that'd be kicking half the players out or killing half the units, and isn't possible.
You can lower GPU settings, a 4090 on ultra is no different to a 3090 on medium, or a 5700XT on lower settings again.

Find the latest CPU review, look for minimum FPS. The rest is upto you, pair with any GPU you want, lower settings til you're happy with the FPS. Preferably run an FPS cap to keep things within the 1% low range of the CPU, and you'll get stutter free happiness.
Notice how nothing can really reach 144FPS, despite high refresh displays being a big deal? This is why they don't matter yet.


Intel Core i9-14900K Review - Reaching for the Performance Crown - Minimum FPS / RTX 4090 | TechPowerUp
4K results are what matter the most IMO, because they show what they'll be like under a more demanding load, which you can use as an example of what next-gen games will run like in the coming years, even at lowered settings.

1698484277384.png



60Hz display? Even a ryzen 2600 is fin.

For models not listed, compare them to the closest one that is - look how the 5600 and 5600G are far apart (they're very different designs) while the similar Zen3 CPUs (5600 through 5950x) are all very close together

Higher models tend to add more cores, which doesnt matter at all for gaming. They only show up as faster due to them also being 100-200Mhz faster each step, to artificially make them the 'best' while real world results are basically identical

Spot the nearly identical CPU's, for both AMD and intel
1698484412408.png


5800x3D added more cache to break away from the pack here, and is an example of why to investigate anything different with the naming
1698484510320.png
 
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Yes and no. You're not wrong. I'd also choose the 6600XT over a 5700XT but....

Right now 99.9% of games don't use or need mesh shaders. In 3-5 years time, 97% of games probably won't need or use mesh shaders.

If you're the sort of person who only plays the latest AAA games then mesh shaders are probably more important, but at the same time both the 5700XT and 6600XT are already starting to struggle with last year's AAA games - 1080p60 usually requires a few compromises to medium settings for 2022 and 2023 titles. Mesh shader support is almost irrelevant because the new 2024 and 2025 AAA games that NEED them are also fairly likely to be beyond the reasonable scope of either card without seriously compromised image quality or framerate.
You're not wrong. But I'd still much rather get the 6600 XT. A more modern architecture usually means better efficiency and longer driver support.
 
Joined
Feb 24, 2023
Messages
3,553 (4.98/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
making a site that does pairing with charts and stuff
TPU has it but it's still vacuumised because of these reasons:

0. People have different displays or even sets of displays. Greater resolution = bigger impact on GPU performance, more CPU idle time (that's why at 4K, it's not as bad to have an i3: your GPU is tortured regardless).
1. People play different games and have different graphics preferences. This leads to inconsistent data on what PC component is used the most. Average results do exist but they should be looked at with a little grain of salt because your mileage WILL vary.
2. Different motherboards have different layouts, thus having different latencies which results in faster RAM access in one PC and quicker GPU access in another one. It'll take a billion eternities to make an elaborate chart on motherboards based on one chipset, let alone all non-outdated motherboards.
3. Silicon lottery means you don't have equally tuned RAM sticks on two different systems. Bandwidths and latencies differ which causes shifts in results.
...and so on.

Best case scenario you will get about 90 percent accurate information about your GPU before actually using it and seeing how it performes "in person."

That said, 5700 XT is extremely inferior to 6600 XT in terms of efficiency, support, hardware condition, warranty, and feature set. I'd rather get a 6800 non-XT though... But I know, additional $120...180 could be too much for you. 6600/7600 is a no-brainer if we compare to 5700 XT anyway.

If you're below 1080p (why?) then you'll be more often limited by CPU+RAM.
If you're at 1080p or UW 1080p then your system is balanced.
At 1440p, 6600 XT is usually tortured a bit more than the Ryzen 5600X. But that's still playable.
At UW 1440p and beyond, 6600 XT is usually insufficient.
 
Last edited:
Joined
Sep 17, 2014
Messages
23,173 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
4K results are what matter the most IMO, because they show what they'll be like under a more demanding load, which you can use as an example of what next-gen games will run like in the coming years, even at lowered settings.
This is a pretty dangerous statement. Look at Alan Wake 2.

All bets are off wrt next gen engines and games. The best approach is 'get a bit of headroom in performance' and 'get sufficient VRAM'. Beyond that, you can't really say a thing about what your perf will look like based on card performance at a different res. More often than not, resolution upgrades aren't that painful on performance if you can already get good frames - its a simple relative hit: 25% more pixels? 75% of your perf left, or more.

Engine upgrades on the other hand... they can simply place whole generations of cards out of the game just like that. RDNA1 on Alan Wake 2 is a good recent example. Required technologies and featuresets are the most influential impact on GPUs. Consider for example also stuff like having DLSS/FSR support on your card.
 
Last edited:
Joined
Feb 24, 2023
Messages
3,553 (4.98/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
RDNA1 on Alan Wake 2 is a good recent example.
Pascal is even more super dead. 5700 XT is at least capable of hitting low 40s with FSR...
 
Joined
Sep 17, 2014
Messages
23,173 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Pascal is even more super dead. 5700 XT is at least capable of hitting low 40s with FSR...
Yes but RDNA1 launched in 2019, Pascal in 2016.

That's 1-1,5 gen worth of a gap. And if you place that in perspective... If you bought RDNA1 in 2019, you've had a mere 4 years of good perf out of it before it just drops dead. For Pascal, the counter is at 7 years. Almost double the lifespan.
 
Joined
Feb 24, 2023
Messages
3,553 (4.98/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?
Yes but RDNA1 launched in 2019, Pascal in 2016.

That's 1-1,5 gen worth of a gap. And if you place that in perspective... If you bought RDNA1 in 2019, you've had a mere 4 years of good perf out of it before it just drops dead. For Pascal, the counter is at 7 years. Almost double the lifespan.
True but 5700 XT is still 60 percent ahead and that's significant. Not to mention the difference in positioning and MSRP: 1080 Ti was a borderline halo GPU with 7 hunnit bucks of a price and 5700 XT was launched almost twice as cheap. 1070 was launched for the same ~400 USD as the 5700 XT and... yeah, I mean, there is no way 1070 can compete. RDNA1 has fallen faster, true, but Pascal has it much harder.

Pascal is still the only nVidia's architecture to be much future proof so far. It's only now it becomes unplayable. Turing didn't achieve much due to poor perf per $ and pre-alpha state of RT cores. Ampere... suboptimal Samsung node, questionable VRAM capacity, instant abandonware state (DLSS Frame Generation and similar features wise) and Ada Lovelace feels like the only GPU capable of ageing well is the 4090. With a grain of salt because $1600 is way beyond average Joe's GPU budget.

RDNA1, though, wasn't a success by any mean. Just a regular "AMD do their AMD thing," nothing extra. RDNA2 is much better despite exceptionally greedy MSRPs, even greedier than those of Ampere GPUs.
 
Joined
Sep 17, 2014
Messages
23,173 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
True but 5700 XT is still 60 percent ahead and that's significant. Not to mention the difference in positioning and MSRP: 1080 Ti was a borderline halo GPU with 7 hunnit bucks of a price and 5700 XT was launched almost twice as cheap. 1070 was launched for the same ~400 USD as the 5700 XT and... yeah, I mean, there is no way 1070 can compete. RDNA1 has fallen faster, true, but Pascal has it much harder.

Pascal is still the only nVidia's architecture to be much future proof so far. It's only now it becomes unplayable. Turing didn't achieve much due to poor perf per $ and pre-alpha state of RT cores. Ampere... suboptimal Samsung node, questionable VRAM capacity, instant abandonware state (DLSS Frame Generation and similar features wise) and Ada Lovelace feels like the only GPU capable of ageing well is the 4090. With a grain of salt because $1600 is way beyond average Joe's GPU budget.

RDNA1, though, wasn't a success by any mean. Just a regular "AMD do their AMD thing," nothing extra. RDNA2 is much better despite exceptionally greedy MSRPs, even greedier than those of Ampere GPUs.
We are in complete agreement here. To relate this to what Mussels said about performance over time... you have to be pretty deep into the world of GPUs and gaming to know what's what. Even just reviews won't provide the frame of reference you really need to make sound choices.
 
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Pascal is still the only nVidia's architecture to be much future proof so far. It's only now it becomes unplayable. Turing didn't achieve much due to poor perf per $ and pre-alpha state of RT cores. Ampere... suboptimal Samsung node, questionable VRAM capacity, instant abandonware state (DLSS Frame Generation and similar features wise) and Ada Lovelace feels like the only GPU capable of ageing well is the 4090. With a grain of salt because $1600 is way beyond average Joe's GPU budget.
I don't think Ada will age well. It lacks VRAM just like Ampere, and it relies way too heavily on software augmentation (e.g. DLSS FG) to run games properly. Performance in games of the next 2-4 years will drop like a rock, and all Ada prices will follow that drop when the 50-series is out. The 4090 is only held up by its halo status, which will also change. Vaporware, the whole lot.

Pascal, on the other hand, did very well by pushing through 7 years of gaming. It's probably the best series Nvidia has ever had.
 
Joined
Feb 24, 2023
Messages
3,553 (4.98/day)
Location
Russian Wild West
System Name D.L.S.S. (Die Lekker Spoed Situasie)
Processor i5-12400F
Motherboard Gigabyte B760M DS3H
Cooling Laminar RM1
Memory 32 GB DDR4-3200
Video Card(s) RX 6700 XT (vandalised)
Storage Yes.
Display(s) MSi G2712
Case Matrexx 55 (slightly vandalised)
Audio Device(s) Yes.
Power Supply Thermaltake 1000 W
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Benchmark Scores My PC can run Crysis. Do I really need more than that?

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1

so in the picture in all his tests of games the ones in the 5700xt and the 5600xt all have the same fps and the ones with the 3070 or 3090 have varying scores. why is this?
That's called a bottleneck. In case you need a more detailed explanation, you can always ask ChatGPT.
Me said:
Describe a bottleneck in terms of a computer system.
ChatGPT said:
In the context of a computer system, a bottleneck refers to a point of congestion or restriction in the flow of data or operations that hinders the system's overall performance. This bottleneck prevents the system from operating at its maximum potential, and it can occur at various points within a computer system. Here are some common examples of bottlenecks in a computer system:

1. CPU Bottleneck: This occurs when the central processing unit (CPU) is unable to handle the incoming data and instructions as quickly as other components can generate them. In such cases, the CPU becomes the limiting factor, causing slowdowns in processing and overall system performance.

2. Memory Bottleneck: A memory bottleneck happens when the computer's RAM (Random Access Memory) or other types of memory cannot keep up with the data transfer demands of the CPU and other components. This can lead to slower performance as the system needs to fetch data from slower storage devices.

3. Storage Bottleneck: When the storage subsystem, which includes hard drives or solid-state drives, is unable to read or write data as fast as other components need it, a storage bottleneck occurs. Slow storage can lead to slower boot times, application loading, and file access.

4. Network Bottleneck: In networked computer systems, a bottleneck can occur in the network infrastructure, such as routers, switches, or network cables. Slow network speeds can cause delays in data transfer and communication between devices.

5. Input/Output (I/O) Bottleneck: This type of bottleneck occurs when the input/output operations, like reading from or writing to storage devices, are slower than the processing speed of the CPU. This can impact tasks that involve frequent data access, like database operations or file transfers.

6. Graphics Processing Unit (GPU) Bottleneck: In systems that rely on GPUs for tasks like gaming or graphics-intensive applications, a GPU bottleneck can occur when the GPU's processing power is not sufficient to handle the workload, leading to reduced frame rates and graphics performance.

7. Software Bottleneck: Sometimes, the bottleneck isn't hardware-related but rather a result of inefficient software or poorly optimized code. Software bottlenecks can manifest as slow program execution or resource-intensive applications that strain the hardware components.

Identifying and addressing bottlenecks is essential for optimizing a computer system's performance. This often involves upgrading the bottlenecked component or improving system configuration, such as adding more memory, upgrading the CPU, or using faster storage devices. Performance monitoring tools can help diagnose and pinpoint the source of bottlenecks in a computer system, allowing for targeted improvements to enhance overall efficiency and speed.
The two you likely care about is #1 and #6. Maybe #7 to some degree as well.
 
Joined
Jan 14, 2019
Messages
14,164 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
That's called a bottleneck. In case you need a more detailed explanation, you can always ask ChatGPT.
That's as stupid as ChatGPT can get.

A bottleneck doesn't mean that one component "can't keep up" with the work of the others. It means that one component is loaded to 100% of its capacity, while other components spend time waiting that first component to finish. Every computer has some kind of bottleneck, which can vary task by task. It's not something to avoid, it's completely natural.

"Identifying and addressing bottlenecks" (as ChatGPT said) is only essential if the bottleneck is causing performance problems in the desired applications. That is, if your CPU is too slow for your work, or causes your GPU to wait within a game, or if your FPS is too low, or fluctuates too heavily. Other than that, bottlenecks are part of every system.
 
Top