• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

can someone explain this graph of cpu/gpu?

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
That's as stupid as ChatGPT can get.

A bottleneck doesn't mean that one component "can't keep up" with the work of the others. It means that one component is loaded to 100% of its capacity, while other components spend time waiting that first component to finish. Every computer has some kind of bottleneck, which can vary task by task. It's not something to avoid, it's completely natural.

"Identifying and addressing bottlenecks" (as ChatGPT said) is only essential if the bottleneck is causing performance problems in the desired applications. That is, if your CPU is too slow for your tasks, or causes your GPU to wait within a game, or if your FPS is too low, or fluctuates too heavily. Other than that, bottlenecks are part of every system.
Yeah, the the OP doesn't seem to understand that basic fact, hence why this needs to be explained. Why benchmarks respond certain ways to more CPU or GPU resources is completely due to bottlenecks and understanding this sort of relationship is important in understanding why benchmark graphs look the way that they do because they can imply that you're GPU or CPU bound. You're right, you're always going to be constrained by something, but sometimes one particular thing dominates all the others and we see this very often with games.
 
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Yeah, the the OP doesn't seem to understand that basic fact, hence why this needs to be explained. Why benchmarks respond certain ways to more CPU or GPU resources is completely due to bottlenecks and understanding this sort of relationship is important in understanding why benchmark graphs look the way that they do because they can imply that you're GPU or CPU bound. You're right, you're always going to be constrained by something, but sometimes one particular thing dominates all the others and we see this very often with games.
Exactly. That ChatGPT answer, on the other hand, implies that bottlenecking means something's wrong. It's a common misconception, so I can't blame ChatGPT for giving that answer. All I'm saying is, it's false.
 
Joined
Jun 29, 2023
Messages
615 (1.04/day)
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
You're not wrong. But I'd still much rather get the 6600 XT. A more modern architecture usually means better efficiency and longer driver support.
Exactly, we can see this happen now actually, with Alan wake 2 and the importance of mesh shaders, which by themselves is already a major dealbreaker, because they WILL become prominent and used for performance gains
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Exactly. That ChatGPT answer, on the other hand, implies that bottlenecking means something's wrong. It's a common misconception, so I can't blame ChatGPT for giving that answer. All I'm saying is, it's false.
I don't read anything from that response that says that it's bad. It says that understanding this is important regarding optimizing computer systems which isn't wrong. That's not a bad thing, it's just a reality.

I guess what I'm saying is that I think what you derived from the statement isn't actually what the statement says which is why I'm pushing back.

Let me put it this way, something doesn't have to be bad for you to be able to make it better.
 
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I don't read anything from that response that says that it's bad. It says that understanding this is important regarding optimizing computer systems which isn't wrong. That's not a bad thing, it's just a reality.

I guess what I'm saying is that I think what you derived from the statement isn't actually what the statement says which is why I'm pushing back.

Let me put it this way, something doesn't have to be bad for you to be able to make it better.
From the answer you quoted:
  • "This bottleneck prevents the system from operating at its maximum potential"
  • "...the CPU becomes the limiting factor, causing slowdowns in processing and overall system performance."
  • "...the GPU's processing power is not sufficient to handle the workload, leading to reduced frame rates and graphics performance."
These sound pretty negative, and aren't necessarily true.
  • A GPU bottleneck can occur even at 100+ or 200+ FPS as long as the GPU is loaded to 100% of its capacity. What is and isn't sufficient is determined by the user, not the component.
  • Similarly, a CPU bottleneck doesn't necessarily cause slowdowns.
  • Also, a bottleneck doesn't prevent the system to operate at its maximum potential. Quite the opposite, in a bottleneck situation, a given component works at its maximum potential, while other components are waiting. The "maximum potential" is always determined by the slowest component needed to complete the given task. There is always one component that is the slowest, unless your computer's performance is infinite, which would defy physics.
 
Last edited:

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
From the answer you quoted:

These sound pretty negative, and aren't necessarily true.
  • A GPU bottleneck can occur even at 100+ or 200+ FPS as long as the GPU is loaded to 100% of its capacity. What is and isn't sufficient is determined by the user, not the component.
  • Similarly, a CPU bottleneck doesn't necessarily cause slowdowns.
  • Also, a bottleneck doesn't prevent the system to operate at its maximum potential. The "maximum potential" is always determined by the slowest component needed to complete the given task. There is always one component that is the slowest, unless your computer's performance is infinite, which would defy physics.
Man, stop taking everything out of context and read the entire thing. It even says:
a bottleneck refers to a point of congestion or restriction in the flow of data or operations that hinders the system's overall performance.
If performance is fine for the task at hand, then this statement would imply that you don't have a bottleneck.

If you're running at 200+ FPS, you don't have a performance issue, thus this statement would imply no (meaningful,) bottleneck.

Also man, given your responses, I think you need to chill. Saying something is stupid out of the gate is probably not a good conversation starter. Particularly if you're picking the statement apart in a myopic way. I normally have respect for your opinions, but not right now.
 
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Man, stop taking everything out of context and read the entire thing. It even says:

If performance is fine for the task at hand, then this statement would imply that you don't have a bottleneck.

If you're running at 200+ FPS, you don't have a performance issue, thus this statement would imply no (meaningful,) bottleneck.

Also man, given your responses, I think you need to chill. Saying something is stupid out of the gate is probably not a good conversation starter. Particularly if you're picking the statement apart in a myopic way. I normally have respect for your opinions, but not right now.
You're just as much picking the statements apart as I am. You may praise ChatGPT for the correct parts of the answer, but I won't, as someone without experience (such as OP) won't be able to do the same. As long as ChatGPT gives the wrong answers that I quoted, I'll say that it's wrong and I don't care about what else it gets right. I give no "good enough" labels for half correct statements, especially for a computer program that people think that knows everything available on the internet (which is obviously not the case).
 
Joined
Jun 29, 2023
Messages
615 (1.04/day)
System Name Gungnir
Processor Ryzen 5 7600X
Motherboard ASUS TUF B650M-PLUS WIFI
Cooling Thermalright Peerless Assasin 120 SE Black
Memory 2x16GB DDR5 CL36 5600MHz
Video Card(s) XFX RX 6800XT Merc 319
Storage 1TB WD SN770 | 2TB WD Blue SATA III SSD
Display(s) 1440p 165Hz VA
Case Lian Li Lancool 215
Audio Device(s) Beyerdynamic DT 770 PRO 80Ohm
Power Supply EVGA SuperNOVA 750W 80 Plus Gold
Mouse Logitech G Pro Wireless
Keyboard Keychron V6
VR HMD The bane of my existence (Oculus Quest 2)
Why did anyone even decide to use chatgpt tho lmao, what use is there when a simple google search will provide a way more accurate answer, or the people here, which have provided better answers
 
Joined
May 13, 2008
Messages
812 (0.13/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
You're on that website.

You start with the performance of your CPU, because you can't "turn down" CPU intensive settings in a game - that'd be kicking half the players out or killing half the units, and isn't possible.
You can lower GPU settings, a 4090 on ultra is no different to a 3090 on medium, or a 5700XT on lower settings again.

Find the latest CPU review, look for minimum FPS. The rest is upto you, pair with any GPU you want, lower settings til you're happy with the FPS. Preferably run an FPS cap to keep things within the 1% low range of the CPU, and you'll get stutter free happiness.
Notice how nothing can really reach 144FPS, despite high refresh displays being a big deal? This is why they don't matter yet.


Intel Core i9-14900K Review - Reaching for the Performance Crown - Minimum FPS / RTX 4090 | TechPowerUp
4K results are what matter the most IMO, because they show what they'll be like under a more demanding load, which you can use as an example of what next-gen games will run like in the coming years, even at lowered settings.

This is pretty much how I go about it. It's absolutely not a perfect science as some people prefer seeing where CPU limitations exist at lower resolutions so they can run at higher framerates, and I can understand some other critiques of it, but I do think it's fairly sound. One really does start to see how certain CPUs/GPUs pair together. FWIW, I think Blackwell (and Navi 5?) will assume/best be matched with at least a 7600/5800X3D. Not a huge difference, but enough to let the GPUs makers skimp a little over the expected generational uplift to obtain desirable framerates (60/120hz mins at x resolution).

I also think it's one of those things where many modern cards assume you will be pairing it with a CPU that is faster than a 12400/13400; a 'gaming'/'performance' cpu from the last couple years. You'll notice the very deliniated gap there that exists pretty much nowhere else in the (low-priced) stack. You start to notice they want something around 112-113fps for many GPUs to hit certain thresholds, or perhaps something around a 5600x3d or an overclocked 6/8 5-series. I feel like that's a pretty good bet for where the PS5 Pro will land in terms of CPU performance as well. If you figure the PS5 is somewhere around a 3700x (98.5), it's likely the 3.5ghz (max) clock will be upped to ~4ghz, which in theory puts in right in line with what I'm talking about. If you want to really get into the weeds, I think the CPU/GPU will be on a ~2/3 divider, so if the GPU is ~2600mhz, the CPU will be ~3900mhz. If the GPU is ~2.67ghz, the cpu 4ghz. If the GPU 2.733ghz, the CPU 4100. If the GPU 2800mhz, the CPU 4.2ghz. I also think when someone looks at many current games at 1440p (Hogwarts, Alan Wake 2, Spiderman 2) you start to get a very good idea of where that GPU will perform (although the assumed doubling of GPU perf + arch improvements also gets you there). Those appear like games literally waiting on a patch for 1440p/60 with nice settings on a Pro, and I'd be willing to bet if they weren't designed with that in mind, they certainly have comparable PC settings that will slot in nicely.

Again, is that a perfect science? No. Could it be *slightly* different, just as the PS5 is not on a perfect ratio? Absolutely, it's just a rule of thumb that should get you pretty close within a pretty small margin of error. It's what I consider well-informed speculation and should give people an idea of where they would want to be have an adequete-level CPU/GPU ratio for the console-like settings (which may actually be 'high' on the pro) for the next few years. TLDR: They want you to buy a 12600k/7600/5800X3D or better, and a now 7900xt/4080 (soon to be Navi 4, 4070 Ti 16GB, or perhaps Battlemage), but you'll probably be able to get by with an overclocked 5800x/7800xt. Maybe even a 5600x given not all 16 threads are used for gaming in a playstation; I don't know exactly how a PS5pro will line up with how those overclock (say 4.7ghz)...It should be close. It's one of those things where the stock 7800xt may overclock to 2721mhz, and a stock 4070 Ti is 2730mhz. Might the console be literally 7680/2.733mhz so they could both claim a meaningless win and differenciate the market (even if not really)? It might be exactly that. They *want* people to upgrade, but if you're savvy there's always a cheaper way.

I also am very curious how next-gen CPUs (say Zen 5) pair with MANY gpus. I pity W1zard for the work he'll have to do, but it should be fascinating. While there are, and certainly will absolutely be cases of straight GPU/VRAM bottlenecks, I'm curious if cards that were very-clearly intended for a certain market (nVIDIA GPUs sure appear to drift from ~60fps at launch to at 50-55fps on newer titles, or say a 4090 4k 100-<120fps) will be bumped up in their minimums. For instance, AMD did everything in their power to make sure the 7800xt is not compared to the 4070 Ti, as they very clearly wanted market differentiation there between it and 7900xt/probably Navi 4. Will a Zen 5 + 7800xt outperform a Zen 4 + 4070 Ti, or achieve certain threshold performance in cases it may not have when both were using the same expected (older platform)? I think it's very possible.

In that case, a CPU upgrade *could* potentially save you a GPU upgrade (or negate price differences for a certain level of perf between the companies), which I think is rather interesting. Will most people think about that? Probably not.

I *think* this is probably why AMD aims to release GPUs around the time, but after new CPUs. They probably expect many people to bench the GPU on the new CPU, which may make the GPU look better (comparably) than it may have on the older systems. For instance, maybe in AW2 a Navi 4 goes from being able to run 4k FSR balanced on Zen 4 to FSR Quality on Zen 5 with 60fps mins, but people will only notice that the GPU is capable of the higher-end performance because of a new bench platform for many reviewers. What many won't think to realize is that the 7800xt may have gone from 55fps 1440p mins to 60fps 1440 mins on the new platform as well. I don't think many people think about the fact people like W1zard uses close to the best, if not the best other side of the equation when testing current CPUs/GPUs, but both of those things evolve over time. Simiarly, they may not be running *quite* as high-end of a CPU at any given time, which can/does make a variable amount of difference depending upon a certain person's focus on a particular resolution.

Now, could a lot of these differences (in gaming) be solved by overclocking the old platform to get to the new stock performance? Sure, but that's not what's on the chart, and not all people do that.

The whole thing can come across as rather precarious, but it really isn't. I generally try to assume baseline absolute (oc) performance for GPUs, say something like 2850/22000 on a 7900xt, which can make a fairly noticable difference in 'playable' (accepted as 60hz, although I know not everyone agrees) settings, and then think to myself the range that would perform across the most common CPUs, which generally isn't a huge gap as long as you're in that 'performance' category. It certainly can matter *just* enough though, which is why I think it's important for each person to look at the chart you pointed to (and other resolutions) based upon their goals. You can generally get a fair idea of what you need to pair to get what you want in *most* circumstances, as the most-demanding games have a fairly consistant demand for performance and it doesn't change that often. It's mostly just dictated by consoles (and then building on top of them to an extent to bridge the gap between GPU perf levels and expected resolutions), and while the console resolution goals shift slightly over time (what might have been a 4k goal for cross-gen may have become 1440p60 currently, and by the end of gen that may become 1080p or a similarish FSR resoltution and/or 30fps, especially after the introduction of the 'pro') it's fairly predictable, and does often match up with predictable tiers of GPUs to get performance over that baseline as long as again, you are in that 'performance' level of CPU.

I wish I knew how to write that in a more concise manner, I apologize for that. Hopefully it makes sense.

Perhaps I over-complicated it, but it is kind of interesting imho. When you dig into it, you start to see the tricks all companies play so certain configurations will or won't be adequete (especially over time), as it has become somewhat predictable if you really crunch the numbers. The fact is though, most people don't, or won't.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Exactly, we can see this happen now actually, with Alan wake 2 and the importance of mesh shaders, which by themselves is already a major dealbreaker, because they WILL become prominent and used for performance gains
Then you missed the conversation because the OP could have started by googling what a bottleneck is. That was my original point, I just used ChatGPT instead of Google to make that point, but if it makes you and @AusWolf feel better, I can use LMGTFY next time if you insist that I be more of a dick about it.
You're just as much picking the statements apart as I am. You may praise ChatGPT for the correct parts of the answer, but I won't, as someone without experience (such as OP) won't be able to do the same. As long as ChatGPT gives the wrong answers that I quoted, I'll say that it's wrong and I don't care about what else it gets right. I give no "good enough" labels for half correct statements, especially for a computer program that people think that knows everything available on the internet (which is obviously not the case).
Only because you did, bub. You need to chill. Google isn't perfect either, but that's beside the point.
 
Joined
Feb 20, 2019
Messages
8,689 (3.99/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This is a pretty dangerous statement. Look at Alan Wake 2.

All bets are off wrt next gen engines and games. The best approach is 'get a bit of headroom in performance' and 'get sufficient VRAM'. Beyond that, you can't really say a thing about what your perf will look like based on card performance at a different res. More often than not, resolution upgrades aren't that painful on performance if you can already get good frames - its a simple relative hit: 25% more pixels? 75% of your perf left, or more.

Engine upgrades on the other hand... they can simply place whole generations of cards out of the game just like that. RDNA1 on Alan Wake 2 is a good recent example. Required technologies and featuresets are the most influential impact on GPUs. Consider for example also stuff like having DLSS/FSR support on your card.
All bets are indeed off with the next generation of game engines, but I'm willing to bet that AMD are the safest bet going forwards. Developers focus on XBOX and Playstation first, which means that a game absolutely must run well on RDNA2.

If you have RDNA2 feature parity or better, you're in a good position until the next generation of consoles comes along. Unsurprisingly, the best way to get RDNA2 feature parity is to buy RDNA2...

You're not wrong. But I'd still much rather get the 6600 XT. A more modern architecture usually means better efficiency and longer driver support.
The trade-off is that a 5700XT is much cheaper than a 6600XT, even used vs used.
A couple of decades ago I was broke and performance/$ was the only metric that mattered. Future-proofing and long-term thinking become irrelevant when you have no money.
 
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Only because you did, bub. You need to chill. Google isn't perfect either, but that's beside the point.
Sure, but if someone asks me a question (or just a general question here on TPU), then I'll try to answer it to my best knowledge, and not just copy-paste some half correct re-hashed bollocks from ChatGPT or Google.

Edit: I'm not saying that googling or using ChatGPT is ultimately wrong, but c'mon, man... we're a forum of PC enthusiasts, we can do so much better than that! :)
 
Last edited:
Joined
Sep 17, 2014
Messages
23,173 (6.10/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Why did anyone even decide to use chatgpt tho lmao, what use is there when a simple google search will provide a way more accurate answer, or the people here, which have provided better answers
Forum being a forum :lovetpu:
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Edit: I'm not saying that googling or using ChatGPT is ultimately wrong, but c'mon, man... we're a forum of PC enthusiasts, we can do so much better than that! :)
I don't know about you, but I expect a person to do an ounce of research before asking questions on a forum. This thread feels like the OP skipped that step. That's why I started with that first basic step. I'm happy to have an in depth discussion about something, but I can't do that if people are just expecting to have answers handed to them without actually trying to understand the problem. That's basically what I'm trying to get at. Maybe I'm setting the bar a bit high, but I do expect people to try googling the question or something before asking the question.

Forum being a forum :lovetpu:
Sure, but I do expect somebody to try and understand something before asking the question. A google search probably could have answered this and that was my point.
 
Last edited:
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
I don't know about you, but I expect a person to do an ounce of research before asking questions on a forum. This thread feels like the OP skipped that step. That's why I started with that first basic step. I'm happy to have an in depth discussion about something, but I can't do that if people are just expecting to have answers handed to them without actually trying to understand the problem. That's basically what I'm trying to get at. Maybe I'm setting the bar a bit high, but I do expect people to try googling the question or something before asking the question.
I normally agree, but...
1. "Why this graph looks like that" is kind of a complex question that is hard to research. If you know what bottlenecking is, you can google it and read about it all day and night. But without that knowledge, what do you google? "Graphs?" Also, googling requires some basic knowledge to be able to filter out any useless material. Without that, it's like looking for a tree in a forest without a map.
2. Like I said, we're a PC enthusiast forum, it is exactly places like this where such questions should be asked. If we turn people away by saying "just google it", or "here's ChatGPT's answer", then what is there to talk about? It's a bit like going into a sports bar, starting a conversation about the latest results in X sport, and people turning you down by saying "just watch the damn game".

A google search probably could have answered this and that was my point.
A google search could have given lots of correct and incorrect answers, from which you need to filter using your knowledge and experience, which OP doesn't necessarily have. For that reason, OP did very well to ask the question here instead of googling it, imo. It's not about expecting people to deliver the answer on a silver platter without any effort. It's more about choosing the right place to ask the question.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,174 (2.77/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
A google search could have given lots of correct and incorrect answers, from which you need to filter using your knowledge and experience, which OP doesn't necessarily have. For that reason, OP did very well to ask the question here instead of googling it, imo. It's not about expecting people to deliver the answer on a silver platter without any effort. It's more about choosing the right place to ask the question.
Correct me if I'm wrong, but usually when looking at hardware reviews there typically is a bit of an explanation of the test describing what we're seeing, if not on the page with the actual metrics, on the summary page or at the end of that particular segment. I know that I often see this is in written content like from W1zz over here or Michael at Phoronix. I see the same kinds of descriptions coming from Roman at Der8auer and from LTT as well when talking about gaming performance in video reviews. The screenshot the op provided is of a YouTube video. You can see the pause button and playback slider on the bottom, so it would be interesting to see what video that was and if they ever addressed that. I suspect that they did if it's a half decent outlet. (Or maybe they didn't? Bad assumption?)

So, all in all, I see where you're coming from. I'm just not sure if I agree with it since it's a video review and they're probably explaining all of this while going through it, at least if it's a half decent hardware review.
 
Joined
Jan 14, 2019
Messages
14,168 (6.39/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Overclocking is overrated
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case It's not about size, but how you use it
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Mechanic
VR HMD Not yet
Software Linux gaming master race
Correct me if I'm wrong, but usually when looking at hardware reviews there typically is a bit of an explanation of the test describing what we're seeing, if not on the page with the actual metrics, on the summary page or at the end of that particular segment. I know that I often see this is in written content like from W1zz over here or Michael at Phoronix. I see the same kinds of descriptions coming from Roman at Der8auer and from LTT as well when talking about gaming performance in video reviews. The screenshot the op provided is of a YouTube video. You can see the pause button and playback slider on the bottom, so it would be interesting to see what video that was and if they ever addressed that. I suspect that they did if it's a half decent outlet. (Or maybe they didn't? Bad assumption?)

So, all in all, I see where you're coming from. I'm just not sure if I agree with it since it's a video review and they're probably explaining all of this while going through it, at least if it's a half decent hardware review.
Good point. The screenshot shows Hardware Unboxed's design, who talk about these things while showing such graphs.

Oh well, at least the question has been answered multiple times. :ohwell:
 
Joined
Oct 2, 2020
Messages
1,068 (0.67/day)
System Name ASUS TUF F15
Processor Intel Core i7-11800H
Motherboard ASUS FX506HC
Cooling Laptop built-in cooling lol
Memory 24 GB @ 3200
Video Card(s) Intel UHD & Nvidia RTX 3050 Mobile
Storage Adata XPG SX8200 Pro 512 GB
Display(s) Laptop built-in 144 Hz FHD screen
Audio Device(s) LOGITECH Z333 2.1-channel
Power Supply ASUS 180W PSU
Mouse Logitech G604
Keyboard Laptop built-in keyboard
Software Windows 10 Enterprise 20H2
but its shown that the 6600 and 5700 are quite similar. only a bit boost in performance and less watts used. I saw that it was between 3-5fps on average. I would take the 5700xt if it wasnt for the power draw as many people dont take into consideration investment over the time of use. and gpu are getting crazy high tdp. its a huge factor for my purchases.

still curious why in the graph the 5700xt doesnt give more fps to the 5600x vs the 3600x. the 3600x may be a weak link, but in comes the 5600x and it should be a bit more. something is off there. is that all of those cpu are stronger then the 5700xt and they all max it out? seems weird a gen 1 ryzen 1600x maxes out a 5700xt which is much newer, no?

thanks man. much appreciated. someone should consider making a site that does pairing with charts and stuff. theres too many reviewers as is. weird there isnt one.
GPU isn't an "investment", LMFAO. and yeah, 6600=5700...
 
Top