• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is CPU game benchmarking methodology (currently) flawed?

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
After watching this video, i'm not so sure anymore:


He claims that the "ancient" FX8350 became faster the the I5 2500K simply by testing with a much much faster graphics card and has shown that, with each increase in GPU performance, the distance to the 2500K narrows, until it's reversed. To be fair, i personally have some doubts because there are some variables he didn't take into account, such as motherboards differences, memory differences, storage speed differences, used drivers differences that may or may not contribute to the overall result, not to mention that in the final part, he used the results of the FX8370 instead (due to not having the FX8350 on the benchmarks).

This finding contradicts the current thinking that in order to remove the GPU out of the equation, one needs to test @ lower resolutions / details so that one can say X processor is better then Y processor @ gaming with the results one gets after the tests, and any changes made to the graphics card will never change the outcome. I understand the logic ... but is this really true?

You may think: why am i even thinking about this with "ancient" CPUs? Because you can use the same principle on current CPUs: if the testing methodology is indeed wrong, then all those referencing reviews with it are being misled (even though not intentionaly) and this means a new way to test a gaming CPU must be found and this one scrapped.

And so i propose that our very own @W1zzard tests this (@ his convenience) and answer this question so that there's zero doubts. Use a 2500K and a 8350 and pair it with a 1080Ti, a 980Ti, a 680Ti (2 generation gap between each card, i think): only change the boards and CPUs while trying to keep the rest, if @ all possible (including drivers), so that there are less variables to interfere with the final results.


EDIT

I think this topic can be closed now because someone has tested this and found the 2500K to be faster on all but 1 title (of 16 tested):




It seems the methodology still holds: as such, no point in keeping this topic open, i think.

EDIT #2

The plot thickens ...

I do admit that i missed the very important bit that different games being tested was the point. I'm also to blame in the sense that i took HUB's numbers as a "gospel" (like Adored calls it) in order to "prove" that the methodology wasn't flawed after all.

His more recent video shows that current CPU benchmarking indeed is flawed (in this title @ the very least). Let me however say that what i'm trying to show is not Intel VS AMD CPU performance bit but the difference you get in a supposedly CPU bottlenecked game when changing from a nVidia card to an AMD one on BOTH CPUs:


View attachment 85800

You can't have it both ways: either the CPU is being bottlenecked or it isn't. Adored showed that both Intel and AMD benefited from changing from nVidia to AMD in DX12 and both lost in DX11. That the gap shrunk in DX12 is not the issue i'm trying to address: that there is a gap is the issue i'm trying to address. This proved that the CPU wasn't being bottlenecked after all, or there wouldn't have been an increase in both CPUs: there was another variable that wasn't being accounted for.

But there are variables here, because of which i think more testing is definitely required: he tested crossfire VS single card and that introduces another variable that doesn't have to be present: CF scaling. It seems RotTR isn't the only game because it happens in DX12's The Division, that i know of so far.

In fact, you don't even need to use 2 CPUs to test if this is true or not, but you do need an nVidia and an AMD cards, as well as RotTR game: just run RotTR using DX11 and DX12 with settings you're absolutely sure will bottleneck the CPU with both cards @ stock and then with the highest overclock on the cards you can get.

If the CPU is bottlenecked "properly", then going from a stock nVidia card to an OCed one should yield margin of error differences and the same should be true for stock AMD card VS OCed one but, if comparisons between manufacturers are allot higher then margin of error, then you'll have your proof right there.

There's also this video that i found very interesting about CPU overhead in nVidia VS AMD:

 
Last edited:
Joined
Jan 8, 2017
Messages
568 (0.19/day)
System Name ACME Singularity Unit
Processor Coal-dual 9000
Motherboard Oak Plank
Cooling 4 Snow Yetis huffing and puffing in parallel
Memory Hasty Indian (I/O: 3 smoke signals per minute)
Video Card(s) Bob Ross AI module
Storage Stone Tablet 2.0
Display(s) Where are my glasses?
Case Hand sewn bull hide
Audio Device(s) On demand tribe singing
Power Supply Spin-o-Wheel-matic
Mouse Hamster original
Keyboard Chisel 1.9a (upgraded for Stone Tablet 2.0 compatibility)
Software It's all hard down here
Aaah, saw that. Yes, i actually believe/agree with it; to an extent that i'd need proper, hard data to be convinced otherwise (ie not the youtube/google kind).
Why? I lack the technical knowledge, granted, but for starters it's way too logical for me to dismiss it and also, empiricity. Have had many AMD chips, have sure worn them out testing things (games included). Can definitely attest to a marked improvement in later years.

Also, why are you surprised? Because Intel told us that going back in time to 2009* is best? And never mind our ignoring almost a decade of technological progress? :)

*that's when the first quad core came out. Yeap.

edit: the only thing i will not comment on is the TPU mention in this video. Because i am unsure as to whether there was copy-pasting involved in that article, or the OP actually believed what he was typing. It's hard to analyze intentions or infer meaning from just a few lines; i read, do my own thinking and move on.
 
Last edited:
Joined
Dec 16, 2012
Messages
540 (0.12/day)
Processor AMD Ryzen R7 5800x
Motherboard B550i Aorus Pro AX
Cooling Custom Cooling
Memory 32Gb Patriot Viper 3600 RGB
Video Card(s) MSI RTX 3080 Ventus Trio OC
Storage Samsung 960 EVO
Display(s) Specterpro 34uw100
Case SSUPD Meshlicious
Power Supply Cooler Master V750 Gold SFX
Mouse Glorious Model D Wireless
Keyboard Ducky One 2
VR HMD Quest 2
Software Windows 11 64bit
I'm using an oced fx8320 for years and it still works but I know I can get more coz I can feel some hiccups. It really depends on the engine used and testing methodology. Scenes where cpu process is important are really hard to replicate. Games like GTA V, WD2 or any multiplayer games can't be benchmarked. No idea whats the beat way to test it.
 
Joined
Feb 2, 2015
Messages
2,707 (0.75/day)
Location
On The Highway To Hell \m/
Very eye-opening. Thanks for sharing that with us. :toast:
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.86/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
He seems to be conflating two things: low res testing and performance improvement with older AMD CPUs over time to make for a bogus argument.

Now, it's 5am here and I'm too tired to watch the video again and analyze things in detail, but it seems that the improved AMD FX performance is coming from newer games which are better optimized for it (and no doubt better drivers and perhaps OS patches) and using that as an argument to claim that low res testing is wrong. That's bullshit. It's obvious that if you want to know what framerate a CPU can achieve, then the graphics card must be taken out of the picture and hence tested at low resolution.

Heck, I remember Unreal Tournament 2003 had a special benchmarking mode which really did take the graphics card out of the picture. It simulated an infinitely fast graphics card by not actually sending the draw calls to it, instead just returning immediately, so all you saw was a static picture while the game benchmark ran. AFAIK this is the only game to have this feature and more games should have it too.

What he should be doing is testing those same games that were used when FX was initially benchmarked, but with modern graphics cards and drivers run at low resolution, but he doesn't do that. That's not comparing like with like and therefore invalid. Judging by those YouTube comments, it looks like he's fooled a lot of people and managed to gain thousands of subscribers, but not me.

He sounds like a real AMD apologist and fanboy.
 

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
He seems to be conflating two things: low res testing and performance improvement with older AMD CPUs over time to make for a bogus argument.

Now, it's 5am here and I'm too tired to watch the video again and analyze things in detail, but it seems that the improved AMD FX performance is coming from newer games which are better optimized for it (and no doubt better drivers and perhaps OS patches) and using that as an argument to claim that low res testing is wrong. That's bullshit. It's obvious that if you want to know what framerate a CPU can achieve, then the graphics card must be taken out of the equation and hence tested at low resolution.

Heck, I remember Unreal Tournament 2003 had a special benchmarking mode which really did take the graphics card out of the picture. It simulated an infinitely fast graphics card by not actually sending the draw calls to it, instead just returning immediately, so all you saw was a static picture while the game benchmark ran. AFAIK this is the only game to have this feature and more games should have it too.

What he should be doing is testing those same games that were used when FX was initially benchmarked, but with modern graphics cards and drivers run at low resolution, but he doesn't do that. That's not comparing like with like and therefore invalid. Judging by those YouTube comments, it looks like he's fooled a lot of people and managed to gain thousands of subscribers, but not me.

Actually, he didn't do any tests @ all: what he did was grab the results of tests done on the same website and notice how it changed from being X slower to Y faster, by trading the card for much faster ones. BUT, because this wasn't done using the same equipment (boards, storage, games, memory speeds) there's a chance that this could be totally missleading OR could be actually even more pronounced: that's what this review would be hoping to answer, definitively.

You do however have a point when you say newer games benefit better from more cores / threads VS older games and it's also an aspect @W1zzard should consider if / when he makes this review: use older and newer titles in an effort to show if this happens only on more multithreaded games or accross the board.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.86/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Haha, guess I put it a little ambiguously lol, my bad. :) Indeed he didn't run any of those benchmarks himself. I meant to say that if he really wants to confirm this for himself then he should run a set of benchmarks himself to properly compare. If not, then at least find comparable reviews, but that's hard to do because things change over time, such as drivers etc, therefore he should really do it himself under controlled conditions. I strongly suspect that if he did this then the 2500K would still come out on top, but that's not what he wants to tell us.

Think, just how many thousands of enthusiasts there are out there motivated to compare products. They would start to notice the FX improvements, which would get picked up by the tech press such as TPU, but it's not happened. Therefore, what he's saying is a crock and I think he's got an agenda to make it look like AMD's products are getting hard done by Intel and everyone else, when the real blame lies with the products themselves.

Heck, just look at the Ryzen launch, all full of hype and promise. AMD and their partners, including Microsoft have had years to work with them to ensure a reasonably bug-free launch, but what do we see? BIOS issues preventing memory modules from working properly and a Windows 10 bug lowering overall performance due to SMT issues. How did Microsoft and more importantly AMD, not notice something so blatantly obvious before launch and just get it fixed? Especially since Windows 7 doesn't have this problem.

I tell you, if I'd dropped a grand+ plus on a Ryzen upgrade (CPU, mobo, RAM, cooler) I'd be pissed at seeing problems like this and might ask for my money back. I expect the thing to work properly out of the box. This is typical AMD and why I have no confidence in them. Funny how Intel CPUs haven't dropped in price much since Ryzen's launch and its "incredible" value proposition, isn't it? Yes, I think Ryzen does have that potential, but the launch has been poorly executed, ruining it. Looks like we'll have to wait for Ryzen v2 for an improved IPC that can compete head-on with Intel in games, but it's a good start, bugs notwithstanding.

There's also another thing that makes him quite the AMD apologist. When buying hardware, one should almost always buy what performs best right now, rather than a promise of better performance in the future. It's not guaranteed, there's no timescale and by the time that the improvement comes, if it does, the hardware will be out of date anyway. So, who cares if the FX works better than a 2500K now? They're both obsolete and can't even be purchased new any more.

I have literally put this philosophy into practice today. You might remember that my Palit GTX 1080 died a couple of months ago and I got a refund on it. Well, I was waiting for the 1080 Ti to come out before buying its replacement. What do I see? A card that costs an obscene £700+, is "only" 35% faster than a 1080 and an overclocked 7700K isn't fast enough to max the card out at 1080p a lot of the time (check TPU's review). Now, I still have a 2500K (at a rather non-enthusiast stock speed, but don't tell anyone :oops:) which is a lot slower than TPU's test rig, so I passed on the 1080 Ti and bought another 1080 today, taking advantage of the cheaper prices.

I've actually bought two models, as I can't quite tell which one I'm gonna keep just from the reviews and will return one to Amazon. Noise is all-important for me, so it's a toss-up between the Palit GTX 1080 GameRock (same as I had before) at £500 or the Zotac GTX 1080 AMP! Extreme at a cool £600 which @W1zzard said is the best GTX 1080 he's ever tested, in his review. Yes, it's somewhat more than I wanted to pay for a 1080 right now, but I really like it, so I'm biased towards keeping it, despite that steep £100 premium. If the coil whine performance is better on the Zotac then keeping it is a no-brainer.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.86/day)
That's not entirely true. Some people buy systems long term and for those it does matter how well hardware ages. And for those people DX12/Vulkan performance does matter. And so does the number of cores. You can be assured R7 1800X will age way better than 7700K. Clock per clock difference is minimal, but Ryzen is rocking twice as many cores. It doesn't require a genius to predict that it'll perform better for longer.
 
Joined
Jan 8, 2017
Messages
9,577 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Many games run in engine limitations and quirks which literally no one mentions. Also running lower resolutions is hardly a good method to show CPU performance in games , or rather it just simply doesn't tell you anything. It's the number of draw calls that hammer down a processor , running a lower resolution simply puts less stress on the GPU , you get no extra information whatsoever about the CPU than if you were running the game on 3x4K panels.

Sure you might say that by doing this the CPU is preparing as many frames as it can , true but then you run in those engine limitations I talked about , so you still don't know the whole story. Take a look at Doom , it's capped at 200fps , yet because it is very well written it will happily prepare you those 200 frames even on slower CPU's.

So in the end the only thing you have to make sure when testing is to max out the game , which some reviewers oddly enough don't do it all the time
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,156 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
I'm assuming that's the Adored person? Enough said.

Can people stop posting links to revenue generating guff from non experts with opinions they get paid for?

Summarise a point and please give a simple link. Stop embedding YT links and giving no further info, it's a pain in the ass.
 
Joined
Nov 18, 2009
Messages
86 (0.02/day)
Location
Alberta, Canada
System Name Fluffy
Processor Ryzen 7 2700X
Motherboard Asus Crosshair VII Hero Wi-Fi
Cooling Wraith Spire
Memory 32Gb's Gskill Trident Z DDR4 3200 CAS 14
Video Card(s) Asus Strix Vega 64 OC
Storage Crucial BX100 500GB SSD/Seagate External USB 1TB
Display(s) Samsung CHG70 32" 144hz HDR
Case Phanteks ENTHOO EVOLV X
Audio Device(s) SupremeFX S1220 / Tiamat 7.1
Power Supply SeaSonic PRIME Ultra Titanium 750 W
Mouse Steel Series Rival 600
Keyboard Razer Black Widow Ultimate
Software Open Office, Win 10 Pro
Joined
Jan 8, 2017
Messages
568 (0.19/day)
System Name ACME Singularity Unit
Processor Coal-dual 9000
Motherboard Oak Plank
Cooling 4 Snow Yetis huffing and puffing in parallel
Memory Hasty Indian (I/O: 3 smoke signals per minute)
Video Card(s) Bob Ross AI module
Storage Stone Tablet 2.0
Display(s) Where are my glasses?
Case Hand sewn bull hide
Audio Device(s) On demand tribe singing
Power Supply Spin-o-Wheel-matic
Mouse Hamster original
Keyboard Chisel 1.9a (upgraded for Stone Tablet 2.0 compatibility)
Software It's all hard down here
While qubit makes valid points, the point remains.
- The more the software moves towards the direction we'd been expecting it to move, the more these old 'dinosaurs' / "horrible" (read:AMD) CPUs get the juice squeezed out of them, it being the point all along.
- The more "gamerzzzz" end up defining this market, the more skewed and convoluted said direction will be. If not halted that is, in terms of mentality. Case in point being the re-re-rehashing of 4cores, ad infinitum. Not to exclude AMD from this mind, they too are planning to sell those.
(when progress is defined by the needs of 15yr olds using software made for 15yr olds, this is what you get, or can get)

I do not think he missed the point or that he lacks the mental capacity to understand that he'd need proper testing for this. I think he just knows his audience, knows his platform (youtube) and he accommodates.
Let's face it, half a page long text and already people think oh, wall of text. Attention span issues abound.

edit: also, there is just enough flavor to this to justify it all. He's using exactly what the parrots have been using; except to showcase other, contradictory points. I do enjoy seeing that happen. Is he 100% right? No. Is he closer to an empirical truth than the others are? Why yes :)
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
While qubit makes valid points, the point remains.
- The more the software moves towards the direction we'd been expecting it to move, the more these old 'dinosaurs' / "horrible" (read:AMD) CPUs get the juice squeezed out of them, it being the point all along.
- The more "gamerzzzz" end up defining this market, the more skewed and convoluted said direction will be. If not halted that is, in terms of mentality. Case in point being the re-re-rehashing of 4cores, ad infinitum. Not to exclude AMD from this mind, they too are planning to sell those.
(when progress is defined by the needs of 15yr olds using software made for 15yr olds, this is what you get, or can get)

I do not think he missed the point or that he lacks the mental capacity to understand that he'd need proper testing for this. I think he just knows his audience, knows his platform (youtube) and he accommodates.
Let's face it, half a page long text and already people think oh, wall of text. Attention span issues abound.

edit: also, there is just enough flavor to this to justify it all. He's using exactly what the parrots have been using; except to showcase other, contradictory points. I do enjoy seeing that happen. Is he 100% right? No. Is he closer to an empirical truth than the others are? Why yes :)

IMHO, it's important to know if the current way to measure a gaming CPU's performance is flawed or not. If it's not, the discussion ends here but if it is, then steps need to be taken to come up with a way to benchmark it that is not flawed.

Haha, guess I put it a little ambiguously lol, my bad. :) Indeed he didn't run any of those benchmarks himself. I meant to say that if he really wants to confirm this for himself then he should run a set of benchmarks himself to properly compare. If not, then at least find comparable reviews, but that's hard to do because things change over time, such as drivers etc, therefore he should really do it himself under controlled conditions. I strongly suspect that if he did this then the 2500K would still come out on top, but that's not what he wants to tell us.

Think, just how many thousands of enthusiasts there are out there motivated to compare products. They would start to notice the FX improvements, which would get picked up by the tech press such as TPU, but it's not happened. Therefore, what he's saying is a crock and I think he's got an agenda to make it look like AMD's products are getting hard done by Intel and everyone else, when the real blame lies with the products themselves.

He's not a reviewer and doesn't have the necessary funds to get the required material to make a review himself, AFAIK. Different sites test with different hardware so it's not really viable to "find comparable reviews" so he did the "next best thing" by just using one site.

That's why i ask if @W1zzard can do it because if a reputable site such as TPU comes to the conclusion that the methodology is flawed (if indeed it is flawed), then the rest of the reviewers will pay attention and strive to come up with a way that's NOT flawed.
 
Joined
Jan 10, 2011
Messages
1,460 (0.29/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
Many games run in engine limitations and quirks which literally no one mentions. Also running lower resolutions is hardly a good method to show CPU performance in games , or rather it just simply doesn't tell you anything. It's the number of draw calls that hammer down a processor , running a lower resolution simply puts less stress on the GPU , you get no extra information whatsoever about the CPU than if you were running the game on 3x4K panels.

Resolution limiting as an attempt to shift bottlenecks, it doesn't necessarily put more theoretical load on the CPU (i.e. it won't create different data to process), though practically, it does.
Draw calls for each frame remain fixed no matter what resolution you run at, however, at higher resolution the GPU's ability to implement the calls it receives becomes slower, so we might -and often do- reach a point where the CPU sits idle while the GPU is busy rendering the frame from the previous batch of calls. Lower the resolution and -as you said- the stress on the GPU, and it becomes a keep-em-coming scenario: The greater the CPU IPC/Clock, the more calls it sends in a time unit, the more overall framerate you get, the less frame time variance you end up with. The last two are the only metrics for a CPU's performance in games (aside from PCIe bandwidth, which hasn't really been an issue since PCIe 2.0 days).

Do these metrics have any worth for real world use (@1080p+) though? That's a different matter.
 
Joined
Jan 8, 2017
Messages
568 (0.19/day)
System Name ACME Singularity Unit
Processor Coal-dual 9000
Motherboard Oak Plank
Cooling 4 Snow Yetis huffing and puffing in parallel
Memory Hasty Indian (I/O: 3 smoke signals per minute)
Video Card(s) Bob Ross AI module
Storage Stone Tablet 2.0
Display(s) Where are my glasses?
Case Hand sewn bull hide
Audio Device(s) On demand tribe singing
Power Supply Spin-o-Wheel-matic
Mouse Hamster original
Keyboard Chisel 1.9a (upgraded for Stone Tablet 2.0 compatibility)
Software It's all hard down here
IMHO, it's important to know if the current way to measure a gaming CPU's performance is flawed or not

And? Must you undergo tests on a professional level to grasp whether it is or not?
Those very tests (which to varying degrees we all rely on, let's be honest), all by themselves, show you that a "horrid" Bulldozer cpu was actually not really that horrid, but rather that it wasn't made to perform in the way a dual core did (assuming one couldn't grasp that much on their own).
That even more to the point (you care about gaming performance right?), when games actually -started- using 5, 6 and 7 cores, said 'horrid' CPUs turned out to outperform that amazing, uber, et al CPU everybody was telling you to buy. Seen anyone admit that out in the open? Nope? Wonder why?

Put differently, we have been using a comparison process that neglected the software limitations (or took them for granted, failing to even mention them) and called for absolute truths.
As in, buy not "because with the current monkeys (sorry i meant software programmers) it will run a game faster". For now. Which it did.
But rather, "buy because it's better". Absolute.

Which leads to what started all this. Not the 480 versus 720p, no. What started all this was questioning whether we directed an entire market's technological prowess into regressing back into what the monkeys were capable/willing of. Ergo whether, in answer to your question, our standards of "better" were actually rather flawed; or not, situation depending of course.
It's why i said no, he's not 100% right (IPC, frequency, they will always have a role to play). But yes, he has a point. Thing is, no one allowed us to see/work with that point, until very recently (last what? Couple of years? In terms of games?). Certain conclusions can now be made, regarding hype, direction, mentality and how this industry of satellites (read: blogs/"press", the bigger the market, the more of them and nevermind if they're worthy of a voice) functions.

Good stuff any day of the week i'd say :)

edit: as stated, self-evident to some. Am here to remind however that as it turns out, this wasn't that obvious to many, many people out there. And now that we have enough multi-core games, it is. Assuming they're willing to listen.
 
Last edited:

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
And? Must you undergo tests on a professional level to grasp whether it is or not?
Those very tests (which to varying degrees we all rely on, let's be honest), all by themselves, show you that a "horrid" Bulldozer cpu was actually not really that horrid, but rather that it wasn't made to perform in the way a dual core did (assumiong one couldn't grasp that much on their own).
That even more to the point, when games actually -started- using 5 and 6 cores, said 'horrid' CPU turned out to outperform that amazing, uber, et al CPU everybody was telling you to buy. Seen anyone admit that out in the open? Nope? Wonder why?

Put differently, we have been using a comparison process that neglected the software limitations (or took them for granted, failing to even mention them) and called for absolute truths.
As in, buy not "because with the current monkeys (sorry i meant software programmers) it will run a game faster". For now. Which it did.
But rather, "buy because it's better". Absolute.

Which leads to what started all this. Not the 480 versus 720p, no. What started all this was questioning whether we directed an entire market's technological prowess into regressing back into what the monkeys were capable/willing of. Ergo whether, in answer to your question, our standards of "better" were actually rather flawed; or not, situation depending of course.
It's why i said no, he's not 100% right (IPC, frequency, they will always have a role to play). But yes, he has a point. Thing is, no one allowed us to see/work with that point, until very recently (last what? Couple of years? In terms of games?). Certain conclusions can now be made, regarding hype, direction, mentality and how this industry of satellites (read: blogs/"press", the bigger the market, the more of them and nevermind if they're worthy of a voice) functions.

Good stuff any day of the week i'd say :)

edit: as stated, self-evident to some. Am here to remind however that as it turns out, this wasn't that obvious to many, many people out there. And now that we have enough multi-core games, it is. Assuming they're willing to listen.

To remove doubts, yes: i'd say "professional level" testing is required to make sure this is or is not an issue.

A while back, everybody was using FRAPS and other such FPS measuring tools but then came PCPer (i think it was) and told us these programs were all working incorrectly and what they were showing was not what was being experienced, and so they came up with a way to properly show this. We became aware that something we were taking for granted was flawed all along ... sound familiar?

I'm not saying the current methodology is flawed but i AM saying that steps must be taken to ensure it's not.
 
Joined
Mar 14, 2008
Messages
511 (0.08/day)
Location
DK
System Name Main setup
Processor i9 12900K
Motherboard Gigabyte z690 Gaming X
Cooling Water
Memory Kingston 32GB 5200@cl30
Video Card(s) Asus Tuf RTS 4090
Storage Adata SX8200 PRO 1 adn 2 TB, Samsung 960EVO, Crucial MX300 750GB Limited edition
Display(s) HP "cheapass" 34" 3440x1440
Case CM H500P Mesh
Audio Device(s) Logitech G933
Power Supply Corsair RX850i
Mouse G502
Keyboard SteelSeries Apex Pro
Software W11
Of cause you cant bench a CPU to see what is good in the future, you can see what is good now, and that the way it is.
Is it pretty known that in moden games an old 8 core will be faster that a old 4 core cpu, and in older games, good Single threaded performance is better ..... I dont see the issue with the way it is done with the lov resolution, in GOW4 in UHD my GTX1080 is Working its ass off to delever fps and my Cpu is ideling, but in 1080P my cpu is Working its ass of and my GTX1080 is chilling.... so yes today Ryzen doesnt have the BEST gaming performance IF WE ONLY LOOK AT THE CPU's gaming performance, but in the future it will be better that the 7700K because Cores will be more important because games will maybe bee developed to use more cores, NOTHING NEW IN THAT.

If you want to see Run the GOV4 benchmark if you have GOV4. GOV = Gears of ware
 
Joined
Mar 10, 2010
Messages
11,880 (2.19/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
After watching this video, i'm not so sure anymore:


He claims that the "ancient" FX8350 became faster the the I5 2500K simply by testing with a much much faster graphics card and has shown that, with each increase in GPU performance, the distance to the 2500K narrows, until it's reversed. To be fair, i personally have some doubts because there are some variables he didn't take into account, such as motherboards differences, memory differences, storage speed differences, used drivers differences that may or may not contribute to the overall result, not to mention that in the final part, he used the results of the FX8370 instead (due to not having the FX8350 on the benchmarks).


This finding contradicts the current thinking that in order to remove the GPU out of the equation, one needs to test @ lower resolutions / details so that one can say X processor is better then Y processor @ gaming with the results one gets after the tests, and any changes made to the graphics card will never change the outcome. I understand the logic ... but is this really true?

You may think: why am i even thinking about this with "ancient" CPUs? Because you can use the same principle on current CPUs: if the testing methodology is indeed wrong, then all those referencing reviews with it are being misled (even though not intentionaly) and this means a new way to test a gaming CPU must be found and this one scrapped.

And so i propose that our very own @W1zzard tests this (@ his convenience) and answer this question so that there's zero doubts. Use a 2500K and a 8350 and pair it with a 1080Ti, a 980Ti, a 680Ti (2 generation gap between each card, i think): only change the boards and CPUs while trying to keep the rest, if @ all possible (including drivers), so that there are less variables to interfere with the final results.
you would find your self quite wrong and him quite right , my 8350 (when its an i7 for intel)is still the recomended amd cpu in AAA games , ive had many an argument over this and at the end of the day i can still play any new games max settings on my Ancient cpu ,some dont get cicuitry like others and one hard fact can be enough to focus on to derail all the good, my cpu still has chops at high res gaming apparently , who knew.



i couldnt possibly agree with the video more.
 
Joined
Jan 8, 2017
Messages
568 (0.19/day)
System Name ACME Singularity Unit
Processor Coal-dual 9000
Motherboard Oak Plank
Cooling 4 Snow Yetis huffing and puffing in parallel
Memory Hasty Indian (I/O: 3 smoke signals per minute)
Video Card(s) Bob Ross AI module
Storage Stone Tablet 2.0
Display(s) Where are my glasses?
Case Hand sewn bull hide
Audio Device(s) On demand tribe singing
Power Supply Spin-o-Wheel-matic
Mouse Hamster original
Keyboard Chisel 1.9a (upgraded for Stone Tablet 2.0 compatibility)
Software It's all hard down here
To remove doubts, yes: i'd say "professional level" testing is required to make sure this is or is not an issue

I understand your reasoning and i too would like some concrete data.
I need also state however that already, we have more than enough to understand how it -is- flawed. Now obviously, we are not talking about 100% flawed, because equally obviously, there is no one PC rig that is "bestest" for everything, just as there is no "one" gaming test to rule them all (diffrent APIs, different generations, different engines, etc).

Until a time when more will be availabe, employ some personal judgement, employ some observational skills.
How often do you see this discussion (like never*), why is it that so few people 'bother' with this (when say, we've had fanboying megathreads about Ryzens before they were even a thing), why is it that repetition by mouth has ended up favoring certain elements (or brands) of this equation over others, etc.

Where there's smoke..

*yes, like never.. we're talking about "bloggers", "press", word of mouth, gamerz, and where it's all lead us; not about threads discussing a 0.0000012% variation in speeds. Just to be clear.
 
Last edited:
  • Like
Reactions: HTC
Joined
Nov 26, 2013
Messages
816 (0.20/day)
Location
South Africa
System Name Mroofie / Mroofie
Processor Inte Cpu i5 4460 3.2GHZ Turbo Boost 3.4
Motherboard Gigabyte B85M-HD3
Cooling Stock Cooling
Memory Apacer DDR3 1333mhz (4GB) / Adata DDR3 1600Mhz(8GB) CL11
Video Card(s) Gigabyte Gtx 960 WF
Storage Seagate 1TB / Seagate 80GB / Seagate 1TB (another one)
Display(s) Philips LED 24 Inch 1080p 60Hz
Case Zalman T4
Audio Device(s) Meh
Power Supply Antec Truepower Classic 750W 80 Plus Gold
Mouse Meh
Keyboard Meh
VR HMD Meh
Software Windows 10
Benchmark Scores Meh
After watching this video, i'm not so sure anymore:


He claims that the "ancient" FX8350 became faster the the I5 2500K simply by testing with a much much faster graphics card and has shown that, with each increase in GPU performance, the distance to the 2500K narrows, until it's reversed. To be fair, i personally have some doubts because there are some variables he didn't take into account, such as motherboards differences, memory differences, storage speed differences, used drivers differences that may or may not contribute to the overall result, not to mention that in the final part, he used the results of the FX8370 instead (due to not having the FX8350 on the benchmarks).

This finding contradicts the current thinking that in order to remove the GPU out of the equation, one needs to test @ lower resolutions / details so that one can say X processor is better then Y processor @ gaming with the results one gets after the tests, and any changes made to the graphics card will never change the outcome. I understand the logic ... but is this really true?

You may think: why am i even thinking about this with "ancient" CPUs? Because you can use the same principle on current CPUs: if the testing methodology is indeed wrong, then all those referencing reviews with it are being misled (even though not intentionaly) and this means a new way to test a gaming CPU must be found and this one scrapped.

And so i propose that our very own @W1zzard tests this (@ his convenience) and answer this question so that there's zero doubts. Use a 2500K and a 8350 and pair it with a 1080Ti, a 980Ti, a 680Ti (2 generation gap between each card, i think): only change the boards and CPUs while trying to keep the rest, if @ all possible (including drivers), so that there are less variables to interfere with the final results.
He forgot to mention that the 2500K will beat the fx8350 when oced
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,156 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
So? Move along.

Just to be clear, I will once i reply to you. My point is verified by your blunt reply. I was 'tricked' into a thread about a topic i am interested in only to be met with no info and a YT link. Had I known the thread required viewing a video of a guy whose work i don't value, i could have avoided the thread.

So i have moved along. Though watching people blatantly ignoring Ryzens deficit is quite depressing. It's an amazing CPU but it is weaker in gaming. I've looked at 1440p reviews where it still loses out. Adored lives to be 'contentious'. It gets more hits that way.

But it's gutter work.

Out.
 
Joined
Jan 8, 2017
Messages
568 (0.19/day)
System Name ACME Singularity Unit
Processor Coal-dual 9000
Motherboard Oak Plank
Cooling 4 Snow Yetis huffing and puffing in parallel
Memory Hasty Indian (I/O: 3 smoke signals per minute)
Video Card(s) Bob Ross AI module
Storage Stone Tablet 2.0
Display(s) Where are my glasses?
Case Hand sewn bull hide
Audio Device(s) On demand tribe singing
Power Supply Spin-o-Wheel-matic
Mouse Hamster original
Keyboard Chisel 1.9a (upgraded for Stone Tablet 2.0 compatibility)
Software It's all hard down here

No offense, but you missed the point here. You've let a couple of biased opinions and/or your own (not that i blame you, i don't use Youtube or Facebook either) color your judgement.
This is about how a product is perceived, who and how has a role to play in that and what it means for the market as a whole. Forget Adored, forget what he has to do to get or maintain his subscribers.

Lastly, 'cause momma taught me to be a fair redneck, Adored wasn't saying or implying that one is 'better' than the other. And he's not refuting the i7 770K's current superiority in FPS; he even illustrates it by showing videos of others so we can see said FPS difference for ourselves.
Stick to the forest and forget about the tree :)
 
Joined
Mar 10, 2010
Messages
11,880 (2.19/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
He forgot to mention that the 2500K will beat the fx8350 when oced
you can oc said 8350 upto 5ghz most modern games and gtaV love that though the elecy bill man also loves that too.
honesty rocks..

Haha, guess I put it a little ambiguously lol, my bad. :) Indeed he didn't run any of those benchmarks himself. I meant to say that if he really wants to confirm this for himself then he should run a set of benchmarks himself to properly compare. If not, then at least find comparable reviews, but that's hard to do because things change over time, such as drivers etc, therefore he should really do it himself under controlled conditions. I strongly suspect that if he did this then the 2500K would still come out on top, but that's not what he wants to tell us.

Think, just how many thousands of enthusiasts there are out there motivated to compare products. They would start to notice the FX improvements, which would get picked up by the tech press such as TPU, but it's not happened. Therefore, what he's saying is a crock and I think he's got an agenda to make it look like AMD's products are getting hard done by Intel and everyone else, when the real blame lies with the products themselves.

Heck, just look at the Ryzen launch, all full of hype and promise. AMD and their partners, including Microsoft have had years to work with them to ensure a reasonably bug-free launch, but what do we see? BIOS issues preventing memory modules from working properly and a Windows 10 bug lowering overall performance due to SMT issues. How did Microsoft and more importantly AMD, not notice something so blatantly obvious before launch and just get it fixed? Especially since Windows 7 doesn't have this problem.

I tell you, if I'd dropped a grand+ plus on a Ryzen upgrade (CPU, mobo, RAM, cooler) I'd be pissed at seeing problems like this and might ask for my money back. I expect the thing to work properly out of the box. This is typical AMD and why I have no confidence in them. Funny how Intel CPUs haven't dropped in price much since Ryzen's launch and its "incredible" value proposition, isn't it? Yes, I think Ryzen does have that potential, but the launch has been poorly executed, ruining it. Looks like we'll have to wait for Ryzen v2 for an improved IPC that can compete head-on with Intel in games, but it's a good start, bugs notwithstanding.

There's also another thing that makes him quite the AMD apologist. When buying hardware, one should almost always buy what performs best right now, rather than a promise of better performance in the future. It's not guaranteed, there's no timescale and by the time that the improvement comes, if it does, the hardware will be out of date anyway. So, who cares if the FX works better than a 2500K now? They're both obsolete and can't even be purchased new any more.

I have literally put this philosophy into practice today. You might remember that my Palit GTX 1080 died a couple of months ago and I got a refund on it. Well, I was waiting for the 1080 Ti to come out before buying its replacement. What do I see? A card that costs an obscene £700+, is "only" 35% faster than a 1080 and an overclocked 7700K isn't fast enough to max the card out at 1080p a lot of the time (check TPU's review). Now, I still have a 2500K (at a rather non-enthusiast stock speed, but don't tell anyone :oops:) which is a lot slower than TPU's test rig, so I passed on the 1080 Ti and bought another 1080 today, taking advantage of the cheaper prices.

I've actually bought two models, as I can't quite tell which one I'm gonna keep just from the reviews and will return one to Amazon. Noise is all-important for me, so it's a toss-up between the Palit GTX 1080 GameRock (same as I had before) at £500 or the Zotac GTX 1080 AMP! Extreme at a cool £600 which @W1zzard said is the best GTX 1080 he's ever tested, in his review. Yes, it's somewhat more than I wanted to pay for a 1080 right now, but I really like it, so I'm biased towards keeping it, despite that steep £100 premium. If the coil whine performance is better on the Zotac then keeping it is a no-brainer.
your perspective is skewed by your fat wallet mate, if id picked for now 5 years ago my pc wouldnt still be doing all that i am now and right now i dont know what ill be doing with my pc in 3 years , vr maybe , point is dont shove your perspective so hard down my throat.
some of us are more subjective about whats a fault.

can i game at 550+fps low settings 1080p , dont give a damn
can i game at 4k ultra settings ,do care



but thats letting my perspective cloud the issue.
 
Last edited by a moderator:

HTC

Joined
Apr 1, 2008
Messages
4,664 (0.76/day)
Location
Portugal
System Name HTC's System
Processor Ryzen 5 5800X3D
Motherboard Asrock Taichi X370
Cooling NH-C14, with the AM4 mounting kit
Memory G.Skill Kit 16GB DDR4 F4 - 3200 C16D - 16 GTZB
Video Card(s) Sapphire Pulse 6600 8 GB
Storage 1 Samsung NVMe 960 EVO 250 GB + 1 3.5" Seagate IronWolf Pro 6TB 7200RPM 256MB SATA III
Display(s) LG 27UD58
Case Fractal Design Define R6 USB-C
Audio Device(s) Onboard
Power Supply Corsair TX 850M 80+ Gold
Mouse Razer Deathadder Elite
Software Ubuntu 20.04.6 LTS
No offense, but you missed the point here. You've let a couple of biased opinions and/or your own (not that i blame you, i don't use Youtube or Facebook either) color your judgement.
This is about how a product is perceived, who and how has a role to play in that and what it means for the market as a whole. Forget Adored, forget what he has to do to get or maintain his subscribers.

Lastly, 'cause momma taught me to be a fair redneck, Adored wasn't saying or implying that one is 'better' than the other. And he's not refuting the i7 770K's current superiority in FPS; he even illustrates it by showing videos of others so we can see said FPS difference for ourselves.
Stick to the forest and forget about the tree :)

Currently, whenever a new CPU is launched and reviewed, it's tested by multiple sites using low resolution / details in an effort to remove the GPU as a factor in whatever results they get. So, in theory, if you were to swap the card in the reviews for a much faster one, and then an even much faster one, the results of the CPU gaming benchmarks would not change and whatever placing in the reviews by X, Y and Z processors would be kept.

But in the video, Adored shows that not only this is incorrect but, in this specific case, it actually gets reversed, and i'm not talking about 1% or 2% here. That raises the question of whether or not the methodology of gaming CPU benchmarking is correct, which is why i created the topic to ask @W1zzard if he could answer this question.

There are however several variables and i address them in the OP because i have no idea if any of them has an impact on the testing, which is why i'm not sure myself if this method is indeed flawed or not.
 
Last edited:

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.86/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
your perspective is skewed by your fat wallet mate, if id picked for now 5 years ago my pc wouldnt still be doing all that i am now and right now i dont know what ill be doing with my pc in 3 years , vr maybe , point is dont shove your perspective so hard down my throat.
some of us are more subjective about whats a fault.

can i game at 550+fps low settings 1080p , dont give a damn
can i game at 4k ultra settings ,do care



but thats letting my perspective cloud the issue.
1 I don't have a fat wallet (I wish I did). I'm just a bit better off than I used to be so can buy some nice things once in a while
2 Not shoving my perspective down anyone's throat. Sounds like you're butthurt because you've got an FX8350 and I've given AMD some well deserved criticism. Remember, it's them I'm having a go at, not you

It's not actually clear what you're getting at, anyway.
 
Top