• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

New PT Data: i9-9900K is 66% Pricier While Being Just 12% Faster than 2700X at Gaming

Joined
Sep 7, 2017
Messages
3,244 (1.23/day)
System Name Grunt
Processor Ryzen 5800x
Motherboard Gigabyte x570 Gaming X
Cooling Noctua NH-U12A
Memory Corsair LPX 3600 4x8GB
Video Card(s) Gigabyte 6800 XT (reference)
Storage Samsung 980 Pro 2TB
Display(s) Samsung CFG70, Samsung NU8000 TV
Case Corsair C70
Power Supply Corsair HX750
Software Win 10 Pro
No thanks, the cheap ones were horrible, even PC-Chips and Biostar were better.

I think they've come a long way, software wise at least. I don't know anything about the quality of NUC boards, but a lot of people seem to like them. It couldn't be too hard to expand the quality to ATX. I just wouldn't pay a premium for it.
 
Last edited:
Joined
Jul 17, 2011
Messages
85 (0.02/day)
System Name Custom build, AMD/ATi powered.
Processor AMD FX™ 8350 [8x4.6 GHz]
Motherboard AsRock 970 Extreme3 R2.0
Cooling be quiet! Dark Rock Advanced C1
Memory Crucial, Ballistix Tactical, 16 GByte, 1866, CL9
Video Card(s) AMD Radeon HD 7850 Black Edition, 2 GByte GDDR5
Storage 250/500/1500/2000 GByte, SSD: 60 GByte
Display(s) Samsung SyncMaster 950p
Case CoolerMaster HAF 912 Pro
Audio Device(s) 7.1 Digital High Definition Surround
Power Supply be quiet! Straight Power E9 CM 580W
Software Windows 7 Ultimate x64, SP 1
and please do a verification @baseclock if intel holds its 95W TDP
Highly questionable if you ask me.
Though, even if it won't exceed those 95W for base clocks, it will pull pretty exactly +230–250W at the wall, excluding the rest of the system – at stock clocks under full load. Mind any overclocking!

Anyway, the overall power-consumption will bump quite a bit! It's still physics, isn't it?
The power-draw can be extrapolated quite easily, ordinary rule of three …

Gaming Load

If a 8700K needs about 66.8 Watts with its 6 cores on a average gaming-load, then a 9900K will be drawing about 89.07 Watts. Still, it won't run on its stock-clocks of 5 Ghz but 'only' on a 8700K's 4.7 GHz – so you have to add the additional consumption which even comes on top of that.

Calculation:

Averaged power-draw at stock-clocks (@4.7 GHz)
calculation 8700_4.7.png


Average gaming-load OC'd (@4,9 GHz)
calculation 8700_4.9.png


… which makes it ~90W on average gaming-load @4.7 GHz on 8 cores, just by the numbers alone.
So a 9900K will be consume at least 90W (in the best theoretical case) – though, this will not be the actual case since it has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). So due to that fact (of the increased Cache) alone it will be drawing significant more power than those 90W and probably will exceed the TDP of 95W.

Or in other words, it will be very likely that the 9900K will already exceed its TDP of 95W already at stock clocks (as you can see by the already overstepping 95.74 Watts at 4.9 Ghz), especially if it runs any warmer.

8700K_gaming.png


Full load

If we then have another look on the 8700K while being under heavy torture load like Prime, we see it doesn't get any better anyway. At Prime a 8700K pulls already 159.5 Watts@stock – and as such, a 9900K will be pulling also at least 212.67 Watts. Having said this, it's still ain't running at its stock-clocks at 5 Ghz but again still 'only' at stock-clocks of a 8700K at 4.7 Ghz. … of course without the additional power-draw of the remaining +300 Mhz, sans the increased power-consumption of its larger cache.

Calculation:

Full-load power-consumption at stock-clocks (@4.7 GHz)
calculation 8700_4.7_torture.png


Full-load power-consumption OC'd (@4,9 GHz)
calculation 8700_4.9_torture.png


As a result, a 9900K will be in that (still best theoretical) case consume at least circa 212.67 W under full load – admittedly, even that won't be the actual case as it still has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). Hence, due to that fact of its increased Cache it will be consume significantly more. On average the 9900K might be easily draw +230–250 Watts. In any case, the official fantasy-TDP of just 95W is here pure maculation and by all means just printer's waste. So, as usual on Intel's official extremely misleading TDP-specifications.

8700K_torture.png


Final conclusion
Note!
All numbers here are always representing the best case (sic!) and are in fact the best possible and assumable Numbers, since we're still at 4.9 Ghz in this scenario. Any greater attention should also be paid to the evident fact that in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself. Those numbers ain't the power consumptions of the whole system! Those are the CPU's values alone.

☞ Please also note, that the Cache which now has a Size increased by 33% will be making significant contributions to actual Wattage-numbers. Furthermore, all numbers arose with the assistance of a Chiller which cooling loop was cooled down and held permanently at 20°C (which, as a side-note, didn't even could hinder the 8700K from running into its thermal limit).

Résumé or bottom line
  • All of those are Best case-values.
  • All Wattages and (possible) clock frequencies under utilisation and made possible through the use of a Chiller (Compressor-cooling).
  • All calculations lacking the remaining clock speed of +100 MHz towards the nominal-clock of the 9900K (naturally including the respective overconsumption)
  • The actual Wattage might be very likely levelling off at +230–250 Watt nominal-consumption, at stock-clocks under full load.

Smartcom

PS: One forgive the potentially significant simplification of given circumstances for the purpose of exemplification. Errors excepted.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
As a result, a 9900K will be in that (still best theoretical) case consume at least circa 212.67 W under full load – admittedly, even that won't be the actual case as it still has a 33% increased Cache compared to the 8700K (12 vs 16 MByte). Hence, due to that fact of its increased Cache it will be consume significantly more. On average the 9900K might be easily draw +230–250 Watts. In any case, the official fantasy-TDP of just 95W is here pure maculation and by all means just printer's waste. So, as usual on Intel's official extremely misleading TDP-specifications.

Nope, sorry. Grab a clamp-on current meter, put it over the 8-pin power connector of your board, and you'll see that Intel's TDP numbers are actually quite accurate, and that's including with Turbo clocks.

Problem is, nobody except for me does this in reviews. Go look at any of the board reviews here and you'll see it. I provided the new board reviewer with the hardware to test this as well, so you'll continue to see actual CPU power draw in motherboard reviews here @ TPU.

I'm also happy to report that AMD's current platform TDP numbers are pretty accurate as to actual power draw as well.

You'll also have to note that Turbo clocks on both platforms, and CPU throttle are controlled by this number. You can even find it in board BIOSes, where you can manipulate the maximum power drawn (and some board makers have previously used this to cheat on review benchmark numbers). By default on all current platforms, this number matches a CPU's advertised TDP.


So rather than blame AMD or Intel on this one, you gotta blame the reviewers who are reporting inaccurate information to you. Its especially revolting to me that nobody tests this way, especially considering that Zalman made meters for this you could buy, that cost just $40, so you don't even need to spend a lot to measure this accurately. A decent and reliable clamp-on meter these days can be had for around $100.

Power numbers derived from meters connected to the PSU's power cord measure the entire system, as well as PSU inefficiency. Of course those numbers seem inflated.. they include board, memory, drives, mouse and keyboard, as well as the CPU, if not also idle power draw of a videocard.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,452 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
That's not my point really, it's just that Intel can't or shouldn't sell chips based on questionable benchmarks & that practice should never be defended, be it Apple/Intel/Nvidia or AMD.

Agreed, but is anyone in disagreement on that? I haven't seen anyone here 'defending Intel' for these misleading results.

The defense gets erected whenever AMD fans tell Intel users that the difference is negligible when there are countless benches (even a dozen of the corrected ones in this article underline it!) that show Intel CPUs still excel at higher framerates and in single thread limited scenarios. And its not just benches either, but practice - experience - from using the hardware. We're on an enthusiast forum, so there is going to be a larger than normal group interested in top end performance. And when it comes to price: the performance gap in some scenario's is easily 30% - let's look at GPU and the additional cost of a 30% faster 2080ti over its little brother below it. Remarkably similar. In both cases, you could say 'must be cheaper', and in both cases we as consumers have influence on that, by simply not buying it.

Using performance metrics from a 14 yr old game engine isn't proving anything. More baloney results. It doesn't even have anything to do with single/multithreading. Source straight up runs like doo doo on ryzen.

Don't agree with the dumb dumb. He's saying that intel leads by 40% in ST, but is knocked down to 12% in MT with 15% ish higher clocks. Tell me, where is all that intel IPC at? It doesn't exist. You can conclude that intel currently has a few percent IPC lead lol. And that doesn't include optimized memory for ryzen.

Dummy is flat out wrong or AMD makes the most superior CPU to ever exist for the next 20 yrs b/c of its SMT. Intel's only tangible lead is in freq and/or applications optimized only for intel (which is most everything).

Ever see game benchmarks with all CPUs locked to 4ghz? It's not rosy for intel's ipc "superiority".

- Don't forget your (expensive) B-die sticks
- Don't forget to clock your Intel CPU at 4 Ghz
- Don't use Source as an example game engine
- Don't use a ST limited scenario

And all of a sudden, Ryzen looks almost (still missing some % though) as good as an Intel CPU! You should go work for PT! I heard they're doing a Ryzen piece next month.

Surely you can see the irony. You have just literally summed up everything that is wrong about AMD-fan perspective on performance, and you cannot even see it, apparently. You should take this perfect example to reflect upon. Dummy... :laugh:
 
Last edited:
Joined
Jul 17, 2011
Messages
85 (0.02/day)
System Name Custom build, AMD/ATi powered.
Processor AMD FX™ 8350 [8x4.6 GHz]
Motherboard AsRock 970 Extreme3 R2.0
Cooling be quiet! Dark Rock Advanced C1
Memory Crucial, Ballistix Tactical, 16 GByte, 1866, CL9
Video Card(s) AMD Radeon HD 7850 Black Edition, 2 GByte GDDR5
Storage 250/500/1500/2000 GByte, SSD: 60 GByte
Display(s) Samsung SyncMaster 950p
Case CoolerMaster HAF 912 Pro
Audio Device(s) 7.1 Digital High Definition Surround
Power Supply be quiet! Straight Power E9 CM 580W
Software Windows 7 Ultimate x64, SP 1
Nope, sorry. Grab a clamp-on current meter, put it over the 8-pin power connector of your board, and you'll see that Intel's TDP numbers are actually quite accurate, and that's including with Turbo clocks.

Problem is, nobody except for me does this in reviews.
You actually are aware that such measurements were made by Igor from Tom'sHardware – and always have in such ways, or are you?
He's one of the few who does that and always have, as he's famous for doing exactly that.

In addition, you seem to have overlooked my note down there were I trying to state expressively, what such numbers are representing and where do those were coming from, no?

Note!
All numbers here are always representing the best case (sic!) and are in fact the best possible and assumable Numbers, since we're still at 4.9 Ghz in this scenario. Any greater attention should also be paid to the evident fact that in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself. Those numbers ain't the power consumptions of the whole system! Those are the CPU's values alone.

I provided the new board reviewer with the hardware to test this as well, so you'll continue to see actual CPU power draw in motherboard reviews here @ TPU.
Reading that makes me actually genuinely happy …
kisss44x20.gif

Now shut up and take my money! Oh, and if you don't mind, let me **** **** ****!
hehee15x18.gif


You'll also have to note that Turbo clocks on both platforms, and CPU throttle are controlled by this number. You can even find it in board BIOSes, where you can manipulate the maximum power drawn (and some board makers have previously used this to cheat on review benchmark numbers). By default on all current platforms, this number matches a CPU's advertised TDP.
I'm actually pret·ty aware of the ongoing over-extensively and widely used abusive methods of such ways to hide way higher actual numbers under the rug, the all too common practice to benchmark with open roofs and completely unbridled for higher numbers (cough MCE! Unlimited Powertargets!) while 'determine' the »actual power-consumption« afterwards whilst having the product muzzled by given BIOS-/UEFI-options and/or lowered PTs, pre-cooled cards, etcetera – thank you.
augen15x18.gif


So rather than blame AMD or Intel on this one, you gotta blame the reviewers who are reporting inaccurate information to you.
I always criticise such wrongdoings as extremely and excessively misleading and deceiving. Always have, always will.
Especially since such devices and/or apparatuses shall be not only pretty affordable for a today's technical editorial department but since every damn reviewer who considers themselves as any reputable or at least may have the personal aspiration to be taken any serious and trustworthy is nothing less than ob·li·ga·ted in taking such measurements and determine such numbers of actual and nominal power-consumptions.

Using the overall system's wattage I straight-out consider such attempts or habits as direct intent to mislead or deceive. All the more if the product is a) known to be taxing higher numbers in reality and especially b) if such reviewers were already made aware and pointed towards such facts (that using the overall system's wattage is representing the product in a way to massively flattering light).
motzi32x23.gif



Smartcom
 

Durvelle27

Moderator
Staff member
Joined
Jul 10, 2012
Messages
6,788 (1.50/day)
Location
Memphis, TN
System Name Black Prometheus
Processor |AMD Ryzen 7 1700
Motherboard ASRock B550M Pro4|MSI X370 Gaming PLUS
Cooling Thermalright PA120 SE | AMD Stock Cooler
Memory G.Skill 64GB(2x32GB) 3200MHz | 32GB(4x8GB) DDR4
Video Card(s) ASUS DirectCU II R9 290 4GB
Storage Sandisk X300 512GB + WD Black 6TB+WD Black 6TB
Display(s) LG Nanocell85 49" 4K 120Hz + ACER AOPEN 34" 3440x1440 144Hz
Case DeepCool Matrexx 55 V3 w/ 6x120mm Intake + 3x120mm Exhaust
Audio Device(s) LG Dolby Atmos 5.1
Power Supply Corsair RMX850 Fully Modular| EVGA 750W G2
Mouse Logitech Trackman
Keyboard Logitech K350
Software Windows 10 EDU x64
Where are the reviews by now
 
Joined
Feb 3, 2017
Messages
3,756 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos

Keicho2

New Member
Joined
Sep 29, 2018
Messages
2 (0.00/day)
All I see is that the old 4 kerner i7 6700k fps is technically the same on the 2700x. In games. Why should an 8 core with 5ghz from intel have only 12%?: D
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
In addition, you seem to have overlooked my note down there were I trying to state expressively, what such numbers are representing and where do those were coming from, no?

No, I've not overlooked anything. You see, if a motherboard is working properly, and it's BIOS is configured properly, such IS NOT POSSIBLE. Anyone telling you that a CPU exceeds it's TDP doesn't fully understand how these things work, and how power draw is controlled via the motherboard for a CPU, and as such, if the BIOS is programmed right, and the board works right, there is ZERO CHANCE for a CPU to exceed the listed TDP.

Now, many things can go wrong that can cause TDP to be exceeded, but it is NEVER supposed to happen, no matter the type of CPU loading. So anyone telling you that this happens, at stock, is misinforming you, and isn't smart enough (IMHO) to investigate why such is taking place. Please also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.

in every single case and all numbers do reflecting the actual Package Power Consumption and as such those are solely reflecting only the processor's consumption in and of itself.

Package Power Consumption is a SOFTWARE reading. So, no this guy is NOT doing as we do. He's reading software, and is assuming that things are reported accurately, when clearly they aren't. He's clearly identified a problem in his configuration for sure, but what and where that problem is, is NOT being reported properly.
 
D

Deleted member 178884

Guest
an 8700K is currently £460 from Scan UK.. lets be a at least a little bit accurate..
Yeah thanks to "shortages" how convenient?

How could we not defend a company that is one of the pillars of IT in our civilization?

You don't have to admire Intel and you might not even respect their contribution to computing (which would be weird for a wannabe enthusiast), but you should understand their importance for stability of this business and the general reality around us.
Do you like pizza? Imagine there was a single company selling 90% of pizzas globally. I'm sure you wouldn't want that company to have any problems. :)

I work in insurance - and industry that's constantly plagued by price wars. People don't like paying for insurance, but they have to. And the business is very scale-dependent, i.e. a large market share greatly improves your margins. Hence, smaller companies are selling policies at dumping prices just to get a large client base. It's easier to renew a client than convince a new to join. So it makes sense to sell them a product at a loss. If they stay for another 1-2 years, we'll make a profit in the end.

I look at CPU business and I see some analogies. For example: you have a huge technological cost for R&D and product release. Clients are rather loyal to brands. And most importantly: people have to buy CPUs - it's just a matter of whom to buy from.
I'm not saying AMD margins are too low for making their business stable. But business-wise it wouldn't necessarily be a bad idea for them to sell even at a loss now, but get up to 20-30% market share and gain some momentum.
On the other hand, it would be totally sensible for Intel to realize that there's a particular group of people that's naturally pulled towards AMD's characteristics and fighting for them is very expensive, so sustaining 90% market share simply costs way too much. Maybe someone had the balls to stand up during a meeting and say: let's give up - it's better to sell 7 CPUS for $500 than 9 for $300.
"defend a company" is going too far here - this is just a discussion thread - we're not taking our toilet onto intel and pouring it over them, If you haven't realized already AMD has been on the "edge" for years now and they've made a comeback - Intel RELYS on AMD existing since they both licence technologies to each other that are essential to the production of processors, Also that comment in regards to maintaining the market - Intel will *Never* hand over the market to amd, they are fighting for that market share, they need it as high as possible, if they decided to sell 7 for 500$ than 9 for 300$ it would also affect investors and stock - I'm not going into it that far however. Intel own the bulk of the server market where the real money is to be made, they've already lost in the war for the fastest supercomputers to IBM which crushed them - If AMD take over the server market intel would probably call out for help from other companies and be in big financial trouble - the mainstream is the way AMD could start this - they only lack R&D funding and that's the only thing stopping them and it can be made in the mainstream - you seriously think intel would threaten their entire companies existence? They will continue to try and push for that market share.
 
Joined
Feb 3, 2017
Messages
3,756 (1.32/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Now, many things can go wrong that can cause TDP to be exceeded, but it is NEVER supposed to happen, no matter the type of CPU loading. So anyone telling you that this happens, at stock, is misinforming you, and isn't smart enough (IMHO) to investigate why such is taking place. Please also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.
That actually is not completely true, at least for current generation (as well as generation or two back) Intel processors. You can definitely see CPU temporarily exceeding TDP even without MCE or some other manufacturer's stupid option enabled. From examples I have personally seen with all BIOS/UEFi settings set to as stock as possible, i7 8700K will run at 120-130W for a little while before settling down at 95W. Similarly, i5 8400 runs at 95W for a little while before settling down at 65W. The "little while" in there seems to depend on the motherboard.

Whether this is the default configuration or not is up for debate. From what I can see from whitepaper, technically it should not be (PL2 is 1.25 TDP and up to 10ms with PL1 Tau at 1 second). Are motherboard manufacturers playing around with settings more than they should (in addition to MCE)?

https://www.intel.com/content/dam/w...heets/8th-gen-core-family-datasheet-vol-1.pdf
Chapter 5: Thermal Management (Page 88)

The other thing with Intel's power management is that AVX throws most of it straight out the window. If overclockers decide to disable the default AVX Offset (-2/-3) along with disabling the power limits, that will increase power consumption and heat by a lot. Back to talking about stock - in general Intel has set the limits pretty well, Turbo frequencies will work reasonably fine for anything not AVX. Heavy AVX load, however, will drop the frequencies down to base quickly.

Please also note that the fallacy I see right away is that they are using software to measure this, rather than physical hardware, as we here @ TPU do.
Package Power Consumption is a SOFTWARE reading. So, no this guy is NOT doing as we do. He's reading software, and is assuming that things are reported accurately, when clearly they aren't. He's clearly identified a problem in his configuration for sure, but what and where that problem is, is NOT being reported properly.
You are measuring power with clamp, right? Have you tested from at least a couple different motherboards whether software readings lie and by how much (both motherboad and CPU)?
I have been looking for a cheap clamp to test the power myself but a $40 meter or reasonably cheap clamps do not have a very good accuracy and so far I have not been interested enough to go for something costing couple hundred moneys. Software might not be in a different ballpark from a cheap meter, given that software readings are somewhat verified in hardware.
 
Last edited:

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.53/day)
You are measuring power with clamp, right?

Correct, with a FLUKE clamp meter.

Have you tested from at least a couple different motherboards whether software readings lie and by how much (both motherboad and CPU)?

Yeah, I have. Over time I have found that AIDA64 can be fairly reliable once it's been updated properly (and if you have a yearly-renewed licence, they'll gladly update it for you if it doesn't work right), but for some boards it is way off, and when you begin overclocking on a lot of boards, it reads less than 1W of power consumed (when clearly it is a whole lot more). It really varies from board to board and version to version of the software in use, whereas the clamp meter just simply works, every time.

I have been looking for a cheap clamp to test the power myself but a $40 meter or reasonably cheap clamps do not have a very good accuracy and so far I have not been interested enough to go for something costing couple hundred moneys. Software might not be in a different ballpark from a cheap meter, given that software readings are somewhat verified in hardware.
You can get the FLUKE meter I have for what I'd call a relatively minor cost, especially for me, since I use it so damn often. As someone that liked to overclock under LN2 and such, it is a tool that you cannot be without, as so much information can be had by watching power increases as you push up the clocks, especially with different CPUs. Now, for the average user, it might not be that worthy of an investment I suppose; it just depends how into overclocking you really are.

Are motherboard manufacturers playing around with settings more than they should (in addition to MCE)?

Yeah, they are. Sad but true, and yeah, you can see some spikes from time to time, especially when AVX loading for sure. But as you've surmised, there is a time limit to this, and yeah, that time limit is also in BIOS and can be adjusted.
 
Last edited:
Joined
Jan 2, 2014
Messages
248 (0.06/day)
Location
Edmonton
System Name Coffeelake the Zen Destroyer
Processor 8700K @5.1GHz
Motherboard ASUS ROG MAXIMUS X FORMULA
Cooling Cooled by EK
Memory RGB DDR4 4133MHz CL17-17-17-37
Video Card(s) GTX 780 Ti to future GTX 1180Ti
Storage SAMSUNG 960 PRO 512GB
Display(s) ASUS ROG SWIFT PG27VQ to ROG SWIFT PG35VQ
Case Cooler Master HAF X Nvidia Edition
Audio Device(s) Logitech
Power Supply COOLER MASTER 1KW Gold
Mouse LOGITECH Gaming
Keyboard Logitech Gaming
Software MICROSOFT Redstone 4
Benchmark Scores Cine Bench 15 single performance 222
Intel 9 years old architecture provides the best performance still! Even brand new AMD Ryzen architecture can't out perform 9 years old Intel architecture.

When you have the best and no competition then you can name your prices.... Both Intel 9900K and Nvidia 2080Ti are 2018 best CPU & GPU with no competition from AMD.

I you want a cheaper 3rd place 2700X its the time to do that. Secondly 9700K out performs 2700X in most games and OC up to 5.5GHz with EK.

If you're a PC Gamer the 9700K is your best choice. If you're just want bragging rights with benchmarking get the top dog 9900K.

2700X Max's out 4.4GHz

9900K/9700K both max out 5.5GHz

8700K/8086K both max out 5.3GHz

That's your head room.
 
Joined
Oct 2, 2015
Messages
3,144 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
Nice troll.
Enjoy your blue tax for extra 5FPS.
 
Joined
Mar 18, 2008
Messages
5,441 (0.89/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 5800X3D | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling Noctua U9S Twin Fan| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) MSI AMD 6750XT | 6500XT | MSI RX 580 8GB
Storage 1TB WD Black NVME / 250GB SSD /2TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 500 SSD/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 850 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 10 Pro 64 | Windows 10 Pro 64 | Windows 7 Pro 64/Windows 10 Home
How the heck did this thread get to 11 pages long? PT retested, not perfect retest but retested with an actual 8 core CPU this time and we got results closer to all what we was expecting and at the same time proved intel wrong! Now its intels turn to redo there in house testing since they said they got the same results as PT with the first run of benchmarks :shadedshu: Anyone defending intel in this thread are ether getting paid/work for intel or are just plain dumb!

So a 9900k is basically double the price of a 2700x for less then 12% gaming performance increase, we know who the true winner is here.
 
Joined
Oct 2, 2015
Messages
3,144 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
/g/ sums it up quite well:

CPU
>Athlon 200GE - Minimal desktop
>R3 2200G - Bare minimum gaming (dGPU optional)
>R5 2400G/i5-8400 - Consider IF on sale
>R5 2600/X - Good gaming & multithreaded work use CPUs
>i7-9700k - If pairing w/ a 2080Ti and the extra $200+ is worth ~135 FPS instead of ~120 FPS to you, despite better CPUs coming next year and requiring new boards
>R7 2700/X - Best value high-end CPU on a non-HEDT platform
>Wait for R7 3700X - Surely the best overall and not a massive disappointment like the 9900k
>Threadripper/Used Xeon - HEDT
 
D

Deleted member 178884

Guest
Honestly I think the i5-9600k is the best value of the series, https://www.tweaktown.com/news/63512/intel-core-i5-9600k-6c-6t-overclocks-up-5-2ghz-air/index.html this thing will cost around 262$ or £200 ~ and at the 5ghz+ range it's not too far behind the 8700k at all and massively ahead in single thread, It's not going to be easy for amd at this price point that's most likely why the dropped the 2700x down in price a bit. the 9900k is just a pure bragging rights processor or for the rich.
 
Joined
Sep 15, 2007
Messages
3,946 (0.63/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember

Intel 9 years old architecture provides the best performance still! Even brand new AMD Ryzen architecture can't out perform 9 years old Intel architecture.

When you have the best and no competition then you can name your prices.... Both Intel 9900K and Nvidia 2080Ti are 2018 best CPU & GPU with no competition from AMD.

I you want a cheaper 3rd place 2700X its the time to do that. Secondly 9700K out performs 2700X in most games and OC up to 5.5GHz with EK.

If you're a PC Gamer the 9700K is your best choice. If you're just want bragging rights with benchmarking get the top dog 9900K.

2700X Max's out 4.4GHz

9900K/9700K both max out 5.5GHz

8700K/8086K both max out 5.3GHz

That's your head room.
You're gonna have to put /s in your posts. Someone is going to think you're not trolling.
 
D

Deleted member 178884

Guest
Nvidia Jetson systems and other ARM powered embedded computers are pretty capable ;)
Not capable, they will not deliver similar performance or anywhere near the performance of AMD and Intel processors, they can stick to the phone market.
 
Joined
Oct 1, 2018
Messages
134 (0.06/day)
oh really? where is your proof? we're using our router with 10gbe switch powered with Nvidia Jetson, it is fast and it is extremely secure. just because you can't game (yet) on those platforms doesn't give you rights to blame their speed. the video transcoding capabilities of those tiny jetson modules are at the level of Quadro's, and you know any Quadro can beat the electrons out of intel or amd processor when it comes to parallel encoding or decoding of multiple streams simultaneously
 
D

Deleted member 178884

Guest
oh really? where is your proof? we're using our router with 10gbe switch powered with Nvidia Jetson, it is fast and it is extremely secure. just because you can't game (yet) on those platforms doesn't give you rights to blame their speed. the video transcoding capabilities of those tiny jetson modules are at the level of Quadro's, and you know any Quadro can beat the electrons out of intel or amd processor when it comes to parallel encoding or decoding of multiple streams simultaneously
Congratulations, what your doing is basically using using an embedded system - designed for one task, and your comparing them with processors used for a hell of a lot more tasks then just that.
 
Joined
Oct 1, 2018
Messages
134 (0.06/day)
Try to use any intel or amd processor at the level of embedded systems, you'll cry in the end.
 
D

Deleted member 178884

Guest
Try to use any intel or amd processor at the level of embedded systems, you'll cry in the end.
This is a discussion about DESKTOP processors, not embedded crap, it's like saying my android tablet beats a windows desktop - they're aimed at different users.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,995 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
I like how this greedy bluegreen couple tries to rip you for extra cash, just preteding to be “exclusive”, higher quality or better perfomance-
Hell, even that, they are so unconfident in doing all that advertising, torning between all they’re flushing, resulting in a complete mess about “what their product really is”
-and you know whats the most pretty in all this? THEY ARE NOT PREMIUM. They are not made from better materials than other stuff in semicond. market, they wont last longer, they dont have premium options. Hell, do Intel and ngreedia know what does PREMIUM means??? If they’ll sell their CPU’s in some Ferrari kind of shop, with cup of coffee and a manager kissing ur arse just to buy their stuff - that I will call premium over AMD. Nit just silly ~10% perfomance gain.

Rich people pay more for better service or good - and NEVER for the same stuff available for every, just in order to show they have more money.
All in all, PC gaming is a leisure - and it shouldnt be considered as a major part of your expenses. This price hike tactics just initially false, from very beginning
Lmao! Nice rant.

Now, stop acting shocked, like this is new. Intel has priced their premium processor at premium prices for nearly two decades. Factored for inflation, those $1,000 chips cost a lot more than this one does.
 
Top