• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-10900K 10-core Processor and Z490 Chipset Arrive April 2020

Joined
Sep 15, 2011
Messages
6,722 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
 

viandyka

New Member
Joined
Dec 11, 2019
Messages
2 (0.00/day)
AMD only need 3 generation Ryzen

and intel just screw their head with 4 "NEW" intel product xD and soon 5 skylake refresh
 
Joined
Oct 2, 2015
Messages
3,135 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
Price.
 
Joined
Mar 16, 2017
Messages
2,101 (0.75/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
No really, what would be the difference then between the K and the X series?!??
Please don't say the chipset and the nr of PCIE-E lines only...
Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.
 
Joined
Jul 19, 2016
Messages
482 (0.16/day)
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

But 5Ghz all-core drawing 450W. Yay Intel!
 
Joined
Feb 20, 2019
Messages
8,280 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

But 5Ghz all-core drawing 450W. Yay Intel!
Don't forget that Skylake rebranded for the 5th time is still a proven high-risk architecture with a fundamentally-exploitable design and new exploits popping up to haunt it faster than they're patched.

On the other side, AMD's frequent architectural jumps mean that by the time hackers glean enough info about Zen 2, 3, 4 to start exploiting it, AMD will have moved on to newer architecture anyway.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I dunno, once they drop so far, the Atom-based architecture is able to step in. I think a 2C/2T Core-based architecture is finally being eclipsed by Gemini Lake in most tasks. I had an Apollo Lake quad core that could handle a 4K stream. It’s a low bar, but we’re talking about really small dies and super cheap prices.
Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).

Server Atoms (C-series) are still in production, but haven't been updated since 2018.

Ultimately the Atom lineup will be replaced by ARM.
 
Joined
Mar 16, 2017
Messages
2,101 (0.75/day)
Location
Tanagra
System Name Budget Box
Processor Xeon E5-2667v2
Motherboard ASUS P9X79 Pro
Cooling Some cheap tower cooler, I dunno
Memory 32GB 1866-DDR3 ECC
Video Card(s) XFX RX 5600XT
Storage WD NVME 1GB
Display(s) ASUS Pro Art 27"
Case Antec P7 Neo
Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).

Server Atoms (C-series) are still in production, but haven't been updated since 2018.

Ultimately the Atom lineup will be replaced by ARM.
I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”
Atom and Celeron/Pentium J-, N-series were all based on the same architecture (Goldmont). Super small cores - up to 16C in a package smaller than LGA1151.
Consumer Atom lineup was dropped.

I'm not sure what will happen when Tremont arrives.
I've seen rumors that server chips (and everything with ECC) will be unified under Xeon brand...
 
Joined
Nov 4, 2005
Messages
11,982 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming :cool:
Wrong.

Intel is faster at our of order operations, due to AMD using a chiplet design.

When gaming at resolutions of 1080 or above AMD are neck and neck. At 720 or lower Intel wins, but who games at that resolution?

AMD is significantly faster at 80% of other actual work due to more cores, and more cache.


Do you even read reviews?
 
Joined
Jul 19, 2016
Messages
482 (0.16/day)
And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming :cool:

3.8% faster (using TPU's own numbers after chipset update) at 1080p gaming using a 2080 Ti, whilst consuming more power, being far less efficient and losing in multithreaded tasks by way more than 3.8%, right? I'll take the Ryzen thanks ;)
 
Joined
Mar 18, 2015
Messages
2,963 (0.84/day)
Location
Long Island
This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.

Rumors - not relevant
Cores - Don't care
IPC - Don't care
Efficiency - Don't care
Mutithreading - don't care
Single core - don't care
720p gaming - who plays @ 720p ... and "likely" has no place in real world discussions.

All that is relevant is how fast a CPU runs the apps they use and the games they play. And won't pay any attention to any new release till i see it tested here... and then only in the apps I actually use and games I actually play ... chest beating and benchmark scores are meaningless. Fanbois beating their chests about the chip they like being faster at tasks that are never or rarely done things that they never do isn't . When I looked at the 9900KF versus the 3900x test results here in TPU ... here's what I see ...

3900X kicks tail in rendering which would be relevant if my user did rendering
3900X kicks tail in game and software development which would be relevant if my user did those things
3900x shares wins in browser performance but differences are too small to observe anyway.
3900X kicks tail in scientific applications which would be relevant if my user did rendering
3900x shares wins in office apps but differences (0.05 seconds) are too small to affect user experience.
3900X lose in Photoshop by 0.05 seconds... but loses by 10 seconds ... finally something that matters to my user
Skipping a few more things 99% of us don't ever do
File compression / media encoding ...also not on the list
Encoding ... use does an occasional MP3 encode and the 3900x trailing by 12 secs might be significant if it was more than an occasional thing.
3900x loses in overall 720 game performance by 7% ... as he plays at 1440p, it's entirely irrelevant
3900x loses in 1440p overall game performance but 2.5% ... not a big deal but 2.5% is an advantage more than most of what we have seen so far.
3900x losses all the power consumption comparisons ... 29 watts in gaming
3900x runs 22 C hotter
3900X doesn't OC as well
3900x is more expensive.

AMD did good w/ the 3900x .... but despite the differences in cores, IPC, die size whatever ... the only think that matters is performance. The are many things that the 3900x does better than the 9900KF, but most users aren't doing this things. You can look at a football player and how much he can bench or how fast he can run the 40 ... but none of those things determine his value to the team, his contribution to the score or how much he gets paid.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
Those chiplets is what kicked Intel's ass.
In reversed positions AMD offered the Phenom I the TRUE quad core vs the glued together two C2D into Q6600.
Guess what we don't care what's true and what's not it's all about performance and this time around AMD are the ones who made those chiplets to work.
And for inter it's not just going to 10/7nm because I bet that they won't be able to hit those 5 GHz on those nodes for years to come.
So for them it will be a step back at first but I'm sure they will bounce back eventually like they did with C2Duo by getting "inspired" by AMD architecture.
And guys don't for get ARM is coming 8CX ....
 
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.

Rumors - not relevant
Cores - Don't care
IPC - Don't care
Efficiency - Don't care
Mutithreading - don't care
Single core - don't care
720p gaming - who plays @ 720p ... and "likely" has no place in real world discussions.

All that is relevant is how fast a CPU runs the apps they use and the games they play. And won't pay any attention to any new release till i see it tested here... and then only in the apps I actually use and games I actually play ... chest beating and benchmark scores are meaningless. Fanbois beating their chests about the chip they like being faster at tasks that are never or rarely done things that they never do isn't . When I looked at the 9900KF versus the 3900x test results here in TPU ... here's what I see ...

3900X kicks tail in rendering which would be relevant if my user did rendering
3900X kicks tail in game and software development which would be relevant if my user did those things
3900x shares wins in browser performance but differences are too small to observe anyway.
3900X kicks tail in scientific applications which would be relevant if my user did rendering
3900x shares wins in office apps but differences (0.05 seconds) are too small to affect user experience.
3900X lose in Photoshop by 0.05 seconds... but loses by 10 seconds ... finally something that matters to my user
Skipping a few more things 99% of us don't ever do
File compression / media encoding ...also not on the list
Encoding ... use does an occasional MP3 encode and the 3900x trailing by 12 secs might be significant if it was more than an occasional thing.
3900x loses in overall 720 game performance by 7% ... as he plays at 1440p, it's entirely irrelevant
3900x loses in 1440p overall game performance but 2.5% ... not a big deal but 2.5% is an advantage more than most of what we have seen so far.
3900x losses all the power consumption comparisons ... 29 watts in gaming
3900x runs 22 C hotter
3900X doesn't OC as well
3900x is more expensive.

AMD did good w/ the 3900x .... but despite the differences in cores, IPC, die size whatever ... the only think that matters is performance. The are many things that the 3900x does better than the 9900KF, but most users aren't doing this things. You can look at a football player and how much he can bench or how fast he can run the 40 ... but none of those things determine his value to the team, his contribution to the score or how much he gets paid.
Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?

3900X is more well rounded for all tasks. Not just one. And more secure. The difference in price is rather small. 9900 variants should cost no more than 3700X. Not what they are now.
Heat i would say is more of an issue for Intel due to higher power consumption. 3900X does not need to OC well because out of the box it already boosts to it's highest speed and with higher IPC it can afford to run at lower clocks. People need to let go of the 5Ghz or bust mentality. Remember Bulldozer was also 5Ghz. I hope no one is missing that.
 
Joined
Jun 28, 2016
Messages
3,595 (1.17/day)
Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?
If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?
This is exactly the problem with many people here. They buy the wrong CPU.

Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?

I think this is also why some people didn't understand low popularity of 1st EPYC CPUs. They had good value and looked very competitive in many benchmarks.
But they fell significantly behind in databases. So the typical comment on gaming forums was: but it's just one task. Yeah, but it's the task that 95% real life systems are bought for (or at least limited by).
3900X is more well rounded for all tasks. Not just one.
Whenever I see an argument like this one, I ask the same question.
I assume you game, right?
What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.

Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.

Most people can't name a single thing.
Some say crap like: they encode videos. A lot? Nah, few times a year, just a few GB.
 
Joined
Oct 2, 2015
Messages
3,135 (0.94/day)
Location
Argentina
System Name Ciel / Akane
Processor AMD Ryzen R5 5600X / Intel Core i3 12100F
Motherboard Asus Tuf Gaming B550 Plus / Biostar H610MHP
Cooling ID-Cooling 224-XT Basic / Stock
Memory 2x 16GB Kingston Fury 3600MHz / 2x 8GB Patriot 3200MHz
Video Card(s) Gainward Ghost RTX 3060 Ti / Dell GTX 1660 SUPER
Storage NVMe Kingston KC3000 2TB + NVMe Toshiba KBG40ZNT256G + HDD WD 4TB / NVMe WD Blue SN550 512GB
Display(s) AOC Q27G3XMN / Samsung S22F350
Case Cougar MX410 Mesh-G / Generic
Audio Device(s) Kingston HyperX Cloud Stinger Core 7.1 Wireless PC
Power Supply Aerocool KCAS-500W / Gigabyte P450B
Mouse EVGA X15 / Logitech G203
Keyboard VSG Alnilam / Dell
Software Windows 11
Long live the i3 9350, why bother with anything else then?
I game, in emulators too, and Skylake is starting to become useless there too.
 
Joined
Aug 21, 2013
Messages
1,898 (0.46/day)
If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?
"PC for gaming" statement is funny to me every time. Like really. You never open a browser? you never unpack anything game related (like mods), you never run any other programs on this PC? Obviously you do unless you run cracked games that do not require launchers.

And this gap between Intel and AMD is greatly exaggerated in reviews due to low resolutions and using the fastest GPU around. I bet most of these people buying i7 or i9 for "only gaming" also do bunch of other stuff (even if lightly threaded) and do no ever notice the miniscule performance difference with a naked eye vs AMD. This is not a FX vs Sandy Bridge or Ryzen 1xxx vs 7700K situation any more where you can easily tell the difference. Since AMD is so close in performance and much better in nearly everything else people are buying AMD 9 to 1 compared to Intel.

Plus there is the matter of priorities. For a gaming the GPU is always #1. A person with a cheaper R7 3700X and a RTX 2080S will always achieve better performance than the next guy with i9 9900KS with a RTX 2070S. The only case where getting the i9 for gaming makes any sense is when money is not a problem and the person already owns a 2080 Ti.

Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?
The one that gets the better average. Obviously. I won't be playing those 3 games forever. Case in point: AMD cards that are really fast in some games like Dirt Rally 4 or Forza Horizon. These are outlier results. I can't and won't base my purcase on one off results that are not representative of overall performance.

Whenever I see an argument like this one, I ask the same question.
I assume you game, right?
What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.
Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.

Most people can't name a single thing.
Some say crap like: they encode videos. A lot? Nah, few times a year, just a few GB.
Is not everything performance limited?
I would say web browser is very much performance limited. I noticed massive speed boost when launching Firefox after upgrading from 2500K to 3800X. Going to 3900X or 3950X i would be able to give Firefox even more threads to work with.
Also i feel like im IO limited and need faster PCI-E 4.0 NVME SSD to replace my SATA SSD. Just waiting on Samsung to announce their client drives next year. Current Phison controller based drives are just a stopgap and not very compelling.
Also network speed is becoming a major bottleneck for me. What can i say - VDSL 20/5 just does not cut it for me. Ideally i would upgrade to symmetrical 300/300 or 500/500 speed. 1G/1G sounds nice but then the web itself would become a bottleneck.
 
Joined
Oct 27, 2009
Messages
1,182 (0.21/day)
Location
Republic of Texas
System Name [H]arbringer
Processor 4x 61XX ES @3.5Ghz (48cores)
Motherboard SM GL
Cooling 3x xspc rx360, rx240, 4x DT G34 snipers, D5 pump.
Memory 16x gskill DDR3 1600 cas6 2gb
Video Card(s) blah bigadv folder no gfx needed
Storage 32GB Sammy SSD
Display(s) headless
Case Xigmatek Elysium (whats left of it)
Audio Device(s) yawn
Power Supply Antec 1200w HCP
Software Ubuntu 10.10
Benchmark Scores http://valid.canardpc.com/show_oc.php?id=1780855 http://www.hwbot.org/submission/2158678 http://ww
Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.
lool, Xeons are not some special silicon... and they span the whole lineup.
There are Xeon-Ws that parallel the HEDT 2066 platform yes.
There are also Xeon E's on the 1151 socket.
So desktop platform intel has whatever-lake 2-chan mem
HEDT refresh-lake 4chan mem
3647 stillnotfrozen-lake 6 chan mem
BGA abomination Glued-lake 12-chan mem
 
Joined
Oct 14, 2017
Messages
210 (0.08/day)
System Name Lightning
Processor 4790K
Motherboard asrock z87 extreme 3
Cooling hwlabs black ice 20 fpi radiator, cpu mosfet blocks, MCW60 cpu block, full cover on 780Ti's
Memory corsair dominator platinum 2400C10, 32 giga, DDR3
Video Card(s) 2x780Ti
Storage intel S3700 400GB, samsung 850 pro 120 GB, a cheep intel MLC 120GB, an another even cheeper 120GB
Display(s) eizo foris fg2421
Case 700D
Audio Device(s) ESI Juli@
Power Supply seasonic platinum 1000
Mouse mx518
Software Lightning v2.0a
that is alote of channels, but do you realy need so many ?
I saw many tests done on channels and from 2 to 4 thare is diffrence in synthetic but real applications don't realy increase in speed
I think if intel would have unganged it could be a improvement with more channels
 
Joined
Feb 18, 2009
Messages
387 (0.07/day)
Processor i7 8700K
Motherboard MSI Z370 Gaming Plus
Cooling Noctua NH-D15S + NF-A12x25 PWM + 4xNF-A14 PWM
Memory 16 GB Adata XPG Dazzle DDR4 3000 MHz CL16
Video Card(s) Gigabyte GTX 1070 Ti Gaming 8G
Storage Samsung 970 EVO Plus, Samsung 850 Evo
Display(s) Samsung C24FG73 144Hz 1080p
Case Fractal Design Meshify C
Audio Device(s) Steelseries Arctis 3
Power Supply Superflower Leadex II Gold 650W
Mouse Steelseries Rival 600
Keyboard Steelseries Apex 7
Software Windows 11 Pro
"Modern Standby" is not new, it was already available in (some?) Z390 boards.

Other than that, big yawn. More 14nm. I assume more 200-300W power consumption with AVX, and probably impossible to fully stresstest your OCs unless you hit the super golden sample stuff. It's kind of hilarious to see so many people on the web complaining about how their 9900K throttles at 5GHz MCE/OC if they dare to try Prime95 or LinpackX.

The sad part is that I saved money even for what I assumed to be a 8/16 9700K. When I saw it's now an i9, I kept the cash. Looks like I'll keep the cash even longer, which is fine, until we get a CPU that you can play with and tweak without the caveats of the Ryzen platform (like lower clocks to go with undervolting, no OC space) or those of Intel's (super high power consumption to the point where it's impossible to cool the beast and stresstest OCs properly).
 

Give.me.lanes

New Member
Joined
Dec 16, 2019
Messages
1 (0.00/day)
if there are fewer lanes on the 10900k Im just going to stick with 10900x and overclock that. I dont understand how everyone is jumping on amd's dick at a loss of PCIE lanes, Im pretty sure you arent getting a chip this big unless you are running nvlink @4k or using it as a WORKSTATION. or do people not know how to build pc's anymore?

and cmon, my sandy at 5 ghz still rocks. dont bash sandy :D
 
Top