• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Launches the sub-$100 Core i3-10100F Quad-Core Processor

Joined
Jun 1, 2011
Messages
4,559 (0.93/day)
Location
in a van down by the river
Processor faster at instructions than yours
Motherboard more nurturing than yours
Cooling frostier than yours
Memory superior scheduling & haphazardly entry than yours
Video Card(s) better rasterization than yours
Storage more ample than yours
Display(s) increased pixels than yours
Case fancier than yours
Audio Device(s) further audible than yours
Power Supply additional amps x volts than yours
Mouse without as much gnawing as yours
Keyboard less clicky than yours
VR HMD not as odd looking as yours
Software extra mushier than yours
Benchmark Scores up yours
I'd agree that you can still get away with a quad core for gaming, if you already have one (hopefully it's at least 4c/8t)... but I don't think it's a great idea to build a new desktop today for gaming with only a quad core.

It's too bad it's still a locked chip, though. It might be a better performer at a higher clockspeed. I don't think you would need a super expensive cooler to cool a modern quad core at high speeds.
are there any options at 4c/4t anymore? Inel pentium is still 2c/4t and celeron is 2c. I don't recall any zen 2 desktop CPUs being just 4c. I think you would need to either get a laptop or find a old/used CPU to just build a gaming 4c system.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,745 (3.31/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
are there any options at 4c/4t anymore? Inel pentium is still 2c/4t and celeron is 2c. I don't recall any zen 2 desktop CPUs being just 4c. I think you would need to either get a laptop or find a old/used CPU to just build a gaming 4c system.
I'm not sure if there are any 4c/4t chips in the current generation, but my comment was about using an older quad core CPU that you already have for gaming today, like my 2600k or Fourstaff's 3570k. There's been some discussions recently that seem to show that those who have 4c/8t CPUs are now having better performance than those with 4c/4t CPUs in modern titles.

So, if you already have a quad core chip that works for you for modern gaming, cool, but I wouldn't recommend anybody build a new desktop for gaming in 2020 with a mere quad core, even if it does have 8 threads.
 
Joined
Oct 30, 2013
Messages
45 (0.01/day)
Plenty of budget gamers will buy this CPU without an iGPU, that's the point. It is a budget gaming CPU, that does really well at gaming. This CPU, 16GB of RAM, and a 3060 or 3050 would make a very good budget gaming rig, and the 10100F isn't going to hold either GPU back.

Ok this makes sense if CPU is really good at gaming, but is it really?
 
Joined
Oct 10, 2018
Messages
943 (0.42/day)
Both current I3 and R3 options from both companies are great 1080p gaming solutions.if you have more budget of course go for higher cores.but this wont disappoint.
 
Joined
Jan 4, 2016
Messages
47 (0.01/day)
If I were building a gaming PC for my kids on a strict budget this might be good. Pair it with a budget dedicated graphics card, like the 2060 or the (presumably upcoming) 3060 or something for sub-$200 and you'd have a good 1080p gaming desktop for esports and basic games like that.

I would also use it for an HTPC to stream games from my main PC down to my living room TV. But even then I would honestly either go with the chip that has integrated graphics and skip the dedicated graphics card or (better yet IMO) go with the newer AMD APUs so that it has better integrated graphics and presumably would work better for me for that purpose.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I'm saying how many people will buy the 10100F with the same z board as TPU & DDR4 3200 memory, is that clear now? Without the high speed memory & the higher specced motherboard, which Intel gimps intentionally, the delta is going to be higher. Now tell me is the 10100F also a drop in replacement on say a H110 board? So you're telling me someone, hopefully informed enough, picks a new Intel board, new RAM, new CPU & has a sub $300 budget for gaming (not counting the dGPU) & still goes Intel ~ because of reasons?

Other than overclocking, the motherboard isn't going to make any real difference in performance. And since we're talking about chips that you can't overclock anyway, there isn't any real difference.

As for the 3200MHz RAM, the RAM speed doesn't make a lot of difference on Intel like it does on AMD, but 3200MHz RAM is cheap these days. Its usually only a few bucks more than 2666.

Why wouldn't they go Intel? As we've pointed out, the 10100F is a very capable gaming CPU. The closest AMD alternative is a R3 3100, which is slower than the 10100F and costs $120. Why would any informed person go AMD in that situation? Their budget boards aren't exactly leaps and bounds better than Intel's. The R3 needs faster RAM to perform well, the Intel doesn't. Why would anyone spend $30 more on the slower AMD processor other than fanboyism?

Ok this makes sense if CPU is really good at gaming, but is it really?

Yes, we've gone over this.
 
Last edited:
Joined
Feb 20, 2019
Messages
8,192 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I'm not sure if there are any 4c/4t chips in the current generation, but my comment was about using an older quad core CPU that you already have for gaming today, like my 2600k or Fourstaff's 3570k. There's been some discussions recently that seem to show that those who have 4c/8t CPUs are now having better performance than those with 4c/4t CPUs in modern titles.

So, if you already have a quad core chip that works for you for modern gaming, cool, but I wouldn't recommend anybody build a new desktop for gaming in 2020 with a mere quad core, even if it does have 8 threads.
This is why I said quad cores are obsolete for gaming. Whether it's 4C/4T or 4C/8T it doesn't make a huge difference, they both struggle to reach consistent 60fps minimum framerates on plenty of modern games, regardless of what graphics card or graphics settings you are using.

There are two scenarios for the 10300F, as I see it:
  1. If you are buying an all-new platform, buying a quad-core is almost pointless because the old platform you're replacing is probably already a quad core.
  2. If you're not buying an all-new platform, then the 10300F is irrelevant because it won't fit in any old motherboards with its new socket requirements.
By the time you've forked out $250 on a gaming GPU and $200 on a motherboard and RAM, you're $450 into a gaming investment so why cripple it with a puny quad core that is only half the potency of the consoles that are going to dominate the next decade of games development?
 
Joined
Nov 15, 2005
Messages
1,011 (0.15/day)
Processor 2500K @ 4.5GHz 1.28V
Motherboard ASUS P8P67 Deluxe
Cooling Corsair A70
Memory 8GB (2x4GB) Corsair Vengeance 1600 9-9-9-24 1T
Video Card(s) eVGA GTX 470
Storage Crucial m4 128GB + Seagate RAID 1 (1TB x 2)
Display(s) Dell 22" 1680x1050 nothing special
Case Antec 300
Audio Device(s) Onboard
Power Supply PC Power & Cooling 750W
Software Windows 7 64bit Pro
The i3-10100 has been on sale at Microcenter for $99.99 for quite some time now. The i3-9100F has been on sale for $69.99. Maybe this will drive the 9100F down even more and this will slot somewhere in between. And I know, Microcenter is not representative of prices as a whole but one can hope.
 
Last edited:
Joined
Jul 16, 2014
Messages
8,195 (2.18/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
Lowspecgamer would love this cpu.
thats what, a step below entry level? :rolleyes:

We've been hearing this for what? 5+ years now? Since at least the FX days at least.
Have you done any checking? I know most of the new games I play run on more than 1 core, simulated thread or not. A small number of older games were adapted to multi-core ( Rift is one of them I once played). Any game design specifically for Intel CPUs will never be multi-core. I doubt any AAA or higher game will commit to only 1 core anymore even for Intel cpus.

But to get back on topic, as I said already this is sub-entry level, it will be couple with cards like 1030 or even a ( whats the worst AMD card?). I can only wait for a review to speculate how bad this will do in gaming performance.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Have you done any checking? I know most of the new games I play run on more than 1 core, simulated thread or not. A small number of older games were adapted to multi-core ( Rift is one of them I once played). Any game design specifically for Intel CPUs will never be multi-core. I doubt any AAA or higher game will commit to only 1 core anymore even for Intel cpus.

Sure, single core isn't as important anymore, but games don't scale to multi-cores that well. That's why more than 4 cores doesn't help and clock speed/IPC is still king.
 
Joined
Aug 4, 2020
Messages
1,608 (1.03/day)
Location
::1
[ ... ]

As for the 3200MHz RAM, the RAM speed doesn't make a lot of difference on Intel like it does on AMD, but 3200MHz RAM is cheap these days. Its usually only a few bucks more than 2666.

[ ... ]


Come again?
 
Joined
Sep 20, 2018
Messages
1,451 (0.65/day)
Whether it's 4C/4T or 4C/8T it doesn't make a huge difference, they both struggle to reach consistent 60fps minimum framerates on plenty of modern games, regardless of what graphics card or graphics settings you are using
What are you even talking about lol, the 10100f is very capable, it can drive a gtx 1660 Super and game over 60fps at 1080p easily.

Some folks are just: if its not a top end model then why bother ?

:shadedshu:



Notice how the i3 can drive a GTX 1660 Super fully with about half the CPU utilization, except in AC Odyssey which is just a garbage port
 
Last edited:
Joined
Feb 20, 2019
Messages
8,192 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
What are you even talking about lol, the 10100f is very capable, it can drive a gtx 1660 Super and game over 60fps at 1080p easily.

Some folks are just: if its not a top end model then why bother ?

:shadedshu:



Notice how the i3 can drive a GTX 1660 Super fully with about half the CPU utilization, except in AC Odyssey which is just a garbage port

  • Warzone - runs fine, stays above 60fps most of the time but those minimums are stutters that a better CPU wouldn't have.
  • RDRII - If you want to run at 35fps because of crippling GPU limitations, then yes - any potato CPU will do the job.
  • Running fortnite with lows of 35fps during any action is an abysmal result that affects your ability to aim properly. Even when not busy, it looks stuttery.
  • Forza's fine. Most racing games are exceptionally easy on CPUs.
  • BFV single player is easy on the CPU. Multiplayer is where you'll really find problems with quad cores. I haven't done much BFV MP, but BF1 MP was terrible on a quad core.
  • AC:Odyssey is an abysmal port, agreed - but like HZD, 8 actual cores is the answer here to get around the original engine's focus on 8 equal threads.
  • Metro is GPU bound, like RDRII it's pointless to say "the i3 is fine" when it's stuttering along at 3-14 fps due to background streaming issues.
  • SW:FO is dropping frames quite significantly at the (non-cutscene) start of that clip. Hard to say what's at fault here.
Given that the default for a non-gaming monitor is actually 75Hz these days, and 144Hz panels are cheap - it's not really a "gaming CPU" unless you can run at >75fps. Even my TV is 120Hz and I'm a filthy casual now. My old, retired, 3770K could likely have done an equally mediocre job in running those games above in an 'okay, I guess' way. I can buy one of those for $25 on Craigslist or ebay.

Proving that the i3 can mostly handle 30-80fps when there's a GPU bottleneck doesn't really cut it, that's why CPU reviews test at 720p.
This article is 18 months out of date, but it's the first match on Google and it's relevant because it keeps the architecture pretty consistent between tested models and simply focuses on the impact of how many cores and threads affect gaming:
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Come again?

A few things. First of all, 6.5% is not a big difference. I'd argue it's not even noticeable. The other thing is, go look at the test setup. Not only was the RAM running at 2666, it's timings were super gimped. It went from 3200 14-14-14-34 to 2666 16-16-16-36. How often do timings get worse with slower speeds?

And out of all of that, you missed the entire point that even with the slower RAM, the significantly cheaper Intel chip is still faster than the R3 3100. And the R3 3100 will take a bigger hit with the slower RAM than the Intel chip will. I didn't say faster RAM made no difference with Intel, I said it doesn't make a lot of difference.
 
Joined
Sep 20, 2018
Messages
1,451 (0.65/day)
  • Warzone - runs fine, stays above 60fps most of the time but those minimums are stutters that a better CPU wouldn't have.
  • RDRII - If you want to run at 35fps because of crippling GPU limitations, then yes - any potato CPU will do the job.
  • Running fortnite with lows of 35fps during any action is an abysmal result that affects your ability to aim properly. Even when not busy, it looks stuttery.
  • Forza's fine. Most racing games are exceptionally easy on CPUs.
  • BFV single player is easy on the CPU. Multiplayer is where you'll really find problems with quad cores. I haven't done much BFV MP, but BF1 MP was terrible on a quad core.
  • AC:Odyssey is an abysmal port, agreed - but like HZD, 8 actual cores is the answer here to get around the original engine's focus on 8 equal threads.
  • Metro is GPU bound, like RDRII it's pointless to say "the i3 is fine" when it's stuttering along at 3-14 fps due to background streaming issues.
  • SW:FO is dropping frames quite significantly at the (non-cutscene) start of that clip. Hard to say what's at fault here.
Given that the default for a non-gaming monitor is actually 75Hz these days, and 144Hz panels are cheap - it's not really a "gaming CPU" unless you can run at >75fps. Even my TV is 120Hz and I'm a filthy casual now. My old, retired, 3770K could likely have done an equally mediocre job in running those games above in an 'okay, I guess' way. I can buy one of those for $25 on Craigslist or ebay.

Proving that the i3 can mostly handle 30-80fps when there's a GPU bottleneck doesn't really cut it, that's why CPU reviews test at 720p.
This article is 18 months out of date, but it's the first match on Google and it's relevant because it keeps the architecture pretty consistent between tested models and simply focuses on the impact of how many cores and threads affect gaming:
It all boils down the this: if someone planning to buy a 100$ CPU, he's probably going to mix it with a 150$ - 250$ GPU ( GTX1650 Super - 1660 Super) and for that the i3 10100 can power those GPUs with plenty more juice to spare. In the video the games were tested on max graphics and most of those games already run at 70-120fps, stop proving min fps mean anything we all know those can be incorrect values captured during a game loading or transitioning areas, its the real time fps of the gameplay we should focus on an its mostly satisfactory for 1080p 60fps gamers, some games even reach above 100fps here and there, drop the graphical fidelity by one value and get a performance that is beyond the 75hz value easy
 
Last edited:
Joined
Jul 16, 2014
Messages
8,195 (2.18/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
I didn't say faster RAM made no difference with Intel, I said it doesn't make a lot of difference.
not many will understand the distinction, but just a guess, you are talking single digit percentage differences with intel?
 
Joined
Feb 20, 2019
Messages
8,192 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
It all boils down the this: if someone planning to buy a 100$ CPU.
You keep talking about a $100 CPU. This isn't just a $100 CPU as there's no old motherboard you can upgrade with this chip. It's a $250 investment if done on the cheap and nasty stuff, more like $300+ to do it properly.

If you're so broke that you can't afford a better CPU but you somehow had the cash to buy a whole new S1200 platform, then you're doing it wrong. The 10400F is 400MHz faster and has 50% more cores and threads, that's raises the total platform cost by maybe $35 which is 10-15%.

Realistically, if people are super short of cash then the 9100F makes WAAAAAAAY more sense than this dead-end S1200 platform. DDR5 is going to require a motherboard change for Intel (again) so investing in S1200 for the sake of an i3 is foolish unless someone plans to abandon their newly-purchased 10th gen i3 very shortly and drop an i7 or i9 in there.
 
Last edited:
Joined
Aug 4, 2020
Messages
1,608 (1.03/day)
Location
::1
A few things. First of all, 6.5% is not a big difference. I'd argue it's not even noticeable. The other thing is, go look at the test setup. Not only was the RAM running at 2666, it's timings were super gimped. It went from 3200 14-14-14-34 to 2666 16-16-16-36. How often do timings get worse with slower speeds?

And out of all of that, you missed the entire point that even with the slower RAM, the significantly cheaper Intel chip is still faster than the R3 3100. And the R3 3100 will take a bigger hit with the slower RAM than the Intel chip will. I didn't say faster RAM made no difference with Intel, I said it doesn't make a lot of difference.
3200-C16 is already the baseline (costs maybe $5 max more than the cheapest, slower module), so any Ryzen will run 3200-C16 as baseline, and should be compared against that (thanks Intel!).
2666-C16 is awful yeah, blame W1zzard for not doing more realistic tests :rolleyes:
The 3100 simply is awful for gaming and shouldn't really be considered because the 2+2 CCX arch totally hamstrings it for gaming - the 3300X would probably be the more apt comparison here if only that would be available at all (thanks AMD!)

I'm not saying 6.5% is a big difference but isn't the delta between 2666 and 3200 more or less the same between Intel and AMD? (like, around 7% at 1080p? It's just that, given AMDs difference in policy when it comes to memory speeds no1 sane will ever stuck their Ryzen at 2666, so there aren't really benches for that.)

I mean, it also depends on what you're doing with your computer - 144Hz+ gaming is on the rise and if you plan on doing that, definitely spring a few bucks extra for the Z490 - for a bottom-of-the-barrel budget gaming potato you'd obviously be both stuck at 1080p-60Hz and GPU bottlenecked so yeah at that point the B460/H470 or w/e cheapest board will suffice, yes.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
not many will understand the distinction, but just a guess, you are talking single digit percentage differences with intel?

Correct, and I'm talking about realistic scenarios in gaming with a mid-tier graphics card that is being maxed out. Not with the highest end graphics cards running at 720p(which still didn't show even a 10% difference).

3200-C16 is already the baseline (costs maybe $5 max more than the cheapest, slower module)

I 100% said that, you even quoted it. Arguing about the difference between 3200 and 2666 is pointless, as I said originally. I don't know why you want to keep doing it.

The 3100 simply is awful for gaming and shouldn't really be considered because the 2+2 CCX arch totally hamstrings it for gaming - the 3300X would probably be the more apt comparison here if only that would be available at all (thanks AMD!)

That was entirely my point. Did you not bother to read the other posts in the thread? The entire RAM issue was because someone said AMD was the cheaper option compared to the 10100. The closest competition from AMD in price is the 3100, which is worse than the 10100 at gaming and more expensive. But they claimed, for some reason, that if the Intel system didn't have the high speed RAM, the gap between AMD and Intel would be smaller. Which is completely untrue. The fact was the AMD system used the same 3200MHz RAM, but the fact is also that the gap would be bigger if the AMD system used the same 2666 RAM config as the Intel system, because AMD relies more on RAM speed than Intel. This is even more true with the 3100 because of the 2+2 CCX config. But even with the single CCX of the 3300X, the RAM speed still has a greater affect on the AMD than Intel. It isn't that Intel isn't affected by the RAM speed, it is just that it affects AMD more.
 
Joined
Aug 4, 2020
Messages
1,608 (1.03/day)
Location
::1
Actually the closest alternative would be the (inexistent) 3300X (MSRP of $120; but you get to use a cheaper B550 vs Z490 for the 3200 memory and the same perf) but I believe we can agree this would still be a moot debate as the 3300X is essentially extinct at this point.

Obviously going for the 3100 over the 10100 is asinine, 2666 memory or not. However, I wouldn't necessarily believe that the 3300X would be impacted more than the 10100 by 2666 memory (I don't believe any1 has benched the 3300X on 2666 memory and published results?).
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.11/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Actually the closest alternative would be the (inexistent) 3300X (MSRP of $120; but you get to use a cheaper B550 vs Z490 for the 3200 memory and the same perf) but I believe we can agree this would still be a moot debate as the 3300X is essentially extinct at this point.

The 3100 would be the closest alternative, price wise, as I said. It has an MSRP of $100, the 10100F is $95. But, as you said, the 3300X being completely absent from the market means the 3100 has gone up in price to the $115 price point.

And, yeah, the lower Intel chipsets don't support 3200, but they do support 2933. The performance difference is next to nothing dropping from 3200 to 2933.
 

rgrooms

New Member
Joined
Feb 15, 2020
Messages
7 (0.00/day)
We've been hearing this for what? 5+ years now? Since at least the FX days at least. The fact is these quad-core chips barely hurt gaming compared to parts with more cores and the performance difference is probably due to lower boost clocks and not missing cores.

The 10100 matches a 3800X and its only ~4% behind the best chips on the market:
The 3300x when overclocked with memory tuned will perform the same or better as well...especially when getting up to 1440p settings. In fact I would say at 1080p 3300x will beat it every time.

Actually the closest alternative would be the (inexistent) 3300X (MSRP of $120; but you get to use a cheaper B550 vs Z490 for the 3200 memory and the same perf) but I believe we can agree this would still be a moot debate as the 3300X is essentially extinct at this point.

Obviously going for the 3100 over the 10100 is asinine, 2666 memory or not. However, I wouldn't necessarily believe that the 3300X would be impacted more than the 10100 by 2666 memory (I don't believe any1 has benched the 3300X on 2666 memory and published results?).
I have my 3300x running with 3733 memory CL16 and I would bet much better performance than with 2666...youtuber Hardwa8re numbers has a video on this tuning Ryzen memory for gaming increase.
 
Joined
Aug 21, 2015
Messages
1,719 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
I'm going to go ahead and perform a bit of thread necromancy here (hey, it's Halloween!) because there's a bunch of sentiment here that consistently drives me bonkers.

The biggest thing that certain folks seem to forget is that not everybody's hardcore. Who cares if a 10100(F) can't properly drive a 2070+, or push the newest gen titles at 60+ FPS at 1440p/4K? If you're starting from scratch, or replacing a DDR3-era system, it's hard to argue with a platform cost of less than $300 (CPU/MB/RAM). Pair a 10100 with low-mid tier graphics and you're off to the 1080p/60 races. Maybe your details aren't maxed. Maybe certain poorly-optimized or -ported titles have greater-than-ideal frame drops. It. Will. Be. Fine. Hell, there are plenty of folks like myself perfectly happily rocking three-generation (or more!) old hardware and playing the plethora of quality games from three, five or more years ago with no performance issues whatsoever.

Also, to whoever proposed buying more graphics than processor: Why? What's the point of leaving specialized processing capability on the table just so the CPU can be pegged? Better to fully utilize the graphics capability you paid for and having central processing headroom leftover.

So no: the 10100(F) is not a gaming processor. There's no such thing as a gaming processor, unless you're talking about custom SOCs. There are only slower and faster processors.
 
Top