Sunday, October 11th 2020
Intel Launches the sub-$100 Core i3-10100F Quad-Core Processor
Intel launched the Core i3-10100F, an interesting option for entry-level gaming PC builds. This 4-core/8-thread processor lacks an iGPU, unlike the $120 Core i3-10100, but that shaves nearly a quarter off of its price, with the Intel ARK page for the chip reporting a price band of $79-$97 (per chip in 1,000-unit quantities). The lack of an iGPU means that the chip is targeted at gaming PC builds with discrete graphics cards. It otherwise has the same specs as the i3-10100, with four cores based on the 10th Generation "Comet Lake-S" microarchitecture, nominal clock speeds of 3.60 GHz with 4.30 GHz Turbo Boost, 6 MB of shared L3 cache, a dual-channel DDR4 memory controller that natively supports DDR4-2667 memory, and 65 W TDP. Its retail package includes a cooling solution. The i3-10100F should be drop-in compatible with any Socket LGA1200 motherboard. Do catch our review of the i3-10100, which should give you an idea of how the i3-10100F should perform.
47 Comments on Intel Launches the sub-$100 Core i3-10100F Quad-Core Processor
So, if you already have a quad core chip that works for you for modern gaming, cool, but I wouldn't recommend anybody build a new desktop for gaming in 2020 with a mere quad core, even if it does have 8 threads.
I would also use it for an HTPC to stream games from my main PC down to my living room TV. But even then I would honestly either go with the chip that has integrated graphics and skip the dedicated graphics card or (better yet IMO) go with the newer AMD APUs so that it has better integrated graphics and presumably would work better for me for that purpose.
As for the 3200MHz RAM, the RAM speed doesn't make a lot of difference on Intel like it does on AMD, but 3200MHz RAM is cheap these days. Its usually only a few bucks more than 2666.
Why wouldn't they go Intel? As we've pointed out, the 10100F is a very capable gaming CPU. The closest AMD alternative is a R3 3100, which is slower than the 10100F and costs $120. Why would any informed person go AMD in that situation? Their budget boards aren't exactly leaps and bounds better than Intel's. The R3 needs faster RAM to perform well, the Intel doesn't. Why would anyone spend $30 more on the slower AMD processor other than fanboyism? Yes, we've gone over this.
There are two scenarios for the 10300F, as I see it:
- If you are buying an all-new platform, buying a quad-core is almost pointless because the old platform you're replacing is probably already a quad core.
- If you're not buying an all-new platform, then the 10300F is irrelevant because it won't fit in any old motherboards with its new socket requirements.
By the time you've forked out $250 on a gaming GPU and $200 on a motherboard and RAM, you're $450 into a gaming investment so why cripple it with a puny quad core that is only half the potency of the consoles that are going to dominate the next decade of games development?But to get back on topic, as I said already this is sub-entry level, it will be couple with cards like 1030 or even a ( whats the worst AMD card?). I can only wait for a review to speculate how bad this will do in gaming performance.
Come again?
Some folks are just: if its not a top end model then why bother ?
:shadedshu:
Notice how the i3 can drive a GTX 1660 Super fully with about half the CPU utilization, except in AC Odyssey which is just a garbage port
- Warzone - runs fine, stays above 60fps most of the time but those minimums are stutters that a better CPU wouldn't have.
- RDRII - If you want to run at 35fps because of crippling GPU limitations, then yes - any potato CPU will do the job.
- Running fortnite with lows of 35fps during any action is an abysmal result that affects your ability to aim properly. Even when not busy, it looks stuttery.
- Forza's fine. Most racing games are exceptionally easy on CPUs.
- BFV single player is easy on the CPU. Multiplayer is where you'll really find problems with quad cores. I haven't done much BFV MP, but BF1 MP was terrible on a quad core.
- AC:Odyssey is an abysmal port, agreed - but like HZD, 8 actual cores is the answer here to get around the original engine's focus on 8 equal threads.
- Metro is GPU bound, like RDRII it's pointless to say "the i3 is fine" when it's stuttering along at 3-14 fps due to background streaming issues.
- SW:FO is dropping frames quite significantly at the (non-cutscene) start of that clip. Hard to say what's at fault here.
Given that the default for a non-gaming monitor is actually 75Hz these days, and 144Hz panels are cheap - it's not really a "gaming CPU" unless you can run at >75fps. Even my TV is 120Hz and I'm a filthy casual now. My old, retired, 3770K could likely have done an equally mediocre job in running those games above in an 'okay, I guess' way. I can buy one of those for $25 on Craigslist or ebay.Proving that the i3 can mostly handle 30-80fps when there's a GPU bottleneck doesn't really cut it, that's why CPU reviews test at 720p.
This article is 18 months out of date, but it's the first match on Google and it's relevant because it keeps the architecture pretty consistent between tested models and simply focuses on the impact of how many cores and threads affect gaming:
www.techspot.com/article/1803-are-quad-cores-dead/
And out of all of that, you missed the entire point that even with the slower RAM, the significantly cheaper Intel chip is still faster than the R3 3100. And the R3 3100 will take a bigger hit with the slower RAM than the Intel chip will. I didn't say faster RAM made no difference with Intel, I said it doesn't make a lot of difference.
If you're so broke that you can't afford a better CPU but you somehow had the cash to buy a whole new S1200 platform, then you're doing it wrong. The 10400F is 400MHz faster and has 50% more cores and threads, that's raises the total platform cost by maybe $35 which is 10-15%.
Realistically, if people are super short of cash then the 9100F makes WAAAAAAAY more sense than this dead-end S1200 platform. DDR5 is going to require a motherboard change for Intel (again) so investing in S1200 for the sake of an i3 is foolish unless someone plans to abandon their newly-purchased 10th gen i3 very shortly and drop an i7 or i9 in there.
2666-C16 is awful yeah, blame W1zzard for not doing more realistic tests :rolleyes:
The 3100 simply is awful for gaming and shouldn't really be considered because the 2+2 CCX arch totally hamstrings it for gaming - the 3300X would probably be the more apt comparison here if only that would be available at all (thanks AMD!)
I'm not saying 6.5% is a big difference but isn't the delta between 2666 and 3200 more or less the same between Intel and AMD? (like, around 7% at 1080p? It's just that, given AMDs difference in policy when it comes to memory speeds no1 sane will ever stuck their Ryzen at 2666, so there aren't really benches for that.)
I mean, it also depends on what you're doing with your computer - 144Hz+ gaming is on the rise and if you plan on doing that, definitely spring a few bucks extra for the Z490 - for a bottom-of-the-barrel budget gaming potato you'd obviously be both stuck at 1080p-60Hz and GPU bottlenecked so yeah at that point the B460/H470 or w/e cheapest board will suffice, yes.
Obviously going for the 3100 over the 10100 is asinine, 2666 memory or not. However, I wouldn't necessarily believe that the 3300X would be impacted more than the 10100 by 2666 memory (I don't believe any1 has benched the 3300X on 2666 memory and published results?).
And, yeah, the lower Intel chipsets don't support 3200, but they do support 2933. The performance difference is next to nothing dropping from 3200 to 2933.
The biggest thing that certain folks seem to forget is that not everybody's hardcore. Who cares if a 10100(F) can't properly drive a 2070+, or push the newest gen titles at 60+ FPS at 1440p/4K? If you're starting from scratch, or replacing a DDR3-era system, it's hard to argue with a platform cost of less than $300 (CPU/MB/RAM). Pair a 10100 with low-mid tier graphics and you're off to the 1080p/60 races. Maybe your details aren't maxed. Maybe certain poorly-optimized or -ported titles have greater-than-ideal frame drops. It. Will. Be. Fine. Hell, there are plenty of folks like myself perfectly happily rocking three-generation (or more!) old hardware and playing the plethora of quality games from three, five or more years ago with no performance issues whatsoever.
Also, to whoever proposed buying more graphics than processor: Why? What's the point of leaving specialized processing capability on the table just so the CPU can be pegged? Better to fully utilize the graphics capability you paid for and having central processing headroom leftover.
So no: the 10100(F) is not a gaming processor. There's no such thing as a gaming processor, unless you're talking about custom SOCs. There are only slower and faster processors.