• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel Working on Fixing "Arrow Lake" Gaming Performance with Upcoming Patches

Admitting that you had no idea how your own product was going to perform says things about your competence in your role, Rob - none of them good.
He's trying to be honest* here, and as bad as his quote sounds, what else could a consumer ask for. Lies and fantsies wouldn't do no good here. Be better.

It's entirely possible that he isn't.
 
"I will say that the performance we saw in reviews is not what we expected and not what we intended"

I find this an interesting statement to make, as Intel's own marketing slides showed the same performance issues as reviewers did...
 
Nah, this is a new architecture. They eventually have to release the product, knowing full well that they could get more out of it. If they wait too long, they miss their launch window and sale seasons. They very likely got Arrow Lake to a mostly stable place and put them on the market. I don’t doubt they have things to work on, and I would not be surprised if they get more performance out of the product. That said, I wouldn’t buy this product on such hopes.
By the time they get the first engineering samples, it's already too late to do major redesign, as that would add years to the development cycle. And by the time they chose the stepping to be sent out as qualification samples (several months ago), they already had set the clock speeds and knew exactly the performance characteristics. Adding an extra stepping or two before release wouldn't change anything fundamental, just very minor tweaks or bugfixes.
And to reiterate my main point; software isn't going to change much. What we see in Linux tests (e.g. from Phoronix) is probably the best case we can get. It's a small step forward, but mostly this archtecture is laying the groundwork for future generations, which buyers of Arrow Lake will have no benefit from.

I imagine a world where Intel only released processors with the Pcores. They had real gains.
There is Xeon W, which isn't that expensive considering the pricing of "high-end" mainstream motherboards these days.
But Intel (and AMD) is missing out on a huge opportunity by not having a proper HEDT platform any more, a platform like:
- CPUs starting at 8 p-cores, decent (but consistent) clock speeds, 250W TDP(without crazy boost), smaller socket with "standard" cooler support.
- 4-channel unregistered ecc/non-ecc
- ~64 CPU PCIe lanes
- Great motherboards at ~$400.
Which would sell great compared to the increasingly useless and unbalanced mainstream platforms from either vendor.

Eight threads is not enough for modern games ported from CPU's with potentially 16 threads, which means games will inevitably drop back to Ecores and suddenly you'll have screwy 1% lows.
<snip>
Whoever at Intel thought a high performance enthusiast chip could run off only 8 pcore threads should have been fired yesterday.
In many cases, SMT actually makes gaming worse, as having multiple threads competing for a core creates more latency and inconsistency in multithreading than having p-cores and e-cores, and a hybrid archtecture would in fact perform better if Windows wasn't using an antiquated kernel.

Even those games which may use more than 8 threads wouldn't have all of these synchronized, as that scales terribly. Usually you might see 2-3 threads completely pegged down and synchronized, and the rest as async worker threads. One thing of note is how much more consistent games could be with better software; some years ago I was working on a rendering engine and testing it on Linux with the standard kernel and a "semi-realtime" kernel, the difference was astounding. While the realtime kernel probably lost ~0.5% average FPS, it was silky smooth and nearly all signs of microstutter disappeared. Applications were smoother too. But the disadvantage; much higher idle load, which is probably why we don't see this shipped as standard. But at the very least, it goes to show that things could be so much better than the jerky stuttery mess we know as Windows today.

Agreed with everything up until this bit. Why is this still a point of contention? Low res CPU benchmarks are an easily controlled testing environment for CPU performance. Nothing more, nothing less.

And while they may not be a big deal for single player games at higher resolutions, those performance deltas absolutely can matter for online play, particularly MMOs.
We've had this silly argument for probably two decades. :rolleyes:
When you run benchmarks with unrealistic hardware/software configurations, you are not eliminating the GPU as a factor; you are in fact introducing artificial bottlenecks which real world users never will run into. Only people who don't know how software scales would think this is a good idea. How a CPU performs in workloads far removed from anything you will ever actually use it for, shouldn't determine your purchasing decisions. And it's not going to tell you anything useful of which CPU is going offer better longevity in gaming performance either. On the contrary, when future games increases their computational load, the CPU with more computational power is going to pull ahead of the weaker CPU with lots of L3 cache.
 
Last edited:
He's trying to be honest* here, and as bad as his quote sounds, what else could a consumer ask for. Lies and fantsies wouldn't do no good here. Be better.

It's entirely possible that he isn't.
Bro.
Bro.
Bro.

He went on a podcast to talk about Arrow Lake.
That should have been a hint to him to have all the answers to all the possible questions about Arrow Lake.
And one of the most obvious possible questions is "why is Arrow Lake performance shit?"
The fact that he could not convincingly answer that question except with "not what we expected" suggests only one of three possibilities:

1. Hallock is incompetent
2. Intel is incompetent
3. Both 1 and 2

Now when you are talking publicly regarding fears that your CPUs are shit, what's the best way to calm that audience down and restore confidence? Is it to:

a. Obfuscate the truth and hint that fixes are coming
b. Claim you have no fucking idea what is going on, but fixes are coming (how can you fix something when you don't know what the actual problem is?)

I'll give you a hint: it's not b.

Hallock is in marketing. His job is not to be honest, it's to reassure customers that Intel is making great products, yet everything he said in that interview did the exact opposite. Does that sound like someone who is doing a good job to you?

There is Xeon W, which isn't that expensive considering the pricing of "high-end" mainstream motherboards these days.
But Intel (and AMD) is missing out on a huge opportunity by not having a proper HEDT platform any more, a platform like:
- CPUs starting at 8 p-cores, decent (but consistent) clock speeds, 250W TDP(without crazy boost), smaller socket with "standard" cooler support.
- 4-channel unregistered ecc/non-ecc
- ~64 CPU PCIe lanes
- Great motherboards at ~$400.
Which would sell great compared to the increasingly useless and unbalanced mainstream platforms from either vendor.
EPYC 8024P would cover all above needs, what's missing are consumer-oriented socket SP6 motherboards. Take the ASRock Rack SIENAD8-2L2T and remove MCIO, 1GbE, 10GbE and BMC (saves 16 PCIe 5.0 lanes); add 2.5GbE, WiFi and 8x SATA ports (-8 lanes) and two ASM4124 chipsets for four USB4 type-C ports (-8 lanes); and bob's your auntie.
 
EPYC 8024P would cover all above needs, what's missing are consumer-oriented socket SP6 motherboards. Take the ASRock Rack SIENAD8-2L2T and remove MCIO, 1GbE, 10GbE and BMC (saves 16 PCIe 5.0 lanes); add 2.5GbE, WiFi and 8x SATA ports (-8 lanes) and two ASM4124 chipsets for four USB4 type-C ports (-8 lanes); and bob's your auntie.
Nah, Threadripper is a much better contender, as Epyc has far too low clock speeds for workstation users/power users and gamers alike, but has mostly the same shortcomings as Xeon W.
 
Uh no, market researchers are who do that job. Research is not marketing.
The marketing that I've been taught in school included research. "Market study" is part of marketing. Doing a benchmark of the competition as well.
But that might be a national difference, it seems that most people in the US see marketing as communication while communication in France is something that's done after the marketing gathered all the data necessary to make an effective communication. I have a bachelor's in communication, and we've been taught the basic principles of marketing, but we weren't meant to become marketers, there's a specific bachelor's for that that goes deeper into it.

Translated from french. As you can see there's a separate specialization course for the students choosing to specialise themselve in marketing or in communication.
1731252694247.png
 
Last edited:
The 14900k has seen no regression, it's just that the AMD CPU's have finally gotten some love from Microsoft's scheduler in 24H2, and because of that AMD CPU's have seen an improvement. There's now 3 X3D CPU's that are faster in it in gaming (7800, 7950 & 9800), and the standard 9xxx series occasionally gets some wins as well. There's nothing 'going on' with thread scheduling of the 14900k.
14900K has seen performance regression both with bios updates and recent windows patches-- if you don't believe me then install windows 10 and test for yourself. Since the initial release of the chip you're looking at a 5-15% (edit up to 25%) performance regression depending on game.

I would guess these new fixes will help any setup with e cores, not just ARL -- at least I hope. The performance over the 2 years I owned the 13700KF started out amazing and then just slowly eroded with every windows patch.
 
Last edited:
14900K has seen performance regression both with bios updates and recent windows schedulers -- if you don't believe me then install windows 10 and test for yourself. Since the initial release of the chip you're looking at a 5-10% performance regression depending on game.

I think these new fixes will help any setup with e cores, not just ARL -- at least I hope.
Here ya go, the first 5 seconds are enough to see an obvious difference.

windows 10 clean install


Windows 11 24h2 clean install

 
Here ya go, the first 5 seconds are enough to see an obvious difference.

windows 10 clean install


Windows 11 24h2 clean install

This is exactly it...
 
This is exactly it...
I have more footage with similar differences in more games - just haven't uploaded them. 24h2 is absolute crap at least for intel, and I have no idea what OS ill use on my 9800x 3d. I guess ill have to test and find out for myself.

Take a look at one I uploaded - 2:39 minute market, it's 22h2 vs 24h2. Massive difference in 1% lows (up to 50 fps, lol).

 
I hope that with these upcoming patches they also fix these issues. A bit worried they won't focus on it at all to make the new architecture look better vs alder/rocket.
 
At the very least they officially recognize it as a disaster launch, guess that's progress over Raptor Gate.
C'mon Intel, fix this so AMD has a minimum of pressure to stop reselling the same old IO dies and chipsets.
 
Staggering level of incompetence! I felt bad for Intel for awhile but now it amaze me how badly this company is managed. It is to early to say they will be out of business this gen because
... AMD struggles too...
Recent cpu releases is a good scenario for a sitcom!
 
Last edited:
Releasing slow hardware, it's fast enough to be popular
promising to fix it... with software :kookoo:
1731269103066.png


Let's say, I am not optimistic regarding this generation of intel CPU-s :rolleyes:

Ps, my lightbulb just gone bust, maybe they could write some code to fixt that too :D
 
The fact that he could not convincingly answer that question except with "not what we expected" suggests only one of three possibilities:

1. Hallock is incompetent
2. Intel is incompetent
3. Both 1 and 2
Sorry, it’s 4. That’s just the narrative they’re spinning while they continue work the dials they have to try to extract more from it.

As others have pointed out, Intel & partners have had intermediate silicon/software and known the expected performance for a while. The reviewer guidance Intel provided was consistent with independent benchmark results.
 
Last edited:
I'll believe it when I see it.

Even if this "fix" corrects some of the gaming performance issues with Arrow Lake, the damage has already been done. Those of us who are clued into what's going on have more than likely already switched to AMD. And unfortunately for Intel, those who have already switched to AMD are probably going to stay with AMD seeing as how they plan on keeping with the AM5 platform for years to come. Suffice it to say, Intel shouldn't have released Arrow Lake until it was ready because damn did they shoot themselves in the foot with a 12-gauge shotgun.
 
Software will never make up for hardware design..
Bulldozer vibes with this comment:roll:
TBF I am glad I skipped that gen, went from Phenom to Core i7 920 and spent a few years with Intel before going back to AMD when Ryzen landed and I could get a 6c/12t CPU for less than $200 which was unheard of before Ryzen and kept to Intel HEDT lineup :eek:
 
there is zero possibility of fixing Arrow Lake with any sort of bios patch outside of fixing BUGS with the ecore/pcore assignment


no microcode update is going to make the imc suckless
no microcode update is going to make the IPC better (short of disabling security mitigations)

in chasing power consumption intel flubbed and cut to much out of the cores. disabling HT comes at a cost of not having those extra execution thread handling resources which help all around performance
 
I think they would do better concentrating on what the chip does well and less where it fails as what he’s doing now is only going to make people more disillusioned when these promises don’t materialise.

also get a new chip ready asap
 
Don't purchase based on a promise but actual benchmarks and stability feedbacks.

Given the price of their CPU, I really don't understand someone who wouldn't pick a 7800X3D or a 9800X3D for a gaming rig
8 core cost a lot...
And if gaming 1440p or 4K no need for fastest CPU
 
failintel, where's the outrage?
 
Hey Hey I 've asked the same question when it launched.
I 'll ask again:

Is It Safe Amazon Studios GIF by Amazon Prime Video
 
I just hope that in their efforts to push Arrow Lake, they don't forget to (or don't want to) support Raptor Lake with better bios.
 
Why bother? Everyone and their mother already bought shiny new voodoo magic 9800x3d. And if there are those who didn't, they'll buy it before christmas.
It's actually funny to see intel's redemption after 7-year long quad core stagnation back in the 2010's. If team blue doesn't manage to produce something similiar to 3d cache, intel is indeed destined to repeat Bulldozer's fate.
 
Back
Top