• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core Ultra Arrow Lake Preview

Agreed, which is why I am holding off until the 9000X3D chips out come to then have my final opinion on this gen, if it falls short as well then it just means this generation is mediocre in general.

Arrow lake is Intel's first attempt at a tile based desktop CPU similar to what AMD did first with Ryzen 3000s, so to be honest I was expecting it to be rocky or even have performance regression. Ryzen 3000s suffered from high latency due to the new chiplet design at the time if my memory is correct, which was rectified with Ryzen 5000s which had massive performance gains over 3000s. If anything I think Arrow Lake is Intel's Zen 2 moment in the sense of them ditching their traditional monolithic design that they have used for their consumer desktop CPUs for the last decade to a tile based design, I guess they need another generation in order to properly optimise their new tile design just like how AMD did with their 5000s chips.

Intel's only possible saving grace for this gen from what I would predict is Bartlett Lake if they actually go through with it. The question is if they would backport their lion cove cores back to LGA 1700 or would they just optimise and reuse Raptor Cove. In any case, I just hope they recognise that they need to start releasing their own SKU or SKUs with stacked cache just like AMD's X3D chips because at this point it has to be apparent to them that most games nowadays benefit more from more cache compared to higher clock speeds.

Well said. Add 3dcache to Intel, make a monolithic/backported product for gaming, or release Lunar Lake for the desktop. Those are their options. The actual products they are releasing are not great.

Not really, and even then you act like i dont know that. If memory makers start pushing frequencies then latency will be next to come down too. It always does.

Whats your response when AMD says their sweet spot is some x amount higher than their default? Just "marketting fluff"??... because you'd be wrong. Their platforms always operate better at that sweet spot and it can be better if you can push it above while maintaining 1:1 uclk to mclk ratio. And there is a ton you can do with ddr5 to mitigate latency on these dimms beyond just CAS.

I like tweaking ram. I work on layout of memory PHYs in VLSI as my job. Its fun for me.
no they've been pushing frequencies for 3 years now and still 5600C28 or 6000C28/30 is still the best
 
Would you be so kind to share how you realsised this undervolt, with which application? Do you also have a power limit applied to your GPU, if not why? Thanks
XTU
You can change the profile in real time using the assigned key combination.
Clipboard01.jpg


Well said. Add 3dcache to Intel, make a monolithic/backported product for gaming, or release Lunar Lake for the desktop. Those are their options. The actual products they are releasing are not great.

With RTX 4090 and I still wasn't convinced by this X3D. It is obvious that it dominates in gaming, but at a much more attractive price you can find processor variants that will not cause problems in games but humiliate it in many other applications, especially in multi-threading.
For example, a 14700KF limited to 95W effectively destroys the 7800X3D in the single-multi threading tests. And it costs $80 less, at least in my country.
I'm not saying they're not good, just that they're way too expensive for what they offer. 5800X3D, launched only two years ago, the king of gaming, is now at the 7600/13600K level. The moral wear and tear is enormous!

Clipboard01.jpg
 
Did you see the Intel slide with 447 Watts. That was not Total system draw. An MSI Live Stream showed that a 14900K and 3090 or 4090 could pull over 1100 Watts from the wall.
447 is system power. in their video presentation the AC mirage was pulling 360 on the socket including 120W CPU and 65 delta compared to 14900K 4090 combo. Apparently their numbers are all over the place. Those 14900K +150W numbers hint at 300W, and that's not quite possible unless heavily overclocked. Normally 14900K never goes above 200W in CP2077.
 
Last edited:
Well said. Add 3dcache to Intel, make a monolithic/backported product for gaming, or release Lunar Lake for the desktop. Those are their options. The actual products they are releasing are not great.


no they've been pushing frequencies for 3 years now and still 5600C28 or 6000C28/30 is still the best
Thats still pretty early in terms of new generations of DDR. Most of the tightest timing to hit DDR4 didnt come until the very end.

Also didnt TPU just release some RAM scaling benchmarks with 7200 being an improvement over 6000/6400? (NOTE that it was on AMD which may not tell the same story on Intel - Ryzen has always had a lot more inner core latency than intel and low latency RAM helps mitigate it where as that may not be the case with Arrow Lake)
 
Well said. Add 3dcache to Intel, make a monolithic/backported product for gaming, or release Lunar Lake for the desktop. Those are their options. The actual products they are releasing are not great.

1728854368758.png

1728854026590.png


Arrow and Lunar side by side. Lunar is a fake representation. Arrow may be the real one, but the P-Core E-core placement is old style, the low power island is isolated, not between the P's as in the slides.
The memory controller belongs on the Compute tile, instead it is placed in the SOC. The SOC, which contains the Media Engine, Display engine and NPU, should have been on the GPU tile.
 
Last edited:
This is so funny. How many gamers are CPU limited to notice slightly worse gaming performance?
even i9 9900k is still good gaming CPU

2 years later its pretty meh... Also it sounds like multiple AM4 X3D chips will be faster in gaming....



There are multiple games I'm already cpu limited in with X3D 7000 processors. I agree though most people probably are not.
if u using RTX4090, using slower GPU differences can be just 0%

If CPU limited? u get 170fps but whit faster CPU u can get 180fps (Rtx4090 in 1080p gaming right?)

u know also Ram can raise u FPS more than faster CPU alone.
 
About the temp template...who wants an AIO 360 in the first place if it isn't the kids playing Fortnite with their headsets ?
I mean, the pump+fans are doing a hell of a noise, this is dumb shit to put in any room.
And I won't talk about the need to have the space to put any AIO 360 in an old case.
Can we tell they won't deliver any shitty noisy cooler for those so called "efficient CPUs" ? Those things would be delivered without cooler ?
Ok I bite here but those marketing scums are shitty, those seem to be good CPUs for people with 11th/12th gen CPUs but when you look closer, it stinks a lot.
I guess I will just have to let my CPU die until I go to something "medium-good". What a shitshow...
 
I doubt people who just had their Intel CPU ruined would rush to buy another Intel? And Intel downplays issue, saying it affected only a small number of CPUs.
I am one of those people and I surely won't touch Intel in the next 2 generations even if it was better bang for the buck (which I did this time around for 13700/13900k vs Zen4 on release, since Intel was ironically a better bang for the buck and actually worse for my use cases).
 
Just checked a comparisson video on youtube with 265k and a 14700k, where the 265k doesn't look good at all, fairly high temps and lower fps,
must be a fake vid.
 
Back
Top