• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Raptor Lake Refresh is coming!

What product has Intel released that wasn't finished recently? Arc? I mean, it's expected their drivers have catching up to do with their first consumer dgpu.
Rocket Lake, Alder Lake, Arc.
Rocket Lake was a complete Joke and completely unnecessary, Alder Lake was really broken in the first few months with "press print to park e-cores" and legacy modes that don't even work, pulled bioses, massive clock stretching when overclocked, not being able to overclock at all, DDR4 3200 1DPC unstable, Windows 11 mandatory.
AMD releases LOTR Gollum level Platforms since Ryzen launched.
ADA from NVidia is flawed with bugs from massive DPC Latency issues which are still not fixed, driver crashes, black screens, monitors not waking up, melting 12vhpwr plugs etc.

the current state of PC Hardware is atrocious.
other things that come in my mind right now:
PSUs blowing up (CWT Platform sold by gigabyte), SSDs deteriorate within weeks (samsung), 1300€ Oleds with no burn in warranty, Motherboards that push half a kilowatt into shut down Zen 4 chips, rotated SMDs blowing up, AIOs across many different vendors clogging up and dying constantly, Quality Product get even more expensive (Noctua raised prices by 10-20% across the board a couple months ago).
it's an absolute shitshow.


at least they had pimp my ride like spinners for Case Fans and fancy displays inside a side panel on computex. :confused:

what's not proper about the 13400 ?
maybe because it's not raptor lake.
 
Rocket Lake was a complete Joke and completely unnecessary
I disagree. It was the first product on a new architecture since Skylake (or rather, Sandy Bridge?), it had 3200 MHz RAM support out of the box, and Xe graphics with HDMI 1.4 and DP 2.0. Sure, it didn't bring much improvement over Comet Lake (edited) in raw performance, but that wasn't the point, as the two generations basically coexisted.
 
Last edited:
Well they think they do, its funny that these companies think they must release something lol, they already have an existing product to sell.
They do it because it increases sales. Same reason AMD did rebrandeon. It worked. Consumers see number go up and pry open their wallets.

I disagree. It was the first product on a new architecture since Skylake (or rather, Sandy Bridge?), it had 3200 MHz RAM support out of the box, and Xe graphics with HDMI 1.4 and DP 2.0. Sure, it didn't bring much improvement over Comet Lake (edited) in raw performance, but that wasn't the point, as the two generations basically coexisted.
While Rocket lake didnt beat top end Comet lake, it performance per core was notably superior to the old skylake designs. In software that used AVX, like StarCraft or sins of a solar empire, this was very noticeable.

That got lost in the "ZOMG it only has 8 cores lame LOLZ" flinging.
 
Rocket Lake, Alder Lake, Arc.
Rocket Lake was a complete Joke and completely unnecessary, Alder Lake was really broken in the first few months with "press print to park e-cores" and legacy modes that don't even work, pulled bioses, massive clock stretching when overclocked, not being able to overclock at all, DDR4 3200 1DPC unstable, Windows 11 mandatory.
AMD releases LOTR Gollum level Platforms since Ryzen launched.
ADA from NVidia is flawed with bugs from massive DPC Latency issues which are still not fixed, driver crashes, black screens, monitors not waking up, melting 12vhpwr plugs etc.

the current state of PC Hardware is atrocious.
other things that come in my mind right now:
PSUs blowing up (CWT Platform sold by gigabyte), SSDs deteriorate within weeks (samsung), 1300€ Oleds with no burn in warranty, Motherboards that push half a kilowatt into shut down Zen 4 chips, rotated SMDs blowing up, AIOs across many different vendors clogging up and dying constantly, Quality Product get even more expensive (Noctua raised prices by 10-20% across the board a couple months ago).
it's an absolute shitshow.


at least they had pimp my ride like spinners for Case Fans and fancy displays inside a side panel on computex. :confused:


maybe because it's not raptor lake.

Underwhelming performance aside, RKL was not buggy, it just wasn't substantially better than CML. For each improvement there was a regression, and that was expected of the first generation of post-Skylake modern design CPUs. CML had to be great, after all it was the same processor Intel had been re-releasing for four full generations. ADL and Arc were new techs, and Windows 11 isn't mandatory - I'm running 10 on my 13900KS system only because ironically - 11 doesn't function correctly on my machine.

AMD's "crimes" were far worse - socket AM4 with AGESA issues through the socket's entire lifetime, fires on socket AM5, manufacturing defects causing overheating on RX 7900 XTX, etc.

The DPC latency bug of Nvidia's driver affects more than just Ada, but yes, the entire product stack is terrible and so are the prices, but 4090s are selling like hotcakes, so expect it to worsen over time, until AMD can recollect itself and release a product worthy of being called a competitor. With the 7900 XTX being a failed product and having been repositioned a tier below, you can reasonably expect for this to remain true at least for the current generation.
 
Underwhelming performance aside, RKL was not buggy, it just wasn't substantially better than CML. For each improvement there was a regression, and that was expected of the first generation of post-Skylake modern design CPUs. CML had to be great, after all it was the same processor Intel had been re-releasing for four full generations. ADL and Arc were new techs, and Windows 11 isn't mandatory - I'm running 10 on my 13900KS system only because ironically - 11 doesn't function correctly on my machine.

AMD's "crimes" were far worse - socket AM4 with AGESA issues through the socket's entire lifetime, fires on socket AM5, manufacturing defects causing overheating on RX 7900 XTX, etc.

The DPC latency bug of Nvidia's driver affects more than just Ada, but yes, the entire product stack is terrible and so are the prices, but 4090s are selling like hotcakes, so expect it to worsen over time, until AMD can recollect itself and release a product worthy of being called a competitor. With the 7900 XTX being a failed product and having been repositioned a tier below, you can reasonably expect for this to remain true at least for the current generation.
I really dont understand how AMD got away with platform support that genuinely made Intel look good by comparison. You should need community wide outrage campaigns to get firmware updates that were promised by the corporation that made them.

The 4090 is selling, but its apparently the only thing selling well, if Nvidia's gaming revenue numbers are to be believed. The rest are very meh. All AMD had to do was manage to make a larger version of what they had and the 7900xtx launched with the force of a wet fart. I wouldnt call it a failure, it is cheaper and performs better then a 4080, but at the same time I expected more from a GPU that size.
 
With the 7900 XTX being a failed product and having been repositioned a tier below, you can reasonably expect for this to remain true at least for the current generation.
When using the same node, AMD has been behind Nvidia for a long time now. The last AMD GPU that used the same node and gave Nvidia a pause was the 290X nearly ten years ago. The Fury X came close to replicating that at stock, but the 980 Ti had too much headroom when overclocked. Still, in hindsight, even the Fury X was more competitive than most of its successors.
 
it would be kind of neat to see if they release a "gaming focused" chip, so 8 p-cores only, and higher clocked, then if it were paired with its ecore cousins.

that could match or exceed a 7800x3d possibly. depending how much higher they could clock it without the extra heat/watts from the ecores.

even the 6ghz boost flagship 13900k has to share its heat load with ecores, i bet that boost could be a little higher without the extra heat/spacing of the die
 
it would be kind of neat to see if they release a "gaming focused" chip, so 8 p-cores only, and higher clocked, then if it were paired with its ecore cousins.

that could match or exceed a 7800x3d possibly. depending how much higher they could clock it without the extra heat/watts from the ecores.

even the 6ghz boost flagship 13900k has to share its heat load with ecores, i bet that boost could be a little higher without the extra heat/spacing of the die
Nothing stopping you from disabling the E cores, and having a 100-200 MHz higher P core OC if you were limited by thermals, although this would be offset by all background tasks now having to be run on the P cores.
 
E-Cores don't run hot anyways...
 
Well they think they do, its funny that these companies think they must release something lol, they already have an existing product to sell.

Maybe we get a 7ghz 500w CPU which is super duper high up the v/f curve. :)


Of course, no doubt.

So...we get new products because there's always a segment of the market that needs to have the best. There's a segment that needs "value" options. There's a segment that wants to spend almost nothing and get something.

The bleeding edge exists in the 5-10% performance gains. That's what these refreshes do, and target improvements that are not really substantive for the average user...but someone doing real processing might see value by cutting stuff like renders by 3-7%...that in the real world might mean literal hundreds of hours less investment.
The "value" consumer looks at "old" tech and sees value. I don't really need a 13900k, when a 5600x gives me nearly the same performance in most everything at fractions of the cost.
The something for nothing consumer looks at the 11000 series for upgraders and is willing to buy something out of date at pennies on the dollar...because after they use it for 3 years they'll be upgrading to the 13000 series.

The thing is, if any of these tiers stagnate then the flow of upgrade money dies. No new product means static old product pricing, which means nobody spends anything. Paraphrasing, the spice must flow. Rational people only upgrade every few generations because connectivity and performance finally matter enough. Thing is, the rational medium is not the spenders that companies care about...(says the person who went from a 3930k to a 5700x after a decade)




Put very shortly, it's about racking up generations. Sales peak and ebb, but as long as the 1x000 series becomes the 1y000 series then there's never a trough of diminished sales due to complete market penetration. Surely nobody would put out the 1y000 unless it was better than the 1x000...in some fashion, right?
 
Well what happens when they dont keep releasing new stuff is price on old stock gradually goes down, the pushing new stuff out is to keep spending high "both in new interest" which is what you saying but also in keeping prices up for the latest on the market.

I think we both saying the same thing though, as I do agree with your last point, they trying to avoid max market penetration, as thats the moment it becomes a buyers market. Plus a new gen gets a fresh load of media interest from TPU etc.

As a consumer I feel better if what I buy stays as the latest for a number of years, and then the next product is a big wow effect which is more likely if its a longer gap from the last generation.

Feels like only yesterday I got my 5600G, and its about to become 2 gens old. 9900k given the rush of new gens from intel was starting to feel old, but when I ordered my noctua bracket and had to fetch receipts I realised it was barely 3 years ago I got the thing.
 
it would be kind of neat to see if they release a "gaming focused" chip, so 8 p-cores only, and higher clocked, then if it were paired with its ecore cousins.

that could match or exceed a 7800x3d possibly. depending how much higher they could clock it without the extra heat/watts from the ecores.

even the 6ghz boost flagship 13900k has to share its heat load with ecores, i bet that boost could be a little higher without the extra heat/spacing of the die

Ha ha ha ha...

(wipes tear from eye, due to laughing)

You can't be serious, can you? I ask because this was exactly the same logic provided years ago when Intel decided to start bolting iGPUs to everything. It'd be much better if you could just disable that part of the silicon, and we'd be back to Sandybridge levels of performance, right? Then Intel did exactly that...and what we got was silicon with defective iGPUs, presumably physically disabled iGPUs, and the performance was basically negligibly better. That is to say all of the hot stuff was still forced to be close to minimize trace lengths...so that dark silicon was basically just a huge heat buffer that didn't give us much of anything.

Holy crap, I'm old. I've now seen the same argument that I made a decade ago or better, and now have to inform someone optimistically looking for the same copium that it will not come. It's funny how age gives you perspective, and that we keep making the same mistakes without learning anything from them. Sigh.
 
I would hate to lose iGPUs, they handy, on server type usage no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

Hopefully AMD will get act together on this and include them as standard, removing need for G chips.

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable.

Testing 13700k right now on its iGPU.
 
I would hate to lose iGPUs, they handy, on server type usage no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

Hopefully AMD will get act together on this and include them as standard, removing need for G chips.

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable.

Testing 13700k right now on its iGPU.
I need an iGPU, mostly because every single dGPU comes with only one HDMI port these days, and I need two.
 
I need an iGPU, mostly because every single dGPU comes with only one HDMI port these days, and I need two.
Yeah I always find it odd its more Display Ports than HDMI on GPU's.
 
what's not proper about the 13400 ?
They're not true Raptor Lakes, even when they aren't ADL.

Looking at Meteor Lake, it looks like Intel is pulling a Broadwell (accompanied by a Devil's Canyon) again.
 
Rocket Lake was a complete Joke and completely unnecessary, Alder Lake was really broken in the first few months with "press print to park e-cores" and legacy modes that don't even work, pulled bioses, massive clock stretching when overclocked, not being able to overclock at all, DDR4 3200 1DPC unstable, Windows 11 mandatory.
1) Not sure what you are on about here. Raptor lake is better than Alder lake, higher clocks for same power draw, more cache, better performance, and the price came down. The i5 13600K is nearly same performance as 12900KS.
2) E core issues aren't the calamity people make them out to be.
3) I run windows 10 LTSC, no issues.
 
I would hate to lose iGPUs, they handy, on server type usage no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

Hopefully AMD will get act together on this and include them as standard, removing need for G chips.

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable.

Testing 13700k right now on its iGPU.

All current-gen Ryzen 7000 AMD CPUs have an iGPU. Their act has been together since last year then?
 
They're not true Raptor Lakes, even when they aren't ADL.

Looking at Meteor Lake, it looks like Intel is pulling a Broadwell (accompanied by a Devil's Canyon) again.

And that's assuming they actually get any improvement on this node vs. the 13900KS. Which I find exceptionally difficult to believe, Intel hasn't been able to keep up with the demand for this halo chip and it's one that literally no one buys vs. the regular 13900K.

I'm willing to bet that the refresh will target primarily the i3, i5 and i7 segments.
 
And that's assuming they actually get any improvement on this node vs. the 13900KS. Which I find exceptionally difficult to believe, Intel hasn't been able to keep up with the demand for this halo chip and it's one that literally no one buys vs. the regular 13900K.

I'm willing to bet that the refresh will target primarily the i3, i5 and i7 segments.

I'm more fascinated on what they actually do It's asking a lot to push those chips even harder for non exotic cooling. I guess they could shift everything down a tier and call it a day a 13900k for $400 as a 14700k wouldn't be terrible.
 
I need an iGPU, mostly because every single dGPU comes with only one HDMI port these days, and I need two.
How do you use them at the same time?
 
And that's assuming they actually get any improvement on this node vs. the 13900KS. Which I find exceptionally difficult to believe, Intel hasn't been able to keep up with the demand for this halo chip and it's one that literally no one buys vs. the regular 13900K.

I'm willing to bet that the refresh will target primarily the i3, i5 and i7 segments.
^This. It's easy to forget on a forum like this, but the vast majority of people opt for the i5 and lower segments. As has been pointed out by @GerKNG and @wNotyarD in this thread, the 13400f is still based on Golden Cove and doesn't get any of the performance benefits of the Raptor Cove P cores and updated efficiency cores. A 14400f or 13490f with 15% more performance than the 13400f will benefit many more people than a 14900KS with 100 to 200 MHz higher turbo clocks.
 
All current-gen Ryzen 7000 AMD CPUs have an iGPU. Their act has been together since last year then?
Glad to hear, I did remember they said they were changing their plans on it, but didnt know if they did it on 7000 series.
 
How do you use them at the same time?
Easily. Just enable dual graphics in the BIOS. On my HTPC, I can even use the Xe iGPU for its HDMI 2.0 to do 4k 60 Hz, while the 1050 Ti does 3D if needed.

Glad to hear, I did remember they said they were changing their plans on it, but didnt know if they did it on 7000 series.
They did, which I'm quite happy about. :)
 
Back
Top