• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Raptor Lake Refresh is coming!

I would hate to lose iGPUs, they handy, on server type usage no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

Hopefully AMD will get act together on this and include them as standard, removing need for G chips.

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable.

Testing 13700k right now on its iGPU.

I...stop for just a moment. You seem to be missing what I'm trying to say, so let me take another swing at it.

iGPUs suck...in certain cases. Likewise, they're a godsend in others. Let me be clear that for the purposes of a consumer grade chunk of silicon, with a primary target audience of gamers, the inclusion of an iGPU is not ideal. Modern conveniences allow that silicon to functionally go dark in operation, but when they came out they were a useless silicon drain for people who had a dGPU.


Having said that, servers make great use of iGPUs. Troubleshooting is much easier with a built-in GPU. Media boxes are great with iGPUs. Modern conveniences make them a non-drain on the system. That does not change the fact that for a long time it was...not ideal for people to have a gaming CPU with a vestigial growth bolted onto it...especially when that growth meant dead space and higher temperatures. We're now seeing the same vestigial usage of P and E cores, where consumers are unlikely to use both so rabidly that we wouldn't see better performance by removing the less useful cores and bumping the clocks higher. I'm laughing because I see my failures repeated, and it's just funny to think about how little we've changed despite all of the change in the world and decades passing.


So we are clear I've setup a few AMD systems without a GPU and one Intel system in the last five years. I...hate the Intel system, and the AMD systems work well enough. That said, with GPUs requiring you to sell a kidney to buy into the middle-range it's getting more and more likely that iGPUs are the future of gaming until prices crash back to reasonable. For that, they deserve praise and respect.
 
I...stop for just a moment. You seem to be missing what I'm trying to say, so let me take another swing at it.

iGPUs suck...in certain cases. Likewise, they're a godsend in others. Let me be clear that for the purposes of a consumer grade chunk of silicon, with a primary target audience of gamers, the inclusion of an iGPU is not ideal. Modern conveniences allow that silicon to functionally go dark in operation, but when they came out they were a useless silicon drain for people who had a dGPU.


Having said that, servers make great use of iGPUs. Troubleshooting is much easier with a built-in GPU. Media boxes are great with iGPUs. Modern conveniences make them a non-drain on the system. That does not change the fact that for a long time it was...not ideal for people to have a gaming CPU with a vestigial growth bolted onto it...especially when that growth meant dead space and higher temperatures. We're now seeing the same vestigial usage of P and E cores, where consumers are unlikely to use both so rabidly that we wouldn't see better performance by removing the less useful cores and bumping the clocks higher. I'm laughing because I see my failures repeated, and it's just funny to think about how little we've changed despite all of the change in the world and decades passing.


So we are clear I've setup a few AMD systems without a GPU and one Intel system in the last five years. I...hate the Intel system, and the AMD systems work well enough. That said, with GPUs requiring you to sell a kidney to buy into the middle-range it's getting more and more likely that iGPUs are the future of gaming until prices crash back to reasonable. For that, they deserve praise and respect.
How is an iGPU that you can turn off in the BIOS a "drain on your system" and a heat source?
 
Yeah I always find it odd its more Display Ports than HDMI on GPU's.
HDMI has licensing. DP does not.

Also, DP can drive HDMI without issue. HDMI, OTOH, has severe issue trying to drive DP, and between monitors and adapters its hard to find a combo that actually works.

Pretty sensible. HDMI is a terrible standard that should have been put out to pasture a decade ago.
All current-gen Ryzen 7000 AMD CPUs have an iGPU. Their act has been together since last year then?
Those are a pittance. 2 CUs vs 12 for the APUs. the 2 CU vesion is barely useful as a display adpater, the 12CU APU model is surprisingly capable. I assume, when he says "eliminating the need for G series", he means having the APU sized GPU on all models.

Which I disagree with anyway, those who want a big dGPU and those who want a 16 core CPU tend to be different buyers. I'd rather see the G series combine an even larger iGPU, say 16 or 20 CUs, with the 3d cache that we've seen evidence of providing huge bumps to iGPU performance. THAT would be a genuinely interesting product.
Nothing stopping you from disabling the E cores, and having a 100-200 MHz higher P core OC if you were limited by thermals, although this would be offset by all background tasks now having to be run on the P cores.
He wants it as a separate product though, just like ye olden times when people wanted a 2500k without the iGPU for "faster performance" even though the iGU didnt slow the CPU down when not in use and there existed silicon to do that, it was the HDET series. But that cost more money and didnt perform better, so.......
 
HDMI has licensing. DP does not.

Also, DP can drive HDMI without issue. HDMI, OTOH, has severe issue trying to drive DP, and between monitors and adapters its hard to find a combo that actually works.

Pretty sensible. HDMI is a terrible standard that should have been put out to pasture a decade ago.
Except that it is still way more popular in basically anything other than gaming monitors for some reason.

He wants it as a separate product though, just like ye olden times when people wanted a 2500k without the iGPU for "faster performance" even though the iGU didnt slow the CPU down when not in use and there existed silicon to do that, it was the HDET series. But that cost more money and didnt perform better, so.......
I'll never get that when you can disable your iGPU (or your E-cores) with one click in the BIOS.
 
Those are a pittance. 2 CUs vs 12 for the APUs. the 2 CU vesion is barely useful as a display adpater, the 12CU APU model is surprisingly capable. I assume, when he says "eliminating the need for G series", he means having the APU sized GPU on all models.

That's not what he was referring to, quotes from his post:

"no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

....

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable."

This is for diagnostic, display adapter, not wasting a PCIe slot for those uses. £30 discrete is a display adapter like a GT 710.
 
Except that it is still way more popular in basically anything other than gaming monitors for some reason.
betamax VS VHS.
Laserdisk VS VHS
Firewire VS USB
Zune VS IPOD

The list goes on. the superior interface almost always gets done in by the inferior, but more consooomer oriented, competitor. I blame consoles, things like the xbox 360 pushed HDMI HARD in the consooomer space where only nerdy business PCs used displayport.
 
betamax VS VHS.
Laserdisk VS VHS
Firewire VS USB
Zune VS IPOD

The list goes on. the superior interface almost always gets done in by the inferior, but more consooomer oriented, competitor. I blame consoles, things like the xbox 360 pushed HDMI HARD in the consooomer space where only nerdy business PCs used displayport.
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
 
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
I prefer DP, but I understand that HDMI is the mass market's choice.
 
My motherboard has a DP, which I'm not using at the moment, just DP on the GPU.
 
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
Going from DP to HDMI is trivial. Going from HDMI to DP is a nightmare. And HDMI has licensing attached per port. DP does not.

Thats why GPUs usually do 3 DP and 1 HDMI. There is 0 advantage to GPU makers to include more HDMI, unless it is a specialized product for digital signage or such.

Every single display I own uses DP. most high refresh rate monitors use DP as a primary. The only thing that uses HDMI is my TV.
 
I prefer DP, but I understand that HDMI is the mass market's choice.
I don't mind either as long as it works. Connector standards are a secondary choice to picture quality to me when buying a monitor (which I haven't done since 2017).
 
I wasnt aware of licensing per port, explains the lopsidedness.
 
Besides having a igpu in a laptop and for troubleshooting, I wouldn't miss it. Some motherboards don't even have the ability to power it (usually overclocking ones). So it's a waste of silicon. The KF CPUs have it disabled, but those are damaged CPUs rebranded.

If Intel sold a future CPU without the igpu, I wouldn't be sad.
 
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
 
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
I have lots of spare (even gaming-grade) GPUs in the house, but I still want an iGPU in my CPU. :D
 
I don't mind either as long as it works. Connector standards are a secondary choice to picture quality to me when buying a monitor (which I haven't done since 2017).
I used to use DVI-D on my old monitor with 1080ti (it had DP but it didnt function properly, I assume buggy monitor firmware). With new monitor am now using DP for main PC.

On my 2nd machine (Ryzen), Its using a Dell 2209Wa screen which natively only has DVI-D and VGA, brought a DVI to HDMI cable last year so now connects via HDMI, currently using that screen now connected to HDMI iGPU port on new board to test it.

So HDMI is nice to have, but luckily I have only ever needed one port.
 
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
I'm iGPU-less because there's no other option for both my CPU and my MB, and there's always fear my GPU will fail and I'll have no video. When I do rebuild, it'll be with an iGPU as basic as it shall be.
 
^This. It's easy to forget on a forum like this, but the vast majority of people opt for the i5 and lower segments. As has been pointed out by @GerKNG and @wNotyarD in this thread, the 13400f is still based on Golden Cove and doesn't get any of the performance benefits of the Raptor Cove P cores and updated efficiency cores. A 14400f or 13490f with 15% more performance than the 13400f will benefit many more people than a 14900KS with 100 to 200 MHz higher turbo clocks.

Those extra 200 MHz would cost literally 150 watts of power, too. I'm quite sure that they will be deploying Raptor Cove at the low-end which will benefit the most and bump clocks on the i7 and lower segments, I was thinking i9-13900K clocks on a i7-13700K configuration (8P+8E) for the i7 segment refresh/i7-13750K. The i9 segment would be unchanged or they'd retire the i9-13900K in favor of a 13950K that has 100 MHz higher clocks than the 13900K but 100 MHz lower than the 13900KS. That's my personal take anyway, I could be surprised.

These KS chips really are pushed to the limit out of the box, it gets very unreasonable very fast past it, and I fail to see how could they extract more out of Intel 7 Ultra/10ESF+ process.
 
I wasnt aware of licensing per port, explains the lopsidedness.
I think its mainly the compatibility.

If you sell a card with 4 outputs, and a user wants to drive triple monitors, then a card with 3 DP and 1 HDMI can drive any combination of DP and HDMI monitors you want, since adapters are dirt cheap and just work. Hell you can do a combo of DP, HDMI, and VGA. Or DVI, HDMI, and DP.

But if you have a card with 3 HDMI and 1 DP, then your triple monitor config must have either 3 HDMI monitors or 2 HDMI and 1 DP monitor. ( or DVI, which can be converted from either). No other config will function, and you will get calls from customers who cant get their video card to output on their three DP monitors.

It's just easier to do as many DP as possible.
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
Every desktop build I've dont has been with a dGPu, dating back to my highschool days. The only iGPU I've used has been in laptops.

Those extra 200 MHz would cost literally 150 watts of power, too. I'm quite sure that they will be deploying Raptor Cove at the low-end which will benefit the most and bump clocks on the i7 and lower segments, I was thinking i9-13900K clocks on a i7-13700K configuration (8P+8E) for the i7 segment refresh/i7-13750K. The i9 segment would be unchanged or they'd retire the i9-13900K in favor of a 13950K that has 100 MHz higher clocks than the 13900K but 100 MHz lower than the 13900KS. That's my personal take anyway, I could be surprised.

These KS chips really are pushed to the limit out of the box, it gets very unreasonable very fast past it, and I fail to see how could they extract more out of Intel 7 Ultra/10ESF+ process.
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.
 
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.

Agreed but at the same time, the Raptor is already their 10 nm process perfected (10 nm/CNL 10SF/ICL 10ESF/ADL "Intel 7"/10ESF+/RPL "Intel 7 Ultra"). It'd be just very, very hard at this point, IMO. It's at the same step in lithography as 14 nm was when CML and RKL came around.

LOL, close and it made me laugh. Such a Intel thing. 5.5 is 320~ for me 5.7 is 380 and 5.7/6 (2core) is 410w. Can't cool that without going direct die.

There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
 
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
Seems like a you issue TBH.

Maybe HDMI is more common for TVs, but any semi-modern computer monitor will have a displayport.

HDMI is a worse standard, as previously stated by others. Displayport has higher bandwidth, and you can use a DP-HDMI adaptor if you insist on using HDMI only monitors.

Agreed but at the same time, the Raptor is already their 10 nm process perfected (10 nm/CNL 10SF/ICL 10ESF/ADL "Intel 7"/10ESF+/RPL "Intel 7 Ultra"). It'd be just very, very hard at this point, IMO. It's at the same step in lithography as 14 nm was when CML and RKL came around.



There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
We'll see.

Aside from the sidegrade of Rocketlake, Intel has been a solid choice for three generations now. I don't expect 14th gen to be different.
 
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.
You might be right, but I think you're forgetting that, besides the 7700k, Intel didn't just increase clock speed. They also increased the number of cores too, but that's unlikely with any 14900K. The 13900k is already a big die for a mass market CPU: 257 mm^2.
 
Seems like a you issue TBH.

Maybe HDMI is more common for TVs, but any semi-modern computer monitor will have a displayport.

HDMI is a worse standard, as previously stated by others. Displayport has higher bandwidth, and you can use a DP-HDMI adaptor if you insist on using HDMI only monitors.
Maybe worse on a technical level, but way more common from TVs to non-gaming, non-professional monitors. This isn't a me issue.
 
There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
I dont think I will have the same success undervolting my 13700k as 9900k, if there was headroom on the 13700k I expect it would be a 13900k instead.

At the moment just will be testing it with lowered PL1 to make sure basic functions work, but will later maybe try some kind of undervolt. I should be able to at least get rid of any overvolting ASRock are doing.
 
Back
Top