• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Raptor Lake Refresh is coming!

Joined
Apr 2, 2011
Messages
2,857 (0.57/day)
I would hate to lose iGPUs, they handy, on server type usage no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

Hopefully AMD will get act together on this and include them as standard, removing need for G chips.

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable.

Testing 13700k right now on its iGPU.

I...stop for just a moment. You seem to be missing what I'm trying to say, so let me take another swing at it.

iGPUs suck...in certain cases. Likewise, they're a godsend in others. Let me be clear that for the purposes of a consumer grade chunk of silicon, with a primary target audience of gamers, the inclusion of an iGPU is not ideal. Modern conveniences allow that silicon to functionally go dark in operation, but when they came out they were a useless silicon drain for people who had a dGPU.


Having said that, servers make great use of iGPUs. Troubleshooting is much easier with a built-in GPU. Media boxes are great with iGPUs. Modern conveniences make them a non-drain on the system. That does not change the fact that for a long time it was...not ideal for people to have a gaming CPU with a vestigial growth bolted onto it...especially when that growth meant dead space and higher temperatures. We're now seeing the same vestigial usage of P and E cores, where consumers are unlikely to use both so rabidly that we wouldn't see better performance by removing the less useful cores and bumping the clocks higher. I'm laughing because I see my failures repeated, and it's just funny to think about how little we've changed despite all of the change in the world and decades passing.


So we are clear I've setup a few AMD systems without a GPU and one Intel system in the last five years. I...hate the Intel system, and the AMD systems work well enough. That said, with GPUs requiring you to sell a kidney to buy into the middle-range it's getting more and more likely that iGPUs are the future of gaming until prices crash back to reasonable. For that, they deserve praise and respect.
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I...stop for just a moment. You seem to be missing what I'm trying to say, so let me take another swing at it.

iGPUs suck...in certain cases. Likewise, they're a godsend in others. Let me be clear that for the purposes of a consumer grade chunk of silicon, with a primary target audience of gamers, the inclusion of an iGPU is not ideal. Modern conveniences allow that silicon to functionally go dark in operation, but when they came out they were a useless silicon drain for people who had a dGPU.


Having said that, servers make great use of iGPUs. Troubleshooting is much easier with a built-in GPU. Media boxes are great with iGPUs. Modern conveniences make them a non-drain on the system. That does not change the fact that for a long time it was...not ideal for people to have a gaming CPU with a vestigial growth bolted onto it...especially when that growth meant dead space and higher temperatures. We're now seeing the same vestigial usage of P and E cores, where consumers are unlikely to use both so rabidly that we wouldn't see better performance by removing the less useful cores and bumping the clocks higher. I'm laughing because I see my failures repeated, and it's just funny to think about how little we've changed despite all of the change in the world and decades passing.


So we are clear I've setup a few AMD systems without a GPU and one Intel system in the last five years. I...hate the Intel system, and the AMD systems work well enough. That said, with GPUs requiring you to sell a kidney to buy into the middle-range it's getting more and more likely that iGPUs are the future of gaming until prices crash back to reasonable. For that, they deserve praise and respect.
How is an iGPU that you can turn off in the BIOS a "drain on your system" and a heat source?
 
Joined
Dec 28, 2012
Messages
4,032 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Yeah I always find it odd its more Display Ports than HDMI on GPU's.
HDMI has licensing. DP does not.

Also, DP can drive HDMI without issue. HDMI, OTOH, has severe issue trying to drive DP, and between monitors and adapters its hard to find a combo that actually works.

Pretty sensible. HDMI is a terrible standard that should have been put out to pasture a decade ago.
All current-gen Ryzen 7000 AMD CPUs have an iGPU. Their act has been together since last year then?
Those are a pittance. 2 CUs vs 12 for the APUs. the 2 CU vesion is barely useful as a display adpater, the 12CU APU model is surprisingly capable. I assume, when he says "eliminating the need for G series", he means having the APU sized GPU on all models.

Which I disagree with anyway, those who want a big dGPU and those who want a 16 core CPU tend to be different buyers. I'd rather see the G series combine an even larger iGPU, say 16 or 20 CUs, with the 3d cache that we've seen evidence of providing huge bumps to iGPU performance. THAT would be a genuinely interesting product.
Nothing stopping you from disabling the E cores, and having a 100-200 MHz higher P core OC if you were limited by thermals, although this would be offset by all background tasks now having to be run on the P cores.
He wants it as a separate product though, just like ye olden times when people wanted a 2500k without the iGPU for "faster performance" even though the iGU didnt slow the CPU down when not in use and there existed silicon to do that, it was the HDET series. But that cost more money and didnt perform better, so.......
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
HDMI has licensing. DP does not.

Also, DP can drive HDMI without issue. HDMI, OTOH, has severe issue trying to drive DP, and between monitors and adapters its hard to find a combo that actually works.

Pretty sensible. HDMI is a terrible standard that should have been put out to pasture a decade ago.
Except that it is still way more popular in basically anything other than gaming monitors for some reason.

He wants it as a separate product though, just like ye olden times when people wanted a 2500k without the iGPU for "faster performance" even though the iGU didnt slow the CPU down when not in use and there existed silicon to do that, it was the HDET series. But that cost more money and didnt perform better, so.......
I'll never get that when you can disable your iGPU (or your E-cores) with one click in the BIOS.
 
Joined
Jul 20, 2020
Messages
1,166 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
Those are a pittance. 2 CUs vs 12 for the APUs. the 2 CU vesion is barely useful as a display adpater, the 12CU APU model is surprisingly capable. I assume, when he says "eliminating the need for G series", he means having the APU sized GPU on all models.

That's not what he was referring to, quotes from his post:

"no need to waste valuable pcie slot for display, for testing new build, no need for dGPU, and they also useful for encoding tasks.

....

Also consider the days of £30 discrete GPUs are gone, so people wanting just low end use, iGPU is quite valuable."

This is for diagnostic, display adapter, not wasting a PCIe slot for those uses. £30 discrete is a display adapter like a GT 710.
 
Joined
Dec 28, 2012
Messages
4,032 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Except that it is still way more popular in basically anything other than gaming monitors for some reason.
betamax VS VHS.
Laserdisk VS VHS
Firewire VS USB
Zune VS IPOD

The list goes on. the superior interface almost always gets done in by the inferior, but more consooomer oriented, competitor. I blame consoles, things like the xbox 360 pushed HDMI HARD in the consooomer space where only nerdy business PCs used displayport.
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
betamax VS VHS.
Laserdisk VS VHS
Firewire VS USB
Zune VS IPOD

The list goes on. the superior interface almost always gets done in by the inferior, but more consooomer oriented, competitor. I blame consoles, things like the xbox 360 pushed HDMI HARD in the consooomer space where only nerdy business PCs used displayport.
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
 
Joined
Nov 26, 2021
Messages
1,718 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
I prefer DP, but I understand that HDMI is the mass market's choice.
 
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
My motherboard has a DP, which I'm not using at the moment, just DP on the GPU.
 
Joined
Dec 28, 2012
Messages
4,032 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
Going from DP to HDMI is trivial. Going from HDMI to DP is a nightmare. And HDMI has licensing attached per port. DP does not.

Thats why GPUs usually do 3 DP and 1 HDMI. There is 0 advantage to GPU makers to include more HDMI, unless it is a specialized product for digital signage or such.

Every single display I own uses DP. most high refresh rate monitors use DP as a primary. The only thing that uses HDMI is my TV.
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I prefer DP, but I understand that HDMI is the mass market's choice.
I don't mind either as long as it works. Connector standards are a secondary choice to picture quality to me when buying a monitor (which I haven't done since 2017).
 
Joined
Feb 1, 2019
Messages
3,692 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I wasnt aware of licensing per port, explains the lopsidedness.
 

ir_cow

Staff member
Joined
Sep 4, 2008
Messages
4,614 (0.77/day)
Location
USA
Besides having a igpu in a laptop and for troubleshooting, I wouldn't miss it. Some motherboards don't even have the ability to power it (usually overclocking ones). So it's a waste of silicon. The KF CPUs have it disabled, but those are damaged CPUs rebranded.

If Intel sold a future CPU without the igpu, I wouldn't be sad.
 
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
I have lots of spare (even gaming-grade) GPUs in the house, but I still want an iGPU in my CPU. :D
 
Joined
Feb 1, 2019
Messages
3,692 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
I don't mind either as long as it works. Connector standards are a secondary choice to picture quality to me when buying a monitor (which I haven't done since 2017).
I used to use DVI-D on my old monitor with 1080ti (it had DP but it didnt function properly, I assume buggy monitor firmware). With new monitor am now using DP for main PC.

On my 2nd machine (Ryzen), Its using a Dell 2209Wa screen which natively only has DVI-D and VGA, brought a DVI to HDMI cable last year so now connects via HDMI, currently using that screen now connected to HDMI iGPU port on new board to test it.

So HDMI is nice to have, but luckily I have only ever needed one port.
 
Joined
Sep 26, 2022
Messages
2,194 (2.62/day)
Location
Brazil
System Name G-Station 2.0 "YGUAZU"
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling Freezemod: Pump, Reservoir, 360mm Radiator, Fittings / Bykski: Blocks / Barrow: Meters
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Lian Li Lancool 216
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro, Garuda Linux
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
I'm iGPU-less because there's no other option for both my CPU and my MB, and there's always fear my GPU will fail and I'll have no video. When I do rebuild, it'll be with an iGPU as basic as it shall be.
 
Joined
Dec 25, 2020
Messages
7,226 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
^This. It's easy to forget on a forum like this, but the vast majority of people opt for the i5 and lower segments. As has been pointed out by @GerKNG and @wNotyarD in this thread, the 13400f is still based on Golden Cove and doesn't get any of the performance benefits of the Raptor Cove P cores and updated efficiency cores. A 14400f or 13490f with 15% more performance than the 13400f will benefit many more people than a 14900KS with 100 to 200 MHz higher turbo clocks.

Those extra 200 MHz would cost literally 150 watts of power, too. I'm quite sure that they will be deploying Raptor Cove at the low-end which will benefit the most and bump clocks on the i7 and lower segments, I was thinking i9-13900K clocks on a i7-13700K configuration (8P+8E) for the i7 segment refresh/i7-13750K. The i9 segment would be unchanged or they'd retire the i9-13900K in favor of a 13950K that has 100 MHz higher clocks than the 13900K but 100 MHz lower than the 13900KS. That's my personal take anyway, I could be surprised.

These KS chips really are pushed to the limit out of the box, it gets very unreasonable very fast past it, and I fail to see how could they extract more out of Intel 7 Ultra/10ESF+ process.
 
Joined
Dec 28, 2012
Messages
4,032 (0.92/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
I wasnt aware of licensing per port, explains the lopsidedness.
I think its mainly the compatibility.

If you sell a card with 4 outputs, and a user wants to drive triple monitors, then a card with 3 DP and 1 HDMI can drive any combination of DP and HDMI monitors you want, since adapters are dirt cheap and just work. Hell you can do a combo of DP, HDMI, and VGA. Or DVI, HDMI, and DP.

But if you have a card with 3 HDMI and 1 DP, then your triple monitor config must have either 3 HDMI monitors or 2 HDMI and 1 DP monitor. ( or DVI, which can be converted from either). No other config will function, and you will get calls from customers who cant get their video card to output on their three DP monitors.

It's just easier to do as many DP as possible.
I always buy a CPU with iGPU, I don't have working spare GPU's laying around...
Every desktop build I've dont has been with a dGPu, dating back to my highschool days. The only iGPU I've used has been in laptops.

Those extra 200 MHz would cost literally 150 watts of power, too. I'm quite sure that they will be deploying Raptor Cove at the low-end which will benefit the most and bump clocks on the i7 and lower segments, I was thinking i9-13900K clocks on a i7-13700K configuration (8P+8E) for the i7 segment refresh/i7-13750K. The i9 segment would be unchanged or they'd retire the i9-13900K in favor of a 13950K that has 100 MHz higher clocks than the 13900K but 100 MHz lower than the 13900KS. That's my personal take anyway, I could be surprised.

These KS chips really are pushed to the limit out of the box, it gets very unreasonable very fast past it, and I fail to see how could they extract more out of Intel 7 Ultra/10ESF+ process.
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.
 
Joined
Dec 25, 2020
Messages
7,226 (4.89/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.

Agreed but at the same time, the Raptor is already their 10 nm process perfected (10 nm/CNL 10SF/ICL 10ESF/ADL "Intel 7"/10ESF+/RPL "Intel 7 Ultra"). It'd be just very, very hard at this point, IMO. It's at the same step in lithography as 14 nm was when CML and RKL came around.

LOL, close and it made me laugh. Such a Intel thing. 5.5 is 320~ for me 5.7 is 380 and 5.7/6 (2core) is 410w. Can't cool that without going direct die.

There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,160 (2.01/day)
Location
Swansea, Wales
System Name Silent/X1 Yoga
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader/1185 G7
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper V3 Pro 8 KHz Mercury White & Pulsar Supergrip tape, Razer Atlas, Razer Strider Chroma
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Blame or no blame, we need HDMI more than we need DP on our graphics cards.

In fact, I do not own a single DP-capable display at the moment.
Seems like a you issue TBH.

Maybe HDMI is more common for TVs, but any semi-modern computer monitor will have a displayport.

HDMI is a worse standard, as previously stated by others. Displayport has higher bandwidth, and you can use a DP-HDMI adaptor if you insist on using HDMI only monitors.

Agreed but at the same time, the Raptor is already their 10 nm process perfected (10 nm/CNL 10SF/ICL 10ESF/ADL "Intel 7"/10ESF+/RPL "Intel 7 Ultra"). It'd be just very, very hard at this point, IMO. It's at the same step in lithography as 14 nm was when CML and RKL came around.



There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
We'll see.

Aside from the sidegrade of Rocketlake, Intel has been a solid choice for three generations now. I don't expect 14th gen to be different.
 
Joined
Nov 26, 2021
Messages
1,718 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
We thought the same thing of 14nm, yet intel kept finding new ways to push another 1-200 mhz out of those things. Hell we all thought the 5 GHz coffee lake was the final limit, then came comet lake with 5.3 GHz thermal boosting.
You might be right, but I think you're forgetting that, besides the 7700k, Intel didn't just increase clock speed. They also increased the number of cores too, but that's unlikely with any 14900K. The 13900k is already a big die for a mass market CPU: 257 mm^2.
 
Joined
Jan 14, 2019
Messages
13,267 (6.06/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Seems like a you issue TBH.

Maybe HDMI is more common for TVs, but any semi-modern computer monitor will have a displayport.

HDMI is a worse standard, as previously stated by others. Displayport has higher bandwidth, and you can use a DP-HDMI adaptor if you insist on using HDMI only monitors.
Maybe worse on a technical level, but way more common from TVs to non-gaming, non-professional monitors. This isn't a me issue.
 
Joined
Feb 1, 2019
Messages
3,692 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
There's the v/f efficiency curve to account for and I suspect thats why the KS binning is so tight (read: low availability). 13900KS ships at the very edge of it, it became clear to me as I was trying to undervolt my chip.
I dont think I will have the same success undervolting my 13700k as 9900k, if there was headroom on the 13700k I expect it would be a 13900k instead.

At the moment just will be testing it with lowered PL1 to make sure basic functions work, but will later maybe try some kind of undervolt. I should be able to at least get rid of any overvolting ASRock are doing.
 
Top