• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel 10nm "Ice Lake" to Combine "Sunny Cove" CPU Cores with Gen11 iGPU

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,878 (7.38/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Intel's upcoming "Ice Lake" die could be the company's biggest processor innovation in a decade, combining new clean-slate design "Sunny Cove" CPU cores, and a new integrated graphics solution based on the company's Gen11 architecture. "Sunny Cove" introduces significant IPC (single-thread performance) gains over "Coffee Lake," introduces new ISA instruction sets, including AVX-512; and a brand new uncore component; while the Gen11 graphics core is Intel's first iGPU to reach the 1 TFLOP/s mark. Intel demonstrated the ultra-low power "Ice Lake-U" SoC platform in its 2018 Architecture Day briefing.

This "Ice Lake-U" chip, with its TDP in the ballpark of 15 W, was shown ripping through 7-zip and "Tekken 7." With 7-zip, Intel was trying to demonstrate vector-AES and SHA-NI improving archive encryption performance by 75 percent over "Skylake." The Gen11 iGPU was shown providing a smoother gameplay than Skylake with Gen9, although the company neither mentioned resolution, nor frame-rates. Anandtech wagers it's above 30 fps.



View at TechPowerUp Main Site
 
Did I miss the Gen10 IGPs?
That's some ugly 3D printed heatsink.
 
'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep
 
'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep
In their defense it is a 15W part.
 
'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep

I agree... couldn't care less about a 15w part really. But what does the screen tear have to do with anything? The camera took the picture in the middle of a frame refresh. That doesn't tell you anything about performance whatsoever.
 
I agree... couldn't care less about a 15w part really. But what does the screen tear have to do with anything? The camera took the picture in the middle of a frame refresh. That doesn't tell you anything about performance whatsoever.

Nothing much but it does show they have no budget there to deploy Vsync without abysmal FPS.

Was mostly a funny observation given the article text :)
 
That cooler really looks like a PUBG crate *OCD kicks in*
 
It's sad that, in order to put a positive spin on a new CPU release, they have to resort to using 7-Zip. yeah I remember last using that 3 years ago after downloading updated MoBo drivers.
 
Nothing much but it does show they have no budget there to deploy Vsync without abysmal FPS.

Was mostly a funny observation given the article text :)

I could be wrong, but don't LCD screens scan line by line? If I'm right about that, even turning Vsync on would not prevent the camera from catching a frame mid-refresh, unless the camera shutter was actually synced to the refresh rate of the monitor. I would assume the image would be darker (since the backlight would technically be off at that point if vsync were on) but unless it were a particularly high speed camera, you probably wouldn't see that in the picture.
 
It's sad that, in order to put a positive spin on a new CPU release, they have to resort to using 7-Zip. yeah I remember last using that 3 years ago after downloading updated MoBo drivers.
Que?

Why isnt it valid, John?
 
I could be wrong, but don't LCD screens scan line by line? If I'm right about that, even turning Vsync on would not prevent the camera from catching a frame mid-refresh, unless the camera shutter was actually synced to the refresh rate of the monitor. I would assume the image would be darker (since the backlight would technically be off at that point if vsync were on) but unless it were a particularly high speed camera, you probably wouldn't see that in the picture.

No. With Vsync a frame is held until the next frame can be shown fully, that is how it eliminates screen tear. Tearing on a monitor happens because part of the old frame is still there when the new frame is sent to the monitor. The camera shot its photo while such a tear happened.

What you are describing is how an old CRT worked. LCD backlights are never 'off' (unless they use pulse width modulation), they hold frames. Some monitors do have black frame insertion or strobe however, which makes them a bit more similar to the old CRT experience and improves motion clarity. And if they use that, then yes, the image would be darker.
 
No. With Vsync a frame is held until the next frame can be shown fully, that is how it eliminates screen tear. Tearing on a monitor happens because part of the old frame is still there when the new frame is sent to the monitor. The camera shot its photo while such a tear happened.

What you are describing is how an old CRT worked. LCD backlights are never 'off' (unless they use pulse width modulation), they hold frames. Some monitors do have black frame insertion or strobe however, which makes them a bit more similar to the old CRT experience and improves motion clarity. And if they use that, then yes, the image would be darker.

Right, but Vsync only operates on the image creation and timing when to draw it to the screen. The torn image with vsync off is because the final image in the frame buffer that gets sent to the monitor is composite of two separate frames. But the actual monitor still draws line by line does it not? So like, it doesn't matter if the frame is fully drawn by the gpu, if the screen hasn't actually put the whole thing on the screen yet when the camera takes the picture, it would still be torn. That's true about monitors not having PWM (most of them, now that I think about it) not having darker screens, so you're right there. Of course this is all pointless if I'm wrong about LCD's still drawing line by line. I can't seem to find a straight answer on that, but I've always been under the impression that they do.
 
On one slide Sunny Cove is listed as comming in 2019. Do you suppose Cannon Lake is cancelled and Intel will go straight to Ice Lake on desktop, or will we get Skylake based Cannon Lake 10 nm first, and Sunny Cove will be used in other applications - select notepads, NUCs in end of 2019, and desktop a year later?
 
Right, but Vsync only operates on the image creation and timing when to draw it to the screen. The torn image with vsync off is because the final image in the frame buffer that gets sent to the monitor is composite of two separate frames. But the actual monitor still draws line by line does it not? So like, it doesn't matter if the frame is fully drawn by the gpu, if the screen hasn't actually put the whole thing on the screen yet when the camera takes the picture, it would still be torn. That's true about monitors not having PWM (most of them, now that I think about it) not having darker screens, so you're right there. Of course this is all pointless if I'm wrong about LCD's still drawing line by line. I can't seem to find a straight answer on that, but I've always been under the impression that they do.

No, when you have a progressive scan resolution (such as 1080p) its a full image drawn at every frame refresh.

1080i (interlaced) would alternatively illuminate even/uneven horizontal rows, so you get 'half' the image drawn at each refresh and the other half in the next. But even here, all rows are drawn at the same time, not line by line going from top to bottom. That is CRT.

https://www.digitaltrends.com/home-theater/1080p-vs-1080i-whats-the-difference/
 
Last edited:
No, when you have a progressive scan resolution (such as 1080p) its a full image drawn at every frame refresh.

1080i (interlaced) would alternatively illuminate even/uneven horizontal rows, so you get 'half' the image drawn at each refresh and the other half in the next. But even here, all rows are drawn at the same time, not line by line going from top to bottom. That is CRT.

https://www.digitaltrends.com/home-theater/1080p-vs-1080i-whats-the-difference/

That link doesn't tell anything about what I'm saying though. Yes, the full image is drawn every frame refresh in 1080p. I get that. But break that single refresh down, and what do you get? Does every single pixel light up at the same moment? Or does it scan like a CRT does?

I wish I could find some definitive answer in a paper or something, but I haven't seen one yet. But this high speed video of an LCD refresh implies that it does indeed scan.



EDIT:
And here's high speed of OLED, a fairly recent technology (to differentiate between old/new tech, as the one in the video above is a 2007 model, so I didn't want to only include old LCD tech.)
Seems scanning does indeed happen.



So, point being this: A sufficiently fast camera could catch an LCD screen with a tear in it, regardless of whether vsync were on or not. Especially if that's an OLED screen.
 
Last edited:
Back
Top