Wednesday, December 12th 2018

Intel 10nm "Ice Lake" to Combine "Sunny Cove" CPU Cores with Gen11 iGPU

Intel's upcoming "Ice Lake" die could be the company's biggest processor innovation in a decade, combining new clean-slate design "Sunny Cove" CPU cores, and a new integrated graphics solution based on the company's Gen11 architecture. "Sunny Cove" introduces significant IPC (single-thread performance) gains over "Coffee Lake," introduces new ISA instruction sets, including AVX-512; and a brand new uncore component; while the Gen11 graphics core is Intel's first iGPU to reach the 1 TFLOP/s mark. Intel demonstrated the ultra-low power "Ice Lake-U" SoC platform in its 2018 Architecture Day briefing.

This "Ice Lake-U" chip, with its TDP in the ballpark of 15 W, was shown ripping through 7-zip and "Tekken 7." With 7-zip, Intel was trying to demonstrate vector-AES and SHA-NI improving archive encryption performance by 75 percent over "Skylake." The Gen11 iGPU was shown providing a smoother gameplay than Skylake with Gen9, although the company neither mentioned resolution, nor frame-rates. Anandtech wagers it's above 30 fps.
Source: AnandTech
Add your own comment

19 Comments on Intel 10nm "Ice Lake" to Combine "Sunny Cove" CPU Cores with Gen11 iGPU

#1
GoldenX
Did I miss the Gen10 IGPs?
That's some ugly 3D printed heatsink.
Posted on Reply
#2
ShurikN
GoldenXDid I miss the Gen10 IGPs?
I believe they didn't have it. Had 9, 9.5 and then this.
Posted on Reply
#3
Vayra86
'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep
Posted on Reply
#4
ShurikN
Vayra86'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep
In their defense it is a 15W part.
Posted on Reply
#5
Papahyooie
Vayra86'Rips through Tekken 7'... yeah the screen is ripping as well (check photo)

This looks so promising I might just fall asleep
I agree... couldn't care less about a 15w part really. But what does the screen tear have to do with anything? The camera took the picture in the middle of a frame refresh. That doesn't tell you anything about performance whatsoever.
Posted on Reply
#6
Vayra86
PapahyooieI agree... couldn't care less about a 15w part really. But what does the screen tear have to do with anything? The camera took the picture in the middle of a frame refresh. That doesn't tell you anything about performance whatsoever.
Nothing much but it does show they have no budget there to deploy Vsync without abysmal FPS.

Was mostly a funny observation given the article text :)
Posted on Reply
#7
Berfs1
That cooler really looks like a PUBG crate *OCD kicks in*
Posted on Reply
#8
John Naylor
It's sad that, in order to put a positive spin on a new CPU release, they have to resort to using 7-Zip. yeah I remember last using that 3 years ago after downloading updated MoBo drivers.
Posted on Reply
#9
Papahyooie
Vayra86Nothing much but it does show they have no budget there to deploy Vsync without abysmal FPS.

Was mostly a funny observation given the article text :)
I could be wrong, but don't LCD screens scan line by line? If I'm right about that, even turning Vsync on would not prevent the camera from catching a frame mid-refresh, unless the camera shutter was actually synced to the refresh rate of the monitor. I would assume the image would be darker (since the backlight would technically be off at that point if vsync were on) but unless it were a particularly high speed camera, you probably wouldn't see that in the picture.
Posted on Reply
#10
R0H1T
GoldenXDid I miss the Gen10 IGPs?
That's some ugly 3D printed heatsink.
Gen10 would probably be CNL U & they all have defective IGP atm.
Posted on Reply
#11
EarthDog
John NaylorIt's sad that, in order to put a positive spin on a new CPU release, they have to resort to using 7-Zip. yeah I remember last using that 3 years ago after downloading updated MoBo drivers.
Que?

Why isnt it valid, John?
Posted on Reply
#12
Patriot
GoldenXDid I miss the Gen10 IGPs?
That's some ugly 3D printed heatsink.
Fan cover, but yeah, heh kinda cool.
Posted on Reply
#13
Vayra86
PapahyooieI could be wrong, but don't LCD screens scan line by line? If I'm right about that, even turning Vsync on would not prevent the camera from catching a frame mid-refresh, unless the camera shutter was actually synced to the refresh rate of the monitor. I would assume the image would be darker (since the backlight would technically be off at that point if vsync were on) but unless it were a particularly high speed camera, you probably wouldn't see that in the picture.
No. With Vsync a frame is held until the next frame can be shown fully, that is how it eliminates screen tear. Tearing on a monitor happens because part of the old frame is still there when the new frame is sent to the monitor. The camera shot its photo while such a tear happened.

What you are describing is how an old CRT worked. LCD backlights are never 'off' (unless they use pulse width modulation), they hold frames. Some monitors do have black frame insertion or strobe however, which makes them a bit more similar to the old CRT experience and improves motion clarity. And if they use that, then yes, the image would be darker.
Posted on Reply
#14
Papahyooie
Vayra86No. With Vsync a frame is held until the next frame can be shown fully, that is how it eliminates screen tear. Tearing on a monitor happens because part of the old frame is still there when the new frame is sent to the monitor. The camera shot its photo while such a tear happened.

What you are describing is how an old CRT worked. LCD backlights are never 'off' (unless they use pulse width modulation), they hold frames. Some monitors do have black frame insertion or strobe however, which makes them a bit more similar to the old CRT experience and improves motion clarity. And if they use that, then yes, the image would be darker.
Right, but Vsync only operates on the image creation and timing when to draw it to the screen. The torn image with vsync off is because the final image in the frame buffer that gets sent to the monitor is composite of two separate frames. But the actual monitor still draws line by line does it not? So like, it doesn't matter if the frame is fully drawn by the gpu, if the screen hasn't actually put the whole thing on the screen yet when the camera takes the picture, it would still be torn. That's true about monitors not having PWM (most of them, now that I think about it) not having darker screens, so you're right there. Of course this is all pointless if I'm wrong about LCD's still drawing line by line. I can't seem to find a straight answer on that, but I've always been under the impression that they do.
Posted on Reply
#15
Bwaze
On one slide Sunny Cove is listed as comming in 2019. Do you suppose Cannon Lake is cancelled and Intel will go straight to Ice Lake on desktop, or will we get Skylake based Cannon Lake 10 nm first, and Sunny Cove will be used in other applications - select notepads, NUCs in end of 2019, and desktop a year later?
Posted on Reply
#16
Vayra86
PapahyooieRight, but Vsync only operates on the image creation and timing when to draw it to the screen. The torn image with vsync off is because the final image in the frame buffer that gets sent to the monitor is composite of two separate frames. But the actual monitor still draws line by line does it not? So like, it doesn't matter if the frame is fully drawn by the gpu, if the screen hasn't actually put the whole thing on the screen yet when the camera takes the picture, it would still be torn. That's true about monitors not having PWM (most of them, now that I think about it) not having darker screens, so you're right there. Of course this is all pointless if I'm wrong about LCD's still drawing line by line. I can't seem to find a straight answer on that, but I've always been under the impression that they do.
No, when you have a progressive scan resolution (such as 1080p) its a full image drawn at every frame refresh.

1080i (interlaced) would alternatively illuminate even/uneven horizontal rows, so you get 'half' the image drawn at each refresh and the other half in the next. But even here, all rows are drawn at the same time, not line by line going from top to bottom. That is CRT.

www.digitaltrends.com/home-theater/1080p-vs-1080i-whats-the-difference/
Posted on Reply
#17
cyneater
So the last time I used AVX and AES is ....

hmm
Posted on Reply
#18
Papahyooie
Vayra86No, when you have a progressive scan resolution (such as 1080p) its a full image drawn at every frame refresh.

1080i (interlaced) would alternatively illuminate even/uneven horizontal rows, so you get 'half' the image drawn at each refresh and the other half in the next. But even here, all rows are drawn at the same time, not line by line going from top to bottom. That is CRT.

www.digitaltrends.com/home-theater/1080p-vs-1080i-whats-the-difference/
That link doesn't tell anything about what I'm saying though. Yes, the full image is drawn every frame refresh in 1080p. I get that. But break that single refresh down, and what do you get? Does every single pixel light up at the same moment? Or does it scan like a CRT does?

I wish I could find some definitive answer in a paper or something, but I haven't seen one yet. But this high speed video of an LCD refresh implies that it does indeed scan.



EDIT:
And here's high speed of OLED, a fairly recent technology (to differentiate between old/new tech, as the one in the video above is a 2007 model, so I didn't want to only include old LCD tech.)
Seems scanning does indeed happen.



So, point being this: A sufficiently fast camera could catch an LCD screen with a tear in it, regardless of whether vsync were on or not. Especially if that's an OLED screen.
Posted on Reply
#19
EarthDog
cyneaterSo the last time I used AVX and AES is ....

hmm
More often than you know? Set an offset and keep an eye on CPUz when broswing and gaming, etc... you will be surprised what trips that.
Posted on Reply
Add your own comment
Nov 21st, 2024 10:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts