Wednesday, December 12th 2018
Intel 10nm "Ice Lake" to Combine "Sunny Cove" CPU Cores with Gen11 iGPU
Intel's upcoming "Ice Lake" die could be the company's biggest processor innovation in a decade, combining new clean-slate design "Sunny Cove" CPU cores, and a new integrated graphics solution based on the company's Gen11 architecture. "Sunny Cove" introduces significant IPC (single-thread performance) gains over "Coffee Lake," introduces new ISA instruction sets, including AVX-512; and a brand new uncore component; while the Gen11 graphics core is Intel's first iGPU to reach the 1 TFLOP/s mark. Intel demonstrated the ultra-low power "Ice Lake-U" SoC platform in its 2018 Architecture Day briefing.
This "Ice Lake-U" chip, with its TDP in the ballpark of 15 W, was shown ripping through 7-zip and "Tekken 7." With 7-zip, Intel was trying to demonstrate vector-AES and SHA-NI improving archive encryption performance by 75 percent over "Skylake." The Gen11 iGPU was shown providing a smoother gameplay than Skylake with Gen9, although the company neither mentioned resolution, nor frame-rates. Anandtech wagers it's above 30 fps.
Source:
AnandTech
This "Ice Lake-U" chip, with its TDP in the ballpark of 15 W, was shown ripping through 7-zip and "Tekken 7." With 7-zip, Intel was trying to demonstrate vector-AES and SHA-NI improving archive encryption performance by 75 percent over "Skylake." The Gen11 iGPU was shown providing a smoother gameplay than Skylake with Gen9, although the company neither mentioned resolution, nor frame-rates. Anandtech wagers it's above 30 fps.
19 Comments on Intel 10nm "Ice Lake" to Combine "Sunny Cove" CPU Cores with Gen11 iGPU
That's some ugly 3D printed heatsink.
This looks so promising I might just fall asleep
Was mostly a funny observation given the article text :)
Why isnt it valid, John?
What you are describing is how an old CRT worked. LCD backlights are never 'off' (unless they use pulse width modulation), they hold frames. Some monitors do have black frame insertion or strobe however, which makes them a bit more similar to the old CRT experience and improves motion clarity. And if they use that, then yes, the image would be darker.
1080i (interlaced) would alternatively illuminate even/uneven horizontal rows, so you get 'half' the image drawn at each refresh and the other half in the next. But even here, all rows are drawn at the same time, not line by line going from top to bottom. That is CRT.
www.digitaltrends.com/home-theater/1080p-vs-1080i-whats-the-difference/
hmm
I wish I could find some definitive answer in a paper or something, but I haven't seen one yet. But this high speed video of an LCD refresh implies that it does indeed scan.
EDIT:
And here's high speed of OLED, a fairly recent technology (to differentiate between old/new tech, as the one in the video above is a 2007 model, so I didn't want to only include old LCD tech.)
Seems scanning does indeed happen.
So, point being this: A sufficiently fast camera could catch an LCD screen with a tear in it, regardless of whether vsync were on or not. Especially if that's an OLED screen.