Monday, April 3rd 2017

Rumored Intel Kaby Lake-G Series: Modular, Multi-Die, HBM 2, AMD Graphics IP?

Rumors have been making the rounds about an as-of-yet unannounced product from Intel: a Kaby Lake-G series which would mark Intel's return foray to a multi-chip module in a singular package. The company has already played with such a design before with its Clarkdale family of processors - which married a 32 nm CPU as well as a 45 nm GPU and memory controller in a single package. Kaby Lake-G will reportedly make away with its simple, low-data rate implementation and communication between two parts, instead carrying itself on the shoulders of Intel's EMIB (Embedded Multi-die Interconnect Bridge), which the company claims is a "more elegant interconnect for a more civilized age."

Instead of using a large silicon interposer typically found in other 2.5D approaches (like AMD did whilst marrying its Fiji dies with HBM memory), EMIB uses a very small bridge die, with multiple routing layers, which provide a good measure of price/data paths for the interconnected, heterogeneous architecture. This saves on the costly TSV (Through-Silicon Vias) that dot the interposer approach.
For now, rumors peg these Kaby Lake-G as special BGA processors based on Kaby Lake, with an additional discrete GPU on the package. The TDP of these processors (at 65 W and 100 W) is well above the Kaby Lake-H's known 45 Watts. Which begs the question: what exactly is under the hood? This, including Intel's modular approach to chip design for which it developed its EMIB technology, could probably account for the AMD graphic's chip TDP - a discrete-level GPU which would be integrated on-die, EMIB's routing layers handling the data exchange between GPU and processor. This is where HBM 2 memory integration would also come in, naturally - a way to keep a considerable amount of high-speed memory inside the package, accessible by the silicon slices that would need to. Nothing in the leaked information seems to point towards this HBM 2 integration, however.
Also helping these "AMD Radeon IP integration" story (besides TDP) is that the two chips that will be part of the Kaby Lake-G series will feature a package size of 58.5 x 31mm - bigger than a desktop Kaby Lake-S (37.5 x 37.5 mm) and the Kaby Lake-H series chips (42 x 28mm). The extra space would accommodate increased footprint of the GPU package - though for now, leaked information points only, again, to Intel's own GT2 graphics solution, though Benchlife seems to put much stock on the AMD side of the equation.
The heterogeneous, modular approach to CPU development here would really benefit Intel thusly: it would allow it to integrate such external graphics solutions that could be produced in other factories entirely and then fitted onto the package; would allow Intel to save die space on their 10 nm dies for actual cores, increasing yields from their 10 nm process; and would allow Intel to recycle old processes with new logic inside the CPU package, permitting the company to better distribute production load across different processes, better utilizing (and extracting value from) their not-so-state-of-the-art processes.

If Intel advances with this modular approach, we stand to see some really interesting designs, with multiple manufacturing processes working in tandem inside a single package, giving Intel more flexibility in developing and implementing its fabrication processes. What do you think about this take on CPU development?
Sources: BenchLife, Computerbase.de
Add your own comment

32 Comments on Rumored Intel Kaby Lake-G Series: Modular, Multi-Die, HBM 2, AMD Graphics IP?

#26
renz496
Imsochobo"deal", more as a lawsuit that resulted in forced license of IP which Intel didn't want.
intel is not forced to it. in fact it is intel that needs it. intel signed cross licensing deal with nvidia back in 2004 so they can make their own gpu without infringing nvidia patent. the deal also allow nvidia to make chipsets for intel CPU. the deal was supposed to expire in 2011. but in 2008/2009 things start heating up between intel and nvidia when intel did not allow nvidia to make new chipset for intel new CPU. this is the actual problem back then but some people made mistake thinking that it was about nvidia graphic IP only.
Posted on Reply
#28
mtcn77
qubitI can just see this technology making for killer consoles and that's a good thing.
I have deep admiration for Intel engineers who put out Broadwell L4. Currently, I don't see another such disruptive technology. Expensive, true, but such hit rates are unheard of. If they stratify it into Vega's HBCC complex which we know Vega is capable of heterogeneously controlling, it would be one quick Labarree reboot.
Posted on Reply
#29
john_
r9I think Intel is still married to Nvidia GPUs.
I don't think Intel gonna want to feed AMD with Ryzen attacking Intel on all fronts by the end of the year.
$264 Millions annually for the GPU licencing is what Intel is paying Nvidia.
Putting those extra money into AMD pocket would definitely hurt Intel in the long run.
A strong AMD can take 10-20-30% of the x86 market from Intel, while Intel can cover those losses in the autonomous automobile business. Both AMD and Intel can keep x86 going strong for many many years, keeping the processor as the most important factor in a system.

A strong Nvidia can have a devastating effect in Intel's future, promoting the ARM platform everywhere with the rest of the companies creating ARM processors and eventually, in a future where AMD will be irrelevant in the GPU business, locking it's GPUs into the ARM platform, making that platform the de facto gaming platform. By also promoting GPUs as the most important factor in a system, Intel's future will not be guaranteed. Not to mention that Nvidia is a strong competitor in the autonomous market.
Posted on Reply
#30
qubit
Overclocked quantum bit
mtcn77I have deep admiration for Intel engineers who put out Broadwell L4. Currently, I don't see another such disruptive technology. Expensive, true, but such hit rates are unheard of. If they stratify it into Vega's HBCC complex which we know Vega is capable of heterogeneously controlling, it would be one quick Labarree reboot.
Indeed, I've never doubted that Intel could put out killer graphics cards if they really wanted to compete with AMD or even NVIDIA. They're big enough to buy the best talent if they need to.
Posted on Reply
#31
Frick
Fishfaced Nincompoop
NokironWell, they have a thing going with Mobileye already.

www-ssl.intel.com/content/www/us/en/automotive/autonomous-vehicles.html
Aye, in the long run Intel will definitely battle with Nvidia. Or rather, they already do. Also in IoT and deep learning. Nvidia has several legs to stand on and most of them competes with Intel.

Anyway, Intel should be pretty capable of making a decent iGPU. Iris Pro has shown us that.
Posted on Reply
#32
r9
qubitIndeed, I've never doubted that Intel could put out killer graphics cards if they really wanted to compete with AMD or even NVIDIA. They're big enough to buy the best talent if they need to.
Yeah Intel can build a powerful gpu but not without zillion patent infringements.
Its hard to build a building when somebody else posses all the land to build on.
And the actually tried building GPU based on reduced instruction Pentium 4 cores.
They stuck like 70-80 cores together. Did their benchmarks, they were waaay behind both AMD and Nvidia and they decided to call it a day.
Posted on Reply
Add your own comment
Dec 18th, 2024 06:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts