Wednesday, May 26th 2010
Intel Shelves Larrabee as a GPU Architecture
Intel has once again shelved plans to come back to the discrete graphics market, with the much talked about GPU codenamed Larrabee. In a recent post by Director, Product and Technology Media Relations Bill Kircos on the company blog, it was detailed that the company's priorities at the moment lie with releasing industry-leading processors that have the graphics-processing horsepower for everyday computing. The Intel HD graphics will only get better with the 2011 series of Core processors based on the Sandy Bridge architecture, where the iGPU core will be completely integrated with the processing complex.
An unexpected yield of the Larrabee program seems to be that Intel has now perfected many-core processors. Since Larrabee essentially is a multi-core processor with over 32 IA x86 cores that handle graphics workload, it could as well give Intel a CPU that is Godsent for heavy-duty HPC applications. Intel has already demonstrated a derivative of this architecture, and is looking to induct it into its Xeon series of enterprise processors.
Source:
Intel Blogs
An unexpected yield of the Larrabee program seems to be that Intel has now perfected many-core processors. Since Larrabee essentially is a multi-core processor with over 32 IA x86 cores that handle graphics workload, it could as well give Intel a CPU that is Godsent for heavy-duty HPC applications. Intel has already demonstrated a derivative of this architecture, and is looking to induct it into its Xeon series of enterprise processors.
16 Comments on Intel Shelves Larrabee as a GPU Architecture
Disappointing but the project was always a major leap of the imagination and the leap appears to be too great for Intel to want to brave.
EDIT: In fact I recall a news thread about it here.
but yeah, epic fail this time around i have to say.
Let's be perfectly honest, Larrabee as a viable GPU product was always a long shot. GPUs are fast because they are fundamentally simple. CPUs (the basis from which Larrabee comes from) are fundamentally complex. The only way for complex to compete with simple is through much higher transistor counts. If Fermi was hot, a competitive Larrabee product would be positively melting. It's a shame Intel couldn't work their magic to make it happen but, ya can't beat physics at physic's game. Intel hoped they could simplify x86 enough to make a viable GPU and, as it turns out, it just can't happen.
but i hope that IGPU is a typo.
nuff with the I srsly.
I.e. latest CPUs are Nehalem+ derived from Core, which came from Pentium M, which was a modified PIII, and that was an improved Pentium Pro. The chips perform excellently, so no reason not to do that, but possibly a whole new design could be even better. Of course they did try a new architecture with P4 Netburst ...which ended up losing ground and got canned, whilst AMD came up with x86-64bit and IMC on CPU die.
64bit Itanium also a mild disaster, so Intel now push x86 Xeons at business.
2 goes at powerful 3D graphics failed (Larrabee and i740 back in 1998).
Use of RDRAM ...needs no elaboration!
All I can think of for success and innovation are the SSDs and sales / marketing teams! I was looking forward to Larrabee as well.
There's simply no need to WASTE billions on a doomed project when you can simply buy a company (you have plenty of cash, plus it's a great investment) that has the known expertise, and more importantly, a real working and ready-to-integrate graphics core so you can implement it already and be done with this expensive Larrabee joke. :)
Course the other part of it was, they tried to reinvent the wheel on how the GPU functions. Intel has been trying to push CPUs > GPUs for years but it just isn't happening. So they come out with Larabee instead, a GPU made out of CPUs. Intel may be the monster of the industry, but they have no standing power to change things when their current graphics tech just sucks.
Its like Intel as Stewart on Mad TV suddenly sporadically flailing limbs saying "hey look what I can do!" Then ATI/NV going back to duking it out as if Intel never existed, seeing him as obviously not a threat.
They've really caught up and it's not like 80% of the people in this world run games on their computers. If they do they're just simple flash games or casual games. Most mainstream users play on their xboxes or wiis or ps3s.