Monday, August 25th 2008
NVIDIA Quashes Larrabee and Fusion Hype
NVIDIA took potshots at Intel's upcoming Larrabee graphics processor and AMD's "GPU on CPU" device, the Fusion. Speaking to the media ahead of the opening of the annual NVISION expo on Monday, Andy Keane, general manager of NVIDIA's GPU computing group, said that there is an "incredible amount about Larrabee that's undefined" commenting on whatever is known about Intel's GPU.
"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.
John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:
"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."
"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."
Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.
Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.
"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."
Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."
"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
Source:
PC Pro
"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.
John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:
"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."
"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."
Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.
Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.
"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."
Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."
"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
45 Comments on NVIDIA Quashes Larrabee and Fusion Hype
It will sell like water!!
For laptops, umpc, low cost machines, etc...
@Dude who said about the expense millimeter thingy coming from the nvidia guy, I was so gonna post that but I didnt cause you put it there. Thanks man.
OT - I wouldn't be surprised if Intel has some kind of smart-ass comment in return for these statements from nVidia . . .
. . . . although, I'm fairly certain AMD will keep their comments to themselves and allow Fuzion sales (once released) to speak for themselves.
CPU = slow all-terain vehicle
GPU = fast car designed for drag-racing
And using CUDA to translate x86 into something that a gpu can understand is called emulation. Emulation would make it perform worse than just using a cpu to begin with.
Besides that, modern CPUs are more complicated to design than modern gpus. GPUS don't have branch predictors, ways to compensate for cache misses, out of order execution, etc., etc.
The short version = gpus make bad general cpus.