Monday, August 25th 2008
NVIDIA Quashes Larrabee and Fusion Hype
NVIDIA took potshots at Intel's upcoming Larrabee graphics processor and AMD's "GPU on CPU" device, the Fusion. Speaking to the media ahead of the opening of the annual NVISION expo on Monday, Andy Keane, general manager of NVIDIA's GPU computing group, said that there is an "incredible amount about Larrabee that's undefined" commenting on whatever is known about Intel's GPU.
"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.
John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:
"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."
"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."
Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.
Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.
"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."
Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."
"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
Source:
PC Pro
"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.
John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:
"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."
"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."
Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.
Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.
"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."
Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."
"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
45 Comments on NVIDIA Quashes Larrabee and Fusion Hype
I think Larrabee even if it flops, is still worth the time spent, as it might open avenues for developers; but Fusion I have to agree seems more novel than anything. Though if it brings some light to AMD's darkened corner of existence, then by all means!
Cheaper to separate them? No. That's why the CPU contains FPU math today, whereas before it was on a separate x87 chip. It's FAR cheaper to combine them, up to some critical point which is a combination of heat and bad dies, = fn(heat, bad dies) where we can also write this as =fn(power consumption, total die size, die technology scale, error rate per mm2 die).
Oh look! The nVidia equation isnt good. Too much power consumption. Too big die size. Too big die technology scale nm. And too big fail per mm2.
So nVidia wouldnt be able to pull it off. But I'm sure Intel or AMD could! :roll:
People buying the can are buying graphics :P
nvidia wil probaly be on its own
nvidia trashing amd is not good as amd let nvidia use there chip sets/gpu/motherboards etc. in servers/desktops/laptops etc.
To an earlier comment about Nvidia on the Intel chipsets (sorry didn't see your post again when I scanned through). That is the MFGs putting Nvidias SLI chip on the boards not Intel. As long as Nvidia has a large fan base and a good product I think the MFGs will still work with Nvidia. Right now Nvidia is fighting head to head with ATI. So trying to cut them out would be a big mistake. If ATI/AMD and Intel can build somethig that clearly out performs Nvidia then I think Nvidia will be out.
I still think the first chance Nvidia gets, they will buy a 86x company/license.