Monday, August 25th 2008

NVIDIA Quashes Larrabee and Fusion Hype

NVIDIA took potshots at Intel's upcoming Larrabee graphics processor and AMD's "GPU on CPU" device, the Fusion. Speaking to the media ahead of the opening of the annual NVISION expo on Monday, Andy Keane, general manager of NVIDIA's GPU computing group, said that there is an "incredible amount about Larrabee that's undefined" commenting on whatever is known about Intel's GPU.

"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.

John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:

"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."

"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."

Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.

Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.

"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."

Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."

"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
Source: PC Pro
Add your own comment

45 Comments on NVIDIA Quashes Larrabee and Fusion Hype

#26
djisas
Lemme just say this about fusion:
It will sell like water!!
For laptops, umpc, low cost machines, etc...
Posted on Reply
#27
WarEagleAU
Bird of Prey
Actually candle, the way AMD is putting out the specs for fusion, it will do HI-def very well, not to mention the cpu is gonna be a multi core cpu so it will have plenty of bandwidth to do whats needed. This is a boon not only for laptops, htpcs and netbooks, but also for entertainment devices, cell phones, etc. Its just the beginning. I for one would love to get my hands on a kick ass fusion chip with a decent gpu (hell hd 3200 would work) and work out some sort of crossfire, power saving feature on my desktop.

@Dude who said about the expense millimeter thingy coming from the nvidia guy, I was so gonna post that but I didnt cause you put it there. Thanks man.
Posted on Reply
#28
Hayder_Master
no i don't think so , nvidia do good cpu's , maybe they want tell amd we can do cpu's too but this not happened
Posted on Reply
#29
PP Mguire
I think, Nvidia needs to rework a GPU to be used as a CPU. Since a GPU is so much faster. Think of the possibilities if ATI started making AMD CPUs. AMD would be king until Intel foudn a way to steal what they where doing.
Posted on Reply
#30
Morgoth
Fueled by Sapphire
wont work
Posted on Reply
#31
PP Mguire
Oh ya? I remember when people used to say dual video cards wouldnt work back in the Voodoo 2 days. GPUs are already processing CPU information via Cuda. If theres a will theres a way.
Posted on Reply
#33
candle_86
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
Posted on Reply
#34
WhiteLotus
The Fusion is going to be god send to every laptop owner out there.
Posted on Reply
#35
candle_86
depends on power really, how many low power ATOMS can AMD make @ 45nm
Posted on Reply
#36
PP Mguire
a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
Exactly. People say its different only because people think GPU=graphics. Well just program it to do something else.
Posted on Reply
#37
imperialreign
W1zzardthats the whole point of fusion. dirt cheap oem systems that need to be able to run vista aero, play back some basic video and be cheap, cheap, cheap. this is by far the biggest market in the pc industry, about 5,000 times bigger (educated guess) than all this GTX 260/280, 4870 X2 stuff
completely agree, and unless I'm mistaken, it was this OEM market that was intended to be the primary market for the Fuzion since it's initial acknowledgment of existance from AMD.



OT - I wouldn't be surprised if Intel has some kind of smart-ass comment in return for these statements from nVidia . . .

. . . . although, I'm fairly certain AMD will keep their comments to themselves and allow Fuzion sales (once released) to speak for themselves.
Posted on Reply
#38
JRMBelgium
So what, Nvidia created Geforce 9 hype...
Posted on Reply
#39
Lillebror
candle_86a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
Its not just that easy! You have to redo the whole architecture and change alot of stuff - Thats gonna take years
Posted on Reply
#40
JRMBelgium
LillebrorIts not just that easy! You have to redo the whole architecture and change alot of stuff - Thats gonna take years
Indeed.

CPU = slow all-terain vehicle
GPU = fast car designed for drag-racing
Posted on Reply
#41
Wile E
Power User
candle_86a GPU can do 95% of the work a CPU can, you add a few hundred more transistors to it, you will have a GPU that can double as a CPU. You can also used CUDA to translate x86 into something a GPU understands.
A GPU is terrible at normal computing. The architecture is not efficient at anything but floating point calculations. They do horrible in everything else, which is where most desktop apps reside.

And using CUDA to translate x86 into something that a gpu can understand is called emulation. Emulation would make it perform worse than just using a cpu to begin with.

Besides that, modern CPUs are more complicated to design than modern gpus. GPUS don't have branch predictors, ways to compensate for cache misses, out of order execution, etc., etc.

The short version = gpus make bad general cpus.
Posted on Reply
#42
xfire
Imagine the big fans and the power consumption just for the CPU. Everyone would have to buy high capacity PSU's and you can forget about HTPC's
Posted on Reply
#43
PP Mguire
Everybody uses big fans on CPUs anyways, either that or watercooling.
Posted on Reply
#44
xfire
Where did you get your statistics from?
Posted on Reply
#45
Darkrealms
Jelle MeesIndeed.

CPU = slow all-terain vehicle
GPU = fast car designed for drag-racing
LoL, I like that!
Posted on Reply
Add your own comment
Jan 17th, 2025 00:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts