Monday, August 25th 2008

NVIDIA Quashes Larrabee and Fusion Hype

NVIDIA took potshots at Intel's upcoming Larrabee graphics processor and AMD's "GPU on CPU" device, the Fusion. Speaking to the media ahead of the opening of the annual NVISION expo on Monday, Andy Keane, general manager of NVIDIA's GPU computing group, said that there is an "incredible amount about Larrabee that's undefined" commenting on whatever is known about Intel's GPU.

"You can't just say 'it's x86 so it's going to solve the massively parallel computing problem.'" said Keane. "Look at the PC," he continued. "With an OS they don't control, and applications coming from everywhere... to say arbitrarily that everything's going to scale to 32 cores seems to me to be a bit of a stretch. " he added.

John Mottram, the chief architect for the NVIDIA G200 graphics processor raised further doubts about Larrabee's real-world performance, brushing aside Intel's announcements as a PR stunt. He is quoted saying:

"They've put out a certain amount of technical disclosure in the past five weeks," he noted, "but although they make Larrabee sound like it's a fundamentally better approach, it isn't. They don't tell you the assumptions they made. They talk about scaling, but they disregard memory bandwidth. They make it sound good, but we say, you neglected half a dozen things."

"Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."

Essentially NVIDIA feels Intel is creating too much of a hype over something that doesn't look like it can take on established players, as it would need some mysterious powerful machinery apart from 32 x86 cores. Peter Glaskowsky, a CPU Architect and blogger says that Larrabee in 2010 will have the same level of performance as a GPU from NVIDIA or ATI had back in 2006.

Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.

"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."

Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."

"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
Source: PC Pro
Add your own comment

45 Comments on NVIDIA Quashes Larrabee and Fusion Hype

#1
newconroer
Thanks BT, good to hear some responses to Larrabee and Fusion I suppose.

I think Larrabee even if it flops, is still worth the time spent, as it might open avenues for developers; but Fusion I have to agree seems more novel than anything. Though if it brings some light to AMD's darkened corner of existence, then by all means!
Posted on Reply
#2
Lillebror
Mottram didn't spare AMD either. In the line of fire was AMD's upcoming Fusion processor, a CPU with a graphics processor embedded.

"Joining both components on the same die doesn't buy you that much," he commented. "It's not like there's a real bottleneck there. And every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function. It's cheaper to separate them."

Andy Keane doubted whether buyers would even care about Fusion. "The class of buyer who buys that type of graphics isn't really buying graphics," he argued. "They don't care about it."

"You're working out what you can really afford to put on a CPU, and you're selling it to a customer who doesn't care. The economics don't make sense." he added.
I think most of those who are gonna buy a thing like that, are using it with programs that uses a gpu - cause then you have a gpu and a cpu in one and the same chip!
Posted on Reply
#3
W1zzard
btarunr"The class of buyer who buys that type of graphics isn't really buying graphics,"
thats the whole point of fusion. dirt cheap oem systems that need to be able to run vista aero, play back some basic video and be cheap, cheap, cheap. this is by far the biggest market in the pc industry, about 5,000 times bigger (educated guess) than all this GTX 260/280, 4870 X2 stuff
Posted on Reply
#4
[I.R.A]_FBi
W1zzardthats the whole point of fusion. dirt cheap oem systems that need to be able to run vista aero, play back some basic video and be cheap, cheap, cheap. this is by far the biggest market in the pc industry, about 5,000 times bigger (educated guess) than all this GTX 260/280, 4870 X2 stuff
if its cheaper and does the same stuff they dont care, chances are they wont even notice
Posted on Reply
#5
Siman0
Nvidida needs to watch what they are saying mostly to intel what if they say o well we don't want that old multi gpu thing on our board it jest takes up space
Posted on Reply
#6
Swansen
I think that is something that Nvidia is missing here, the fact that "those people won't care" is exactly why it will work. Like some one else said, it will be able to run a 3d desktop as well as some minor 3d applications all in a smaller space. Seems like a good idea to me, especially for laptops, and UMPCs.
Posted on Reply
#7
Jansku07
if its cheaper and does the same stuff they dont care, chances are they wont even notice
The users wont notice this, but OEMs will. Every cent that cheapens the PC is a big win for them. Fusion, if carried out succesfully, will be a great succes for both OEMs and DAAMIT.
Posted on Reply
#8
[I.R.A]_FBi
Jansku07The users wont notice this, but OEMs will. Every cent that cheapens the PC is a big win for them. Fusion, if carried out succesfully, will be a great succes for both OEMs and DAAMIT.
And that is what is going to rake in the sales.
Posted on Reply
#9
Jansku07
.. every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function.
Says the head architect of G200 core... :roll:
Posted on Reply
#10
HTC
.. every square millimeter you add to the die is a very expensive millimeter. It's an incremental expense, not a linear function.
Jansku07Says the head architect of G200 core... :roll:
Quoted for truth!
Posted on Reply
#11
lemonadesoda
MottramIt's an incremental expense, not a linear function. It's cheaper to separate them.
I think he means exponentially increasing, not incremental.

Cheaper to separate them? No. That's why the CPU contains FPU math today, whereas before it was on a separate x87 chip. It's FAR cheaper to combine them, up to some critical point which is a combination of heat and bad dies, = fn(heat, bad dies) where we can also write this as =fn(power consumption, total die size, die technology scale, error rate per mm2 die).

Oh look! The nVidia equation isnt good. Too much power consumption. Too big die size. Too big die technology scale nm. And too big fail per mm2.

So nVidia wouldnt be able to pull it off. But I'm sure Intel or AMD could! :roll:
Posted on Reply
#13
btarunr
Editor & Senior Moderator
[I.R.A]_FBinvidia + gpu + cpu = fail
It won't happen, they're yet to get a proper x86 license. So maybe this is true:
Every GPU we make, we always consider this type of design, we do a reasoned analysis, and we always conclude no. That's why we haven't built that type of machine."
...because they can't. They're yet to get into out of the order execution processors.
Posted on Reply
#14
xfire
Who's still waiting for the can to open?
People buying the can are buying graphics :P
Posted on Reply
#15
Morgoth
Fueled by Sapphire
nvidia sould stfu there still in teh game becus of intel and amd
Posted on Reply
#17
Morgoth
Fueled by Sapphire
i think asoon intel gets a good gpu and nvidia still doest have a x86 licanse
nvidia wil probaly be on its own
Posted on Reply
#18
X1REME
nvidia is building its own x86 cpu and is about to pronounce it any day now, they don't have an x86 licence, so i don't understand what there gonna do about that (maybe use via = atom)

nvidia trashing amd is not good as amd let nvidia use there chip sets/gpu/motherboards etc. in servers/desktops/laptops etc.
Posted on Reply
#19
Nkd
I am an nvidia owner but they seem to be getting desparate, they know they have no answer for fusion and amd will be taking a lot of oem systems and eliminating the need of dedicated graphics, Nvidia is just downplaying fusion because they got no answer for it, or they will be late with that answer.
Posted on Reply
#20
Darkrealms
That is a lot of talking for Nvidia right now. They better have something good with their GTX300 series or they could be in trouble.

To an earlier comment about Nvidia on the Intel chipsets (sorry didn't see your post again when I scanned through). That is the MFGs putting Nvidias SLI chip on the boards not Intel. As long as Nvidia has a large fan base and a good product I think the MFGs will still work with Nvidia. Right now Nvidia is fighting head to head with ATI. So trying to cut them out would be a big mistake. If ATI/AMD and Intel can build somethig that clearly out performs Nvidia then I think Nvidia will be out.

I still think the first chance Nvidia gets, they will buy a 86x company/license.
Posted on Reply
#21
PP Mguire
nvidia sould stfu there still in teh game becus of intel and amd
No, because their chipset market is only a small percent of where their money comes from.
Posted on Reply
#22
Unregistered
NkdI am an nvidia owner but they seem to be getting desparate, they know they have no answer for fusion and amd will be taking a lot of oem systems and eliminating the need of dedicated graphics, Nvidia is just downplaying fusion because they got no answer for it, or they will be late with that answer.
agreed ! although its a smart company , it will come up with something .
Posted on Edit | Reply
#23
Swansen
truehighroller1I want to see what Intel says about this outlashing.
Intel is huge, i think they could care less.
Posted on Reply
#24
btarunr
Editor & Senior Moderator
truehighroller1I want to see what Intel says about this outlashing.
Actually this is a result of Intel rubbishing the growing popularity of NVIDIA CUDA. It's a counteroffensive. Dig thru our news archives, you'll find the story.
Posted on Reply
#25
candle_86
NkdI am an nvidia owner but they seem to be getting desparate, they know they have no answer for fusion and amd will be taking a lot of oem systems and eliminating the need of dedicated graphics, Nvidia is just downplaying fusion because they got no answer for it, or they will be late with that answer.
not true, for media enthusiants to accelerate hi-def a dedicated GPU with high memory bandwith is a must. And as for gaming well nuff said. And intel hasn't built a quality GPU in 15 years. The last decent thing they built was the i740 to compete with the Riva128.
Posted on Reply
Add your own comment
Jan 16th, 2025 16:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts