Sunday, July 6th 2008

Intel Larrabee Capable of 2 TFLOPs

German tech-journal Heise caught up with Intel's Pat Gelsinger for an article discussing the company's past and future as the silicon giant heads towards 40 years of service this 18th of July.

Among several topics, came up the most interesting one, visual computing and Intel's plans on it. 'Larrabee' strikes as a buzzword. It is the codename of Intel's upcoming graphics processor (GPU) architecture with which it plans to take on established players such as NVIDIA and AMD among others.

What's unique (so far) about Larrabee is that it's entirely made up of x86 processing cores. The Larrabee is likely to have 32 x86 processing cores. Here's a surprise: These processing cores are based on the design of Pentuim P54C, a 13+ year old x86 processor. This processor will be miniaturised to the 45nm fabrication process, they will be assisted by a 512-bit SIMD unit and these cores will support 64-bit address. Gelsinger says that 32 of these cores clocked at 2.00 GHz could belt out 2 TFLOPs of raw computational power. That's close to that of the upcoming AMD R700. Heise also reports that this GPU could have a TDP of as much as 300W (peak).With inputs from Heise
Add your own comment

77 Comments on Intel Larrabee Capable of 2 TFLOPs

#1
tkpenalty
I want actual pics of the card...
Posted on Reply
#2
magibeg
wow 300 watts eh? Could heat the lower level of my house. I get the strange feeling this is either going to horribly flop or do incredibly well. Very little middle ground :P
Posted on Reply
#3
Morgoth
Fueled by Sapphire
tkpenaltyI want actual pics of the card...
i'm not sure but i tough it was an Intergrated GPU on Nehalem with 2 cores + Ht :S
now i have seen this
i'm starting to get confused lol



btarunr Its spelled Larrabee
Posted on Reply
#4
btarunr
Editor & Senior Moderator
So, that 150W connector is 10-pin? (I'll eliminate 12-pin since two 6-pin connectors have a blank pin each that could be shared with pin #3)
Posted on Reply
#5
1c3d0g
300 watts. Can you say nuclear reactor? What happened to efficiency, Intel? :(

I don't like where GPU's are heading. There's too much power draw for so little performance increase. This goes for all GPU makers. Something needs to be done to bring power demands back in line with the rest of computer components. They have enough trouble as it is squeezing last-generation high-end GPU's in notebooks, but this is just ridiculous.
Posted on Reply
#6
MilkyWay
most wack gpu ive ever seen

whoever thought that having 32 old cpus and makign a gpu based out of it is either incredibly stupid or amazingly crafty

im not even going to bother saying much because we all know nothing matters untill we get results

STILL i know for a fact 300w is alot for gpu i mean you could run a full pc on that nearly
Posted on Reply
#7
Voyager
x86 cores :twitch:
Now we can run existing software on GPU :toast:
Posted on Reply
#8
DonInKansas
I'd never have to run the heater in the winter; I'll just play more games!:p
Posted on Reply
#9
MilkyWay
wonder if you could run programs on the gpu instead of the cpu LOL this card boggles me completely
Posted on Reply
#10
btarunr
Editor & Senior Moderator
Why so much fuss about its TDP? Wasn't the HD2900 XT like 200W (peak)?
Posted on Reply
#11
Mussels
Freshwater Moderator
based on the design of Pentuim P54C
Pentuim? is that some CPU i never heard of?

*cough spellcheck*

300W TDP... gack.
Posted on Reply
#12
jyoung75
2 TFLOPS by Larabee a year from now is nice, but I can get 2.4 TFLOPS from the Radeon 4870x2 a month from now. And the Radeon cards are already rumored to be ray tracing monsters (used for ray tracing HD scenes in Transformers) www.tgdaily.com/content/view/38145/135/.
Posted on Reply
#13
TheGuruStud
Since when is a general purpose cpu going to be able to process graphics at a respectable rate?

If that was the case, everyone with a quad core would be getting 50 FPS in 3dmark with the cpu test (I don't care if it has high speed ram and cache attached or not). I'm calling intel retarded, again.

edit: Or it's more fud. Like that 10 GHz pentium 4 they just had laying around :laugh:
Posted on Reply
#14
btarunr
Editor & Senior Moderator
TheGuruStudSince when is a general purpose cpu going to be able to process graphics at a respectable rate?

If that was the case, everyone with a quad core would be getting 50 FPS in 3dmark with the cpu test (I don't care if it has high speed ram and cache attached or not). I'm calling intel retarded, again.
If a ~70 GFLOPs Core 2 Extreme can do ~6 fps, guess what 2000 GFLOPs can.
Posted on Reply
#15
dracoonpit
1c3d0gI don't like where GPU's are heading. There's too much power draw for so little performance increase. This goes for all GPU makers. Something needs to be done to bring power demands back in line with the rest of computer components.
I disagree. Comparing to performance-increase, GPUs have been more efficient with every new model. Performance per Watt ratio of new GPUs is better - no matter if they need 200 watts ..
Posted on Reply
#16
TheGuruStud
btarunrIf a ~70 GFLOPs Core 2 Extreme can do ~6 fps, guess what 2000 GFLOPs can.
That's 100% theoretical max. I have a lot more faith in a c2e than some untested, whack design that supposed to be from old architecture. If it's new architecture or at least mostly from the ground up, then I'll be quiet. What they're claiming is just ridiculous.

To me, this is like M$ saying the xbox 360 is fast b/c it has tri-core and runs at 3.2 GHz. But in reality there's not many transistors and it just can't push much data.
Posted on Reply
#17
eidairaman1
The Exiled Airman
intel can claim this and claim that, by the time they release the card it will already be obsolete by Nvidia and AMD, hell even Via.
Posted on Reply
#18
Mussels
Freshwater Moderator
the only thing this has over video cards, is the x86 architecture. that means you can effectively add 32 CPU cores to any machine. ANY app should be able to use it (games, encoding/decoding apps, etc)
Posted on Reply
#19
TheGuruStud
eidairaman1intel can claim this and claim that, by the time they release the card it will already be obsolete by Nvidia and AMD, hell even Via.
VIA!!! Fastest. Stuff. Ever. :laugh:

Seriously, though, VIA was cool back in the day, but they pissed me off when the athlon 64s came out. Those boards were slow and buggy.
Posted on Reply
#20
btarunr
Editor & Senior Moderator
TheGuruStudThat's 100% theoretical max. I have a lot more faith in a c2e than some untested, whack design that supposed to be from old architecture. If it's new architecture or at least mostly from the ground up, then I'll be quiet. What they're claiming is just ridiculous.

To me, this is like M$ saying the xbox 360 is fast b/c it has tri-core and runs at 3.2 GHz. But in reality there's not many transistors and it just can't push much data.
Just as you say you'd be silent if it was something built from scratch, you can't be loud about this either. As for scratch, these are 'old' processors, but shrunk, clocked to 2 GHz, .....etc. When something of this sort comes from Gelsinger, it's better we not jump to assumptions that it's a 'bad' architecture, since we've seen nothing to prove it's bad just as yet.
Posted on Reply
#21
Mussels
Freshwater Moderator
btarunrJust as you say you'd be silent if it was something built from scratch, you can't be loud about this either. As for scratch, these are 'old' processors, but shrunk, clocked to 2 GHz, .....etc. When something of this sort comes from Gelsinger, it's better we not jump to assumptions that it's a 'bad' architecture, since we've seen nothing to prove it's bad just as yet.
since the core architecture (core solo/duo, and then onto core 2 duo/quad) designs came from a pentium 3 tualatin, his argument really falls down anyway. old cores that hit a tech limit can really be revilatised with new tech and die shrinks.
Posted on Reply
#22
panchoman
Sold my stars!
WTF you call that a gpu? thats not a gpu! thats 32 p4 cores stuck together on a card with a 300w tdp after a huge ass die shrink! WTF intel.. i expected much much much more from you, come on.. 32 p4 cores stuck together to make a gpu...
Posted on Reply
#23
btarunr
Editor & Senior Moderator
A quad-core QX9770 draws 130W, isn't 32 cores @ 300W an improvement?
Posted on Reply
#24
Mussels
Freshwater Moderator
well 300W TDP isnt so bad... oh wait yes it is. TDP means its not max, so it could even go upto 400W real draw.

That said, this is intel. they could easily throw in some power saving features and have its power usage scale really well (modified speedstep, for example)
Posted on Reply
#25
Morgoth
Fueled by Sapphire
panchomanWTF you call that a gpu? thats not a gpu! thats 32 p4 cores stuck together on a card with a 300w tdp after a huge ass die shrink! WTF intel.. i expected much much much more from you, come on.. 32 p4 cores stuck together to make a gpu...
Those are not Netburst !
Posted on Reply
Add your own comment
Dec 18th, 2024 07:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts