Wednesday, July 2nd 2008

Intel Downplays the Growing Popularity of NVIDIA CUDA

The co-general manager of Intel's Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA's CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.

Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.
Source: Custom PC
Add your own comment

20 Comments on Intel Downplays the Growing Popularity of NVIDIA CUDA

#1
W1zzard
the man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"
Posted on Reply
#2
btarunr
Editor & Senior Moderator
There's very little furniture in that GPU schematic, even S3 Chrome S27 made it look pretty :p

jk
Posted on Reply
#3
lemonadesoda
W1zzardthe man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"
True to a point... there are ready-to-go libraries for x86, and compilers that can be extended for larrabee. While the average programmer will not get optimal performance our of larrabee, they can work within their regular IDE and the compiler will do most of the work.

With CUDA (or Physx for that matter) you need a huge shift in how you think/develop, plus you need to learn the new programming model.
Posted on Reply
#4
Morgoth
Fueled by Sapphire
i agree with Lemonadesoda it takes large ammount of time and money to leurn a new code and Update the game engines, mapping tools and every els that part of this stuff
Posted on Reply
#5
mdm-adph
W1zzardthe man is right, but larrabee will have the same issues because its "programming model" is still different .. massive parallelism, "programmers simply don’t have enough time to learn how to program for new architectures"
Well, you know how it is -- they have to say that. With all this talk about CUDA and ATI's GPGPU and how disgustingly fast they're able to crunch numbers -- even if they're limited in what they can do -- Intel's starting to look left out of the loop to the public, at least where buzzwords like "folding" and "number crunching" are involved.
Posted on Reply
#6
Megasty
There is no acceptable learning curve for new gaming engines. That's why most games today only use 1 of the 3 proven engines. NV's best bet is to continue to derive CUDA to fit those engine. We all know about the mindstorm that came about when physics were introduced into the crytek engine. Crytek 2 is a monster but CUDA can still only help it. Developing new CUDA drivers for every crackpot engine that comes around would be a complete waste.

That Larrabee chip looks really plain, maybe we'll be able to play doom on it :p
Posted on Reply
#7
eidairaman1
The Exiled Airman
So if there is an Orifice that these Companies have to go thru, why Is ATI and Nvidia the Most popular among gamers, Where intel isnt. I say intel is feeling threatened by both ATI and Nvidia.
btarunrThe co-general manager of Intel’s Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA’s CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.

Gelsinger says that programmers simply don’t have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: “The problem that we’ve seen over and over and over again in the computing industry is that there’s a cool new idea, and it promises a 10x or 20x performance improvement, but you’ve just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.”. The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.

Gelsinger tells that Intel’s Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.



Source: Custom PC
Posted on Reply
#8
X-TeNDeR
^Exactly, intel is afraid from all these new techs, which could some day compete with their best parts.
Number crunching is number crunching, and if some day GPUs could bend and crunch as good as a processor, this might be a problem for them.
Posted on Reply
#9
Morgoth
Fueled by Sapphire
no cus the dont have X86
Posted on Reply
#10
WarEagleAU
Bird of Prey
Sonys cell does live up to its hype, and Im sure there is more.

I for one am interested in seeing what Intel does with their larrabee cards and this ray tracing they are so fond of.
Posted on Reply
#11
HTC
Morgothno cus the dont have X86
Doesn't ATI do, through AMD?

Dunno if it's possible, law wise, though!
Posted on Reply
#12
eidairaman1
The Exiled Airman
Technically ATI doesnt Exist as a Company, just a Product Line now, so yes ATI does have X86 licensing thru AMD.
Posted on Reply
#13
HTC
eidairaman1Technically ATI doesnt Exist as a Company, just a Product Line now, so yes ATI does have X86 licensing thru AMD.
You sure?

*Begins to drool* ... :rockout:
Posted on Reply
#14
eidairaman1
The Exiled Airman
well most of the head positions at ATI left after the buyout, aka forced retirement (good sum of money for them to quit) And it was a buyout, so AMD could implement a X86 graphics card, but the coding will be tough regardless, so i believe the route will be, what isnt broke dont fix it. beyond that isnt there a link here stating that ATI cards as of late have some ray tracing capability anyway?
Posted on Reply
#16
tkpenalty
WarEagleAUSonys cell does live up to its hype, and Im sure there is more.

I for one am interested in seeing what Intel does with their larrabee cards and this ray tracing they are so fond of.
AMD already beat intel to it...
HTCYou sure?

*Begins to drool* ... :rockout:
Indeed. However "ATi" still hasnt been dropped. AMD, unlike nvidia actually keep what they buy alive.
Posted on Reply
#17
Darkrealms
Intel is bringing up a weak arguement because their attempt in the "gaming" graphics industry is so far behind.

They may well be right, but the current systems came about because people were willing to make them work. Both ATI and Nvidia are willing to try to make theirs work.
Posted on Reply
#18
eidairaman1
The Exiled Airman
tkpenaltyAMD already beat intel to it...



Indeed. However "ATi" still hasnt been dropped. AMD, unlike nvidia actually keep what they buy alive.
only thing Nvidia kept was SLI when it was 3DFX that created that solution.
Posted on Reply
#19
btarunr
Editor & Senior Moderator
eidairaman1only thing Nvidia kept was SLI when it was 3DFX that created that solution.
The "SLI" by 3DFX wasn't an abbreviation for Scalable Link Interface. So no, the name isn't a carry forward of an old brand name.
Posted on Reply
#20
eidairaman1
The Exiled Airman
the Initials were Scan-Line Interleave, which was same concept, only thing nvidia did was tried improving upon it.
Posted on Reply
Add your own comment
Oct 3rd, 2024 14:13 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts