Wednesday, July 2nd 2008
Intel Downplays the Growing Popularity of NVIDIA CUDA
The co-general manager of Intel's Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA's CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.
Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.
Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.
Source:
Custom PC
Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.
Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.
20 Comments on Intel Downplays the Growing Popularity of NVIDIA CUDA
jk
With CUDA (or Physx for that matter) you need a huge shift in how you think/develop, plus you need to learn the new programming model.
That Larrabee chip looks really plain, maybe we'll be able to play doom on it :p
Number crunching is number crunching, and if some day GPUs could bend and crunch as good as a processor, this might be a problem for them.
I for one am interested in seeing what Intel does with their larrabee cards and this ray tracing they are so fond of.
Dunno if it's possible, law wise, though!
*Begins to drool* ... :rockout:
They may well be right, but the current systems came about because people were willing to make them work. Both ATI and Nvidia are willing to try to make theirs work.