- Joined
- Oct 9, 2007
- Messages
- 47,180 (7.56/day)
- Location
- Hyderabad, India
System Name | RBMK-1000 |
---|---|
Processor | AMD Ryzen 7 5700G |
Motherboard | ASUS ROG Strix B450-E Gaming |
Cooling | DeepCool Gammax L240 V2 |
Memory | 2x 8GB G.Skill Sniper X |
Video Card(s) | Palit GeForce RTX 2080 SUPER GameRock |
Storage | Western Digital Black NVMe 512GB |
Display(s) | BenQ 1440p 60 Hz 27-inch |
Case | Corsair Carbide 100R |
Audio Device(s) | ASUS SupremeFX S1220A |
Power Supply | Cooler Master MWE Gold 650W |
Mouse | ASUS ROG Strix Impact |
Keyboard | Gamdias Hermes E2 |
Software | Windows 11 Pro |
The co-general manager of Intel's Digital Enterprise Group, Pat Gelsinger told Custom PC that NVIDIA's CUDA programming model would be nothing more than an interesting footnote in the annals of computing history.
Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.
Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.
View at TechPowerUp Main Site
Gelsinger says that programmers simply don't have enough time to learn how to program for new architectures like CUDA. Says Gelsinger: "The problem that we've seen over and over and over again in the computing industry is that there's a cool new idea, and it promises a 10x or 20x performance improvement, but you've just got to go through this little orifice called a new programming model. Those orifices have always been insurmountable as long as the general purpose computing models evolve into the future.". The Sony CELL and the fact that it didn't live up to all its hype as something superior to current computing architectures proves his point.
Gelsinger tells that Intel's Larrabee graphics chip will be entirely based on Intel Architecture x86 cores, and the reason for that is so developers can program for the graphics processor without having to learn a new language. Larrabee will have full support for APIs like DirectX and OpenGL.
View at TechPowerUp Main Site
Last edited by a moderator: