Tuesday, December 29th 2009

NVIDIA Fermi-based GeForce GPU Further Delayed?
NVIDIA's next-generation GeForce GPU based on the Fermi architecture is reportedly further delayed to March 2010, up from its originally expected time-frame of January. NVIDIA on its part maintained that Fermi-based GeForce GPUs will be released sometime in Q1 2010, and with a March launch, that would still stand true.
Fermi's development history is marked with late arrivals. The DirectX 11 compliant architecture was announced in October 2009 to counter the market-available DirectX 11 compliant ATI Radeon HD 5800 GPUs. Then in mid-November, the company released the first products based on the architecture - GPGPU accelerators under the NVIDIA Tesla HPC banner. An alleged working prototype GeForce accelerator was spotted around the same time, with word doing rounds that NVIDIA will be ready with the new GeForce GPU in early Q1, probably coinciding with the CES event. Faced with further delays, NVIDIA reportedly notified its partners that the new GPUs will be released to the marked only in March.
NVIDIA plans to launch the 40 nm Fermi-GF100 GPU which is DirectX 11 compliant and supports GDDR5 memory in March, and will launch a GF104 version. Till then, the mainstream-thru-performance segments will be left to be defended by GeForce GTS 250, GT 240, GT 220, 210, 9800 GT, against a fortified mainstream lineup by AMD consisting of ATI Radeon HD 5670/5650 (codenamed "Redwood"), and ATI Radeon HD 5450 (codenamed "Cedar"). These DirectX 11 compliant GPUs from AMD will be released in January.
Source:
DigiTimes
Fermi's development history is marked with late arrivals. The DirectX 11 compliant architecture was announced in October 2009 to counter the market-available DirectX 11 compliant ATI Radeon HD 5800 GPUs. Then in mid-November, the company released the first products based on the architecture - GPGPU accelerators under the NVIDIA Tesla HPC banner. An alleged working prototype GeForce accelerator was spotted around the same time, with word doing rounds that NVIDIA will be ready with the new GeForce GPU in early Q1, probably coinciding with the CES event. Faced with further delays, NVIDIA reportedly notified its partners that the new GPUs will be released to the marked only in March.
NVIDIA plans to launch the 40 nm Fermi-GF100 GPU which is DirectX 11 compliant and supports GDDR5 memory in March, and will launch a GF104 version. Till then, the mainstream-thru-performance segments will be left to be defended by GeForce GTS 250, GT 240, GT 220, 210, 9800 GT, against a fortified mainstream lineup by AMD consisting of ATI Radeon HD 5670/5650 (codenamed "Redwood"), and ATI Radeon HD 5450 (codenamed "Cedar"). These DirectX 11 compliant GPUs from AMD will be released in January.
136 Comments on NVIDIA Fermi-based GeForce GPU Further Delayed?
2010 doesn't sound that far away anymore, since it'll be in a few hours ;)
I'll just say it: NVIDIA made a bad choice focusing on Tesla. They are expecting to find a huge market but they are disowning an existing huge market in order to infiltrate another. Because only corporations/governments looking to buy super computers would even look at Tesla, I think they overestimate how much money there is to be had in that segment.
That said, they know that pushing further into the server market will be the only way to "stay alive".
VS
(/-\/-\/-\/-\/-\) Vapor
Mebey I should.
Cloud computing is more focused on small multi threaded jobs that can be reassembled out of order and with little impact on the rest of the job or the continuation of other parts of the job. Like folding at home, seti at home, climate computation etc....
Cloud computing will never have the bandwith, thread handling, or tasking available in the next few years to be a viable force to drive a real time 3D game. Look at DX11, it is a excesize to see how far we can take a single chip/AIB to compute multithreaded applications natively, and even with its huge bandwidth and power we still fail to fully use the core.
Fermi will be dead in 10 years (long replaced by something else). Cloud computing won't see much use for 10 years. Looks like a bad decision to me.
It is mostly Google and Intel pushing the idea of cloud computing. Google wants your information and Intel wants corporations to buy the hardware because they have deeper pockets (8-way, 8-core Xeon platforms, anyone?). There is little benefit here for the consumer.