Tuesday, June 17th 2008
ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.
The head of ATI Technologies claims that the recently introduced NVIDIA GeForce GTX 200 GPU will be the last monolithic "megachip" because they are simply too expensive to manufacture. The statement was made after NVIDIA executives vowed to keep producing large single chip GPUs. The size of the G200 GPU is about 600mm2¬¬ which means only about 97 can fit on a 300mm wafer costing thousands of dollars. Earlier this year NVIDIA's chief scientist said that AMD is unable to develop a large monolithic graphics processor due to lack of resources. However, Mr. Bergman said that smaller chips allow easier adoption of them for mobile computers.
Source:
X-bit Labs
116 Comments on ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.
but as for the discussion at hand, the gtx280 is like my 2900xt in that it puts out alot of heat, uses alot of energy, is expensive to produce, and has to have a big cooler on it.
but as for the specs, I said it before, the gtx280 is exactly what we all hoped it would be spec wise.
Its powerful. Thats for sure. But AMD are saying that Nvidia are being suicidal by keeping everything in one core, I have to agree with that logic. Two HD4850s according to Tweaktown spank a GTX280, and those are the mid range HD4850, not the high-mid, 4870. The 4850 already is faster than a 9800GTX.
Now if you consider AMD putting two HD4850/HD4870's performance into ONE card, what AMD is saying suddenly makes sense.
If physics and lighting where moved to the GPU from the CPU that bottleneck is gone from the CPU and the GPU can handle it 200x at least faster than the fastest quad core even running the game at the same time. This in turn allows for better more realistic things to be done, remember the alan wake demo at IDF for those great physics, heres the thing it stuttered, now if CUDA is used intead it would get alot more FPS, the reason for not so heavy realistic phyics is the lack of raw horsepower, if CUDA is used as Nvidia hopes it will be used the games may not run any faster, but the level of realism can increase greatly which would sway more than one consumer.
If it get 100FPS and use's large transparent textures for dust thats great
if it gets 100FPS but draws each grain of dirt as its own pixel thats even better
which would you get evne with the price diffrence id go for the real pixel dirt
Because otherwise, just because temperatures are not higher doesn't mean the chip is not outputting more heat and consuming more. Heat has to do with energy swapping. In the case of CPU is energy swapping between surfaces. More cache = more surface = more energy transfer = lower temperatures at same heat output.
That was one reason, the other a lot more simple is that, was not 5000+ a 5200+ wih hlf the cache "dissabled"? In quotes because most times they can't dissable all the energy in the dissabled part.