DarkMatter
New Member
- Joined
- Oct 5, 2007
- Messages
- 1,714 (0.27/day)
Processor | Intel C2Q Q6600 @ Stock (for now) |
---|---|
Motherboard | Asus P5Q-E |
Cooling | Proc: Scythe Mine, Graphics: Zalman VF900 Cu |
Memory | 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15 |
Video Card(s) | GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory |
Storage | 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0 |
Display(s) | HP p1130, 21" Trinitron |
Case | Antec p180 |
Audio Device(s) | Creative X-Fi PLatinum |
Power Supply | 700W FSP Group 85% Efficiency |
Software | Windows XP |
Explain the logic behind that. You're only mfgr prices up by doing that, ever heard of multi layered PCBs man?...
HD4850 > 9800GTX by 25% According to AMD, this is fairly believeable.
A dual GTX280 is technically impossible, between two slots, why? 65nm to 55nm doesn't boast much of a change in TDP! Nvidia's CEO even admitted it, do I have to repeat this? GX2 would be viable, with say a GT200 variant that is similar to the G92 in die size. It was mentioned that a die shrink would only drop the GTX280's heat ouput down to what, around 200W, which is still ridiculously high (400W+ GX2). Who gives about Idle when the card is ridiculously hot at load.
Nvidia really stabbed themselves in the foot, while it is powerful as such, the HD4870X2 will be a more successful product.
Could you post a link to where Nvidia's CEO said that please? And where those power numbers were mentioned, though I suppose it's the same. I highly doubt going to 55nm won't make the card consume less than 200W.
Also as I mentioned, Nvidia doesn't need 2 280s to crush Ati's X2, not even 2 260s. By only shrinking the chip to 55nm it would be 400mm2, take some ROPs out and you will get a die size close to G92. No one has said GT200 GX2 is possible but GT200b IS, and you will see it soon if Ati's X2 happens to be quite faster than GTX280.
Also real power consumption of GTX280 is nowhere near those 236W, while the older cards are close to their claimed TDP. It's temperatures are far better than on G92 and RV670 too, despite being a lot bigger, so there's some room left there. If GT200b can't improve the performance beyond that of the X2 a GX2 of GT200b WILL come, but it's nature is not so defined. In fact a card with 2x the performance of GTX280 doesn't make sense AT ALL. If it did, because games in the near future could take advantage of it, then Ati would be LOST.
In the end it will all depend on the real performance of the RV770. AFAIK HD4870 > 9800 GTX by 25% and HD4850 > 8800 GT by 25%. That also means HD4850 > 9800 GTX but by 5-10 %. ANYWAY forget about that if the performance boost of newer drivers happens to be true.