HellasVagabond
New Member
- Joined
- Jan 19, 2007
- Messages
- 3,376 (0.51/day)
- Location
- Athens , GREECE
System Name | SECONDARY RIG / PRIMARY RIG / THIRD RIG |
---|---|
Processor | i920@3.6GHz / i920@4GHz / AMD Phenom II 955 |
Motherboard | Gigabyte EX58-UD4P / Gigabyte EX58-UD7 / ASRock 890GX3 |
Cooling | CoolIT Domino ALC / Thermalright Silver Arrow / Thermalright VenomousX |
Memory | 12GB DDR3 @ 1800MHZ / 6GB DDR3 @ 2250MHZ / 4GB DDR3 @ 1600MHZ |
Video Card(s) | XFX ATI RADEON 5970 / GAINWARD NVIDIA GTX 580 / 2xGEFORCE GTX295 |
Storage | 1550GB / 6TB SAS - SSD / 160GB SSD |
Display(s) | NEC 26WUXi2 / NEC 3090WQXi / SONY 55A2000 (1080P 55inch) |
Case | COOLER MASTER HAF 932 / COOLER MASTER ATCS 840 / ANTEC DARKFLEET DF85 |
Audio Device(s) | Soundblaster X-Fi Xtreme Music / SoundBlaster X-Fi Fatal1ty Pro / Realtek Onboard |
Power Supply | CWT 1200W / Enermax Revolution 85+ 1250W / Ikonik Vulcan 1200W |
Software | Windows 7 x64 / Windows 7 x64 / Windows 7 x64 |
People over at Hardspell today got a picture of the NVIDIA inside PPT in which high-end graphic cards 8800 and 2900 are tested in CPU usages when decoding H.264 HD. The entire test is based on an Core 2 Duo processor. The blue ones are the CPU usage only decoded by CPU, the red ones are after inserted 2900 while the green ones are after inserted 8800GT. From what the picture says the 8800GT supports H.264 hardware decoding but 2900XT and 2900Pro don't. Good move from NVIDIA to present a cheap 8800 series based solution and name it 8800GT.
View at TechPowerUp Main Site

View at TechPowerUp Main Site