Saturday, September 27th 2008
Zotac 9800 GTX+ Accelero Twin Turbo Model Spotted
Here's the third GeForce 9800 GTX+ non-reference model in the works. Zotac has prepared a graphics card based on the GeForce 9800 GTX+ GPU with its own PCB design. As for cooling, they've employed the expertise of Arctic Cooling, with an Accelero Twin Turbo cooler for the GPU. Zotac may have worked on aspects related to the card's power design and memory, keeping in mind, that the 9800 GTX+ is a 55nm GPU which might draw lower power than its 65nm counterpart, keeping clock speeds constant. It uses clock speeds of 740/1836/2200 MHz (core/shader/memory). Notable PCB features include:
Source:
Expreview
- 0.8 ns GDDR3 memory chips made by Samsung
- A 4+2 phase power design
- Power input reduced to a single 6+2 pin PCI-E power connector
34 Comments on Zotac 9800 GTX+ Accelero Twin Turbo Model Spotted
Both cards are similarly priced in the mayority of the world and performance is a give or take, depending on the game. Aditionally Nvidia has CUDA and PhysX right now, something that MANY (I include myself) would choose over any little performance advantage. Add to that when overclocked performance goes to Nvidia. The truth is, that as things are, the 9800gtx+ is as appealing if not much more than the HD4850 so there is a place for both in the market.
Everything else, is pure fanboism.
PD: Any price argument goes to waste, when you consider my point of PhysX and look at Shadowfold's specs. HD4850+8400GS is quite more $ than the 9800GTX+ everywhere.
EDIT: After a quick search through many Australian etailers that came up googling, I must say that what you said is simply FALSE. Cheaper HD4850 I could find was $210 aussies, with average being $240. The cheapest GTX+ was $220, with average around $245, BUT cheapest HD4870 I could find was $315. So no, no, no, mate.
If the total power of a GS was used in a game:
470 - 29 = 441
441/470*100 = 94%
Basically if you were having 30fps you would get 28fps (60fps vs. 56fps) IN THE WORST CASE SCENARIO, which would probably never happen, cos as I said games are not bottlenecked by the shaders nowadays.
Aditionally if the game used twice the power of the GS, the HD4850 would lag badly whereas the GTX+ would get 27fps, funny eh?
If some of the extra SPs can effectively be used, up to those 705 things get even better for the GTX+.
Basically the "you would lose performance" is a myth, mostly spread by Ati fans.
Aditionally a GT has the same amount of ROPs but less SPs, so the GTX+ has much (compared to what 8400GS has) more "spare" processing power than a GT.
PD: Maybe it was just me, but GRAW 2 demo run like crap. It was an unnoptimiced POS that I personally wouldn't use as proof of anything, but mediocrity. LOL sorry, I had a really bad experience with it. Didn't try it with PhysX enabled though.
And framerate drop is negligible from physx accel if you're already getting decent fps. If you're getting such low fps that you need those extra 5 back from the physx, then you need more power anyway.
edit (for guy below :)): I want two of these, specifically, if they don't charge an arm and a leg for the cooler.
The 9800GTX+ wouldn't have problems running physX i can run it on my 8800GT without having noticeable frame rate drops