# NVIDIA GeForce 8800GTX and GT Pictures



## malware (Sep 29, 2006)

Here's a link to the PC Club forums thread, which shows first pictures of NVIDIA's latest GeForce 8800GTX and GeForce 8800GT video cards, based on the G80 GPU. You can clearly see how massive the new cards are going to be. The one with the water cooling is GeForce 8800GTX, the other is 8800GT version.

*View at TechPowerUp Main Site*


----------



## technicks (Sep 29, 2006)

Crappy looking.


----------



## Canuto (Sep 29, 2006)

Yeah but does that really matter?


----------



## pt (Sep 29, 2006)

if i doesn't have watercooling i can't have a 6800gtx?


----------



## newtekie1 (Sep 29, 2006)

I assume/hope those are not the final heatsink designs.


----------



## Alec§taar (Sep 29, 2006)

I don't understand Kanji, so it is nothing but photographs to me... 

I would like a technical readout of features & such (transistors count, etc.) on this, IF possible, in English... don't mean to be a nag, but here is an area I am VERY curious on in hardwares!



* BIG Nvidia fanboy here, admittedly...

APK

P.S.=> Here's some of that, in case you guys cannot read/translate Kana, etc.:

http://www.nvnews.net/vbulletin/showthread.php?t=65217

&

http://forums.anandtech.com/messageview.cfm?catid=31&threadid=1930352

apk


----------



## technicks (Sep 29, 2006)

Canuto said:


> Yeah but does that really matter?



Shure it does.


----------



## giorgos th. (Sep 29, 2006)

technicks said:


> Crappy looking.



it`s just the reference design..when the companies that will produce it,make their final adjustments i believe it will be alot better.


----------



## zekrahminator (Sep 29, 2006)

malware said:


>


Dude, that thing has TWO PCI-E connectors and a liquid cooling line inside the cooler ...doesn't bode well for thermal output. And it certainly doesn't bode well for my power bill llol. So much for DX10 being efficient lol.


----------



## SunShine (Sep 29, 2006)

This card is too long .


----------



## EastCoasthandle (Sep 29, 2006)

Wait, you guys see nothing wrong with 2 PCI express, 6 pin connectors, the "buzzer" at the end of the cards (vista requirement, I'm sure) and the option to have both HSF and what looks like a radiator water cooling solution using alu plating with 90 degree bends???  Come on this thing looks like a pure crap to me. I mean come on, at least offer a brick power supply to offset the dual PSU requirements, sheesh talk about inefficient and backwards.

Here is the possible AGP, SLI version


----------



## technicks (Sep 29, 2006)

Yeah man. The card will stick out 2 or 3 cm more the length of the ATX motherboard. 
I see some fitting problems 
And those double connectors are absolutely rediculous.  
Buy another psu again:shadedshu


----------



## jocksteeluk (Sep 29, 2006)

all i can say is its about time they moved on from the 7 series but it is a shame they havent bothered to do anything about the power consumption issues


----------



## HaZe303 (Sep 29, 2006)

Im a bit afraid that both Nv and ATI will have humongos powerrequirements, and to me it seems they are taking the easy way out. Instead of trying to invent something new, something that is powerful graphics vise but doesnt tripple youre electricity bill. This is bad news for the worlds enviroment. We are already murdering our "mother" earth, so we really dont need even more stuff sucking down power...


----------



## pt (Sep 29, 2006)

im going to keep my x1800gto for 2 or more years, so im safe  
and when i buy a dx10 i will buy a mid budget one


----------



## tofu (Sep 29, 2006)

All I can say is:

HOLY SH!T LOOK AT THE CARD LENGTH!!!

I'm waiting for the DX10 midrangers, probably the 8600GT. 

EDIT: I missed the *2x PCI-E CONNECTOR*


----------



## Azn Tr14dZ (Sep 29, 2006)

I'm probably going to go mid-range DX10...maybe SLI too. No need to go high-end for me.


----------



## Deleted member 24505 (Sep 29, 2006)

that'd fit in my case(akasa eclipse 62)noooooo problemo,i have 3-5" of tray past the end of my board.i'll squeeze that baby in.


----------



## pt (Sep 29, 2006)

tigger69 said:


> that'd fit in my case(akasa eclipse 62)noooooo problemo,i have 3-5" of tray past the end of my board.i'll squeeze that baby in.



mine will probabily fit, if not is modding time  when i get it


----------



## Deleted member 24505 (Sep 29, 2006)

* Unified Shader Architecture 
* Support FP16 HDR+MSAA 
* Support GDDR4 memories 
* Close to 700M transistors (G71 - 278M / G70 - 302M) 
* New AA mode : VCAA 
* Core clock scalable up to 1.5GHz 
* Shader Peformance : 2x Pixel / 12x Vertex over G71 
* 8 TCPs & 128 stream processors 
* Much more efficient than traditional architecture 
* 384-bit memory interface (256-bit+128-bit) 
* 768MB memory size (512MB+256MB) 
* Two models at launch : GeForce 8800GTX and GeForce 8800GT 
* GeForce 8800GTX : 7 TCPs chip, 384-bit memory interface, hybrid water/fan cooler, water cooling for overclocking. US$649 
* GeForce 8800GT : 6 TCPs chip, 320-bit memory interface, fan cooler. US$449-499


----------



## pt (Sep 29, 2006)

Azn Tr14dZ said:


> I'm probably going to go mid-range DX10...maybe SLI too. No need to go high-end for me.



a couple 8600gs would be great


----------



## thedivinehairband (Sep 29, 2006)

The size of that thig is ridiculous and the heatsink it looks to require, equally so. And water cooler by default? That's just wrong. None of that for me please. I think I'll chug along quite happily with my little X800XL for a while. 

Wonder what sorta state the r600 will be when we finally get some pics of it. Reckon I'll go midrange like Azn and tofu suggested. Gonna have a look out when the rv6** series come around. (Bum Shaka Laka, LOL)

Anyway have plans for my Mobo, CPU and RAM before I go changing my graphics card.


----------



## DaJMasta (Sep 29, 2006)

zekrahminator said:


> Dude, that thing has TWO PCI-E connectors and a liquid cooling line inside the cooler ...doesn't bode well for thermal output. And it certainly doesn't bode well for my power bill llol. So much for DX10 being efficient lol.



.

That's disgusting.

I really wanted a DX10 card, but if they're gonna require 500 watt PSUs or better, I won't patronize them.  If they follow the performance/power trend in the CPU industry then I'll buy one.


----------



## overcast (Sep 30, 2006)

technicks said:


> Crappy looking.



Are you attending a fashion show?


----------



## Ketxxx (Sep 30, 2006)

its big, looks ugly, coolingdoesnt even look like it would be THAT effective without modding, rediculous power requirements, ill have none of that thanks. lets see what ati can offer.


----------



## Judas (Sep 30, 2006)

I think most card which we will see in the near future will be all power hungry,its a bit long too 
might have issues with the crosshair since the mobos power slots are side ways
as for looks its upside down so you wont see it ... unless you have mirors in the bottom of your case   
Have to wait and see what ATI are comming up with


----------



## POGE (Sep 30, 2006)

Biggar is bettar right?


----------



## Ketxxx (Sep 30, 2006)

lol, no. i say ati and nvidia take a seat and work on proper solutions, in the meantime programmers whill just have to *shock, horror* actually program code and optimise it half decently for a change.


----------



## Ketxxx (Sep 30, 2006)

i know various programmers. its not the programmers of game engines im critisising, but the developers of games utilising that particular game engine. the game engine itself is already highly optimised, its the code thats based around it that isnt.


----------



## Canuto (Sep 30, 2006)

Top end cards are just not worth it! 

People look a X1650pro and a X1950XTX have the same image quality the XTX gives you 25% ~ 75% more performance for +100% money and demands tons of juice/power.

We're much better buying a mid range card every year...


----------



## Ketxxx (Sep 30, 2006)

logically thats inaccurate. i agree buying the very top end cards are not worth it. however buying a card thats one or two steps down the ladder is worth it. for example, 6800gt over the 6800ultra, or a 7900gt over a 7900gtx.


----------



## Canuto (Sep 30, 2006)

Well yeah but i don't consider a 7900Gt to be a top end it's more like a mid/top to me it's price now is about 300€(not sure) and... you got my point 500€ cards are just not worth it.


----------



## Pheonix_789 (Oct 1, 2006)

I want to see the price and specs of an 8600 GT I hope it will be at least 256Bit


----------



## SPHERE (Oct 1, 2006)

http://img80.imageshack.us/img80/6281/1159518316603in2.jpg

ooo!! i like the look of that mem interface  damn look at that thing lol the mem chips incircle the core  it looks like they are 32-bit per chanel too like ati's x1000 line

humm yeah card is a bit long lol but i got long cases so if i end up gettin one shouldnt be a prob 

a intersting thing i noticed is the soldering point for a bigger pcie power connector.. humm maby that's what they will have on the card in the final version? (pcie 2.0 power connector possibly??)



EastCoasthandle said:


> Wait, you guys see nothing wrong with 2 PCI express, 6 pin connectors, the "buzzer" at the end of the cards (vista requirement, I'm sure) and the option to have both HSF and what looks like a radiator water cooling solution using alu plating with 90 degree bends???  Come on this thing looks like a pure crap to me. I mean come on, at least offer a brick power supply to offset the dual PSU requirements, sheesh talk about inefficient and backwards.
> 
> Here is the possible AGP, SLI version


 rofl nice photo shop lol


----------



## KennyT772 (Oct 2, 2006)

no theres simply a 128bit ram buffer and then 256bit vram. that way games can load next levels textures into the secondary ram and cut loading times


----------



## SPHERE (Oct 2, 2006)

KennyT772 said:


> no theres simply a 128bit ram buffer and then 256bit vram. that way games can load next levels textures into the secondary ram and cut loading times


 where did you read this?


----------



## KennyT772 (Oct 2, 2006)

between the lines

its the most logical answer as the next step for a ram bus would be 512bit. being a 256+128 means master and secondary ram banks. i bet 8 chips are on the 256bit bus and 4 on the 128bit bus. its also most logical to use the 256bit bus for active textures as it has higher performance. its also another logical idea to use the secondary ram bank for prefetching textures and geometry files as that is one major hinderance of todays cards. no game for well over 2 years will be able to fully use 768mb ram on card so nvidia will make some creative use for it.


----------



## SPHERE (Oct 2, 2006)

KennyT772 said:


> between the lines
> 
> its the most logical answer as the next step for a ram bus would be 512bit. being a 256+128 means master and secondary ram banks. i bet 8 chips are on the 256bit bus and 4 on the 128bit bus. its also most logical to use the 256bit bus for active textures as it has higher performance. its also another logical idea to use the secondary ram bank for prefetching textures and geometry files as that is one major hinderance of todays cards. no game for well over 2 years will be able to fully use 768mb ram on card so nvidia will make some creative use for it.


na that wouldnt make sence  see here is how i see it they now have 768mb 384 bit of mem that they dinamicaly partition off to how ever they want arrange it

its like having 12 hdds in raid 0 dont make 2 raids and limit your options make one and partition it however you want

there are aplications now that can put use to that much mem like 3ds max etc..  

the reason for that much mem and it being 384-bit is cause there isn't enough room for more than that bit size and they are using 512mbit chips im shure if they could find the room they would make it 512bit with 256mbit chips but it would require a pcb with a hella lot of layers 

btw if you look at the back it seems they are now using 32 bit per chanel mem interfaces instead of 64-bit per chanel like on previous generations which takes more pcb room and also performs better

??


----------



## KennyT772 (Oct 2, 2006)

dude..its said right on the nvid released specsheet what there are 2 seperate busses. 

what in the hell are you talking about with the ram bitsize


----------



## SPHERE (Oct 2, 2006)

KennyT772 said:


> dude..its said right on the nvid released specsheet what there are 2 seperate busses.
> 
> what in the hell are you talking about with the ram bitsize


 yeah i saw that "128+256" that makes no sence to have 2 banks on a card you dont see systems with one bank for windows and one bank for programs

even if your interpretatin is correct that is dumb cause it would be slower than just using a 384bit bus for both btw in your previous post you talked about how it loaded textures n stuff into this ram buffer you are theoryizing well thats what a texture buffer is and that is what is on your video card right now 

its worded as 128bit+256 bit cause they are added its just a werid way of describing that is easily misinterpreted kinda like you could describe the new xeon chipsets as being 128+ 128 bit but in reality its a 256 bit bus

when refering to bit size i was talking about the bus width aka 128-bit 256-bit etc

this will be my last post on this topic i just dont feel like debating lol nothing personal i hate internet debates


----------



## Solaris17 (Oct 2, 2006)

Ketxxx said:


> lol, no. i say ati and nvidia take a seat and work on proper solutions, in the meantime programmers whill just have to *shock, horror* actually program code and optimise it half decently for a change.




or **OMG THRILLER** realease a peice of hardware that has driver support !!!!    omg i know imagine that?! wow.


----------



## vexd (Oct 15, 2006)

technicks said:


> Crappy looking.



rofl my video card could literally look like a piece of s**t and i wouldnt care, its what it does that counts ^^


----------

