Wednesday, October 3rd 2007
![NVIDIA](https://tpucdn.com/images/news/nvidia-v1738672025795.png)
NVIDIA GeForce 8800GT Pictured
Updated: NVIDIA GeForce 8800GT Pictured
NVIDIA's latest GeForce 8800GT (G92) video card will be clocked at 600MHz core and 1.8GHz memory, and will feature single slot cooling, at least according to the leaked picture. The reference 3DMark06 score provided by NVIDIA is 10769 marks. The launch date is said to be brought forward to October 29th.
Sources:
VR-Zone, mobile1
NVIDIA's latest GeForce 8800GT (G92) video card will be clocked at 600MHz core and 1.8GHz memory, and will feature single slot cooling, at least according to the leaked picture. The reference 3DMark06 score provided by NVIDIA is 10769 marks. The launch date is said to be brought forward to October 29th.
50 Comments on NVIDIA GeForce 8800GT Pictured
Its good to see the big boys are fighting again, rather than 8800's dominating
HD2600xt or 8800GT ? Hmmmm ......
Looks very promising, but like already stated, we'll have to see what it really is when they're released. I love the Mid-range competition season, it seems this one was off to a lazy start, but now it's kicking into gear. I know some consider the 8600's and 2600's mid range, and they are to an extent, but not the mid-range I wanted to have.
I can see it now, TPU will be getting flogged with mass 8800GT vs 2950PRO/2900Pro threads...I will pick the one that gives me the most for my money in the end.
:toast:
Nvidia has decided to change the marketing name of G92 (D8P) from GeForce 8700 GTS to GeForce 8800 GT instead. The 8800 GT card is 9-inch long, has 8-10 layer PCB and equipped with 1ns GDDR3 memories. There will be 2 SKUs; GeForce 8800 GT 256MB and GeForce 8800 GT 512MB priced at US$199 and US$249 respectively. The performance lies between GeForce 8600 GTS and 8800 GTS 640MB. However, knowing that RV670 is doing better than expected, Nvidia is now trying to bump up clock speed of G92 and may push forward its launch schedule to early November to be ahead of RV670. There will be 2 SKUs; GeForce 8800 GT 256MB and GeForce 8800 GT 512MB priced at US$199 and US$249 respectively.
www.vr-zone.com/articles/GeForce_8800_GT_Overclocking_Is_Good/5312.html
We heard the overclocking headroom for GeForce 8800 GT is pretty impressive mainly due to the finer 65nm process technology. However, Nvidia has purposely limits the core clock at 600MHz so its performance won't get too close to the GeForce 8800 GTS thereby hurt its sales. However, users can still choose to overclock the core and we should be seeing more than 700MHz easily. However, if Radeon HD 2950PRO performance comes too close for comfort, Nvidia might increase the core clock further.
New GTS Sku?
www.vr-zone.com/articles/Nvidia_Plans_New_GeForce_8800_GTS_SKU/5311.html
ROPs (x clock) mark the maximum Pixel fillrate, where the SPs determine the maximum transformed, textured, and shaded pixel fillrate. Somehow inbetween are the TMUs, wich are, by far, more important than ROPs. Just take a look at these examples: 6800Ultra vs. 7800GT (6400 MP/s both, Texel fillrate 6400 vs. 8000.) or 7900GS (7200 MP/s, 9000 MT/s) vs. 7900GT(7200 MP/s,10800 MT/s). Even the 7600GT (4480 MP/s, 6720 MT/s) beats the 6800Ultra on newer or complex games.
SPs are also important as I'll try to explain in this example. Let's say you want all your objects in the game to use a shader wich uses the following:
-a color texture (diffuse)
-bump mapping
-specular highlights (so some spots of the surface shine more than others)
-some kind of transparency
To achieve this 4 effects you need to perform at least 4 operations per pixel, since renderization is performed altering the pixel color itself. This is done by the SPs.
Also you need the TMUs to map the required textures, but usually (until now) the same texture is used for more than one effect. So an hipothetic SP/TMU/ROP balance for this case could be 4/2/1.
Basically you can only take Pixel Fillrate as a power measurement when you play at really high resolutions, above 1920x1200 and at the same time you don't use AA/AF because nowadays this is also done at shader level (to some extent) and SPs would become the bottleneck again.
AddSub I think I get your point though. If you mean you don't want this card to have 8 or 12 ROPs, I must agree with you. But if it has 16 (4x4, wich is my bet) at 600MHz, that makes 9600MP/s, wich is more than enough for a 64 SP mid-range card. Compare to GTS (10,000), Ultra (14,688). In order to get 1920x1200 4xAA/16xAF at 60Hz this is what you need
1920x1200x60x4x16=8,847,630,000 => 8,847 MP/s (I was never 100% sure of this math, tho)
Something that pretty much every high end card has had since GF 7 series. But you don't get 60 frames in games with that config, why? (Retorical)
Sorry for the rant. I'm not trying to teach no one or anything, just to educate those who don't know anything about this. I may be wrong in my statements also. Feel free to reply.
Bye, Dark
ROP counts do matter and can be seen as a measure of raw power a card has. I'm pretty familiar with the architectural changes that happened during the last few years, hardware and software wise. But, I never cared for shiny shader intensive games. In fact, vast majority of my games are pre-Dx9 era, so amount of shader ops a card can push does not matter as much. I do know that shader processors do have multirole purpose, and it has nothing to do with performance but with the design simplicity (aka manufacturing cost). Now you are probably thinking, why do I need a powerful card for my old games? Well there are plenty of old titles that either due to poor design or just being ahead of their time require massive amounts of pixel pushing power. Games that take no advantage of new technologies like HDR or similar. Many of these old games are not capable of high resolutions so memory bandwidth is not that important and using(forcing) high AA/AF many times results in unwanted side-effects.
Finally, if my shinny brand new $400 card from nvidia or ATI scores lower in Dx8 era benchmarks (say, 3DMark2001, GL Excess, or similar) than my old, GF3 Ti500 (about $5 on ebay), then something is not right.
Regarding Ketxxx's green PCB complaint, there are some pictures of one with a black PCB at <a href="http://www.vr-zone.com/articles/More_8800_GT_Card_Pics_Surfaced%3B_Black_PCB/5323.html#Scene_1" target="_blank">VR-Zone</a>
EDIT: That little black regalator i hope it's not the same one that was on my 7900GT in that location as it heated up the Southbrige like 8-15c it was a total pain in the ass. As mine got as hot as 50c and then some.
Take it as you wish
www.theinquirer.net/gb/inquirer/news/2007/10/04/g92-heat-problems
BUT if i was getting a new Video card i'd wait for the newer 8800GTS's to come out.
How ever it looks like it's a MUCH better cooler that was on those cards but it's allso clocked higher too.
With living a house were we cannot afford a air conditioner they get even harder to keep cool. it just seems to me they conpeating to me so much that they take to many shortcuts.
Edit: ahwell Ive already decided on on my mobo :) lol I'll stay single card for a long time.. donno if my 500watt psu would takeon 2 of theses cards anyway.
forums.guru3d.com/showthread.php?t=200420
G71 based 90nm core hits 700mhz on 1.5vmod
G70 based 110nm core hits 570mhz on 1.5vmod
my G71 on stock vcore (1.1v) hits 565mhz core STABLE on STOCK cooling