Tuesday, March 6th 2012
GeForce GTX 680 Final Clocks Exposed, Allegedly
Waiting on Kepler before making a new GPU purchase? Well, you have to wait a little longer. Thankfully, this wait can be eased with the latest leaks about NVIDIA's 28 nm chip and the GeForce GTX 680 it powers.
According to VR-Zone, the GTX 680 does indeed feature 1536 CUDA Cores and a 256-bit memory interface, but it also has hotclocks, meaning the GPU is set to 705 MHz but the shaders operate at 1411 MHz. The memory (2 GB most likely) is supposed to be clocked at 6000 MHz giving a total memory bandwidth of 192 GB/s.
NVIDIA's incoming card is 10 inches long and also has 3-way SLI support, and four display outputs - two DVI, one HDMI and one DisplayPort. The GeForce GTX 680 is expected to be revealed on March 12 and should become available on March 23rd.
Source:
VR-Zone
According to VR-Zone, the GTX 680 does indeed feature 1536 CUDA Cores and a 256-bit memory interface, but it also has hotclocks, meaning the GPU is set to 705 MHz but the shaders operate at 1411 MHz. The memory (2 GB most likely) is supposed to be clocked at 6000 MHz giving a total memory bandwidth of 192 GB/s.
NVIDIA's incoming card is 10 inches long and also has 3-way SLI support, and four display outputs - two DVI, one HDMI and one DisplayPort. The GeForce GTX 680 is expected to be revealed on March 12 and should become available on March 23rd.
63 Comments on GeForce GTX 680 Final Clocks Exposed, Allegedly
And before you say something, no, it doesnot work the same way on NVIDIA
FX 5, failed? yes
G71, failed? yes (and more than yes, super defective GPU :) )
Renaming 8800/9800/GTs 250? failed...
So nVidia do a good job? As good as AMD, they both have problems, they both have good cards.. As for now, we don't know yet about nVidia new GPU.. Maybe they will, maybe they won't...
Anywho, speculation speculation, give us facts, give us benchmarks otherwise I will not care.
your fan boysim eyes cant be seen world biggest fail in human history?? HD2900 XT :banghead: LOL ;)
after G80 series nvidia is always wins in single gpu cards
gtx280 defeat hd4870
gtx 480 defeat hd5870
gtx 580 defeat hd6970
all time nvidia is absolute winner -amd lol's fan never seen fps stuttering in their shity cards....... if they care about that (but they dont know anything about that)
we believe this bro : :
never underestimate the power of ......... people in large group
.... ... = something like stupid
many many fan amd fan boys can do that together
(GTX480) (I'm not even mentioning price tags)
A lot of posts here are so full of BS and stupid Un-equationable equations
Sure Honda, your civic is good but our veyron supersport is better
Can't believe i'm even putting an effort to post about these subjects
And mr. pioneer, learn how to express yourself better, these kind of posts don't do good for your reputation or what's left of it.
Also,
Edit: Your link is broken.
And dont for get
The 7970 and the GTX 680 will be DX11 and will fill in the gaps till windows 8 is released which is DX12 based, So there no need to guess AMD will release a new line up with a following 8970 Codenamed "Tenerife" and Nvidia will release there GTX 780 which every one knows is coming. That would be stupid because you forgot to use the image button.
Kepler 256-bit faster than HD 7970 348-bit. Fact?
let me correct this:
gtx280 defeat hd4870 ---> the HD4870 cost less that and consume less than the GTX 280, you need to compare to GTX 260
gtx 480 defeat hd5870 ---> the HD5870 cost less that and consume less than the GTX 480, you need to compare to GTX 470
gtx 580 defeat hd6970 ---> the HD6970 cost less that and consume less than the GTX 580, you need to compare to GTX 570
I just don't give a shit who's the top performer.. I just want a good performance/price. You are comparing 2 GPU that are not in the same battle.
hey wait, compare a Porsche 911 Turbo to an Acura TL.... They are not in the same battle. You are a real clown :D . You are the "fanboy" in there.
:slap:
as for the HD2900XT, yes it was a fail, but the card was working at least. Not The 7900GT/GTX, as they failed out of the box. This is a pure fail!
Please stop calling people fanboy, as you should look at ya.
err....., this thread is going out of nowhere :wtf:
i know it's a freedom of speech, but if you respect all of articles on TPU, at least wait until w1zz release a benchmark. otherwise, stop flaming or for the good sake ; keep your mouth shut..
Don't give in to the hate people. ;)