Tuesday, March 6th 2012

GeForce GTX 680 Final Clocks Exposed, Allegedly

Waiting on Kepler before making a new GPU purchase? Well, you have to wait a little longer. Thankfully, this wait can be eased with the latest leaks about NVIDIA's 28 nm chip and the GeForce GTX 680 it powers.

According to VR-Zone, the GTX 680 does indeed feature 1536 CUDA Cores and a 256-bit memory interface, but it also has hotclocks, meaning the GPU is set to 705 MHz but the shaders operate at 1411 MHz. The memory (2 GB most likely) is supposed to be clocked at 6000 MHz giving a total memory bandwidth of 192 GB/s.

NVIDIA's incoming card is 10 inches long and also has 3-way SLI support, and four display outputs - two DVI, one HDMI and one DisplayPort. The GeForce GTX 680 is expected to be revealed on March 12 and should become available on March 23rd.
Source: VR-Zone
Add your own comment

63 Comments on GeForce GTX 680 Final Clocks Exposed, Allegedly

#26
LiveOrDie
Dj-ElectriCI dont pay a dime for the GPUs i have here, just the pleasures of being a reviewer.
And to conclude u have nothing smart to say and probably agreed with what i said, its ok son.

I still don't know what the GTX680 has to offer, so i can't really say anything about it. I just hope it will include out-of-the-box triple monitor support.
Well i know the 7970s have 2 mini DP ports but you can use a DisplayPort Hub right?
Posted on Reply
#27
Crap Daddy
Live OR DieI am a fanboy i glad to be AMD can go jump in front of a bus, and how do you even know if the 680 cant support tri monitors it has DisplayPort as well :rolleyes:.

And glad to see you have 2x 7970 i also had crossfire and sold both cards and glad there gone 120Hz or nothing!
Don't know if you know but Kyle over at hardOCP (another AMD biased site) said that he info from China (so not around the webz) that in canned benchmarks - artificial - this is 45-50% over the GTX 580 and it will support 3 monitors NV surround out of the box. Take it as you wish.
Posted on Reply
#28
dj-electric
Live OR DieWell i know the 7970s have 2 mini DP ports but you can use a DisplayPort Hub right?
I can use a hub to chain-link two MiniDPs and get 6 displays up and running. 5 on one GPU, not a secret.

And before you say something, no, it doesnot work the same way on NVIDIA
Posted on Reply
#29
ChristTheGreat
pioneerlike AMD Faildozer??i hope this not happend- nvidia is not like AMD fail producer- nvidia every when do its job well - in 5870 dx11 is like joke - gtx480 does com to fix it - now GK104 come to defeat Tahiti and GK100 come to exterminate any thing;)
Please, correct me if I am wrong.

FX 5, failed? yes
G71, failed? yes (and more than yes, super defective GPU :) )
Renaming 8800/9800/GTs 250? failed...

So nVidia do a good job? As good as AMD, they both have problems, they both have good cards..
I can use a hub to chain-link two MiniDPs and get 6 displays up and running. 5 on one GPU, not a secret.

And before you say something, no, it doesnot work the same way on NVIDIA
As for now, we don't know yet about nVidia new GPU.. Maybe they will, maybe they won't...
Posted on Reply
#30
ZoneDymo
Is it just me or is Pioneer´s and LiveofDie´s English skill lacking?

Anywho, speculation speculation, give us facts, give us benchmarks otherwise I will not care.
Posted on Reply
#31
THE_EGG
Hang on a second, I thought Nvidia was getting rid of their separate clocks for the shaders and it was only going to behave the same as AMD cards. Another rumour falsified i suppose. Anyway the card looks well set out, but i'm still curious about the spare solder points for another 6-pin power connector. Benchies soon I hope.
Posted on Reply
#32
Aquinus
Resident Wat-man
THE_EGGHang on a second, I thought Nvidia was getting rid of their separate clocks for the shaders and it was only going to behave the same as AMD cards. Another rumour falsified i suppose. Anyway the card looks well set out, but i'm still curious about the spare solder points for another 6-pin power connector. Benchies soon I hope.
We will see when nVidia actually releases some legit information on the GPU. You know, some older chips had spots for a second power connector and if you soldered a connector on, it was in some cases to get better over-clocks out of them. My only concern would be, how much power will this puppy eat up and how much more can it take before temperatures and power consumption becomes unrealistic?
Posted on Reply
#33
THE_EGG
AquinusWe will see when nVidia actually releases some legit information on the GPU. You know, some older chips had spots for a second power connector and if you soldered a connector on, it was in some cases to get better over-clocks out of them. My only concern would be, how much power will this puppy eat up and how much more can it take before temperatures and power consumption becomes unrealistic?
yeh thats why im curious, i remember back in the 8xxx and 9xxx days of the geforce cards how it gave some extra power for better OCs.
Posted on Reply
#34
Benetanegia
Crap Daddycanned benchmarks - artificial -
On that, for Kyle canned benchmark == any benchmark not done by HardOCP, so for example all of W1zz's reviews == canned bench for him.
Posted on Reply
#35
Aquinus
Resident Wat-man
A lot of these rumors are showing up on Chinese websites with no cited sources. Weather this information is legit or not is very up in the air. Only nVidia can set the record straight, unfortunately.
Posted on Reply
#36
Crap Daddy
AquinusA lot of these rumors are showing up on Chinese websites with no cited sources. Weather this information is legit or not is very up in the air. Only nVidia can set the record straight, unfortunately.
This will happen very soon. Our very own Wizz is on his way to SF for NV briefing.
Posted on Reply
#37
rpsgc
Live OR DieTri monitors using a 7950 :wtf: that 1gb per screen Enjoy your bottle neck. sorry you wont get one if your playing games from the 80s.
Not_sure_if_trolling_or_just_stupid.jpg
Posted on Reply
#38
pioneer
ChristTheGreatPlease, correct me if I am wrong.

FX 5, failed? yes
G71, failed? yes (and more than yes, super defective GPU :) )
Renaming 8800/9800/GTs 250? failed...

So nVidia do a good job? As good as AMD, they both have problems, they both have good cards..
w8 w8 ..

your fan boysim eyes cant be seen world biggest fail in human history?? HD2900 XT :banghead: LOL ;)

after G80 series nvidia is always wins in single gpu cards

gtx280 defeat hd4870

gtx 480 defeat hd5870

gtx 580 defeat hd6970

all time nvidia is absolute winner -amd lol's fan never seen fps stuttering in their shity cards....... if they care about that (but they dont know anything about that)

we believe this bro : :

never underestimate the power of ......... people in large group

.... ... = something like stupid

many many fan amd fan boys can do that together
Posted on Reply
#39
dj-electric
putting so much sh*t your card draws twice the power and only 20% faster isn't a win to brag about
(GTX480) (I'm not even mentioning price tags)

A lot of posts here are so full of BS and stupid Un-equationable equations

Sure Honda, your civic is good but our veyron supersport is better
Can't believe i'm even putting an effort to post about these subjects

And mr. pioneer, learn how to express yourself better, these kind of posts don't do good for your reputation or what's left of it.
Posted on Reply
#40
symmetrical
Here's hoping they price it at $549 and offer better performance than a 7970. That way 7970 prices will come down to earth and the planets will re-align.
Posted on Reply
#41
razaron
When is this supposed to be coming out?

Also,
Posted on Reply
#42
Aquinus
Resident Wat-man
razaronWhen is this supposed to be coming out?

Also,
www.gifflix.com/files/4326f39cc6e2.gif
No one know for sure, but there are rumors that nVidia is going to release information mid-march and something is going to be released closer to April, but take this with a grain of salt.

Edit: Your link is broken.
Posted on Reply
#43
punani
pioneerw8 w8 ..

your fan boysim eyes cant be seen world biggest fail in human history?? HD2900 XT :banghead: LOL ;)

after G80 series nvidia is always wins in single gpu cards

gtx280 defeat hd4870

gtx 480 defeat hd5870

gtx 580 defeat hd6970

all time nvidia is absolute winner -amd lol's fan never seen fps stuttering in their shity cards....... if they care about that (but they dont know anything about that)

we believe this bro : :

never underestimate the power of ......... people in large group

.... ... = something like stupid

many many fan amd fan boys can do that together
I should put this in my signature :D !
Posted on Reply
#44
LiveOrDie
ZoneDymoIs it just me or is Pioneer´s and LiveofDie´s English skill lacking?

Anywho, speculation speculation, give us facts, give us benchmarks otherwise I will not care.
If you must know I'm dyslexic i do the best i cant, and its liveordie not liveofdie :p

And dont for get

The 7970 and the GTX 680 will be DX11 and will fill in the gaps till windows 8 is released which is DX12 based, So there no need to guess AMD will release a new line up with a following 8970 Codenamed "Tenerife" and Nvidia will release there GTX 780 which every one knows is coming.
rpsgcNot_sure_if_trolling_or_just_stupid.jpg
That would be stupid because you forgot to use the image button.
Posted on Reply
#45
dir_d
Quality of this site is going down, you all sound very immature. For goodness sake lets hope this thing slaughters the 7970 and comes in at $350 so i can pick up a 7970 on the cheap. Both sides perform relatively well we just need more competition.
Posted on Reply
#47
ChristTheGreat
pioneerw8 w8 ..

your fan boysim eyes cant be seen world biggest fail in human history?? HD2900 XT :banghead: LOL ;)

after G80 series nvidia is always wins in single gpu cards

gtx280 defeat hd4870

gtx 480 defeat hd5870

gtx 580 defeat hd6970

all time nvidia is absolute winner -amd lol's fan never seen fps stuttering in their shity cards....... if they care about that (but they dont know anything about that)

we believe this bro : :

never underestimate the power of ......... people in large group

.... ... = something like stupid

many many fan amd fan boys can do that together
yeah sure I am a fanboy ...

let me correct this:

gtx280 defeat hd4870 ---> the HD4870 cost less that and consume less than the GTX 280, you need to compare to GTX 260

gtx 480 defeat hd5870 ---> the HD5870 cost less that and consume less than the GTX 480, you need to compare to GTX 470

gtx 580 defeat hd6970 ---> the HD6970 cost less that and consume less than the GTX 580, you need to compare to GTX 570

I just don't give a shit who's the top performer.. I just want a good performance/price. You are comparing 2 GPU that are not in the same battle.

hey wait, compare a Porsche 911 Turbo to an Acura TL.... They are not in the same battle. You are a real clown :D . You are the "fanboy" in there.

:slap:

as for the HD2900XT, yes it was a fail, but the card was working at least. Not The 7900GT/GTX, as they failed out of the box. This is a pure fail!


Please stop calling people fanboy, as you should look at ya.
Posted on Reply
#48
Oberon
Crap DaddyDon't know if you know but Kyle over at hardOCP (another AMD biased site) said that he info from China (so not around the webz) that in canned benchmarks - artificial - this is 45-50% over the GTX 580 and it will support 3 monitors NV surround out of the box. Take it as you wish.
"Another AMD biased site" that "reviews" a GTX 580 after the release of the 7000 series, compares its benchmark results against... well, nothing, and gives it a gold award because it's a single card that can drive three monitors. There's definitely a whole lot of bias at HardOCP, but it's not toward AMD.
Posted on Reply
#49
Jonap_1st
absolute winner, shitty card, fanboy, biggest fail, always win. etc..
err....., this thread is going out of nowhere :wtf:

i know it's a freedom of speech, but if you respect all of articles on TPU, at least wait until w1zz release a benchmark. otherwise, stop flaming or for the good sake ; keep your mouth shut..
Posted on Reply
#50
erocker
*
I think I should just start infracting people that use the words "fanboy" and "fail". That would clean things up nicely. I have no opinion on this card until it's seen in action and benchmarks are running.

Don't give in to the hate people. ;)
Posted on Reply
Add your own comment
Dec 27th, 2024 16:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts