• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 680 Kepler 2 GB

Wasn't sure if this had been brought up yet, but it says FXAA has been added as an option in the control panel?

Does this mean it can now be forced in all DX9/10/11 games? I know before they had an option in Nvidia's control panel, but it was only for OpenGL.
 
Wasn't sure if this had been brought up yet, but it says FXAA has been added as an option in the control panel?

Does this mean it can now be forced in all DX9/10/11 games? I know before they had an option in Nvidia's control panel, but it was only for OpenGL.

and also, which of the outlined features will be supported by older generation cards (such as fermi, gt200, g80)
 
that's true. I threw away a bunch of HD 2000 class cards last year. and yes, it hurt.
HD 3000 will be next this year

don't throw them better donate for making free pc for poor people's kids....
 
I feel somewhat sorry for those who already got the 7970

I don't if they've used and enjoyed it for say most all of Jan, Feb and now 3 weeks in March that's 10 weeks or $5 a week.

While today you'd supposedly trail by approximately 5% in a few of the most demanding titles @1920x where it really matters (those approx 60Fps or less). So if you don't play Batman, Civilization, or Dragon Age it really becomes moot. It is BF3 that really mattered most... 7970 buyer has played for 10 weeks and today they feel like the lost 10% in average Fps that alway the nature of this bussiness. Though in some tests we see the minimums for GTX680 while good aren't spectacular as 10%, and it's those low-points that matter more when it actual game play. And that's where "Turbo Boost" took the advantage especially BF3 and the untapped omph in a GK104, but they needed Turbo Boost to find it. Nvidia knew BF3 was the only game they had to really make the most supearlative, and on that front the spent the time/money and really worked the profiles of the "clock speed nanny" to really shine. Will they be able to cost PCB and the components that are need for that on a $400 GK104?

There’s a lot of talk about efficiency, and on the pure watts/Fps it very good, but in actual gaming consumption it's like 3% more efficient across the spectrum of actual gaming titles, which yield say 5% in Fps (verse a 7970) while exceptionally remarkable against previous GTX580. That's again all "clock speed nanny"

When GTX580 plummeted in price a few weeks back that was a clear sign the 680 was going to overwhelm the 580. I feel sorry for the GTX580 buyer who thought it was a deal for $420, while needing to work some huge rebate... they may hurt the worst. :twitch:
 
Last edited:
-----Original Message-----
From: Jensen H Huang
Sent: Thursday, March 22, 2012 9:48 AM
To: Employees
Subject: Kepler Rising

Today, the first Kepler - GTX 680 - is on shelves around the world!

Three years in the making. The endeavor of a thousand of the world's best engineers. One vision - build a revolutionary GPU and make a giant leap in efficient-performance.

Achieving efficient-performance, great performance while consuming the least possible energy, required us to change our entire design approach. Close collaboration between architecture-design-VLSI-software-devtech-systems, intense scrutiny on where energy is spent, and inventions at every level were necessary. The results are fantastic as you will see in the reviews.

Kepler also cultivated a passion for craftsmanship - nothing wasted, everything put together with care - with a goal of creating an exquisite product that works wonderfully. Let's continue to raise the bar and establish extraordinary craftsmanship as a hallmark of our company.

Today is just the beginning of Kepler. Because of its super energy-efficient architecture, we will extend GPUs into datacenters, to super thin notebooks, to superphones. Not to mention bring joy and delight to millions of gamers around the world.

I want to thank all that gave your heart and soul to create Kepler. You've created something wonderful.

Congratulations everyone!

Jensen

Anandtech
 
Because of its super energy-efficient architecture

How is it super efficient? It should consume less power than it does.
It has smaller bus, less shaders, less memory. and it only wins in peak, Max and Blue-ray, but I guess it is still faster and it does win Max power by a bit. I would say its Efficient, but not Super.

Capture021.jpg


But thats me just being picky I still want this card!! My 2x 4850 comes in at almost 50%:cry: of this GTX 680 in Relative performance.
 
W1z, is there any chance you can do a test with WoW at 1920x1200 with all the bells and whistles? (there is a DX11 option)
 
Last edited:
I would say its Efficient, but not Super.

Though this is Nvidia and the word "efficient" was never even in their vocabulary before! :D
 
Def nice clock's! bta already posted a link to that yesterday IIRC. Also you wont see any end user's getting that high, best we/they can hope for is what ever the DOC paramater's allow :o

Plus that was done with the add on VRM pwr board ;)

yeah well Evga SC+ signature varient of the 680 will have 8+6 pin power design, but then 5 phase PWM, Evga FTW will have 8 phase pwm, and then the Classified will have 13 phase I think. So those cards shoudl be able to clock pretty damn well assuming the cooling is adequete
 
Last edited:
Woohoo Radeon HD7970 out the door, and GTX680sli in lol. Pretty pumped and I hope the drivers are better, as this 7970 in CF was giving me a headache one problem after another.


:nutkick:
 
New card looks great. Might be time for me to go Nvidia. As of now, price performance, this card kills the 7970.
 
Remember the brilliant RV770? This is history repeating with Nvidia as the one with the clever architecture.

GTX680 = Tahiti. So either card is worth buying in terms of performance. The reason to buy Kepler would be the better price, lower power consumption (i.e. GPU produce less heat energy), CUDA, fancy AA. As for features that were unique to ATI is also now available in NV, eyefinity and the lean/low power consuming architecture. I hope NV will never make another barbarian/joule guzzler GPU again.

This is really the perfect achievement from Nvidia! This is definitely a worthy buy, lets see if this will ignite the price war. I'll definitely get one when the price is a bit better than they are now.

What I seriously love about this card is that it is able to keep up despite being lean and consumes less power. Moreover, for once, this chip does not have a heat spreader like previous NV cards. This makes the job of adding water block a lot easier. When I mod my 560TI to water cooling, it was quite annoying that the spreader was in the way. I end up drilling holes and a whole lot of custom mod to get it to work.

The dynamic clocking system is a good feature. I actually tried my own custom dynamic clocking on my 4890. It worked but everytime the clock changes, there is a noticeable halt. This is similar to the problem faced by Kepler now. See the last paragraph:
http://techreport.com/articles.x/22653/9
Hopefully future drivers will fix this up because I hate those inconsistent framerate.

- Tahiti = Fermi which was out way sooner. Correction, Tahiti = improved Fermi, Kepler = improved Fermi / Tahiti hybird... more or less.
- Idk why people are trying to hold on the 4 screen things since you could 4 independent ever since 1st gen iSeries H55 chipset... IGD+PEG at same time 2 on board 2 on card.. and you could 3 way for Nvidia in the lesser known MDT series or the 2win on a single card. Yes the Kepler is a better improvement on this but it's not like 4 monitors was not possible on the Nvidia side.
 
Last edited:
I don't if they've used and enjoyed it for say most all of Jan, Feb and now 3 weeks in March that's 10 weeks or $5 a week.

While today you'd supposedly trail by approximately 5% in a few of the most demanding titles @1920x where it really matters (those approx 60Fps or less). So if you don't play Batman, Civilization, or Dragon Age it really becomes moot. It is BF3 that really mattered most... 7970 buyer has played for 10 weeks and today they feel like the lost 10% in average Fps that alway the nature of this bussiness. Though in some tests we see the minimums for GTX680 while good aren't spectacular as 10%, and it's those low-points that matter more when it actual game play. And that's where "Turbo Boost" took the advantage especially BF3 and the untapped omph in a GK104, but they needed Turbo Boost to find it. Nvidia knew BF3 was the only game they had to really make the most supearlative, and on that front the spent the time/money and really worked the profiles of the "clock speed nanny" to really shine. Will they be able to cost PCB and the components that are need for that on a $400 GK104?

There’s a lot of talk about efficiency, and on the pure watts/Fps it very good, but in actual gaming consumption it's like 3% more efficient across the spectrum of actual gaming titles, which yield say 5% in Fps (verse a 7970) while exceptionally remarkable against previous GTX580. That's again all "clock speed nanny"

When GTX580 plummeted in price a few weeks back that was a clear sign the 680 was going to overwhelm the 580. I feel sorry for the GTX580 buyer who thought it was a deal for $420, while needing to work some huge rebate... they may hurt the worst. :twitch:

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/16
in this review the hd7970 did better than kepler and was well in the 80fps range, which is wierd if you ask me

also look at this
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17
hd7970 is dominating in compute, so while kepler is a bit more efficient in games it happens to be for the price of compute(but i would take that with a grain of salt as this review about compute isnt extensive enough)
but dang this round it looks like both amd and nvidia are kinda switching places! or more or less becoming very similar

EDIT: also i just realized this test is using an i7 920 while the other is using an intel sandy bridge extreme, the difference between the 2 is interesting
and here the driver used is 11.12, any reason why you didnt use catalyst 12.2 w1zz? since i believe thats the one that adds support to hd7970(im assuming because they dont make a great difference and the release hd7970 results are being compared with kepler)
 
NVIDIA should release a version of this card which doesn't have GPU Boost and with which we can change the base clocks. GPU Boost is useless.
 
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/16
in this review the hd7970 did better than kepler and was well in the 80fps range, which is wierd if you ask me

also look at this
http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review/17
hd7970 is dominating in compute, so while kepler is a bit more efficient in games it happens to be for the price of compute(but i would take that with a grain of salt as this review about compute isnt extensive enough)
but dang this round it looks like both amd and nvidia are kinda switching places! or more or less becoming very similar

EDIT: also i just realized this test is using an i7 920 while the other is using an intel sandy bridge extreme, the difference between the 2 is interesting
and here the driver used is 11.12, any reason why you didnt use catalyst 12.2 w1zz? since i believe thats the one that adds support to hd7970(im assuming because they dont make a great difference and the release hd7970 results are being compared with kepler)

In the exact page you linked the GTX680 beats the HD7970 in 2/3 resolutions. The HD7970 only wins at 1920x1200, and at that res it's only ahead by 1fps, which is well within the MoE. By your logic I could point to the bottom chart (1680x1050) and say "oh look, the GTX570 is faster than the HD7970!" As for Compute, it's 50/50. Half the time it runs about as well as the 580, half the time it runs like a 560Ti. As long as the GTX680 can perform in games, it's a winner in my book (especially being smaller, less power hungry, quieter, and cooler than the HD7970).

As for the test bench, this has come up every review. W1z has done several benchmarks proving the OCed i7-920 is not holding back the setup. While SB is technically faster, when testing GPU's it's largely irrelevant. The difference even in the most CPU-bound game he runs (SC2) was only a few percentage points.
 
Talking about head to head o/c versus o/c:
http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-11.html
"Overall, it’s pretty easy to see that AMD’s Radeon HD 7970 has more to gain from an aggressive overclock than Nvidia’s GeForce GTX 680. It’s a little disappointing, then, that the Catalyst Control Center driver tops out at 1125/1575 MHz."
It would be interesting to see, two msi lightnings of this puppies(gtx680, hd7970) voltage tweaked and tested.
 
GTX 680, currently selling in our place at 686 USD, wtf
 
i-im sorry to hear that >.<
 
Talking about head to head o/c versus o/c:
http://www.tomshardware.com/reviews/geforce-gtx-680-sli-overclock-surround,3162-11.html
"Overall, it’s pretty easy to see that AMD’s Radeon HD 7970 has more to gain from an aggressive overclock than Nvidia’s GeForce GTX 680. It’s a little disappointing, then, that the Catalyst Control Center driver tops out at 1125/1575 MHz."
It would be interesting to see, two msi lightnings of this puppies(gtx680, hd7970) voltage tweaked and tested.

Lol I love these so called "fair" assessments ... they will over clock the 7970 to make it fair with the GTX 680 yet fail to mention the 1GB advantage of the 7970 cards lol. If they wanted to make it completely "fair" just to compare clock for clock, they'd find a way to nerf the extra 1GB of ram the 7970 has lol ... I know that probably can't be done but still not exactly "fair".

PS... combining post so no double posting.

Wow. I don't know if it's been said yet but prices are already dropping on GTX 560 cards... CompUSA shows all 560 cards from at least 6 manufactures just all went to @$164.99-$169.99 when previously at $179.00-$199.00. GTX 560 ti seems to be dropping a small $5.00 per card as well. Hope it keeps dropping :)
 
Last edited:
In the exact page you linked the GTX680 beats the HD7970 in 2/3 resolutions. The HD7970 only wins at 1920x1200, and at that res it's only ahead by 1fps, which is well within the MoE. By your logic I could point to the bottom chart (1680x1050) and say "oh look, the GTX570 is faster than the HD7970!" As for Compute, it's 50/50. Half the time it runs about as well as the 580, half the time it runs like a 560Ti. As long as the GTX680 can perform in games, it's a winner in my book (especially being smaller, less power hungry, quieter, and cooler than the HD7970).

As for the test bench, this has come up every review. W1z has done several benchmarks proving the OCed i7-920 is not holding back the setup. While SB is technically faster, when testing GPU's it's largely irrelevant. The difference even in the most CPU-bound game he runs (SC2) was only a few percentage points.

yes im aware and i agree, what i meant to say is that kepler in this setup ties with hd7970 in civilization, while in wizz's review the hd7970 was way behind, thats the point i was trying to make. and i assumed the cpu had something to do with it, tho the interesting bit is, kepler performed the same pretty much on both benchmarks(except in 1680x1050) while the hd7970 caught up in the anandtech review and was on par with kepler. one thing i just noticed is that wizz used 4xAA, anand was using 4xMSAA

as for the compute, what i was trying to say is that hd7970 is much more consistent in its performance in compute, while the gtx680 definitely seems to have its weak points, with the fluid simulation being on top, and half the benches being all the way down, and on of them being within the top but pretty close to the tahiti its hard to even consider it a total Win (in that specific becnhmark)
 
i-im sorry to hear that >.<

So am I. The only major part I kept from my old PC was the GFX card waiting for this gen of cards to surface. But the ridiculous prices have pretty much put my "final upgrade" on hold and hope competition forces prices down...:shadedshu
 
yes im aware and i agree, what i meant to say is that kepler in this setup ties with hd7970 in civilization, while in wizz's review the hd7970 was way behind, thats the point i was trying to make. and i assumed the cpu had something to do with it, tho the interesting bit is, kepler performed the same pretty much on both benchmarks(except in 1680x1050) while the hd7970 caught up in the anandtech review and was on par with kepler. one thing i just noticed is that wizz used 4xAA, anand was using 4xMSAA

as for the compute, what i was trying to say is that hd7970 is much more consistent in its performance in compute, while the gtx680 definitely seems to have its weak points, with the fluid simulation being on top, and half the benches being all the way down, and on of them being within the top but pretty close to the tahiti its hard to even consider it a total Win (in that specific becnhmark)

LOl sorry thought you were someone repling to my post above originally... I'll let this post stand though cause it's relevant to your post. Of course we're gonna see some fallout spins from the AMD side. All the single card benches and most the 2way SLI benchs that I have seen have the GTX 680 beating the 7970 on BF3 but some with good margin and others just barely. I haven't really looked at the other game benches cause wow, so much to read through atm and swamped. As far as 4xAA and 4xMSAA yeah the MSAA is more taxing on FPS I believe. There could be other small factors contributing to the inconsistencies in the GTX 680 reviews. One of course could be that each review uses a different test bench setup. Not sure what could be going on there; BUT, I did see a reivew where they compared 3way and 4way SLI to 3 and 4 xfire and it seems like something happens in BF3 to the GTX680 with 3 and 4 sli setups... The xfire seemed to beat it miserrably with scalability in 4way SLI being almost non-exsistent. Seems like the GTX 680 losses it's scalability after 3 cards. I'm wondering if this is a driver problem... I havn't had a chance to read more about it yet. Any thought ?
 
Last edited:
LOl sorry thought you were someone repling to my post above originally... I'll let this post stand though cause it's relevant to your post. Of course we're gonna see some fallout spins from the AMD side. All the single card benches and most the 2way SLI benchs that I have seen have the GTX 680 beating the 7970 on BF3 but some with good margin and others just barely. I haven't really looked at the other game benches cause wow, so much to read through atm and swamped. As far as 4xAA and 4xMSAA yeah the MSAA is more taxing on FPS I believe. There could be other small factors contributing to the inconsistencies in the GTX 680 reviews. One of course could be that each review uses a different test bench setup. Not sure what could be going on there; BUT, I did see a reivew where they compared 3way and 4way SLI to 3 and 4 xfire and it seems like something happens in BF3 to the GTX680 with 3 and 4 sli setups... The xfire seemed to beat it miserrably with scalability in 4way SLI being almost non-exsistent. Seems like the GTX 680 losses it's scalability after 3 cards. I'm wondering if this is a driver problem... I havn't had a chance to read more about it yet. Any thought ?

yea man exactly, thats what was confusing me, and i tried spotting the differences but still cant put my hand on what is going on, but owell in general the 680 has an edge in gaming on single gpu, but its pretty close in my opinion
as for scaling yea i saw something like that, and it said that the inconsistent clocks on 4 different cards or so make the scaling very difficult to achieve(remember usualy sli and xfire setups clock according to the lowest card in the bunch, in this case each card clocks differently) so if thats the case im assuming future drivers might disable dynamic clocking(thats the easy solution) or find some clever way to sort it out and i know nvidia's drivers team is pretty talented and can eventually do it, but i could be wrong as i know very little about this issue other than what i read in the article I saw
 
Back
Top