• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce GTX 680 Features Speed Boost, Arrives This Month, etc., etc.

Well, we still don't know the whole part of the story especially the price of the GTX680.
So there is still a chance because NV has to fill some parts all the way down to the GTX560Ti/HD7850 which are 200-250$ cards and they have only one chip ready. The GK106 is nowhere to be seen (in fact it's more mysterious than the GK110).
 
Tahiti is a mid-tier "sea-islands".

Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.

That's right.



No it's not shaddy. They don't have anything better for now, either because they can't make it (TSMC 28nm issues or because of their own fault), or because when they saw the competition they decided to play safe and ensure a better situation than another GF100.

No matter what, they don't have anything faster for now. So if what they do have is faster than the competition they name it accordingly and I guess that they price it accordingly, although we dont know the price yet, so making assumptions is stupid. Like Crap Daddy said, we are probably screwed, but that's something I've been saying the minute I saw HD7970's performance, but of course I was flamed for that.
I never flamed you for the price. 7970 is way over priced.
 
I see the performance of HD7870 so close to the HD7950... maybe the HD7900 is limited by drivers until nvidia release the GTX600???

Maybe i'm paranoic.... @.@
 
Obviously you launch the card that's ready and "adapt".
Eactly, and if this works out and blindsided AMD... kuodos.

But, here’s me thinking… What happened or is happening with a GK110? Why so late? If GK104 came out this great, why not redeploy with a GK110 “death blow” at any price? Or, is it not working out right, how can a bigger die not be working, they can't correct it? ...

They're providing AMD time to engineer and release a re-spin? Something doesn't make sense with this; I mean is it that revolutionary size, performance, efficiency, and price… they just aren't compelled to stand the market on its ear?
 
Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.

It's all semantics and naming. All the 8900 series will be basically is a beefed up Tahiti. Same architecture.
 
Tenerife could be the 8900 series too. Right now Tahiti is top tier.....unless AMD goes something like 7995 or some crap.

And GK104 is top tier now.

I never flamed you for the price. 7970 is way over priced.

I never said you did, but oh, I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
 
But, here’s me thinking… What happened or is happening with a GK110? Why so late? If GK104 came out this great, why not redeploy with a GK110 “death blow” at any price? Or, is it not working out right, how can a bigger die not be working, they can't correct it? ...
Could be any number of reasons:
1. The larger GPU is obviously going to need a wider memory bus. Nvidia are lagging in memory controller implementation at the present time -hardly surprising since the GDDR5 controller was basically pioneered by AMD. Witness the relatively slow memory clocks for Fermi.
A 384 (or larger) bus width is likely a necessity for workstation, and particularly HPC, and for whatever else GK110 is, it will primarily earn back its ROI in the pro market.
2. Likewise cache
3. Double precision optimization ?
4. Maybe the sheer size of the die is problematic for yield, heat dissipation etc. Not an unknown factor with large GPU's in general and Nvidia's large monolithic dies in particular.
 
Opening another beer before going to bed. Wake me up when we can order this. :)
 
Nvidia is no longer this magical super powerful and mysterious company that worshipers had once believed, performance levels are well within I expected. It's going to tricky picking a GPU for a build I have coming up in late April.
 
And GK104 is top tier now.
Then what will they call the next one? 780 in the same year? Sorry I'm not buying it.

I never said you did, but oh, I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
Well.....as its been said we havent seen the price or power draw yet. Could be 1000 bucks with a 500w power draw for 10% faster then the 7970 lol. I doubt it.....but NVIDIOTS would pay for it. I wouldnt put it past NVIDIA to charge it knowing this.
 
Sounds like their stressing a mid-range chip to be top dog.


Something is telling me their adding voltage to get the clocks up to compete.

"Speed Boost" come on!! They already have 3 clock profiles now.Why some other kind voltage control unless your worried the damn thing is going to overheat in 3D situation.I can just hear the fan going up and down,up and down :rolleyes:

I hope im wrong but ...... we shall see :shadedshu
 
I have to wonder what effect the "clock speed-boost feature" could have on overclocking and if it could be turned off.

Hopefully GK104 clocks well as if it is only a relatively small percentage ahead of a stock 7970 then surly the 7970s with high clocks (1.1ghz+) would be so close or in theory even beat it.
Whatever happens it looks like things could get interesting but in a kind of unexpected way.

As far as the name goes obviously after seeing all the dual mid range GPU cards Nvidia chose to make the 680 just 660 SLI on a chip but the yields failed them so now the 660 is the 680 and GK100 is the 780 when AMD brings out the 89xx cards :p
 
Then what will they call the next one? 780 in the same year? Sorry I'm not buying it.

So when's launch day for the GTX 780? I'd like to get my pre-order in

BTW:
HD 2900 series ....May 2007
HD 3870 series.....Nov 2007

So, not exactly unheard of, even if you use the "same year" terminology rather than a calender year. If we're talking the same architecture, you might want to check on the GF100/GF104 launch timeable.

Well.....as its been said we havent seen the price or power draw yet. Could be 1000 bucks with a 500w power draw for 10% faster then the 7970 lol. I doubt it.....but NVIDIOTS would pay for it. I wouldnt put it past NVIDIA to charge it knowing this.
Something to be said for building a brand. Maybe if ATi/AMD had shown more than a passing interest in dev support (GiTG) and pro graphics we wouldn't be looking at this situation.

Still, no pleasing some people....as your avatar proclaims.
 
I have to wonder what effect the "clock speed-boost feature" could have on overclocking and if it could be turned off.

Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.
 
So when's launch day for the GTX 780? I'd like to get my pre-order in

BTW:
HD 2900 series ....May 2007
HD 3870 series.....Nov 2007

So, not exactly unheard of, even if you use the "same year" terminology rather than a calender year. If we're talking the same architecture, you might want to check on the GF100/GF104 launch timeable.

And if you owned a 2900 series you would also know what a bitter taste that left in your mouth. Why do you think its not common place anymore? Hmmmmm.

Also I love all the "But, but AMD does it too" crap. Some of it isn't even remotely the same. Yet people use it as an excuse for what NVIDIA is doing. Guess what? This thread is about NVIDIA not AMD.

There I bit. Ya happy? Do you really wanna troll me?
 
There I bit. Ya happy? Do you really wanna troll me?

I heard there's a GTX780 special edition handmade and signed by Jen Hsun Huang waiting for you in the lobby at NV headquarters in Santa Clara. I heard it beats the heck out of Tenerife.
 
I heard there's a GTX780 special edition handmade and signed by Jen Hsun Huang waiting for you in the lobby at NV headquarters in Santa Clara. I heard it beats the heck out of Tenerife.

Buying plane ticket nowz!
 
I still don't really know why folk say the 7970 is over priced. It's a consumer article made by a private company for profitable means. The stark reality is it is better than the 580 by a reasonable margin and can grossly overclock without any hassle to make it vastly superior (to me that means 40-50% faster).

The 3GB AMD card is on par (or cheaper) than the 3GB GTX 580 versions. Likewise the 6970 was priced reasonably high at launch (although the premium to move to the 580 was not proportional to it's superiority). The 7970 requires to be priced higher than the previous best performing single gpu card - that is just reality.

As for the 680, if it has a lower production cost (than the 580 had) then it is not unreasonable to assume it will sell at a competitive price. Many reports mention it is an efficient chip, unlike Fermi. If that is the case, it does not need an exhorbitant price tag. NV marketing knows how to sell (for better or worse, ethically) - It is not unreasonable to suggest they release a superior card and use AMD's high pricing to make consumers double take AMD's prices. "Hey look at those AMD douchebags ripping you off" scenario.

As for people harking on about AMD will just release higher clocked cards to 'hump' the 680, that's an invalid point. IF GK104 is efficient and conservatively clocked, then it may also be an overclocking dream - we dont know yet. My 580 can run at 950 (23% overclock). A 7970 at stock is 925, a lot of reviewers topped out at 1125 (TPU review hit 1075). That's a 21% overclock. Okay, so my 580 is a Lightning but the point is the same, overclocking can be done on both sides.

The 680 will also be the contemporary top tier NV card. It doesn't matter if it is not the uber perfoming card of myth. It is NV's top and possibly the worlds top performing single gpu card. If all the reasonable rumours are true, GK110(112, whatever), the daddy Kepler card IS the be all and end all and NV are in no rush with it. They've seen Tahiti and thought, "oh, is that it!" and focussed on the GK104 launch because they know they can beat it. It's a stern possibility that whatever AMD come up with, Big GK will win. Reasoning?
GCN is AMD's new design. They'll evolve their compute design for better or worse to compete with GK. NV have CUDA well under control. They can shrink it onto the current fab process and make it a monster.

I really think this round of gfx cards are little 'offerings'. AMD saying, "oh looky at our new compute stuff" and NV saying, "oh looky at our new efficient card". I think Q4 2012 will be when the real shit hits the fan and both camps make tweaks and redesigns that establish their proper power play.

Oh, Charlie at S/A says TSMC has halted ALL 28nm processes for now due to an issue.
http://semiaccurate.com/2012/03/07/tsmc-suddenly-halts-28nm-production/

Anyway, all of this is just logical personal opinion. I'm just as eager as all to see the real benchmarks from reviews.
 
Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.

That is a very good point, if it could respond fast enough and with enough of a boost it could in theory possibly improve the game play experience across the board by at least dampening the fps dips.

I would expect it to act kind of like AMDs powertune but in reverse.
 
I was flamed by many, because 15% over GTX580 was miraculous and Nvidia would never come up with something much faster and if they did it would cost $1000 and draw 500w and whatnot.
The reference HD7950 3Gb is showing 16% better @2650x with an $550 MSRP. While a GTX580 1.5Gb originally MSRP at $500! So AMD gave another 1.GB memory 16% performance, better efficiency and at that time didn't see Nvidia challenging with a GK104, so that price was not totally out of line.
 
That is a very good point, if it could respond fast enough and with enough of a boost it could in theory possibly improve the game play experience across the board by at least dampening the fps dips.

I would expect it to act kind of like AMDs powertune but in reverse.

I wondered that same thing, there is definitely no obvious way of doing it... like turboboost makes sense because it can detect when an application is bound by clockspeed bc it is a single thread, and then boosts that core with that thread....

Unless it dynamically overclocks the bottlenecking parts of the GPU, I don't see how could benefit. I mean, it is clear that it will save power by doing this but power saving always = more latency and reduced perf. Maybe it detects a safe overclock and applies it during games? The only other option is if the card boosts to an unstable long-term clock... but something that is stable for short bursts.
 
Last edited:
Am I the only one who sees this as a featureless feature its the same thing as QnQ and Intel Speed Step only the clock go's up and down.

IMO its basically saying here's a 750 hp engine that's listed as 650 hp but has this awesome feature where you press the red button and it has 750 hp

I don't see it as such. I think what they are talking about is more of a short speed boost that if ran at constantly would overheat the card. When really high loads are detected, the card overclocks itself for a short period of time, which will overload the cooler if done for a long time.

The cards already do power saving when not under load, but this detects extremely heavy load and cranks up the speeds to overcome. For example:

Image you are playing a FPS and someone throws a grenade and there is an explosion. This is an instance of high load, where a normal card would experience a framerate drop(or lag spike). But the GK104 detects this high load and momentarily boosts the clock speed to help mitigate the lag experience.

Using your example, it would be a 750HP engine that has to use a 650HP engine's cooling system due to space constraints, but you can push a button and for a few seconds get 750HP.

Question i ask is how well it will work. It would be a really good thing if can remove dips in the fps. Those happen very sudden so I think it would be hard to instantaneously boost clock speed and if it doesnt boost speed instantaneously it would have to predict the future.

They already have the "Render 3 Frames in Advance" option, so....

But I think it could be a matter of only taking a frame or two to boost the speed.

Frame 1: This frame is really hard to render.
Frame 2: Speed boost kicks in.

We know the cards are already measuring load, so it probably isn't hard to detect hard to render frames and give a momentary speed boost.
 
^^ cant wait to see the reviews and what this does to aftermarket overclocking

There is probably a time limit too... what if you're playing a game that gives the card an all-round general hard time...
 
^^ cant wait to see the reviews and what this does to aftermarket overclocking

Actually, now that I think about it, it only really has to detect framerate. The drivers are already monitoring framerate in real time, that is how OSD programs like FRAPS work. So, when it detects a drop in framerate, speed boost kicks in for 30 seconds(or whatever) to help through it, then some kind of cool off period between boosts or something to keep the card form overheatings as well as a maximum temp for the cards to run at where beyond that temp there will be no speed boosts until the card cools off.
 
Back
Top