Thursday, March 22nd 2012

Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?

Although it doesn't matter anymore, there are several bits of evidence supporting the theory that NVIDIA originally intended for its GK104-based performance graphics card to be named "GeForce GTX 670 Ti", before deciding to go with "GeForce GTX 680" towards the end. With the advent of 2012, we've had our industry sources refer to the part as "GTX 670 Ti". The very first picture of the GeForce GTX 680 disclosed to the public, early this month, revealed a slightly old qualification sample, which had one thing different from the card we have with us today: the model name "GTX 670 Ti" was etched onto the cooler shroud, our industry sources disclosed pictures of early samples having 6+8 pin power connectors.

Next up, while NVIDIA did re-christian GTX 670 Ti to GTX 680, it was rather sloppy at it. The first picture below shows the contents of the Boardshots (stylized) folder in NVIDIA's "special place" for the media. It contains all the assets NVIDIA allows the press, retailers, and other partners to use. Assets are distributed in various formats, the TIFF is a standard image-format used by print-media, for its high dot-pitch. Apart from a heavy payload, the TIFF image file allows tags, that can be read by Windows Explorer, these tags help people at the archives. The tags for images in TIFF format, of the GTX 680 distributed to its partners in the media and industry contain the tag "GTX 670 Ti".
It doesn't end there. Keen-eyed users, while browsing through NVIDIA Control Panel, with their GTX 680 installed, found the 3D Vision Surround displays configuration page refer to their GPU as "GTX 670 Ti". This particular image was used by NVIDIA on their 3D Vision Surround guide.

We began this article by saying that frankly, at this point, it doesn't matter. Or does it? Could it be that GK104 rocked the boardroom at NVIDIA Plex to the point where they decided that since it's competitive (in fact, faster) than AMD's Radeon HD 7970, it makes more business sense selling it as "GTX 680"?

What's in the name? Well for one, naming it "GTX 680" instead of "GTX 670 Ti", releases pressure off NVIDIA to introduce a part based on its "big chip" based on the GeForce Kepler architecture (GK1x0). It could also save NVIDIA tons of R&D costs for its GTX 700 series, because it can brand GK1x0 in the GTX 700 series, and invest relatively less, on a dual-GK104 graphics card to ward off the threat of Radeon HD 7990 "New Zealand", and save (read: sandbag) GK1x0 for AMD's next-generation Sea Islands family based on "Enhanced Graphics CoreNext" architecture, slated for later this year, if all goes well. Is it a case of mistaken identity? Overanalysis on our part? Or is there something they don't want you to know ?
Add your own comment

55 Comments on Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?

#26
GC_PaNzerFIN
erockerConsidering how many beta/preview drivers the release to the public, they very much care. They seem to work hard at bringing out new drivers to use... I suppose it's the competence of the drivers that I'm concerned with.
They just made the biggest architecture change since DX10 cards, they are experimenting with new things in the drivers. Obviously the current drivers are far from ready. Don't expect bargain sales of HD 7970 if they are only under 10% behind GTX 680 in performance tho.

What I am afraid is that both manufacturers are sort of happy with current situation. They make a lot of profit as long as prices remain this high.
Posted on Reply
#27
Yellow&Nerdy?
Nvidia might release GK110 as 7xx series sometime early next year, since they already named GK104 as the top-end GPU for the 6xx series. If the rumored specs are true at all, GK110 will be very fast, but also very hot. But at least they now have several months to tweak the card.
Posted on Reply
#28
qubit
Overclocked quantum bit
GC_PaNzerFINWhat I am afraid is that both manufacturers are sort of happy with current situation. They make a lot of profit as long as prices remain this high.
You've hit the nail on the head. It's all adjusted just so to squeeze the most revenue rather than give the best competition. This is where 5 or 6 graphics card companies in genuine hot competition would bring us the best products, not those just tweaked and priced to milk the most from the punters. :rolleyes:
Posted on Reply
#29
Benetanegia
erockerIndeed, but that's the way things work. Don't make things too fast, milk more money = profit! Now Nvidia needs to come out with the mid to upper-midstream cards. A slightly nerfed GTX 680 that can beat a 7950 or go slightly head to head with a 7970 at a good price would be the real winner.
The problem I see is what about the rest of the lineup? I'm pretty sure they are going to do the same they did the last generation, so GK106 will be half of what GK104 is and GK107 is 1/4 of GK104 (this has been confirmed). The problem with that is, that unless scaling is borked on GK104*, much more so than on Tahiti vs Pitcairn, GK106 will have a very tough time competing at all. I'd imagine that it will be as fast as the GTX560 Ti, which is fine for a GTX550 replacement (which is what it was suposed to be) but it's a very bad replacement for the GTX560 Ti itself, much less competition for Pitcairn. What will they sell in the $250-350 range? Maybe Nvidia just lost the upper-mainstream on their own.

* And considering what Nvidia achieved with the thing, that's hard to believe.
Yellow&Nerdy?Nvidia might release GK110 as 7xx series sometime early next year, since they already named GK104 as the top-end GPU for the 6xx series. If the rumored specs are true at all, GK110 will be very fast, but also very hot. But at least they now have several months to tweak the card.
GK110 should be ready for August this year, since it taped out in January/February. That's why the situation with it is so complex. ~5 months for a new gen?
Posted on Reply
#30
Crap Daddy
BenetanegiaWhat will they sell in the $250-350 range? Maybe Nvidia just lost the upper-mainstream on their own.
I will try and guess. GK104. NV launched today apparently some Kepler mobile GPUs all based on GK107 and which range from 660M, enthusiast to 640M performance. I expect at least one GTX670Ti, let's say and one GTX670, first for 7950 (350-400$) and the second for 7870 (300-350$) based on GK104.
Posted on Reply
#31
TheoneandonlyMrK
BenetanegiaGK110 should be ready for August this year, since it taped out in January/February. That's why the situation with it is so complex. ~5 months for a new gen?
thats a short timeframe ,5 months but then that really would be 5 months late, no dig but it is what it is, but that would allow them to appear to the majority as being on their game so, might be wise,

and imho they are only showing part of their hand, i think they have done an Amazeing job with the silicon they have been passed ,but i honestly believe this chip still isnt quite how they wanted it tobe and the enforced power gateing and clock control is some very very clever way of increasing yields, in my head they had a chip with more shaders per smx and then have per shader redundancy ,ie cut out individual shaders plus the abillity to bin on smx units and the forced abillity to control the power ,so i see a lot of high shader count card varieties all called roughly the same thing but clocked very differently , 680 eco ,R GT GTX GS TI then with reduced SMX take ten off 680 per smx and add prefix per binned clock speed = less free oc all round but a good market spread.
Posted on Reply
#32
Yellow&Nerdy?
BenetanegiaGK110 should be ready for August this year, since it taped out in January/February. That's why the situation with it is so complex. ~5 months for a new gen?
Well since they claim to be disappointed by SI, and were able to claim the top single-GPU position with a more mid-range GK104, why not just delay GK110 until next gen and call it the GTX 780? There's absolutely no reason to release it so soon and I'm sure that it should be able to do pretty well against what will be AMD's next gen, since from what rumors have said until now is that it will just be a tweaked version of SI with no architectural changes or shrunken manufacturing process. But I guess it's all speculations right now, we'll see what's the case in 5 - 6 months.
Posted on Reply
#33
Benetanegia
Crap DaddyI will try and guess. GK104. NV launched today apparently some Kepler mobile GPUs all based on GK107 and which range from 660M, enthusiast to 640M performance. I expect at least one GTX670Ti, let's say and one GTX670, first for 7950 (350-400$) and the second for 7870 (300-350$) based on GK104.
And GK106 for $200? It could be. But forcing 3 cards out of a chip so small... I don't know if that's a good practice.
Posted on Reply
#34
Inceptor
Humm...
If GK110 is possibly ready for August, either the life of the 6xx series is extended and lasts well into AMD's 8xxx series lifetime, or the 6xx series is truncated in its card rollout and the 7xx series quickly takes its place at the high end, before the end of the year.
Posted on Reply
#35
Benetanegia
theoneandonlymrkand imho they are only showing part of their hand, i think they have done an Amazeing job with the silicon they have been passed ,but i honestly believe this chip still isnt quite how they wanted it tobe and the enforced power gateing and clock control is some very very clever way of increasing yields, in my head they had a chip with more shaders per smx and then have per shader redundancy ,ie cut out individual shaders plus the abillity to bin on smx units and the forced abillity to control the power ,so i see a lot of high shader count card varieties all called roughly the same thing but clocked very differently , 680 eco ,R GT GTX GS TI then with reduced SMX take ten off 680 per smx and add prefix per binned clock speed = less free oc all round but a good market spread.
The chip is not what they wanted it to be? Wow, AMD can thank God then, because if it had been what Nvidia wanted...

Power control is there from the very beginning, as is the SMX's, the amount, the way they are etc. It's not something you can change on the fly. It is what it was meant to be from the beginning. Except that it has launched at 1000 Mhz instead of at 950 Mhz which was probably the plan. But then AMd released the "Ghz Edition campaign" and robably Nvidia marketing team started :banghead: "this should have been our idea!!"
Posted on Reply
#36
GC_PaNzerFIN
Yellow&Nerdy?Well since they claim to be disappointed by SI, and were able to claim the top single-GPU position with a more mid-range GK104, why not just delay GK110 until next gen and call it the GTX 780? There's absolutely no reason to release it so soon and I'm sure that it should be able to do pretty well against what will be AMD's next gen, since from what rumors have said until now is that it will just be a tweaked version of SI with no architectural changes or shrunken manufacturing process. But I guess it's all speculations right now, we'll see what's the case in 5 - 6 months.
Exactly. Only thing NVIDIA needs to top now is HD 7990 and considering they are the better at perf/w per one GPU now it shouldn't be too hard to beat the HD 7990 with 2x GK104.

That leaves no need for GK110 now, definately would be logical to sell them as quadros till there is need for ah the new 7xx series.
Posted on Reply
#37
Crap Daddy
BenetanegiaAnd GK106 for $200? It could be. But forcing 3 cards out of a chip so small... I don't know if that's a good practice.
Never mind the good practice. That's what NV has. Strangely enough I haven't read one leak concerning GK106 and what's the status. Next in line will be most certain GK107.

I don't think we'll see any of GK110 until AMD launches the 8000 series. The game is over at the top. Based on how GTX680 looks like - 100% gaming card - the big boy is reserved to teslas and quadros and HPC. It might be that NV is making a clearer distinction between professional and gaming.
Posted on Reply
#38
erocker
*
BenetanegiaThe problem I see is what about the rest of the lineup? I'm pretty sure they are going to do the same they did the last generation, so GK106 will be half of what GK104 is and GK107 is 1/4 of GK104 (this has been confirmed). The problem with that is, that unless scaling is borked on GK104*, much more so than on Tahiti vs Pitcairn, GK106 will have a very tough time competing at all. I'd imagine that it will be as fast as the GTX560 Ti, which is fine for a GTX550 replacement (which is what it was suposed to be) but it's a very bad replacement for the GTX560 Ti itself, much less competition for Pitcairn. What will they sell in the $250-350 range? Maybe Nvidia just lost the upper-mainstream on their own.

* And considering what Nvidia achieved with the thing, that's hard to believe.
This is why Nvidia will do (hopefully) as they have done in the past and just snip and disable a little here and there on the GK104 and we have a GTX670. :)
Posted on Reply
#39
Crap Daddy
erockerThis is why Nvidia will do (hopefully) as they have done in the past and just snip and disable a little here and there on the GK104 and we have a GTX670
Make that two GTX670.
Posted on Reply
#40
TheoneandonlyMrK
BenetanegiaThe chip is not what they wanted it to be? Wow, AMD can thank God then, because if it had been what Nvidia wanted...

Power control is there from the very beginning, as is the SMX's, the amount, the way they are etc. It's not something you can change on the fly. It is what it was meant to be from the beginning. Except that it has launched at 1000 Mhz instead of at 950 Mhz which was probably the plan. But then AMd released the "Ghz Edition campaign" and robably Nvidia marketing team started "this should have been our idea!!"
though i agree on most of your points ,and deff that AMD got off lightly here , in a way.

but your deffinately wrong about it having to be designed in from the start ,the shader domain level clocks would have to be designed in but the gpu power gated clocks though most definitely designed in to some degree can easily be added to any card with the right power control circuitry, this i KNow .

My only point regarding this was not that they added it but that they have had to run tighter volt and temp control then they expected /would have liked IMHO.

you could easily have designed in 2056 shaders with some expected to fail in every smx ,given the high speed nvidia are running the shaders at this would be intelligent and reasonable design , redundancy is what engineering does . if you pre aligned the shaders you could eliminate particulate damage to a shader array by disableing that and the one for or aft row of shaders pre ,designed in redundancy think 480 but intelligently done to guarantee/optimize yields, it seems to be their main goal.

create a chip thats capable and very dynamic/flexible
Posted on Reply
#41
KainXS
or the card was just so fast they could get away with bumping it up as it already edges out AMD's current top single GPU card, this is Nvidia, nothing new here lol.

Dynamic yes, flexible, probably not, It doesn't seem like its another G92.

The truth of the naming should have been in the prerelease drivers though . . . . .
Posted on Reply
#42
devguy
My guess is that they'll just add that suffix on for GK110. "GTX 680 TI"
Posted on Reply
#43
TheoneandonlyMrK
devguyMy guess is that they'll just add that suffix on for GK110. "GTX 680 TI"
probably bang on as that would be the stupidest name to name a different gpu , not a dig at you dev guy, i actually agree and all the lower ones are going to be 680 S SE T G P versions right down to 1smx cards:roll:
Posted on Reply
#44
alienstorexxx
could anyone stop talking bullshit :confused:

in wich kind of planet we live?? do you really think that nvidia probably launch a "gtx 680" (the suposed real one) at 40%~60% more performance than last gen? even this "gtx670 ti" is almost powerful than a gtx590. So, are they some kind of company wich totally destroys last gen of graphics card on performance, making them totally obsolete?

we live in the kind of planet where the money rules, so all you're trying to understand, and discuss, is a total waste of time.

i think that, they obviously can't compete with amd at the time amd released it, so they invented all this fantasy (in case of fail), because they didn't know how fast the arquitecture could be. does anybody remember that chart with the 660ti being 10% more powerfull than the 7970? that's some big kind of shit

do you really belive this crappy marketing strategy?

i really love this gtx680. but with all this around, makes it unprofessional. too much fanboyism.
Posted on Reply
#45
Jurassic1024
qubitYup, I'm sure it was gonna be called the 670 Ti. It's a fantastic card, but no matter how good it is, I want the top end GPU to make the upgrade itch go away.

Also, you can look at it this way: we've actually got less for our money. This card should sell for significantly less and the top end card sell for this price, which would give much better performance. Instead, nvidia are simply milking it by comparing it to the performance of the competition and putting out a product that just beats it by a reasonable margin, instead of the killer product that would decimate it. Ho-hum, nothing to get excited about.
Except waiting for the GK110 would of meant waiting much longer, and giving AMD free reign on the market. They did release 7700, 7800, 7900 series 3 months apart remember? nVIDIA couldn't afford to wait, and if they did, it would be all over the rumor mill that nVIDIA has problems with 28nm, will it beat the 7970, Fermi Part II, etc etc. The card is priced as it is, for what performance you get, and it's well worth it considering its $50 cheaper than the card it beats. Get a clue huh?
Posted on Reply
#46
RejZoR
And AMD originally made HD7320 at first, but it turned out to be so powerful they renamed it to HD7970 instead. Do you guys seriously believe these bedtime stories?
Posted on Reply
#47
LiveOrDie
Did NVIDIA Originally Intend to Call GTX 680 as GTX 670 Ti?
Yes because they saw no threat in AMDs 7970 so they rebranded it as a high end part.
Posted on Reply
#48
thematrix606
erockerConsidering how many beta/preview drivers the release to the public, they very much care. They seem to work hard at bringing out new drivers to use... I suppose it's the competence of the drivers that I'm concerned with.
Sorry to say, I'm still having issues with 120hz on my 7970... so no matter how many beta drivers they release, if they don't fix anything important, it does not count ;) :banghead:
Posted on Reply
#49
THE_EGG
so when the full sha-bang kepler comes out in August, will it be similar to the gtx 200 vs hd 4000 days? When you pay a f**kload you obliterate the competition?
Posted on Reply
#50
RejZoR
Ppl are exaggerating the whole GTX 680 vs HD7970 deal. The cards are virtually identical, each with its better and worse sides. As for the pricing, AMD set it so high because competition had nothing to offer. They might decrease them, if sales decline below what they want.
Posted on Reply
Add your own comment
Nov 22nd, 2024 21:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts