• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Moving Around the Fabled GeForce GTX TITAN II

GTX 460 Fermi counterpart goes to Kepler GTX 680

GTX 480 Fermi counterpart goes to Kepler N/A Never existed.

GTX Fermi Refresh GTX 580 counterpart goes to Kepler refresh GTX 780Ti.

Not confuzzled enough? Then I will throw this in too. The GTX 880 is NOT A HIGH END MAXWELL. It's a half assed attempt at a upper midrange GPU hampered by a throwback to the 28 nm process for the price of a high end GPU most likely. It might perform better than the GK110 but it's not the Flagship.
 
GTX 460 Fermi counterpart goes to Kepler GTX 680

GTX 480 Fermi counterpart goes to Kepler N/A Never existed.

GTX Fermi Refresh GTX 580 goes to Kepler refresh GTX 780Ti.

Not confuzzled enough? Then I will throw this in too. The GTX 880 is NOT A HIGH END MAXWELL. It's a half assed attempt at a upper midrange GPU hampered by a throwback to the 28 nm process for the price of a high end GPU most likely. It might perform better than the GK110 but it's not the Flagship.

What are you talking about?

Wouldn't you think the GTX480 Kepler replacement is the GTX780 non Ti?
 
What are you talking about?

Wouldn't you think the GTX480 Kepler replacement is the GTX780 non Ti?

No, because the GTX 780 was gimped. Nvidia had to dump the defective GK110 chips somehow. They were already paid for.
 
No, because the GTX 780 was gimped. Nvidia had to dump the defective GK110 chips somehow. They were already paid for.

It's kind of intentional what GPU vendors do. Shall I mention the 7970? No wait, it was gimped too so they could release a 7970 Ghz edition.

Or my i7 3930k is a gimped i7 3960? No, it's designed that way to create a product division.

Whether we like it or not, technology produced on nm processes will always allow vendors to sell products based on 'castrated' products to create a product series. The 7950/7970 or R9 290/R9 290x is the perfect example of this.

This isn't about nvidia dumping bad inventory, it's about a company making money by fusing cores and reducing costs on imperfect wafers. They both do it.
 
Err big maxwell, perhaps it is, is this different from the device leaked last week that was an A1 stepped part many assumed was gm204.
These pr leaks are a tad vague these days imho, also a few sites are throwing around a 2015 release date , now surely that can't be right especially on this node.
 
It's kind of intentional what GPU vendors do. Shall I mention the 7970? No wait, it was gimped too so they could release a 7970 Ghz edition.

Or my i7 3930k is a gimped i7 3960? No, it's designed that way to create a product division.

Whether we like it or not, technology produced on nm processes will always allow vendors to sell products based on 'castrated' products to create a product series. The 7950/7970 or R9 290/R9 290x is the perfect example of this.

This isn't about nvidia dumping bad inventory, it's about a company making money by fusing cores and reducing costs on imperfect wafers. They both do it.

Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?
 
Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?

Dude, have you been asleep for the last 10 years? This stuff has been going on forever

Honestly I think what Nvidia does with their GPUs, atleast IMO was smart in a business standpoint. For instance look at the 780 to 780Ti. They release the 780 with a few SMX clusters disabled. Wait for AMD to counter, then then release a full fledge GK110 GPU to compete with whatever AMD release (290/290x) to maintain the performance crown.

Its kind of been this way for a long time, and both companies do it.
 
My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.
 
My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.
credit for the apology :)
 
Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?


GF100 = hot and gimped GTX480. It still was the best performing card.
GF110 = GTX480 perfected, the GTX580. It was the best performing card.
GK100 = Unproducable at the time. so GK104 becomes GTX680 (beats 7970)
---- 7970 Ghz appears, can be seen to beat GTX680----
GK110 = Titan appears, rules them all at a silly price. Then comes GTX780.
---- R9 290X appears, can be seen to best 780/Titan in many cases.
GK110 Revision B = 780Ti and Titan Black.

I don't see Nvidia dumping any inferior GPU's. Seriously, if you call GF100, GK110 and GK110(B) inferior GPU's then wtf have AMD got? Inferior requires a relative product to be related to. Nv have been on top for years now (unfortunately for pricing).

When wafers are made there are always poorer yield parts at first, until the process is refined or the arch is tweaked to fit the process. It's not about dumping inferior anything. It's a known development process and this is a product of it. I know if i make 'X' wafers, I will not get 'X' 100% parts. Those that don't make the grade become the lower models. AMD and NV both do it. If anything they are intentionally scaled down GPU's to fit a known production yield.

The only true dumping that gets done is when both companies rebrand last gen cards as lower brand current gen. That's intentional dumping.

But hey, after having (over the past two years) 2 x 7970's, a Titan and a 780ti, I feel pissed to know I've had inferior GPU's.

But tbh, you're probably making the same point I am but speaking a different semantic. I'm viewing it as a production process/marketing necessity and seeing it from a neutral stance (both camps do it). You seem to imply NV is only doing it - which would be very wrong.

My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.
EDIT: No need to apologise :toast:
 
GF100 = hot and gimped GTX480. It still was the best performing card.
GF110 = GTX480 perfected, the GTX580. It was the best performing card.
GK100 = Unproducable at the time. so GK104 becomes GTX680 (beats 7970)
---- 7970 Ghz appears, can be seen to beat GTX680----
GK110 = Titan appears, rules them all at a silly price. Then comes GTX780.
---- R9 290X appears, can be seen to best 780/Titan in many cases.
GK110 Revision B = 780Ti and Titan Black.

I don't see Nvidia dumping any inferior GPU's. Seriously, if you call GF100, GK110 and GK110(B) inferior GPU's then wtf have AMD got? Inferior requires a relative product to be related to. Nv have been on top for years now (unfortunately for pricing).

When wafers are made there are always poorer yield parts at first, until the process is refined or the arch is tweaked to fit the process. It's not about dumping inferior anything. It's a known development process and this is a product of it. I know if i make 'X' wafers, I will not get 'X' 100% parts. Those that don't make the grade become the lower models. AMD and NV both do it. If anything they are intentionally scaled down GPU's to fit a known production yield.

The only true dumping that gets done is when both companies rebrand last gen cards as lower brand current gen. That's intentional dumping.

But hey, after having (over the past two years) 2 x 7970's, a Titan and a 780ti, I feel pissed to know I've had inferior GPU's.

But tbh, you're probably making the same point I am but speaking a different semantic. I'm viewing it as a production process/marketing necessity and seeing it from a neutral stance (both camps do it). You seem to imply NV is only doing it - which would be very wrong.


EDIT: No need to apologise :toast:
Your correct, its been the same thing for a long time now and I doubt that would change anytime soon. They develop a new architecture and push it as far as they can, next year they improve and refine said architecture and push it to its limits rinse and repeat.

Although the 7970 and 7970Ghz edition are the same card, AMD just bumped the core clocks on the reference models.

I don't think either company comes out smelling like roses in either instance.
 
It's a lot more complex and nuanced then you know this binning malarkey, it's no coincidence chip makers release what's essentially a slightly damaged asic (yes fused off ) as their top sku, they sometimes skim the top a fair while before they have anywhere near enough for a sku release.
It is the intelligent way of hauling success out of assured failure , and every chip company does it (all have a failure rate too)but they don't always release in the same way.
 
Well, I learned my lesson last year. If there is a Maxwell Titan part I'll steer well clear of it, knowing there will be a bloody good chance 2-3 months later a non compute part would be released for hundreds of pounds/dollars less.

There is a slim chance all the very early rumours are wrong though and GM204 is not 880. After all, GK104 was GTX680 because Nvidia knew they could make it the top tier part. GK100 was canned. Compare with GF100/110 and GF104. Kepler was 'out of order' (in both manufacturing and price!)

Perhaps GM204 is mainstream, GM200 is 880. And they can make more money off a compute GM200 part with more memory. After all, GK110 is the Titan, 780, 780Ti and Titan Black architecture. GM200 doesn't mean Titan alone.

I think the next arch's from Nvidia and AMD should allow the 4k gaming on a single card to be a reality though, albeit, just and no more.

Doesn't Kepler support 4K over it's HDMI output?
 
Doesn't Kepler support 4K over it's HDMI output?

You referring to the color degradation trick they use to save on bandwidth.

Find it pointless to spend X amount of money on a GPU and a 4K monitor only to gimp it. People do stranger things though.
 
The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
(And I'm an AMD guy)
 
Doesn't Kepler support 4K over it's HDMI output?

Yes, but what i meant was no single gpu is powerful enough to run 4k at max settings with 'high' frame rates.

You referring to the color degradation trick they use to save on bandwidth.

Find it pointless to spend X amount of money on a GPU and a 4K monitor only to gimp it. People do stranger things though.

http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of

Both AMD and Nvidia's card support 30hz 4k quality. Nvidia have simply found a way to stream it at 60Hz. Of course it's a reduced quality image - it's a HDMI bandwidth limitation. You make it sound as if Nvidia are making up for a gimped card.

Both AMD and Nvidia use displayport for 60Hz
 
The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
(And I'm an AMD guy)

Because NV had projected they were behind, but turns out 7970 vs "660Ti" was a fair fight. What difference does it make? Neither 290x or Titan were ready to go from either side.
 
Because NV had projected they were behind, but turns out 7970 vs "660Ti" was a fair fight. What difference does it make? Neither 290x or Titan were ready to go from either side.

It's shared fault between nvidia and AMD, the underperforming Tahiti-based "7900" series cards should have been downgraded to 7800 series, but I guess the marketing greed doesn't stop in front of anything.... :rolleyes:
 
The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
(And I'm an AMD guy)
7970 was a pretty good card though, 6970 to 7970 was a big jump.
 
7970 was a pretty good card though, 6970 to 7970 was a big jump.

Not as big as the gigantic performance jump between Fermi and Kepler (GK104) mid-range cards. ;)

It was a marketing fail not matter how you try to justify it. :)
 
The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
(And I'm an AMD guy)
That does not make any sense, what your saying is that the 7970 which was a big jump from the 6970 was a failure and NVidia intentionally did not release a more powerful card because of this and instead released a less powerful card that actually lost at the end of the day to the 7970 in favor of waiting? If either team could have released their top tier GPU, I guarantee they would have and just charged a fortune over the rival. I also do not see how this is much different in releases than what has been done in the past.

Also im getting confused because the thread is supposed to be about a rumored Titan 2? Why are we stuck debating the 680 and 7970 at this point...
 
NVIDIA will in 2015 to see what mean when you see too high with prices.
Lot of people no understanding more for their price. Lot of people have GTX780Ti SLI, Titan SLI,
Titan Black SLI very strong cards and if NVIDIA show up with something 1000$ and AMD offer 5% weaker for 600$ I think that would be one bed year for NVIDIA.
GF110 cost 500-600$, GK110 700-1000$, now people expect GM210 more than 1000$.
I think NVIDIA shouldn't ask more than 800$ for premium card and depend on time when they decide to launch and price more or less people will buy.
That chip would be my next choice, but depend of price. I will not look some locked cards as GTX780 any more.
I think this time Titan II will be not be only reference card. We can expect maybe some EVGA Classified with extreme fabric OC with ACX cooler.
Than I can OC only Haswell E because GPU will be OC by fabric.
 
All right so the current iteration of 4k GPUs is limited by the HDMI bandwidth. If it's lower quality, how lower? Couldn't the same be said for a lot of texture files because of compression? e.g. the whole id 5 megatexture getting consolotis debate.
 
That does not make any sense, what your saying is that the 7970 which was a big jump from the 6970 was a failure and NVidia intentionally did not release a more powerful card because of this and instead released a less powerful card that actually lost at the end of the day to the 7970 in favor of waiting? If either team could have released their top tier GPU, I guarantee they would have and just charged a fortune over the rival. I also do not see how this is much different in releases than what has been done in the past.

Also im getting confused because the thread is supposed to be about a rumored Titan 2? Why are we stuck debating the 680 and 7970 at this point...
Obviously cos Amd,, if only Amd got there shit sorted then world hunger would be a thing of the past and nvidia would make a gtx 880 thats 300% better in every way yet cheaper than toast.
 
Back
Top