# NVIDIA Moving Around the Fabled GeForce GTX TITAN II



## btarunr (Jul 11, 2014)

NVIDIA is moving around engineering samples of what it describes as "GM200 A1 graphics processor," in its shipping manifest. The sample was making its way from Taiwan, to Bangalore, India, from where it's likely pushed to the company's facilities in Bangalore and Hyderabad. A1 steppings of NVIDIA chips are usually pre-production, and bound for just a few more rounds of testing, before being upgraded to "A2" and mass-produced. German tech site 3DCenter.org also pulled out some likely specifications from its sources.

To begin with, the GM200, like the GM204, will be built on existing 28 nm silicon fabrication process, as both NVIDIA and AMD appear to have suffered design setbacks due to their common foundry partner, TSMC, not being able to set its next-gen 20 nm node up to speed in time. The GM200 is expected to feature over 4,000 CUDA cores, although the exact number is unknown. It is expected to widen the memory bus to 512-bit. Given the existing process, the GPU will be huge. Over 600 mm² huge. NVIDIA will probably bank on the energy efficiency of its "Maxwell" architecture to cope with thermal loads put out by a chip that big. The GM200-based "GeForce GTX TITAN II" could launch in the first half of 2015.





*View at TechPowerUp Main Site*


----------



## VulkanBros (Jul 11, 2014)

I dont even want to know the price of this tning - twice the price of the original Titan


----------



## ZoneDymo (Jul 11, 2014)

And it will be about 1500 dollars


----------



## Sony Xperia S (Jul 11, 2014)

ZoneDymo said:


> And it will be about 1500 dollars





VulkanBros said:


> I dont even want to know the price of this tning - twice the price of the original Titan



There is no sane reason or arguments for it to be $1500.

The original titanIC was build on an earlier version of the lithography process, so yields should have been improved by now which means the production costs are much much lower without those insane margins which they put atop.

If they really set it at that price tag, then I do not want to know who are those who will give them what they ask.

Be very careful because this policy leads to a disaster - ever growing prices, so at some point virtually no one will be able to buy, only they will buy it from themselves.  ahahaha


----------



## ZoneDymo (Jul 11, 2014)

Sony Xperia S said:


> There is no sane reason or arguments for it to be $1500.
> 
> The original titanIC was build on an earlier version of the lithography process, so yields should have been improved by now which means the production costs are much much lower without those insane margins which they put atop.
> 
> ...



Its a company that wants to make as much profit as they can, there is your reason.
Its not really about production cost at all, if that was what prices of products where based on we would be paying SOOOOO much less for everything.
The "at some point virtually no one will be able to buy" does not go here as Nvidia (and AMD) have a wide range of cards and many all of those will be much more affordable.

Titan is like Intel's 1000+ dollar Extreme series.


----------



## Sempron Guy (Jul 11, 2014)

You can always try Nvidia, just use the right marketing for it this time


----------



## the54thvoid (Jul 11, 2014)

Well, I learned my lesson last year.  If there is a Maxwell Titan part I'll steer well clear of it, knowing there will be a bloody good chance 2-3 months later a non compute part would be released for hundreds of pounds/dollars less.

There is a slim chance all the very early rumours are wrong though and GM204 is not 880.  After all, GK104 was GTX680 because Nvidia knew they could make it the top tier part.  GK100 was canned.  Compare with GF100/110 and GF104.  Kepler was 'out of order' (in both manufacturing and price!)

Perhaps GM204 is mainstream, GM200 is 880.  And they can make more money off a compute GM200 part with more memory.  After all, GK110 is the Titan, 780, 780Ti and Titan Black architecture.  GM200 doesn't mean Titan alone.

I think the next arch's from Nvidia and AMD should allow the 4k gaming on a single card to be a reality though, albeit, just and no more.


----------



## Kissamies (Jul 11, 2014)

The die size would be large as hell, the 65nm GT200 is the largest what comes to mind, and that's already 576mm2..


----------



## HumanSmoke (Jul 11, 2014)

Sony Xperia S said:


> There is no sane reason or arguments for it to be $1500.


It probably won't be....because the first GM200 products will likely follow the GK110 model - Pro (Tesla) boards first. With Intel giving away KNL dev kits to anyone who can spell c-o-d-i-n-g, let alone actually do it, it's more of an imperative to stay on top of the HPC market I would have thought


Sony Xperia S said:


> The original titanIC was build on an earlier version of the lithography process, so yields should have been improved by now which means the production costs are much much lower without those insane margins which they put atop.


The GTX Titan Z also uses fairly new (GK 110-350-) B1 silicon on an established process. Pricing will be whatever Nvidia deem they can get away with charging for the number of boards they hope to sell/have in inventory. If the company is serious about moving quantity then they will be priced accordingly. If they intend to just drip feed them into the channel with barely enough quantity to stay relevant, then the price is largely irrelevant.

BTW: Even if the GM200 scales just 1mm larger on a side than GK 110 ( 600mm^2 vs 551mm^2), that still equates to a loss of 20% in the number of die candidates per 300mm wafer compared to GK 110


Sony Xperia S said:


> Be very careful because this policy leads to a disaster - ever growing prices, so at some point virtually no one will be able to buy, only they will buy it from themselves.  ahahaha


If the big GPU sales to desktop buyers impacted the overall revenue stream to any large degree then I'd agree. While the volume mid/small GPU cards tend to get regular price adjustments, the same isn't usually the case for the high end low volume market (for either vendor I might add). That in itself should be an indicator.


9700 Pro said:


> The die size would be large as hell, the 65nm GT200 is the largest what comes to mind, and that's already 576mm2..


576mm^2 equates to 24 * 24. 600mm^2 equates to 24.49 * 24.49. More of an issue when you map them out on a wafer- those half-millimetres add up. BTW: GT200 isn't the largest GPU by a long shot. The original Larrabee was estimated at ~650-700mm^2, and even the in-production Xeon Phi GPU is estimated to be not much smaller. Intel aren't shy about specifications in general- there isn't an Intel CPU that hasn't had its die size publicized. I'd challenge anyone to find the die size and transistor count specification of a Xeon Phi GPU.


----------



## NC37 (Jul 11, 2014)

the54thvoid said:


> Well, I learned my lesson last year.  If there is a Maxwell Titan part I'll steer well clear of it, knowing there will be a bloody good chance 2-3 months later a non compute part would be released for hundreds of pounds/dollars less.
> 
> There is a slim chance all the very early rumours are wrong though and GM204 is not 880.  After all, GK104 was GTX680 because Nvidia knew they could make it the top tier part.  GK100 was canned.  Compare with GF100/110 and GF104.  Kepler was 'out of order' (in both manufacturing and price!)
> 
> ...



Nah I doubt it. 880 will likely be the 204.

If only everyone could learn that lesson...we'd have another GTX 460 era on us. Powerful and affordable GPUs which scale wonderfully with SLI.

I may buy NV GPUs more often than AMD, but times like this I root for AMD. NV is getting away with it and they know they can. Course AMD isn't helping things. They pull the same tricks.


----------



## Roel (Jul 11, 2014)

This is surely interesting, I was aiming for the 880 but GM200 seems significantly faster if these rumors are true. Expected CUDA cores at least 25% more than GM204, the 880 was expected to have 3200 cores. And a massive 512-bit memory bus vs 256-bit so it will probably be 8GB vs 4GB. I am going to wait until more is known before making any decisions and see if the price difference will be worth it compared to how long the card will last me.


----------



## techy1 (Jul 11, 2014)

ZoneDymo said:


> Titan is like Intel's 1000+ dollar Extreme series.


NO, Titan in CPU world would be like AMD's FX "gold" (or something) edition , that would be barely better than regullar FX, worse than compettition, but costs 1000$+


----------



## ZoneDymo (Jul 11, 2014)

techy1 said:


> NO, Titan in CPU world would be like AMD's FX "gold" (or something) edition , that would be barely better than regullar FX, worse than compettition, but costs 1000$+



Which is pretty much the same as Intel's Extreme editions


----------



## Xzibit (Jul 11, 2014)

HumanSmoke said:


> I'd challenge anyone to find the die size and transistor count specification of a Xeon Phi GPU.



*TechPowerUp - Intel Xeon Phi SE10X*


----------



## HumanSmoke (Jul 11, 2014)

Xzibit said:


> *TechPowerUp - Intel Xeon Phi SE10X*


That number, 350mm^2 is a guess I'm picking since it crops up as anecdotal "evidence" without documentation, and as a guess it seems a bad one at that. Dave Kanter's take is:


> The total size of the LLC should be around 144MB or 2MB/core, which corresponds to roughly 250mm2 on a 14nm process. While this is fairly large, *Knights Landing is probably a 700mm2 device*, and spending about a third of the die area on cache is a very reasonable design choice as discussed in an earlier article on microservers. Assuming 256KB L2 cache per core, the inclusive LLC is unlikely to be any smaller than 2MB/core to maintain an 8:1 capacity ratio.


Even Charlie D, the Intel shill pegs the 350mm^2 as a fairy story, so I wouldn't place too much stock in it.

BTW: The Xeon Phi review in the db - it leads to a CPU review !?!


----------



## techy1 (Jul 11, 2014)

ZoneDymo said:


> Which is pretty much the same as Intel's Extreme editions



but Intel's Extreme at least is NOT worse than competition, but all of Titans are


----------



## GhostRyder (Jul 11, 2014)

Sony Xperia S said:


> There is no sane reason or arguments for it to be $1500.
> 
> The original titanIC was build on an earlier version of the lithography process, so yields should have been improved by now which means the production costs are much much lower without those insane margins which they put atop.
> 
> ...


If someone buys it there is a sane reason from their perspective, it probably will be 1000+ dollar minimum if we are to take anything from the recent Z release.

Its still so far off it does not matter at this point beyond a first glimpse if this turns out to be it.  I mean in all honesty 2015 is a good while off and this has a vague first half estimate which can always change.  I just hope this time around they release the 880 first instead of trying to push the Titan off on people then release the X80 model.


----------



## the54thvoid (Jul 11, 2014)

techy1 said:


> but Intel's Extreme at least is NOT worse than competition, but all of Titans are



No, Titan Black is the fastest single gpu consumer gfx card (stock comparison to stock 780Ti).  The i7 3960 cpu from intel is not a huge amount faster than the i7 3930 but it is/was twice the price (way back when i bought mine).  It's actually a very good analogy of premium product at inflated premium price.  Doesn't make it a rational scaling with performance but that doesn't seem to matter at that end.


----------



## Patriot (Jul 11, 2014)

Sony Xperia S said:


> There is no sane reason or arguments for it to be $1500.
> 
> The original titanIC was build on an earlier version of the lithography process, so yields should have been improved by now which means the production costs are much much lower without those insane margins which they put atop.
> 
> ...



There is no sane reason the Titan-Z was 3k  ... Don't try to limit nvidia's actions by reason... it just won't work.


----------



## the54thvoid (Jul 11, 2014)

Patriot said:


> There is no sane reason the Titan-Z was 3k



Correct.  It's an embarrassment - which is a shame as it's actually a decent card. http://www.hardwarecanucks.com/foru.../66869-nvidia-titan-z-performance-review.html



Patriot said:


> Don't try to limit nvidia's actions by reason... it just won't work.



It will for most products.  Why GTX680 was released and at the price point it was, why Titan was released etc.  All makes perfect _business_ sense.


----------



## GhostRyder (Jul 11, 2014)

the54thvoid said:


> Correct.  It's an embarrassment - which is a shame as it's actually a decent card. http://www.hardwarecanucks.com/foru.../66869-nvidia-titan-z-performance-review.html


  Agreed, for all the situations you would want the card they left an opening for a better alternative which I believe was the overall problem with Titan-Z.



the54thvoid said:


> It will for most products.  Why GTX680 was released and at the price point it was, why Titan was released etc.  All makes perfect _business_ sense.



I think everything is just going up in price, its actually getting harder and harder to be a PC gamer in this day and age.  I still remember when spending 100 bucks on a GPU got me BF2 and 2142 at a great high performance experience which has almost diminished in this day and age.  Hopefully we will see some improvements to the pricing structures down the line at some point, but either way it will take time.


----------



## MxPhenom 216 (Jul 11, 2014)

Sempron Guy said:


> You can always try Nvidia, just use the right marketing for it this time


 
lol I think their marketting for the first Titan was fine. That card sold like hot cakes.


----------



## Ahhzz (Jul 11, 2014)

Sony Xperia S said:


> There is no sane reason or arguments for it to be $1500.
> 
> .....


Nvidia. 'Nuff Said.


----------



## MxPhenom 216 (Jul 11, 2014)

It amazes me how quickly threads that involve the "Titan" escalate into a pissing match about price. IMO that horse has been beaten enough, and pricing of GPUs these days in general.


----------



## xorbe (Jul 11, 2014)

MxPhenom 216 said:


> lol I think their marketting for the first Titan was fine. That card sold like hot cakes.



The price point was okay for the high rollers for the performance offered 18 months ago.  It's still a great card, and in fact $/year it looks like it's going to last quite a while, and an acceptable deal in fact for having had 6GB.  NVidia is going to be competing with themselves attempting to woo existing Titan owners that they need a new card.  Gonna be a pretty tough sell!  And we all know the non-Titan version will be along soon after.


----------



## 64K (Jul 11, 2014)

GTX 460 Fermi counterpart goes to Kepler GTX 680

GTX 480 Fermi counterpart goes to Kepler N/A Never existed.

GTX Fermi Refresh GTX 580 counterpart goes to Kepler refresh GTX 780Ti.

Not confuzzled enough? Then I will throw this in too. The GTX 880 is NOT A HIGH END MAXWELL. It's  a half assed attempt at a upper midrange GPU hampered by a throwback to the 28 nm process for the price of a high end GPU most likely. It might perform better than the GK110 but it's not the Flagship.


----------



## MxPhenom 216 (Jul 11, 2014)

64K said:


> GTX 460 Fermi counterpart goes to Kepler GTX 680
> 
> GTX 480 Fermi counterpart goes to Kepler N/A Never existed.
> 
> ...



What are you talking about?

Wouldn't you think the GTX480 Kepler replacement is the GTX780 non Ti?


----------



## 64K (Jul 11, 2014)

MxPhenom 216 said:


> What are you talking about?
> 
> Wouldn't you think the GTX480 Kepler replacement is the GTX780 non Ti?



No, because the GTX 780 was gimped. Nvidia had to dump the defective GK110 chips somehow. They were already paid for.


----------



## the54thvoid (Jul 11, 2014)

64K said:


> No, because the GTX 780 was gimped. Nvidia had to dump the defective GK110 chips somehow. They were already paid for.



It's kind of intentional what GPU vendors do.  Shall I mention the 7970?  No wait, it was gimped too so they could release a 7970 Ghz edition.

Or my i7 3930k is a gimped i7 3960?  No, it's designed that way to create a product division. 

Whether we like it or not, technology produced on nm processes will always allow vendors to sell products based on 'castrated' products to create a product series.  The 7950/7970 or R9 290/R9 290x is the _perfect example_ of this.

This isn't about nvidia dumping bad inventory, it's about a company making money by fusing cores and reducing costs on imperfect wafers.  They both do it.


----------



## TheoneandonlyMrK (Jul 11, 2014)

Err big maxwell, perhaps it is, is this different from the device leaked last week that was an A1 stepped part many assumed was gm204.
These pr leaks are a tad vague these days imho, also a few sites are throwing around a 2015 release date , now surely that can't be right especially on this node.


----------



## 64K (Jul 11, 2014)

the54thvoid said:


> It's kind of intentional what GPU vendors do.  Shall I mention the 7970?  No wait, it was gimped too so they could release a 7970 Ghz edition.
> 
> Or my i7 3930k is a gimped i7 3960?  No, it's designed that way to create a product division.
> 
> ...



Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?


----------



## MxPhenom 216 (Jul 11, 2014)

64K said:


> Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?



Dude, have you been asleep for the last 10 years? This stuff has been going on forever

Honestly I think what Nvidia does with their GPUs, atleast IMO was smart in a business standpoint. For instance look at the 780 to 780Ti. They release the 780 with a few SMX clusters disabled. Wait for AMD to counter, then then release a full fledge GK110 GPU to compete with whatever AMD release (290/290x) to maintain the performance crown. 

Its kind of been this way for a long time, and both companies do it.


----------



## 64K (Jul 11, 2014)

My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.


----------



## Ahhzz (Jul 11, 2014)

64K said:


> My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.


credit for the apology


----------



## the54thvoid (Jul 11, 2014)

64K said:


> Yes this is exactly about Nvidia dumping inferior GPUs. Have you been asleep for 2 years?




GF100 = hot and gimped GTX480.  It still was the best performing card.
GF110 = GTX480 perfected, the GTX580.  It was the best performing card.
GK100 = Unproducable at the time.  so GK104 becomes GTX680 (beats 7970)
---- 7970 Ghz appears, can be seen to beat GTX680----
GK110 = Titan appears, rules them all at a silly price.  Then comes GTX780.
---- R9 290X appears, can be seen to best 780/Titan in many cases.
GK110 Revision B = 780Ti and Titan Black.

I don't see Nvidia dumping any inferior GPU's.  Seriously, if you call GF100, GK110 and GK110(B) inferior GPU's then wtf have AMD got? Inferior requires a relative product to be related to.  Nv have been on top for years now (unfortunately for pricing).

When wafers are made there are always poorer yield parts at first, until the process is refined or the arch is tweaked to fit the process.  It's not about dumping inferior anything.  It's a known development process and this is a product of it.  I know if i make 'X' wafers, I will not get 'X' 100% parts.  Those that don't make the grade become the lower models.  AMD and NV both do it.  If anything they are intentionally scaled down GPU's to fit a known production yield.

The only true dumping that gets done is when both companies rebrand last gen cards as lower brand current gen.  That's intentional dumping.

But hey, after having (over the past two years) 2 x 7970's, a Titan and a 780ti, I feel pissed to know I've had inferior GPU's.

But tbh, you're probably making the same point I am but speaking a different semantic.  I'm viewing it as a production process/marketing necessity and seeing it from a neutral stance (both camps do it).  You seem to imply NV is only doing it - which would be very wrong.



64K said:


> My response to you was rude the54thvoid. I get carried away with enthusiasm sometimes.


EDIT: No need to apologise


----------



## GhostRyder (Jul 11, 2014)

the54thvoid said:


> GF100 = hot and gimped GTX480.  It still was the best performing card.
> GF110 = GTX480 perfected, the GTX580.  It was the best performing card.
> GK100 = Unproducable at the time.  so GK104 becomes GTX680 (beats 7970)
> ---- 7970 Ghz appears, can be seen to beat GTX680----
> ...


Your correct, its been the same thing for a long time now and I doubt that would change anytime soon.  They develop a new architecture and push it as far as they can, next year they improve and refine said architecture and push it to its limits rinse and repeat.

Although the 7970 and 7970Ghz edition are the same card, AMD just bumped the core clocks on the reference models.

I don't think either company comes out smelling like roses in either instance.


----------



## TheoneandonlyMrK (Jul 11, 2014)

It's a lot more complex and nuanced then you know this binning malarkey,  it's no coincidence chip makers release what's essentially a slightly damaged asic (yes fused off ) as their top sku, they sometimes skim the top a fair while before they have anywhere near enough for a sku release. 
It is the intelligent way of hauling success out of assured failure , and every chip company does it (all have a failure rate too)but they don't always release in the same way.


----------



## newconroer (Jul 11, 2014)

the54thvoid said:


> Well, I learned my lesson last year.  If there is a Maxwell Titan part I'll steer well clear of it, knowing there will be a bloody good chance 2-3 months later a non compute part would be released for hundreds of pounds/dollars less.
> 
> There is a slim chance all the very early rumours are wrong though and GM204 is not 880.  After all, GK104 was GTX680 because Nvidia knew they could make it the top tier part.  GK100 was canned.  Compare with GF100/110 and GF104.  Kepler was 'out of order' (in both manufacturing and price!)
> 
> ...



Doesn't Kepler support 4K over it's HDMI output?


----------



## Xzibit (Jul 12, 2014)

newconroer said:


> Doesn't Kepler support 4K over it's HDMI output?



You referring to the color degradation trick they use to save on bandwidth.

Find it pointless to spend X amount of money on a GPU and a 4K monitor only to gimp it.  People do stranger things though.


----------



## bubbleawsome (Jul 12, 2014)

The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
(And I'm an AMD guy)


----------



## the54thvoid (Jul 12, 2014)

newconroer said:


> Doesn't Kepler support 4K over it's HDMI output?



Yes, but what i meant was no single gpu is powerful enough to run 4k at max settings with 'high' frame rates.



Xzibit said:


> You referring to the color degradation trick they use to save on bandwidth.
> 
> Find it pointless to spend X amount of money on a GPU and a 4K monitor only to gimp it.  People do stranger things though.



http://www.anandtech.com/show/8191/nvidia-kepler-cards-get-hdmi-4k60hz-support-kind-of

Both AMD and Nvidia's card support 30hz 4k quality.  Nvidia have simply found a way to stream it at 60Hz.  Of course it's a reduced quality image - it's a HDMI bandwidth limitation.  You make it sound as if Nvidia are making up for a gimped card.  

Both AMD and Nvidia use displayport for 60Hz


----------



## roki977 (Jul 12, 2014)

That thing will cost so much that it will be a joke..


----------



## xorbe (Jul 12, 2014)

bubbleawsome said:


> The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
> (And I'm an AMD guy)



Because NV had projected they were behind, but turns out 7970 vs "660Ti" was a fair fight.  What difference does it make?  Neither 290x or Titan were ready to go from either side.


----------



## Sony Xperia S (Jul 12, 2014)

xorbe said:


> Because NV had projected they were behind, but turns out 7970 vs "660Ti" was a fair fight.  What difference does it make?  Neither 290x or Titan were ready to go from either side.



It's shared fault between nvidia and AMD, the underperforming Tahiti-based "7900" series cards should have been downgraded to 7800 series, but I guess the marketing greed doesn't stop in front of anything....


----------



## Vario (Jul 12, 2014)

bubbleawsome said:


> The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
> (And I'm an AMD guy)


7970 was a pretty good card though, 6970 to 7970 was a big jump.


----------



## Sony Xperia S (Jul 12, 2014)

Vario said:


> 7970 was a pretty good card though, 6970 to 7970 was a big jump.



Not as big as the gigantic performance jump between Fermi and Kepler (GK104) mid-range cards. 

It was a marketing fail not matter how you try to justify it.


----------



## GhostRyder (Jul 12, 2014)

bubbleawsome said:


> The only thing that makes me mad about the 680 fiasco is the fact that it was supposed to be the 670ti but AMD screwed up the 7970 bad enough to let them use it as the high end. The true 680/680ti would've smashed the 7970 into the ground. Imagine if AMD had been successful and had the 290x as the 7970 at $500 and the 780ti as 680ti at $700 back in 2012. Grr.
> (And I'm an AMD guy)


That does not make any sense, what your saying is that the 7970 which was a big jump from the 6970 was a failure and NVidia intentionally did not release a more powerful card because of this and instead released a less powerful card that actually lost at the end of the day to the 7970 in favor of waiting?  If either team could have released their top tier GPU, I guarantee they would have and just charged a fortune over the rival.  I also do not see how this is much different in releases than what has been done in the past.

Also im getting confused because the thread is supposed to be about a rumored Titan 2?  Why are we stuck debating the 680 and 7970 at this point...


----------



## Vlada011 (Jul 12, 2014)

NVIDIA will in 2015 to see what mean when you see too high with prices. 
Lot of people no understanding more for their price. Lot of people have GTX780Ti SLI, Titan SLI, 
Titan Black SLI very strong cards and if NVIDIA show up with something 1000$ and AMD offer 5% weaker for 600$ I think that would be one bed year for NVIDIA.  
GF110 cost 500-600$, GK110 700-1000$, now people expect GM210 more than 1000$. 
I think NVIDIA shouldn't ask more than 800$ for premium card and depend on time when they decide to launch and price more or less people will buy. 
That chip would be my next choice, but depend of price. I will not look some locked cards as GTX780 any more.
I think this time Titan II will be not be only reference card. We can expect maybe some EVGA Classified with extreme fabric OC with ACX cooler.
Than I can OC only Haswell E because GPU will be OC by fabric.


----------



## newconroer (Jul 13, 2014)

All right so the current iteration of 4k GPUs is limited by the HDMI bandwidth. If it's lower quality, how lower? Couldn't the same be said for a lot of texture files because of compression? e.g. the whole id 5 megatexture getting consolotis debate.


----------



## TheoneandonlyMrK (Jul 13, 2014)

GhostRyder said:


> That does not make any sense, what your saying is that the 7970 which was a big jump from the 6970 was a failure and NVidia intentionally did not release a more powerful card because of this and instead released a less powerful card that actually lost at the end of the day to the 7970 in favor of waiting?  If either team could have released their top tier GPU, I guarantee they would have and just charged a fortune over the rival.  I also do not see how this is much different in releases than what has been done in the past.
> 
> Also im getting confused because the thread is supposed to be about a rumored Titan 2?  Why are we stuck debating the 680 and 7970 at this point...


Obviously cos Amd,, if only Amd got there shit sorted then world hunger would be a thing of the past and nvidia would make a gtx 880 thats 300% better in every way yet cheaper than toast.


----------



## Xzibit (Jul 13, 2014)

newconroer said:


> All right so the current iteration of 4k GPUs is limited by the HDMI bandwidth. If it's lower quality, how lower? Couldn't the same be said for a lot of texture files because of compression? e.g. the whole id 5 megatexture getting consolotis debate.



No. It just wouldn't make sense to use HDMI on current 4k panels when DisplayPort can provide the bandwidth. That's why I said people have done stranger things.  DP is right there.  It also depends on the panel and your GPU connectors.

As far as textures.  That will depend on the game devs if they want to up the quality of models and textures to coincide with 4k resolution.  Don't expect a rapid transition since consoles are at 1080p PS4 and below ? X-Box One.  Doubtful they are using optimize textures and models for there respected resolutions and adapting it to a higher resolution hasn't been the PC gaming industries strong suit with all the poor ports PC users end up settling for.


----------



## Sony Xperia S (Jul 13, 2014)

Vlada011 said:


> if NVIDIA show up with something 1000$ and AMD offer 5% weaker for 600$ I think that would be one bed year for NVIDIA.
> GF110 cost 500-600$, GK110 700-1000$, now people expect GM210 more than 1000$.



That's what I'm mostly afraid of, this trend leads to ever growing prices and if the GM200 is 1000-1500, then GM210 will be 1200-1700, then GP100 1500-2000, GP200 1700-2200, etc...

Ridiculous, isn't it?

Sadly, this is exactly what happens, we already "have" that piece of shit for $3000!!!


----------



## Sony Xperia S (Jul 13, 2014)

nvidia,

if you are so stupid to not be able to produce high-end chips on normally priced manufacturing processes, 

THEN, FOR GOD'D SAKE, DON'T DO IT, DON'T WASTE YOUR AND MORE IMPORTANTLY OUR TIME WITH YOUR SHITS!


----------



## Fluffmeister (Jul 13, 2014)

Wow, what a bunch of cry babies.


----------



## atticus14 (Jul 13, 2014)

Thats quite a bit different, 7970 owners could easily OC their cards to GHz edition and I say many enthusiasts already enjoyed that well before GHz edition, Therefore no real reason for buyers remorse.  The equivalent of that would be 780 owners unlocking the extra cores to "make" a 780ti.



the54thvoid said:


> It's kind of intentional what GPU vendors do.  Shall I mention the 7970?  No wait, it was gimped too so they could release a 7970 Ghz edition.
> 
> Or my i7 3930k is a gimped i7 3960?  No, it's designed that way to create a product division.
> 
> ...


----------



## bogami (Jul 14, 2014)

Many discussions. Why not think about Payday (game name do so).  gogogo cards are waiting.


----------



## badtaylorx (Jul 15, 2014)

One way to look at why they have product lines would be this.

They design the chip, then have them fabd.  Not all chips come out rosie...  So what they do to cut losses is disable the poorly performing cores and whatever is left becomes whatever performance category it falls under.  The flagship (780ti, 580, 7970ghz) that comes at the end is just process maturazation.


----------



## xenocide (Jul 15, 2014)

It's interesting seeing people complain about GPU and CPU prices every time around.  It's usually directed towards Intel and Nvidia, because for the past 5 or so years they have had the highest end products--cherry picking at it's finest.  Funny seeing people forget about the $1000+ single core FX line on socket 939 from AMD, or the fact that the HD7970 launched at about $600.  The highest performing product always sets the price, and everyone else adjusts.  AMD has done it, Nvidia has done it, Intel has done it, ATi and 3Dfx did it, it's the way hardware has always worked.

As for shipping "broken" parts, AMD was the company that neutered and shipped Phenom with the TLB issues well documented, so they hardly get a pass.  They also currently have a flagship product that runs "as planned" at 90c+.  The only reason AMD is so cost effective is because Intel and Nvidia have been regularly keeping them in check btw.


----------



## Sony Xperia S (Jul 15, 2014)

xenocide said:


> The highest performing product always sets the price, and everyone else adjusts.



The R9 295X2, in the worst case, is on the same level of performance but yet you see 50% discount.

AMD, historically, has always been offering better products at their respective price points.

With their arrogant greed, nvidia did even lose Sony Playstation 4 and the Microsoft's console orders.


----------



## xenocide (Jul 16, 2014)

Sony Xperia S said:


> The R9 295X2, in the worst case, is on the same level of performance but yet you see 50% discount.
> 
> AMD, historically, has always been offering better products at their respective price points.


 
It also runs hotter than the sun, uses more power, and is only priced that way because Nvidia beat them to that performance tier.  They intentionally undercut them heavily to try and grab as many consumers as possible.  Nvidia definitely priced the Titan Z and Titan Black eggregiously, but at launch the Titan was a top tier product undeniably, and should have been marketed as a cut down Tesla product.  AMD had no problem selling the HD7970 (non-GHz) for $600 when Nvidia hadn't gotten Kepler out yet.  Once the GTX680 dropped for $500 the price on the HD7970 plummeted (to about $550, then $480, then I think about $400).



Sony Xperia S said:


> With their arrogant greed, nvidia did even lose Sony Playstation 4 and the Microsoft's console orders.


 
Jesus, this argument never ends.  Nvidia *didn't want* that market.  The profit margins were almost non-existant and they were paying more to fabs than they were making.  AMD is barely making money off those consoles, despite them selling at record paces.  It's easy to look and see AMD moving more parts and assume it bodes well for business, but that's not always the case when you're selling your products at razor thin profit margins.


----------



## Sempron Guy (Jul 16, 2014)

MxPhenom 216 said:


> lol I think their marketting for the first Titan was fine. That card sold like hot cakes.



I'm referring to the Titan Z  but I always thought the entire Geforce Titan line is falsely marketed and it went back to nvidia pretty hard with the Titan Z. The first Titans sold fine alright until the similarly performing 780s comes out for less money. I had my fair list of friends who jump in to the Titan bandwagon who doesn't even do a hint of compute tasks who shouted foul when the 780 came out.


----------



## MxPhenom 216 (Jul 16, 2014)

Sempron Guy said:


> I'm referring to the Titan Z  but I always thought the entire Geforce Titan line is falsely marketed and it went back to nvidia pretty hard with the Titan Z. The first Titans sold fine alright until the similarly performing 780s comes out for less money. I had my fair list of friends who jump in to the Titan bandwagon who doesn't even do a hint of compute tasks who shouted foul when the 780 came out.


 
sure, but that's what happens in tech industry.

The Titan brand name is is like Intels Extreme line imo. Though I do think Nvidia should not market the Titan line as gaming cards like they have been. Really they are a do everything card. Offer some of the best gaming performance, while also being a cheaper option for developers and professionals who need the compute performance.


----------

