# NVIDIA GeForce GTX 780 Ti Launch Date and Pricing Revealed



## btarunr (Oct 29, 2013)

NVIDIA's response to AMD Radeon R9 290X, the GeForce GTX 780 Ti, is likely designed to be faster than the company's GeForce GTX TITAN graphics card, at a lower price, although it turns out that it won't end up anywhere close to AMD's pricing. The GeForce GTX 780 Ti will be formally launched on November 7, 2013. It will be priced at $699.99. The card is shaping up to be an overclocked GTX TITAN, based on the same GK110 silicon, and with 6 GB of memory. The company will also launch a game bundle along the lines of AMD Never Settle, which will include Steam/U-Play keys to Batman: Arkham Origins, Assassin's Creed IV: Black Flag, and Splinter Cell: Blacklist.





*View at TechPowerUp Main Site*


----------



## Wastedslayer (Oct 29, 2013)

So if its basically a Titan, and they let companies make their own versions (i.e. Lightning, Classified, HOF), this could be quite impressive!


----------



## SIGSEGV (Oct 29, 2013)

Oh please nvidia. FFS 
For me 699$ price point for single GPU is a joke except for those who have tons of money 
However, i must congratulate nvidia for their effort to deliver competition  and also successfully deliver such products which milking the cash cow


----------



## Totally (Oct 29, 2013)

Aha! To all those wondering how much nvidia was fleecing its customers with the titan, well there you go. 

p.s. guess there isn't a Titian ultra if the 780ti is an oc'd titan


----------



## ensabrenoir (Oct 29, 2013)

......come on seriously?  Who didn't see this coming.... Just waiting to see what this thing can do....  You wear the crown you can command the price!!!!  Some won't get it though....until you've worn a $100 dress shirt you'll never comprehend its existence....

....no i dont wear $100. dress shirts ....but i did try one on.


----------



## SIGSEGV (Oct 29, 2013)

ensabrenoir said:


> ......come on seriously?  Who didn't see this coming.... Just waiting to see what this thing can do.... * You wear the crown you can command the price!!!! * Some won't get it though....until you've worn a $100 dress shirt you'll never comprehend its existence....
> 
> ....no i dont wear $100. dress shirts ....but i did try one on.



yes absolutely, as nvidia always do..


----------



## Animalpak (Oct 29, 2013)

We will not see a 790 with 2 Titans on it, too much heat to dissipate ...


----------



## Xzibit (Oct 29, 2013)

ensabrenoir said:


> ....no i dont wear $100. dress shirts ....but i did try one on.



Duck-face selfies with the shirt or IT NEVER HAPPENED!!!


----------



## hardcore_gamer (Oct 29, 2013)

Still overpriced. Nvidia was raping customers with the $1000 price. Now they reduced the raping to one hole.


----------



## sweet (Oct 29, 2013)

There is no confirmation of 780Ti specs, therefore your claim of "The card is shaping up to be an overclocked GTX TITAN, based on the same GK110 silicon, and with 6 GB of memory" is completely invalid.


----------



## Xzibit (Oct 29, 2013)

hardcore_gamer said:


> Still overpriced. Nvidia was raping customers with the $1000 price. Now they reduced the raping to one hole.



You can't rape the willing.


----------



## Fourstaff (Oct 29, 2013)

Judging by price, this will be better than 280x by quite a noticeable margin? I have noticed that Nvidia doesn't perform that well as resolution goes up (AMD scales better with resolution).


----------



## 1d10t (Oct 29, 2013)

To be honest i'm absolutely confused with nVidia.They released $700 GTX 780Ti out of nowhere and beating his own $1000 card?



Animalpak said:


> We will not see a 790 with 2 Titans on it, too much heat to dissipate ...



Suppose nVidia did,will you pay 2 grand for it?


----------



## SIGSEGV (Oct 29, 2013)

1d10t said:


> To be honest i'm absolutely confused with nVidia.They released $700 GTX 780Ti out of nowhere and beating his own $1000 card?
> 
> 
> 
> Suppose nVidia did,will you pay 2 grand for it?



i guess they just don't want the "Titan" name looks bad especially after Zeus and his friends have been striking the Titans


----------



## repman244 (Oct 29, 2013)

Totally said:


> Aha! To all those wondering how much nvidia was fleecing its customers with the titan, well there you go.
> 
> p.s. guess there isn't a Titian ultra if the 780ti is an oc'd titan



The 780Ti isn't just an OC'ed Titan, it probably has the fully enabled chip this time...overclocking would just drive the power consumption too high.




1d10t said:


> To be honest i'm absolutely confused with nVidia.They released $700 GTX 780Ti out of nowhere and beating his own $1000 card?



Drop price for 780, EOL Titan, get the 780Ti in...the price won't stay at $1000 for the Titan - it either drops or goes EOL.


----------



## Maous (Oct 29, 2013)

*Well played Nvidia*

Weather the Msrp is $649 or $699 both price points have been speculated. Your looking at a customizable titan with 6gb of Samsung memory, better coolers and power phases. Count the value of the codes at $60 each game and $100 off shield that's a savings of $280 per card (if you want those games and the shield discount) or option 2 you get the codes don't want any of them and sell them for - 25% value on eBay boom still $215 off the Msrp making this card come in at an astonishing $435 or $485 respectively!

Yes at either msrp you can sli these cards for less then the current price of a titan or crossfire R9 290X or you can give your friends the same games you play either way it's a win/win even with amd's wider selection of games you can not = a $280 dollar value, way to go Nvidia /clap


----------



## buggalugs (Oct 29, 2013)

Titan will have to disappear, its the only option Nvidia have without looking bad, (or worse).

 They would have to drop the price 40% and that's not going to happen.


----------



## TheoneandonlyMrK (Oct 29, 2013)

I was going to flame the price as usual and it is high but the 6gb memory  helps sweeten it a bit ish.


----------



## qubit (Oct 29, 2013)

What I'd really like to see is a card with the full GK110 chip, the same Titan cooler, 3GB RAM at a good price like £300-400. I can dream.


----------



## DeadSkull (Oct 29, 2013)

Totally said:


> Aha! To all those wondering how much nvidia was fleecing its customers with the titan, well there you go.
> 
> p.s. guess there isn't a Titian ultra if the 780ti is an oc'd titan



GTX 780 Ultra ?





Maous said:


> Weather the Msrp is $649 or $699 both price points have been speculated. Your looking at a customizable titan with 6gb of Samsung memory, better coolers and power phases. Count the value of the codes at $60 each game and $100 off shield that's a savings of $280 per card (if you want those games and the shield discount) or option 2 you get the codes don't want any of them and sell them for - 25% value on eBay boom still $215 off the Msrp making this card come in at an astonishing $435 or $485 respectively!
> 
> Yes at either msrp you can sli these cards for less then the current price of a titan or crossfire R9 290X or you can give your friends the same games you play either way it's a win/win even with amd's wider selection of games you can not = a $280 dollar value, way to go Nvidia /clap



I remember the days when GTX 290 barely went over $500 and it was a dual gpu high end card. $600+ for a single die high end card is insane, especially $1000 for either Titan or the GTX 690.


----------



## DeadSkull (Oct 29, 2013)

Oh and the Titan will probably be replaced by a fully unlocked Titan aka gk180 "Atlas"

http://www.techpowerup.com/news_tags.php?tag=GK180


----------



## Maous (Oct 29, 2013)

*Lack of competition....*

I'm very much excited to see amd finally catch up after all this time. Now hold on to your butts (excuse the jurassic Park reference) but let me take you on a little bit of Nvidia chip history starting with Fermi for simplicity's sake since it was recent. 

Why was the gtx 480/580 called GF100/GF110 because it stands for Geforce Fermi chip number 100/110 these pieces of Silicon had wider memory buses, more Cuda cores and more ram (3gb) etc they were the top of the line. Now look at the gtx 470/570 these chips were also GF100/GF110 But had smaller bus widths at 320 instead of 384 now go all the way down to the gtx460/560 GF104/GF114 finally we get the 256 bus width we are looking for (remember this) Now we look at the gtx 590 also GF110 384 bit bus.... 

So here's where things get interesting. 

Amd releases the 7000 series Nvidia says in a press release we expected more from amd. 

Now we get a Gtx 680 but it's GK104 256 bit bus... Um what (we knew there was a gk110 from a leak which means a possible Gk100 was in the works as well) then we get a gtx 690 wait 2 of the exact same chips used in the 680 in one card this was never done previously in the history of video cards except for the Asus Mars series..... The gtx590 used chips much closer to a 570 as did all the previous dual cards excluding the Mars series. 

So because amd failed to be competitive we got a gtx660 re-branded as a gtx 680 and it was competitive /beat out the original 7970 amd's top card. Fast forward a little later and we see the 7970ghz edition which is actually different silicon. Nvidia wasting little time drops titan a cut down gk110 384 bit bus (hey there's the chip that is supposed to be the 770 the 570's true successor) some time passes and they release the 780 (okay finally their top card has the correct bus width and ram configuration and the chip number is correct or is It? this folks was the gk100 that was never released the true 680 and I'll prove that in just a moment) fast forward again we have amd launch its 3rd reiteration of graphics core next the R9 290/290X hmm how does Nvidia respond 770ti (Gk110 384 bit bus a little cut down from 780 hey its the real gtx 670!!) and the gtx 780ti (GK110 384 bit bus check um it's still a cut down GK110 oh hello your the real gtx 770 which is on its 3rd name change. 

We still have yet to see a fully enabled GK110 folks which was supposed to be the real gtx 780 kind of like how the gtx 580 was the fully functional/enabled GF110 and the 570 and 560ti were cut down versions of the GF110 chip hell we still haven't really seen a real Kepler refresh there's no GK114 or Gk116 or 118 why because amd is not pushing Nvidia to reply with them yet. Why do you think no matter what amd releases Nvidia can release weeks later and catch even their manufacturers off guard. Because they have the chips to do so because amd is not being competitive.

If gtx 800 series is released in Q1-2014 I guarantee it will be 28nm why because they have unused GK chips lying around that they can re-brand and launch as 28nm "maxwell" 

I for one will be waiting to see what the future holds but I hope amd can increase pressure on Nvidia to make them release properly in the future


----------



## repman244 (Oct 29, 2013)

DeadSkull said:


> Oh and the Titan will probably be replaced by a fully unlocked Titan aka gk180 "Atlas"
> 
> http://www.techpowerup.com/news_tags.php?tag=GK180



It's essentially just a fully enabled GK110, so the 780Ti should have 2880 cores.


----------



## Prima.Vera (Oct 29, 2013)

I'm starting to hate this piece of crap company more than EA. Seriously. The amount of price they are charging for a stupid card is beyond ridiculous. I will just buy a PS4 and be done with it....


----------



## qubit (Oct 29, 2013)

Maous said:


> I'm very much excited to see amd finally catch up after all this time. Now hold on to your butts (excuse the jurassic Park reference) but let me take you on a little bit of Nvidia chip history starting with Fermi for simplicity's sake since it was recent.
> 
> Why was the gtx 480/580 called GF100/GF110 because it stands for Geforce Fermi chip number 100/110 these pieces of Silicon had wider memory buses, more Cuda cores and more ram (3gb) etc they were the top of the line. Now look at the gtx 470/570 these chips were also GF100/GF110 But had smaller bus widths at 320 instead of 384 now go all the way down to the gtx460/560 GF104/GF114 finally we get the 256 bus width we are looking for (remember this) Now we look at the gtx 590 also GF110 384 bit bus....
> 
> ...



I wholly agree with the sly product line repositioning by nvidia due to lack of competition by amd. Have you noticed how the top end cards have all conveniently doubled in price at the same time? :shadedshu

Each generation should leapfrog the other in performance, like we used to see a few years ago. Therefore, to properly compete with Titan, the 290x should have been something like 30-40% or more faster in everything. On top of that, it should have had an effective and classy cooler like the Titan has. What we see instead, is a decent amd GPU and board that only matches Titan in performance with a crap cooler slapped on it. That isn't progress.

Sorry to sound like a conspiracy theorist, but I swear it feels like amd and nvidia are operating as a cartel to keep their respective positions at all times with nvidia as performance kind and amd as underdog. This latest generation is a perfect example, because it's blatantly obvious that amd could have put a decent cooler on there without affecting the price much. More like $10 at the most and have had waay better reviews. That would have really opened up the performance of the 290x. Instead they've hamstrung it and gone for the effing "value" market again.   Once again, this isn't progress.

You're not quite right about the GTX 590 though. This uses the full GF110 GPUs and is literally two GTX 580s in one card, like the GTX 690 is two GTX 680s. However, the clocks are significantly reduced on the 590 due to heat and power constraints. That card could run at full speed if it had an appropriate cooler, likely water cooling.

Also, I doubt nvidia could just rebrand the current chips and call them Maxwell. They may update the architecture a bit, like amd did with the 290x and call that Maxwell. I have a feeling it's going to be more than that, though.


----------



## buggalugs (Oct 29, 2013)

Maous said:


> We still have yet to see a fully enabled GK110 folks which was supposed to be the real gtx 780 kind of like how the gtx 580 was the fully functional/enabled GF110 and the 570 and 560ti were cut down versions of the GF110 chip hell we still haven't really seen a real Kepler refresh there's no GK114 or Gk116 or 118 why because amd is not pushing Nvidia to reply with them yet. Why do you think no matter what amd releases Nvidia can release weeks later and catch even their manufacturers off guard. Because they have the chips to do so because amd is not being competitive.
> 
> If gtx 800 series is released in Q1-2014 I guarantee it will be 28nm why because they have unused GK chips lying around that they can re-brand and launch as 28nm "maxwell"
> 
> I for one will be waiting to see what the future holds but I hope amd can increase pressure on Nvidia to make them release properly in the future



 It sounds like you're looking at it from one side....with blinkers It was 4 months after the 7970 was released before we saw the 680. Nvidia missed the important Christmas season sales that year. The 7970 was the biggest performance jump from previous generation in at least the last decade. That goes for both AMD and Nvidia cards.

 The 680 only barely matched the 7970, so instead of blaming AMD for that(when they just released a card with the biggest performance jump in a decade) you should be calling out Nvidia for not releasing the proper GK110 at the time. You should be calling them out for releasing a half-baked overclocked product that barely matched the 7970, and still charged top dollar for it.

 The 7970Ghz edition forced Nvidia back to the factory to rush out the 780, and Even the 780 isn't a huge amount faster than the 7970Ghz/680. 

 Its just ridiculous to say AMD have not been competitive in the last couple of years. Nvidia's flagship $1,000 GPU has been made obsolete within a few weeks,and dropped its value by almost 50%, it hasn't been out that long, and Nvidia have been forced to release new cards 780Tiand drop the value of existing cards by $150+.

 Trust me, without AMD, we would be paying $1,000 for highend Nvidia GPUs and not seeing new cards for 18 months or more,

 I don't think you understand the reasoning behind the memory bus variations between generations. It has nothing to do with competition, and all to do with memory type, with GDDR3 they needed wider memory buses to reach certain bandwidth, with GDDR4 and GDDR5 those requirements changed because bandwidth could be achieved on narrower memory buses.

 Because we've been on GDDR5 for a while now and they are pushing the frequency limits of current memory ICs, the only option is to increase the memory bus again.

 So yeh, It sounds like you're looking at this from a NVidia fan point of view and not looking at the big picture.


 Seriously you guys, Nvidia would be rubbing their hands together and laughing that they could release a $1,000 GPU that gets made obsolete in a few weeks, and guys are still kissing their butt and treating them like gods.  If you suckers would speak up for what is best for us consumers, instead of kissing butt, maybe we wouldn't have to pay so much for graphics cards.

 I get the feeling that Nvidia could release a card and charge $1,999 for it and you guys would still be kissing their butts. Its people like you is why we pay so much for these dam things...... And this is coming from someone who always buys a highend GPU every generation and every chipset/CPU generation so I'm not afraid to spend money.....AND I currently have a 680, had a 7970 and plan to get a 290X. IMO no single GPU is worth more than about $599. Even that is pushing it.

 AMD has done a huge amount for competition and for pricing, if you cant see that after the 290X then their is something wrong with your thinking, or you're just a NVidia fan.


----------



## Fourstaff (Oct 29, 2013)

buggalugs said:


> Trust me, without Nvidia, we would be paying $1,000 for highend AMD GPUs and not seeing new cards for 18 months or more,



Fixed that for you. At the end of the day, both companies exist to bring as much bacon home as possible.


----------



## EarthDog (Oct 29, 2013)

> The 7970Ghz edition forced Nvidia back to the factory to rush out the 780,


It did what? No... no it didnt. The 7970 Ghz edition with the new drivers tends to trade punches with a 680 depending on what site, what game, and what settings are used. At TPU a stock 680 beats the 7970 by 1% at 1080p res. 

A stock 780 beats the tar out of the 7970 performance wise...to the tune of almost 20% here at TPU too.

I like the message, but it seems like you got lucky with the result as the data being presented to prove your point is a bit ehh.. off.


----------



## qubit (Oct 29, 2013)

Fourstaff said:


> Fixed that for you. *At the end of the day, both companies exist to bring as much bacon home as possible.*



What?!!!


----------



## Fourstaff (Oct 29, 2013)

qubit said:


> What?!!!



We all know both companies are stuffed full with geeks coding and designing the next best chip. Well, geeks (aka internet people) loves bacons (and cats), so might as well cut the chase.


----------



## Frick (Oct 29, 2013)

qubit said:


> What?!!!



Meaning that in the end they are about making money. Both of them.



Fourstaff said:


> We all know both companies are stuffed full with geeks coding and designing the next best chip. Well, geeks (aka internet people) loves bacons (and cats), so might as well cut the chase.



I love neither! And I hate cats. Seriously, they're assholes ALL OF THEM.


----------



## EarthDog (Oct 29, 2013)

Perhaps he would recognize 'cheddar' as money instead of bacon? 

I guess "bring home the bacon" didn't make it across the pond.


----------



## Aithos (Oct 29, 2013)

I find it hilarious that people are complaining about the price.  Let's go over a couple facts:

1) Titan is a $1000 beast.  Is it overpriced?  Absolutely, but so is even "top" card.  There isn't a single best in category part that ISN'T overpriced.  If you want the "best" case it's going to be 2-3x as expensive, if you want the "best" SSD setup (in raid) it is going to be super expensive.  Best in slot (as we call gear upgrades in WoW) will always be expensive.  How expensive is going to continue to go up with inflation over time because as a general rule everything else has gotten considerably cheaper.  

2) The current overclocked 780 GTX cards are $660-$700, so we aren't really looking at any significant price difference at all assuming the 780ti is at least as good as a overclocked 780.  We can pretty safely assume this even without official specs because nVidia wouldn't be able to increase the price at all if it didn't beat out the aftermarket cards like the EVGA superclocked or classified.

Now for some speculation:

If this does turn out to be a fully enabled gk110 and/or an overclocked Titan with 6gb of ram then at $700 is a STEAL.  That would mean that for $1400 (1.4x the price of a Titan) you could run a SLI setup with two of the fastest cards on the market AND get two game bundles (1 to keep and 1 to sell or gift).  That's a lot less than $2k like an SLI Titan would have run and who knows what will happen with the aftermarket branded 780ti.  I'll take a couple EVGA 780ti superclocked or classified in a heartbeat.

Not to mention if the regular aftermarket overclocked cards drop to $500ish (which perform on par with the 290x and still come with the game bundle) then you have a card that runs cooler and quieter, still has room to overclock (a little) and saves a little money.

I'm super pumped about this, I won't make my decision on what to buy until the final specs are out and I can run some math but nVidia is making some good decisions...


----------



## wolf (Oct 29, 2013)

buggalugs said:


> The 7970 was the biggest performance jump from previous generation in *at least* the last decade. That goes for both AMD and Nvidia cards.





How could anyone forget the 8800GTX (2006) so quickly? that one stands out really far for me as obliterating the generation before it.

The GTX280 (2008) did a damn good job of that too if memory serves.

The proof is in the pudding if you don't believe me, the pudding being TPU's delicious, chocolatey reviews.

I'm not taking away from anything AMD did or has just done, but that was a bold call that simply isn't true.


----------



## Prima.Vera (Oct 29, 2013)

Frick said:


> I love neither! And I hate cats. Seriously, they're assholes ALL OF THEM.



LOL


----------



## rtwjunkie (Oct 29, 2013)

Frick said:


> Meaning that in the end they are about making money. Both of them.



Perfectly said!  They exist to make money, and will charge whatever the market will bear them selling at.  I would do the same if I had a product to sell.  I would not be trying to make sure everyone in the world could afford my product, because at the end of the day I've got my own life to take care of.


----------



## Casecutter (Oct 29, 2013)

This isn't so much improving competition more just a price drop on Titan in a roundabout way.  It's face saving... Titan will be permitted to fade with a legacy of 6 months but not just yet EoL although swiftly relinquishes it as an antiquity.  Memorable but priced $600 more at the time than the 7970Ghz with about 20% in performance meaning it performance to dollar was 73% behind the 7970Ghz. IT's legacy should not be any big deal. 

As with Titan... or this new SKU of it, it really will do little to change the competitive landscape if still priced $150 more than a R9 290X.

This and the pricing tumble of the GTX780 tell me Nvidia realizes Hawaii production is "all-in" with full-run of starts at TSMC and imaginably good yields.  I just by chance wonder if a better power/watts than these first run silicon that went into just these reference cooler cards might be succeeding?  Could it been that TSMC didn’t give AMD didn’t get the best first risk production run and AMD had to go to market with what they got?   Could we see in at the end of November AIB’s working off improved production silicon?

Something tells me Nvidia knows there's trouble on the way and especially with a R9 290 (non-X) and why they didn't just go the expected 10-12%, but went straight to a 22-24%.   I think Nvidia's hope is to unload their geldings before the end of November, at which point AMD AIB's release a full gambit of R9 290X and Non-X into the market.


----------



## qubit (Oct 29, 2013)

Fourstaff said:


> We all know both companies are stuffed full with geeks coding and designing the next best chip. Well, geeks (aka internet people) loves bacons (and cats), so might as well cut the chase.





Frick said:


> Meaning that in the end they are about making money. Both of them.



Hey guys I'm just messin' that I'm "shocked" that both companies are just in it for the money, lol.


----------



## Tatty_One (Oct 29, 2013)

be happy..... people who are not sold on the 290X can get a Titan+ for much cheaper than a Titan, be grateful for something at least.


----------



## birdie (Oct 29, 2013)

Maous said:


> I'm very much excited to see amd finally catch up after all this time. Now hold on to your butts (excuse the jurassic Park reference) but let me take you on a little bit of Nvidia chip history starting with Fermi for simplicity's sake since it was recent.



Your post is perfectly fine aside from one tiny problem. I don't know why but you believe that releasing a new GPU architecture is like eating a piece of cake in the morning, um, no, Kepler cost NVIDIA over $100 million in development and they want to capitalize as much as possible from it.

I'd actually praise NVIDIA for having designed such a versatile and flexible architecture which can be tuned and pose a real threat to the competition in no time.


----------



## birdie (Oct 29, 2013)

Casecutter said:


> This isn't so much improving competition more just a price drop on Titan in a roundabout way.  It's face saving... Titan will be permitted to fade with a legacy of 6 months but not just yet EoL although swiftly relinquishes it as an antiquity.  Memorable but priced $600 more at the time than the 7970Ghz with about 20% in performance meaning it performance to dollar was 73% behind the 7970Ghz. IT's legacy should not be any big deal.



Sane people have *never bought* top of the GPU line - a much better investment would have been a SLI tandem of two GTX 660 (ti).

So, all this talk about how NVIDIA has raped its customers is annoying and lame.


----------



## Casecutter (Oct 29, 2013)

birdie said:


> So, all this talk about how NVIDIA has raped its customers is annoying and lame.


Did I say someone was maltreated against their will?  Nobody forces folks to spend $1000, they do it willingly.  I simply pointed out the metric of performance/cost, how anyone interprets/justifies it I don't get into.


----------



## NeoXF (Oct 29, 2013)

To Hell with this stupid card, if it was 599$ it would've made much more sense.

Now it's way to expensive for AMD to care, so no price drops on Rx 200 series in the high end. It's also mostly irrelevant, seeing as how 20nm is getting closer and closer... so I don't think AMD would bother with a R9 290XT... or R9 295X or whatever... not to say that R9 290X (especially once custom versions hit the market) will have much of a problem besting it in most situations, to begin with.




Tatty_One said:


> be happy..... people who are not sold on the 290X can get a Titan+ for much cheaper than a Titan, be grateful for something at least.



Eh. I'd never be grateful for corporations playing us like tools in a toolbox and, for the lack of a better word, abusing our (so called) needs.

But I would be excited (and I kinda am) to be able to get a better setup for my gaming rig for less money tho. So in a indirect sense, some (well, a lot of) evil is better than only evil.


----------



## Casecutter (Oct 29, 2013)

NeoXF said:


> 20nm is getting closer and closer...


TSMC spokeswoman Elizabeth Sun, "TSMC will begin 20 nanometer process production in the first quarter of next year and the new operation will contribute to its revenues from the second quarter".  
From that some indicate at earilest we might see a GPU card is perhap June 2014; but some are perdicting the bulk of card releases might end up being late summer-fall timeline.


----------



## NeoXF (Oct 29, 2013)

Casecutter said:


> TSMC spokeswoman Elizabeth Sun, "TSMC will begin 20 nanometer process production in the first quarter of next year and the new operation will contribute to its revenues from the second quarter".
> From that some indicate at earilest we might see a GPU card is perhap June 2014; but some are perdicting the bulk of card releases might end up being late summer-fall timeline.



You know what they say (I think)... no matter how far off something is, it sure as acid rain is closer than it was 6 months ago...


----------



## Slizzo (Oct 29, 2013)

For those complaining about the Titan's price, does the AMD R9 290X have the capabilities of a FirePro card for a fraction of the price?

No?

Then quit your whining.  The Titan is a baby Quadro for all intents and purposes for about a third of the price. Why gamers were buying it I have no idea, but if you're a graphic designer who happens to game, the Titan is the best product for you.


----------



## TheHunter (Oct 29, 2013)

Slizzo said:


> For those complaining about the Titan's price, does the AMD R9 290X have the capabilities of a FirePro card for a fraction of the price?
> 
> No?
> 
> Then quit your whining.  The Titan is a baby Quadro for all intents and purposes for about a third of the price. Why gamers were buying it I have no idea, but if you're a graphic designer who happens to game, the Titan is the best product for you.



Actually it has unlocked DP just like GK110 Titan.. Same with 7970 full DP 1tflop/s


But in nvidia eyes its price premium and needed 1k $ price tag lol




1d10t said:


> To be honest i'm absolutely confused with nVidia.They released $700 GTX 780Ti out of nowhere and beating his own $1000 card?
> 
> 
> 
> Suppose nVidia did,will you pay 2 grand for it?



780Ti is faster in Single precision and games, but it will be slower in Double Precision.. Same crippled DP like by 780GTX ~ 190-200 gflops/s, so Titan "can" stay as it is.

Imo Titan was never really worth it, lol only for e-peen most of the time, if I want  DP cheap then I'll just get one 7970GHz or 280X


----------



## the54thvoid (Oct 29, 2013)

TheHunter said:


> Actually it has unlocked DP just like GK110 Titan.. Same with 7970 full DP 1tflop/s



Not sure but found this...



> We've also come to learn that AMD changed the double-precision rate from 1/4 to 1/8 on the R9 290X, yielding a maximum .7 TFLOPS. The FirePro version of this configuration will support full-speed (1/2 rate) DP compute, giving professional users an incentive to spring for Hawaii's professional implementation.



http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650.html


----------



## qubit (Oct 30, 2013)

Isn't it a bit ironic how NVIDIA will launch a card that's faster yet name it lower down the range? I'd have thought something like Titan Ultra would have been a great name for it and make for great marketing. I'd certainly like to own something with that name.


----------



## leeb2013 (Oct 30, 2013)

I sometimes think people forget that Nvidia is a business that needs to make a profit and we as customers don't have the right to determine how much money they make of each product. We do have the right to choose what we buy. It amazes me, that a company which has made a unique, world leading product, that no one else has managed to produce, gets so much flak for charging a premium price for it. Do people really think that customers should decide how much they charge for their top end cards? Should Nvidia really make sure that most people can afford a Titan? The choice is simple. If AMD made a GPU with the price and performance you like, buy that. If Nvidia do, buy that instead. Simples!


----------



## DannibusX (Oct 30, 2013)

The word "rape" is being thrown around quite a bit here.  It's completely pointless to use it, people that have bought nVidia's cards _wanted_ them, otherwise they'd have sprung for AMD.

nVidia charges a premium for their GPU's.  I don't care.  If I felt the value I got out of a card was worth the price I would buy it.  I jumped to ATI when the 5870 launched, and I'm still pleased by its performance (not for long) and I am just as likely to return to nVidia as I am to snag an AMD card (which is likely; price to performance) when I build my new PC.

The ATI/AMD v. nVidia debate is stale, whether talking performance or price.  Get what you want.  Don't bring rape into it, rape is a horrible thing.


----------



## Goodman (Oct 30, 2013)

Not saying anything but...

AMD is ready this time around they got an R9 295x ready to go if need be...


----------



## Tatty_One (Oct 30, 2013)

leeb2013 said:


> I sometimes think people forget that Nvidia is a business that needs to make a profit and we as customers don't have the right to determine how much money they make of each product. We do have the right to choose what we buy. It amazes me, that a company which has made a unique, world leading product, that no one else has managed to produce, gets so much flak for charging a premium price for it. Do people really think that customers should decide how much they charge for their top end cards? Should Nvidia really make sure that most people can afford a Titan? The choice is simple. If AMD made a GPU with the price and performance you like, buy that. If Nvidia do, buy that instead. Simples!



This is where we disagree to a certain extent because the consumer does and has every right to determine the price of a product, now in the case of a GTX780Ti which is a niche market in itself, probably only accounting for 5% of NVidia sales (probably less actually) a company has little to lose in hiking the price, simply because the product with such a low consumer base in the first place won't effect the sales profile very much, however if the consumer market decides that it is not prepared to pay the price of that product then sales are poor and costs are not recovered let alone profit, normally a company with a poor selling product decides on one of two routes, either discontinue the product or lower it's price, if they choose the 1st..... everyone loses, if they choose the 2nd then the consumer gains and eventually the company does also...... hopefully, of course that only works if the lowered price does not produce a loss, otherwise the product is discontinued anyways


----------



## TheHunter (Oct 30, 2013)

Ok folks nothing to see here 








Finally! I kinda knew it will be full though, hopefully its also a refreshed version aka GK180


----------



## Crap Daddy (Oct 30, 2013)

TheHunter said:


> Ok folks nothing to see here
> 
> http://i.imgur.com/X57uD1z.jpg
> 
> ...



Saw this on other forums and the idea is, there's at least one guy who appears to have access to a 780Ti, this gpu-z shot is fake. But... he hints the CC number is correct.


----------



## NeoXF (Oct 30, 2013)

^Yeah, because internet people are strangers to photo-manipulation, right? Source or better yet, official confirmation, or bust.



qubit said:


> Isn't it a bit ironic how NVIDIA will launch a card that's faster yet name it lower down the range? I'd have thought something like Titan Ultra would have been a great name for it and make for great marketing. I'd certainly like to own something with that name.



Sorry, but you don't sound like a very smart shopper to me, LOL. I mean, on occasion, I buy something more expensive or something that's not as the best (but neither the worst), but looks better in my case, or stuff like that, but I never went as far as to buy something just because it sounds cool... Not to mention, in about 2-3 years time, such a name would sound hell-a hilarious... pathetic even, considering low end cards would best it...


----------



## qubit (Oct 30, 2013)

NeoXF said:


> *Sorry, but you don't sound like a very smart shopper to me*, LOL. I mean, on occasion, I buy something more expensive or something that's not as the best (but neither the worst), but looks better in my case, or stuff like that, but I never went as far as to buy something just because it sounds cool... Not to mention, in about 2-3 years time, such a name would sound hell-a hilarious... pathetic even, considering low end cards would best it...



Really? Gosh, I guess I'd better start taking buying lessons from you, then! 

My post is rather logical and clear, so if you don't understand it, well, perhaps _you're_ not very smart...


----------



## MxPhenom 216 (Oct 30, 2013)

New screenshot of 780Ti specs.................................Im going to call fake, but regardless. One can hope.


----------



## NeoXF (Oct 30, 2013)

^ If you would've bothred scrolling up a bit, you'd see someone else already posted that screenshot. Also, why doesn't anyone bother posting sources as well!?



qubit said:


> Really? Gosh, I guess I'd better start taking buying lessons from you, then!
> 
> My post is rather logical and clear, so if you don't understand it, well, perhaps _you're_ not very smart...



Congrats on the free insult, you sure look better than me now. /s

But congratulations and sounding like a butthurt rager on the first try tho (if that was your intention). Next time try to re-read posts before you reply to them, I'm not gonna bother re-explaining what I ment. Anyway, whatever, I don't want this conversation to go any further.


----------



## qubit (Oct 30, 2013)

NeoXF said:


> ^ If you would've bothred scrolling up a bit, you'd see someone else already posted that screenshot. Also, why doesn't anyone bother posting sources as well!?
> 
> 
> 
> ...



You actually insulted me first, I simply replied in kind.

That's how your post reads, anyway.


----------



## TheHunter (Oct 31, 2013)

http://videocardz.com/47491/nvidia-geforce-gtx-780-ti-3gb-memory-gk110-gpu


Its still a secret 14 or 15SMX.. Imo 15SMX at higher clocks would be possible on refreshed GK180 with lower TDP wattage, but you never know


----------



## Crap Daddy (Oct 31, 2013)

http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores

Videocardz sez it's the big full mofo GK110. 2880 CC


----------



## the54thvoid (Oct 31, 2013)

Crap Daddy said:


> http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores
> 
> Videocardz sez it's the big full mofo GK110. 2880 CC



If true they need to address the power limits.  One of the 2 reasons quoted by some folk that Titan clocks slower before hitting limits that a 780 is it's much higher core count.  So if Ti has even more cores than Titan, well those cores need power.  The mitigating factor might be the 3Gb memory requiring less power.

If of course, this news is legitimate.


----------



## Tatty_One (Nov 1, 2013)

the54thvoid said:


> If true they need to address the power limits.  One of the 2 reasons quoted by some folk that Titan clocks slower before hitting limits that a 780 is it's much higher core count.  So if Ti has even more cores than Titan, well those cores need power.  The mitigating factor might be the 3Gb memory requiring less power.
> 
> If of course, this news is legitimate.



Although the op shows the Ti as having 6GB memory, it will be interesting to see what the max power draw is I would guess more than 300W which won't be good, not because a 300W pull on a very high end graphics card is bad, but because all of those NVidia fans that have criticised the 290X consistently quote it's huge power draw at 315W


----------



## erocker (Nov 1, 2013)

I'd pay $699 if it came with a water block.


----------



## the54thvoid (Nov 1, 2013)

Tatty_One said:


> Although the op shows the Ti as having 6GB memory, it will be interesting to see what the max power draw is I would guess more than 300W which won't be good, not because a 300W pull on a very high end graphics card is bad, but because all of those NVidia fans that have criticised the 290X consistently quote it's huge power draw at 315W



I thought large power draw and lots of heat was good thing?  Everyone's raving about that (j/k)

But seriously, it all comes down to power v performance.  The 290X power draw might well come down with cooler models (literally).  Does lots of heat not create waste power?

Then of course the AIB's can add better power circuitry to compensate.  I think custom 290X's will be pretty overpowering but of course we need to see a good _clock for clock_ comparison against the 780Ti, with power being known.  Then we'll see if the price premium Nvidia will be charging is relevant.



erocker said:


> I'd pay $699 if it came with a water block.




One of our UK etailers does just that with the 290X.  Have a custom fitted block and an air cooler as well.

http://www.overclockers.co.uk/showproduct.php?prodid=WC-054-TL&groupid=701&catid=56&subcat=1752

http://www.overclockers.co.uk/showproduct.php?prodid=WC-055-TL&groupid=701&catid=56&subcat=1752


----------



## EarthDog (Nov 1, 2013)

Well, considering for TPUs testing it has about ~60W to take that crown (Titan pulled 263W "max" here)... it has a ways to go. As a fan of both camps, it will be interesting to see where it lands though. To me, if it does pull more power, I do hope that it beats it out performance wise.



> But seriously, it all comes down to power v performance. The 290X power draw might well come down with cooler models (literally). Does lots of heat not create waste power?


Maybe? I never heard of that one before...but thinking about it, at least from an extreme cooling perspective you are right. For example I need 1.45v for 5.1Ghz on my 4770K at ambient temps, while I can hit 5.4Ghz with the same voltage under LN2... BUT that is also a difference of nearly 140C (-120C I have a coldbug and assuming 20C ambient). I am not sure the difference between air coolers or even water would make a noticeable difference.


----------



## RCoon (Nov 1, 2013)

780ti confirmed to be fully unlocked GK110 chip


----------



## Fourstaff (Nov 1, 2013)

the54thvoid said:


> good _clock for clock_ comparison against the 780Ti, with power being known.



Not sure why every so often this comes up, I am pretty sure both the 780Ti and 290x are designed with different target clockspeeds in mind, making the highest sustainable performance much more useful than clock for clock advantage.


----------



## Tatty_One (Nov 1, 2013)

EarthDog said:


> Well, considering for TPUs testing it has about ~60W to take that crown (Titan pulled 263W "max" here)... it has a ways to go. As a fan of both camps, it will be interesting to see where it lands though. To me, if it does pull more power, I do hope that it beats it out performance wise.
> 
> Maybe? I never heard of that one before...but thinking about it, at least from an extreme cooling perspective you are right. For example I need 1.45v for 5.1Ghz on my 4770K at ambient temps, while I can hit 5.4Ghz with the same voltage under LN2... BUT that is also a difference of nearly 140C (-120C I have a coldbug and assuming 20C ambient). I am not sure the difference between air coolers or even water would make a noticeable difference.



Agreed, if you look at the origional titan review in the overclock section, without the extra shader count, once you add a little voltage for the higher stock clocks it's pushed beyond 300W, although to be totally honest If I was buying a card with this kind of performance a few watts here and there would make little difference to me.


----------



## EarthDog (Nov 1, 2013)

Tatty_One said:


> although to be totally honest If I was buying a card with this kind of performance a few watts here and there would make little difference to me.


Without a doubt. I only look at power consumption for such cards just to make sure I have enough juice. 

So far my sentiment, "A quality 550W PSU will be plenty for a single CPU and GPU setup overclocked" holds true. 

Hell, if I can run a 3570K (4.5GHz) and GTX 690 (overclocked) on a 550W PSU, even an AMD FX-8xxx and a 290x will be fine... maybe. Damn those FX CPUs...


----------



## Recus (Nov 1, 2013)

DX 11.2 will give some pain to someone. : D


----------



## the54thvoid (Nov 1, 2013)

Recus said:


> http://videocardz.com/images/2013/10/Galaxy-GeForce-GTX-780-Ti.jpg
> 
> DX 11.2 will give some pain to someone. : D



Wow, official. It's like Titan Ultra with a memory deficit and presumably less DP compute power? 

Reviews should be interesting.


----------



## qubit (Nov 1, 2013)

the54thvoid said:


> Wow, official. It's like Titan Ultra with a memory deficit and presumably less DP compute power?
> 
> Reviews should be interesting.



Very interesting. If the price is right then I'll see about getting one and then another later on.

Mind you, nvidia's pricing is never really "right" is it? lol


----------



## RCoon (Nov 1, 2013)

qubit said:


> Very interesting. If the price is right then I'll see about getting one and then another later on.
> 
> Mind you, nvidia's pricing is never really "right" is it? lol



Nope it isn't, not to mention they always seem to stay a notch behind on how much VRAM they ship their cards with. Never understood the persistent discrepancy between AMD and NVidia on VRAM.


----------



## newtekie1 (Nov 1, 2013)

hardcore_gamer said:


> Still overpriced. Nvidia was raping customers with the $1000 price. Now they reduced the raping to one hole.



No one is forcing anyone to buy Titan or the 780Ti.  NVidia has plenty of competitively priced cards that compete directly with every card AMD has.  If you are stupid enough to pay several hundred dollars more for a few 10% better performance, you deserve what you get.


----------



## the54thvoid (Nov 1, 2013)

newtekie1 said:


> If you are *stupid* enough to pay several hundred dollars more for a few 10% better performance, you deserve what you get.



Did someone call me there?   

But seriously, really, I'm interested.....


----------

