# 6 GB Standard Memory Amount for GeForce Titan



## btarunr (Feb 5, 2013)

NVIDIA's next high-end graphics card, the GeForce "Titan" 780, is shaping up to be a dreadnought of sorts. It reportedly ships with 6 GB of GDDR5 memory as its standard amount. It's known from GK110 block diagrams released alongside the Tesla K20X GPU compute accelerator, that the chip features a 384-bit wide memory interface. With 4 Gbit memory chips still eluding the mainstream, it's quite likely that NVIDIA could cram twenty four 2 Gbit chips to total up 6,144 MB, and hence the chips could be spread on either sides of the PCB, and the back-plate could make a comeback on NVIDIA's single-GPU lineup.

On its Radeon HD 7900 series single-GPU graphics cards based on the "Tahiti" silicon (which features the same memory bus width), AMD used 3 GB as the standard amount; while 2 GB is standard for the GeForce GTX 680; although non-reference design 4 GB and 6 GB variants of the GTX 680 and HD 7970, respectively, are quite common. SweClockers also learned that NVIDIA preparing to price the new card in the neighborhood of $899.





*View at TechPowerUp Main Site*


----------



## Nordic (Feb 5, 2013)

A titan out there... and coming... 

What will be Mt. Olympus's champion in the coming battle?


----------



## Protagonist (Feb 5, 2013)

Oh my head hurts ever since the news of Geforce Titan and I'm feeling dizzy too... I think i should take a vacation till it launches in the real world


----------



## syeef (Feb 5, 2013)

james888 said:


> A monster is out there... and coming...



When they release it and you buy, they will be announcing a 12GB next-gen GPU.


----------



## Nordic (Feb 5, 2013)

syeef said:


> When they release it and you buy, they will be announcing a 12GB next-gen GPU.



I could care less about the 6gb memory. We barely use 1gb of memory in most games at the popular resolutions. I was speaking about everything else. 7.1 billion transistors. 

I will also be keeping my 7970 for I hope the next 2 gens. Its a great card and more than I need right now before being overclocked.


----------



## Enmitynz (Feb 5, 2013)

$899 usd, yikes. Wonder how long until we hear of the dual gk110 gpu with the near 2k pricerange? Lol. As long as the perfomance is there...Nvidia will charge the earth and tbh, why wouldnt they.


----------



## FordGT90Concept (Feb 5, 2013)

I was interested until I saw the $899 price tag.  If they offered it for $300, I'd be tempted to have another go at NVIDIA.


----------



## buildzoid (Feb 5, 2013)

899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
So I will stick with AMD.


----------



## The Von Matrices (Feb 5, 2013)

james888 said:


> We barely use 1gb of memory in most games.



I beg to differ.  If this truly is a high end card, then it needs all the memory it can get.  This card is expected to be used by people with 4K and greater resolutions.  When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory.  I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.


----------



## Nordic (Feb 5, 2013)

The Von Matrices said:


> I beg to differ.  If this truly is a high end card, then it needs all the memory it can get.  This card is expected to be used by people with 4K and greater resolutions.  When I run Skyrim at 5760x1200 with high resolution textures I can can easily use 2.5 GiB of graphics memory.  I only expect that to become the norm with new generation consoles and therefore more detailed games coming out.



You are right. I was thinking of 1080p. If one has a card like this I hope they have higher than 1080p.


----------



## The Von Matrices (Feb 5, 2013)

On another note, I think that this would be the most memory chips ever on a reference single GPU graphics card at 24.  GTX 295 still holds the reference card crown at 28, and the ASUS ARES III would hold the overall record at 32 if it was released.


----------



## D007 (Feb 5, 2013)

899, let the raping begin.. I'm not paying 899..lol..


----------



## repman244 (Feb 5, 2013)

buildzoid said:


> 899$ for it but if my calculations are right it should consume around 300w which I wonder how their gonna cool. Most likely the clocks will be pulled down to 800mhz so that the TDP drops but then performance should be about 39% greater than that of a gtx 680 at which point the 80% price increase is stupid. If it has unlocked voltage I would buy it but most likely this will end the same way as the 600 series with Boost and locked volts.
> So I will stick with AMD.



300W of heat isn't such a problem IMO, look at all the dual GPU's that are rated at 375W (and go way beyond that when overclocked).


----------



## dj-electric (Feb 5, 2013)

I feel bad for non-reviewers :X
Even if it costs 800$


----------



## silapakorn (Feb 5, 2013)

Don't forget that GTX690 is now roughly 899-950$. The price for Titan is quite normal to me.


----------



## SIGSEGV (Feb 5, 2013)

> SweClockers also learned that NVIDIA preparing to price the new card in the neighborhood of $899.



The PlayStation 4 “Orbis” is going to debut on the market later this year (on February 20, 2013) at price-points starting from *$350 to $400.*


----------



## Optimis0r (Feb 5, 2013)

Still more than happy with my GTX 680 at this point even if this turn out to be a monster. I'm only gaming at 1080p anyway so all that amount of ram would be useless to me.

Last card I had was a GTX 560 Ti, before that HD 5870 - i.e gaming cards. GTX 680 = gaming card. GK110 just doesnt make sense to me as a gaming card. Like GTX 480/580 this would obviously consume much more power. For HPC yes these gpu will do a job but for gaming I think its not the best / optimum solution.

Yes I understand people want the best of the best but at that alleged price point its ridiculous.  However thats not to say I'm not excited about this. Unless AMD does miracles with GCN then performance crown would almost certainly be, without arguments, with Nvidia. However if AMD's pricing is very competitive, combined with great gaming bundles then who knows how the market will turn out. YAY gpu wars.


----------



## Nordic (Feb 5, 2013)

I wonder how it would be for WCG


----------



## Prima.Vera (Feb 5, 2013)

I smell dual GPU here....


----------



## xenocide (Feb 5, 2013)

SIGSEGV said:


> The PlayStation 4 “Orbis” is going to debut on the market later this year (on February 20, 2013) at price-points starting from *$350 to $400.*



It's likely being *announced* on 2/20/2013, but it won't be available until the end of the year.


----------



## RCoon (Feb 5, 2013)

Prima.Vera said:


> I smell dual GPU here....



Me too. Same price as a 690. Almost the same TDP too.


----------



## dj-electric (Feb 5, 2013)

^Is the Tesla K20 a dual-gpu card?


----------



## iO (Feb 5, 2013)

$900 is not much if you consider the size and cost of such a huge die. They won´t make profit with this pricing.


----------



## seronx (Feb 5, 2013)

http://www.nvidia.com/content/PDF/kepler/Tesla-K20X-BD-06397-001-v05.pdf

World's best rumors...


----------



## progste (Feb 5, 2013)

with this one 4k gaming should be a piece of cake


----------



## dj-electric (Feb 5, 2013)

Really?^ Is the geforce GTX660 handle games in 1920X1080 like a piece of cake? Well that's the euqasion here...


----------



## progste (Feb 5, 2013)

Dj-ElectriC said:


> Really?^ Is the geforce GTX660 handle games in 1920X1080 like a piece of cake? Well that's the euqasion here...



well, yes it does


----------



## dj-electric (Feb 5, 2013)

Yes, in some games it is, in some it's on its knees. Unless you give up on some major details


----------



## progste (Feb 5, 2013)

Dj-ElectriC said:


> Yes, in some games it is, in some it's on its knees. Unless you give up on some major details



on its knees? not really it can keep acceptable framerates on everything with max settings, maybe not 60 fps, but 30 in pretty much every game


----------



## Prima.Vera (Feb 5, 2013)

As far as I know and I am aware, the architecture of Tesla is a "little" different from the one of a gaming card. Heck, even the Quadro series is a little different from gaming cards. So I won't be in a hurry in comparing this Titan with Tesla just yet, even is some of the specs coincide...


----------



## Ikaruga (Feb 5, 2013)

progste said:


> on its knees? not really it can keep acceptable framerates on everything with max settings, maybe not 60 fps, but 30 in pretty much every game



Last time I checked 30 fps was only acceptable on consoles


----------



## progste (Feb 5, 2013)

ikaruga said:


> last time i checked 30 fps was only acceptable on consoles



=p


----------



## Lemosopher (Feb 5, 2013)

There are many complaints about the pricetag but people have to keep in mind,  the desktop market doesn't rule electronics anymore.  I expected flagship gpus as well as other components to go up.  They simply are not going to sell as much of these components as they have in years past.


----------



## tokyoduong (Feb 5, 2013)

Ikaruga said:


> Last time I checked 30 fps was only acceptable on consoles



30 FPS is acceptable if it's consistent. And don't flame bait, you really don't need to mention consoles in there as they pretty much work the same as a locked down PC. 

I think this card is awesome even if it only performs +75% of GTX 680. Although I would never buy it at $900. I just refuse to pay that much for a video card regardless of its performance.


----------



## Prima.Vera (Feb 5, 2013)

Agree. Anything more than 500$ is not worth it. Especially in the era of console ports.


----------



## Solidstate89 (Feb 5, 2013)

Prima.Vera said:


> As far as I know and I am aware, the architecture of Tesla is a "little" different from the one of a gaming card. Heck, even the Quadro series is a little different from gaming cards. So I won't be in a hurry in comparing this Titan with Tesla just yet, even is some of the specs coincide...



The architecture of the GeForce, Quadro and Tesla cards are literally identical. The only differences are driver optimizations and the fact that with Quadro and Tesla cards, nVidia doesn't artificially limit the Double Precision FP operations like they do with their GeForce cards.

Beyond those differences (and usually the amount of memory as well) of the drivers and artificial limitations, the actual architecture of the GPU is literally identical.


----------



## Prima.Vera (Feb 5, 2013)

Then try to install a modded driver of a GeForce to a Tesla card and see what happens...


----------



## progste (Feb 5, 2013)

the gpu architecture should be very similar but the rest of the card is different


----------



## Ikaruga (Feb 5, 2013)

tokyoduong said:


> I think this card is awesome even if it only performs +75% of GTX 680. Although I would never buy it at $900. I just refuse to pay that much for a video card regardless of its performance.


First of all, I love Nvidia gives us faster cards, and I'm a big fan of Kepler when it comes to gaming tbh, and since I never said otherwise, I assume that part was not for me.



tokyoduong said:


> 30 FPS is acceptable if it's consistent. And don't flame bait, you really don't need to mention consoles in there as they pretty much work the same as a locked down PC.


Unless this is an enthusiast site....


----------



## tacosRcool (Feb 5, 2013)

6 GBs of RAM..... is everybody readying for 4K gaming in the future?


----------



## Frick (Feb 5, 2013)

tacosRcool said:


> 6 GBs of RAM..... is everybody readying for 4K gaming in the future?



Forget about 4K, GIVE ME MY AFFORDABLE 1440/1600 MONITORS!


----------



## progste (Feb 5, 2013)

Frick said:


> Forget about 4K, GIVE ME MY AFFORDABLE 1440/1600 MONITORS!


there are lots of nice 1080p monitors around 100$


----------



## Frick (Feb 5, 2013)

progste said:


> there are lots of nice 1080p monitors around 100$



2560x1440/1600


----------



## progste (Feb 5, 2013)

Frick said:


> 2560x1440/1600



oh, i misunderstood


----------



## Shihab (Feb 5, 2013)

Wouldn't mind shelling $600~$700 for that card, IF the performance lives it up to its _rumoured_ specs. Dunno, but this card seems too good to be true.

6GB is way too much though, shave off 3GB and drop the price a few hundred dollars please. 




Prima.Vera said:


> Agree. Anything more than 500$ is not worth it. Especially in the era of console ports.



Indeed, but let's be honest here and give credit to the people in DICe, Crytech and (Hurts me to say it) Ubisoft >_> There are still games worth to play on an HEDT.
Waiting for BF4, Crysis 3 looks amazing, and FarCry 3 proved well the need for fast hardware.


----------



## Lionheart (Feb 5, 2013)

Wonder if it could handle 3x 4K monitors in Nvidia surround lolz


----------



## Solidstate89 (Feb 5, 2013)

Prima.Vera said:


> Then try to install a modded driver of a GeForce to a Tesla card and see what happens...



It prevents you as the firmware is different.

That has absolutely *nothing* to do with the architecture of the GPU itself. It's all software.


----------



## Prima.Vera (Feb 5, 2013)

Solidstate89 said:


> It prevents you as the firmware is different.
> 
> That has absolutely *nothing* to do with the architecture of the GPU itself. It's all software.



So you're saying that the only difference between Tesla, Quadro and GeForce is in software?


----------



## Optimis0r (Feb 5, 2013)

well Fermi (GF100/110) was the one gpu to rule them all and shared the same gpu die for the gaming cards (Geforce) and HPC cards Tesla / professional quadro. So yes its due to software differences i.e accelerating games vs HPC vs professional software such as CAD etc.


----------



## nickbaldwin86 (Feb 5, 2013)

Finally... when it comes to high res all i want is more memory. 4GB isn't enough


----------



## hardcore_gamer (Feb 5, 2013)

899 bucks ? I'd rather save my money for buying both PS4 and XBOX 720.


----------



## Easy Rhino (Feb 5, 2013)

I can see the marketing campaign now!

GEFORCE TITAN 780..Easily Satisfying the Demands of All Your Console Ports!


----------



## Solidstate89 (Feb 5, 2013)

Prima.Vera said:


> So you're saying that the only difference between Tesla, Quadro and GeForce is in software?



And the amount of memory that is placed on the PCB and the firmware - yes. There is no architectural differences in the GPU itself. The most you'd see is some disabled CUDA Clusters here and there, but that doesn't change the underlying architecture.


----------



## Hilux SSRG (Feb 5, 2013)

hardcore_gamer said:


> 899 bucks ? I'd rather save my money for buying both PS4 and XBOX 720.



Sure if you want to stay in 1080p. 

I'm rather interested in the remaining pieces of the 700 series.  Possible gtx 760 = current gtx 670 ?


----------



## nickbaldwin86 (Feb 5, 2013)

why are you guys worrying about 1080p... I would hope that anyone that spends 900 on a card would have a 1600p or 3d surround setup


----------



## Filiprino (Feb 5, 2013)

$900 USD, lol.


----------



## Fluffmeister (Feb 5, 2013)

Too expensive, who needs this to play console ports... blah blah blah.

This place never gets old.


----------



## Cortex (Feb 5, 2013)

$900, still no improvement in perf/USD, 2bad. Yawn. That's not how you popularize PC gaming.

Tough times for nVidia, Tesla got a problem with Xeon Phi, Tegra 4 apparently sucks, next gen consoles use AMD hardware (although a mistake IMHO, absolute performance and perf/Watt wise), GeForce might be next in trouble (I certainly hope so, because of their price politics, PhysX restrictions, 1/24 double precision performance of GTX600 series, they practically invented and killed GPGPU for consumer)...


----------



## tokyoduong (Feb 5, 2013)

It used to be that PC gaming was clearly superior. Graphics were light years ahead, controls were....more than 4 buttons lol. And the fact that the game can be extensive and lengthly.
Now that difference is much smaller while the difference in price still remains a huge gap. $250 console is ready to game with any TV and works as soon as you turn it on without much hassle. PC gaming needs over $1000 and a lot more problems. 

From my experience, xbox live seems to have much less server problems than any of the PC titles. There tends to be less hackers and cheaters on xbox live. However, the big downside is there's a lot of screechy 10 year old boys that will curse at anything unfavorable to him.

The answer is clear for the vast majority of the people who wants to game. It's cheaper, it's simpler, it works much much more often, it cheaper, it's much more reliable, it eats less juice, it's cheaper, and............it requires no upgrades so therefore it's cheaper.


----------



## MxPhenom 216 (Feb 5, 2013)

This card will allow you to go from 100fps in console ports to over 9000fps!


----------



## yogurt_21 (Feb 5, 2013)

silapakorn said:


> Don't forget that GTX690 is now roughly 899-950$. The price for Titan is quite normal to me.



which is exactly what makes me think Titan is a dual gpu. Plus the memory amount  2x 3Gb makes more sense then a single gpu 6gb card. Then I'd expect the single gpu variant to be in the 450-500$ range with 3gb while offering 10-20% more performance than the 680.


----------



## Shihab (Feb 5, 2013)

Cortex said:


> Tough times for nVidia



"Problem?"

-Nvidia



tokyoduong said:


> From my experience, xbox live seems to have much less server problems than any of the PC titles.


Well, you see, it's not as easy to maintain a 64 players server as it is a 24 players server.




tokyoduong said:


> it requires no upgrades so therefore it's cheaper.



Under that context all mainstream/performance laptops would be considered the top of economical efficiency. 

PCs don't "Require" updates, advanced game engines does. If you are going to stall that, well....


----------



## Frick (Feb 5, 2013)

tokyoduong said:


> It used to be that PC gaming was clearly superior. Graphics were light years ahead, controls were....more than 4 buttons lol. And the fact that the game can be extensive and lengthly.



The graphics bit might be true now, but historically it's not true. Look at the old SNES JRPG's for instance (for extensive and lenghty), or the original Tomb Raider and Goldeneye for the N64 (for graphics).

And when you get to certain genres the console has always been superior (fighting and sports comes to mind).


----------



## Shihab (Feb 5, 2013)

yogurt_21 said:


> which is exactly what makes me think Titan is a dual gpu. Plus the memory amount  2x 3Gb makes more sense then a single gpu 6gb card. Then I'd expect the single gpu variant to be in the 450-500$ range with 3gb while offering 10-20% more performance than the 680.




Point to be noted: what's known for sure that's the Titan will sport a GK110, which is faster than the GK104 used in the GTX690.
Now, it's difficult to believe that Nvidia will retail a card based on _two_ state-of-the-art/flagship/etc GPUs, and retail it for _less_ than another card with 2 lesser GPUs.


----------



## radrok (Feb 5, 2013)

How did this become a PC vs Console thread? 

There are no doubts PC has its own advantages and Consoles have their own.

One thing is for sure, you can't put all the passion you put into your PC on a Console.


On topic : If this is a single GPU and will have unlocked voltage then count me in for one, I've been longing for a worthy 6990 upgrade.


----------



## Hayder_Master (Feb 5, 2013)

but in this case GTX690 will better deal than this one with 1080P resolution.


----------



## HumanSmoke (Feb 5, 2013)

Hayder_Master said:


> but in this case GTX690 will better deal than this one with 1080P resolution.


Yup...because the target market for $900 graphics cards uses 1920x1080 


To those bitching and moaning about the graphical detail of "console ports", I suggest that software development generally follows hardware development. I think I'd be more concerned with the content (originality) of a game before its textures' ability to saturate the framebuffer, or looping post-process compute function for an end result that yields minimal image quality improvements while heavily decreasing framerate.


----------



## Easy Rhino (Feb 5, 2013)

HumanSmoke said:


> Yup...because the target market for $900 graphics cards uses 1920x1080
> 
> 
> To those bitching and moaning about the graphical detail of "console ports", I suggest that software development generally follows hardware development. I think I'd be more concerned with the content (originality) of a game before its textures' ability to saturate the framebuffer, or looping post-process compute function for an end result that yields minimal image quality improvements while heavily decreasing framerate.



yes, software development generally follows hardware development but it also follows the money. in case you havn't noticed, for the past 6 years developers have been heavily focused on console titles. the hardware doesn't change so you get a nice static API to work with which means more efficient developers which leads to higher profit margin. this is not going to change.

also note that yes PC sales have been up the past couple years respectively but that is mainly because the games being developed are not pushing the hardware. developers are not bothering!


----------



## HumanSmoke (Feb 5, 2013)

Easy Rhino said:


> yes, software development generally follows hardware development but it also follows the money. in case you havn't noticed, for the past 6 years developers have been heavily focused on console titles. the hardware doesn't change so you get a nice static API to work with which means more efficient developers which leads to higher profit margin. this is not going to change


It probably has to. Integrated graphics are evolving faster than discrete, and that benefits the one company that couldn't give a shit about game dev relations. With screen resolution and APIs (D3D, OGL) increasing only fractionally for graphics horsepower requirement, and the prospect of rasterization remaining the only form of gaming render, the ball is in Nvidia's -and more increasingly, AMD's (with their console hw and Gaming Evolved program) court.

On a related note, here's a graph I made (numbers averaged between JPR and Mercury Research where both were available) for an article (still under consideration at another site) tracing the history of graphics from the 1950's military simulators to the present day. The trend is pretty self explanatory.


----------



## brandonwh64 (Feb 5, 2013)

With a market share like that intel should make a GPU that is decent to game on and they would skyrocket.


----------



## Protagonist (Feb 5, 2013)

Easy Rhino said:


> I can see the marketing campaign now!
> 
> GEFORCE TITAN 780..Easily Satisfying the Demands of *All* Your Console Ports!



Don't forget that not all console games come to PC, eg i would have loved to see MGS4 Guns Of The Patriot come to PC, but it was a PS3 exclusive so i had to get a PS3 just to play the game.

As someone wrote I'd rather save the $899 and by both PS4 & XBOX720 just so i can play the exclusive titles that i love, tho i wish the exclusive titles were Multi-Platform so i would just use the PC, but its not like that.


----------



## Easy Rhino (Feb 5, 2013)

brandonwh64 said:


> With a market share like that intel should make a GPU that is decent to game on and they would skyrocket.



the point is they don't have to. it is probably more profitable not building a discrete gpu. 

if history is any indicator the gpu card is going to way of the sound card, ethernet card and raid card. unless you need the absolute best you simply don't need one.


----------



## Tatty_One (Feb 5, 2013)

Am I missing something here?  They will get thousands of complaints..... all those 32bit systems with no Ram left to play the game with...... keen gamers are not necessarily tech enthusiasts


----------



## HumanSmoke (Feb 5, 2013)

Tatty_One said:


> Am I missing something here?  They will get *thousands* of complaints..... all those *32bit systems *with no Ram left to play the game with...... keen gamers are not necessarily tech enthusiasts



/not sure if you're playing the ironic card

Just a wild guess on my part, but there could be an outside possibility that someone spending $900 on graphics could possibly be using a 64-bit OS. And I'm not sure that "thousands" of prospective Titan owners would still be tied to a 32-bit operating system...if indeed, thousands of Titan cards are actually produced.


----------



## Prima.Vera (Feb 5, 2013)

Maybe Crysis 3 or Witcher 3 to maximize the VRAM on 3 monitors with 8xFSAA and 18xAF?


----------



## [H]@RD5TUFF (Feb 5, 2013)

I do hope this is true with 4k approaching, I am excited to get my hands on one of these!


----------



## D007 (Feb 5, 2013)

Dj-ElectriC said:


> Yes, in some games it is, in some it's on its knees. Unless you give up on some major details



Lmfao u must be kidding.. What games?  
Don't just say "Blah blah blah the 6 is bad and blah blah games run bad" 
post proof or it never happened..

As it is every single game I have played with my 680 is maxed and it breezes through it like a dream..

As for 4k gaming.. Yea 1000 for the card. 5000 for the TV.. Have fun with that..


----------



## erocker (Feb 5, 2013)

The price sets a bad precedent. Aren't the next generation of cards supposed to perform better? Are we to expect with every new generation of cards, there's going to be a price increase? Double the performance (if that is to be believed), so what. GTX 680 was released a year ago, the price of this card should be at the price of the (already inflated price) of the 680. I can easily afford the $899 price tag, but it is asking too much in a world of console ports and dying PC exclusives.


----------



## Casecutter (Feb 5, 2013)

Just a Halo product... not the second coming (7XX)...  Just "Titan" for those that articulated it couldn't be done, Jen Hsun Huang can put a check in the box.   It's not supposed to make sense to us.  While as to price, if you have to get a pry bar to the wallet step aside, they'll have enough fish that step-up!  With limited production they can control the channel and can cut it off once sales tapper back. I'd say it will have Boost and all probably voltage locked. We wait for its eventual release.

Is it that hard to understand?


----------



## erocker (Feb 5, 2013)

Casecutter said:


> Is it that hard to understand?



If you're replying to me, no it is not. Best of luck to those who shell out money for it. Good for them.


----------



## GSquadron (Feb 6, 2013)

seronx said:


> http://www.nvidia.com/content/PDF/kepler/Tesla-K20X-BD-06397-001-v05.pdf
> 
> World's best rumors...



Thanks, it looks very important!


----------



## HumanSmoke (Feb 6, 2013)

erocker said:


> The price sets a bad precedent.


With all due respect, it sets nothing.

The latest'n'Greatest single GPU cards of this generation debuted at $499-549, and Nvidia seem keen on distancing the Titan from the GTX nomenclature indicating that the card, like other limited editions, resides outside of the standard consumer model.

If $899 were setting a precedent, then we would already be experiencing it, since the GeForce 6800 Ultra 512MB (14 March 2005) debuted at that exact same price (and $999 for the BFG OCéd version)


----------



## erocker (Feb 6, 2013)

HumanSmoke said:


> With all due respect, it sets nothing.
> 
> The latest'n'Greatest single GPU cards of this generation debuted at $499-549, and Nvidia seem keen on distancing the Titan from the GTX nomenclature indicating that the card, like other limited editions, resides outside of the standard consumer model.
> 
> If $899 were setting a precedent, then we would already be experiencing it, since the GeForce 6800 Ultra 512MB (14 March 2005) debuted at that exact same price (and $999 for the BFG OCéd version)



Yes, I'm referring to the past few years. I'm fully aware how expensive cards used to be. But that's fine, people can pay what they want, the point of my post was that I will not. We are also getting pretty darn close to the to the time frame of  "next generation" with this high performance "limited edition" card.


----------



## Fluffmeister (Feb 6, 2013)

The card will sell just fine, and nVidia knows it.


----------



## HumanSmoke (Feb 6, 2013)

erocker said:


> Yes, I'm referring to the past few years. I'm fully aware how expensive cards used to be. But that's fine, people can pay what they want, the point of my post was that I will not.


You wont be alone. 99.99999% of consumers wouldn't buy the card either...but then, someone who buys a $450 pre-built would probably say the same thing about buying a card at $349. If you can't evince an interest in OTT high-dollar tech on a tech enthusiast site, where you gonna go?
If value for money and performance-per-dollar were the ultimate criteria for everyone (and just not the 99.99999% majority) then where does that leave multi-GPU, bespoke water/chiller cooling and the like ?
As a passion and hobby, I personally don't have to find justification for the expenditure. It is what it is.


erocker said:


> We are also getting pretty darn close to the to the time frame of  "next generation" with this high performance "limited edition" card.


Firstly, I doubt whether any of the refreshed GK114/Curacao parts are going to top this card, so from a purely performance pov the only real argument would be whether spending $1k on SLI/CFX from the upcoming generation makes a convincing argument for those with the cash, and whether the 6GB framebuffer is required for the intended workload. As for perf/$ the Titan is already way down the list without taking into account the next gen since SLIéd GTX670/680 océd or CFX'd HD 7970's are guaranteed to beat it in most (if not all ) benchmarks...Hasn't stopped a slew of AMD's board partners selling HD 7990's or Sapphire hawking the ridiculously priced 6GB 7970 Toxic.

By all accounts TSMC's 20nm process could be late arriving, so I don't see any huge increases in performance from refreshes of the current parts- not unless Nvidia and AMD want to throw die size and power budget out the window.


----------



## erocker (Feb 6, 2013)

You seem to be mistaking me with someone who thinks no one will buy this card. Never did I say that nor do I dispute anything you have said.


So, okay I'm glad you state your opinion.


----------



## Tatty_One (Feb 6, 2013)

HumanSmoke said:


> /not sure if you're playing the ironic card
> 
> Just a wild guess on my part, but there could be an outside possibility that someone spending $900 on graphics could possibly be using a 64-bit OS. And I'm not sure that "thousands" of prospective Titan owners would still be tied to a 32-bit operating system...if indeed, thousands of Titan cards are actually produced.



Last time I looked, there were considerably more 32 bit OS sales than 64 bit and that was including OEM's, when I say considerably more it was about 4 times more, but that was probably a year or 18 months ago so things may have significantly changed...... as I said, in the big wide world.... avid gamers are not necessarily hardware enthusiasts however you are right...... thousands of the cards probably won't be sold so I should have put something like "all those 32 bit OS owners who buy this card are going to get very upset".    At the very least, limiting potential sales by being tied to an OS is not really a good idea IMO, what would be though is to ditch 32bit possibly altogehter.... I dunno..... a lot of businesses would probably be unhappy with that.


----------



## Ikaruga (Feb 6, 2013)

Tatty_One said:


> Last time I looked, there were considerably more 32 bit OS sales than 64 bit and that was including OEM's, when I say considerably more it was about 4 times more, but that was probably a year or 18 months ago so things may have significantly changed......



Just a quick note: I think the target audience (e.g.:gamers ) is more likely to use 64bit win7. Steam survey also reflects this showing win7x64 a massive 55.55% lead over all the rest.


----------



## Deleted member 67555 (Feb 6, 2013)

I love how people are expecting 4k games to become the norm because a new gen of consoles will be out soon....which btw will be another gen of 1080p consoles but with better frame rates, field of view and overall speed...which means pc's will get slightly better ports...

News flash 4k still isn't standard for anything yet game devs aren't going to tailor to the pc market and the glory days where a single piece of pc hardware is acceptably priced at $900 is long gone specially when anyone can build a better than console gaming machine for $500.

There's PC Enthusiasm and PC reality and right now reality is ports are sucking the enthusiasm right out of PC's.


----------



## Prima.Vera (Feb 6, 2013)

Ikaruga said:


> Steam survey also reflects this showing win7x64 a massive 55.55% lead over all the rest.



55% is masive lead to you ??


----------



## Recus (Feb 6, 2013)

Prima.Vera said:


> 55% is masive lead to you ??



Sure.


----------



## Prima.Vera (Feb 6, 2013)

Recus said:


> Sure.
> 
> http://s7.postimage.org/ydlxoun57/image.png



Yeah, I get it now, but is kinda misleading. They should have a comparison between win7/8 x64 and x32 just to see the real percentage


----------



## Tatty_One (Feb 6, 2013)

Ikaruga said:


> Just a quick note: I think the target audience (e.g.:gamers ) is more likely to use 64bit win7. Steam survey also reflects this showing win7x64 a massive 55.55% lead over all the rest.



I agree but am willing to bet there will still be a decent proportion of 32bit gamers out there, lets face it, 50% of users probably don't know the difference between 32 and 64bit, your steam survey link kind of suggests there are millions of 32bit OS users? I am not suggesting it will be an issue, although it will be to *any* 32bit users who do buy the card.


----------



## Solidstate89 (Feb 6, 2013)

Prima.Vera said:


> Yeah, I get it now, but is kinda misleading. They should have a comparison between win7/8 x64 and x32 just to see the real percentage



They do. Look at the versions without the x64.


----------



## Ikaruga (Feb 6, 2013)

Tatty_One said:


> I agree but am willing to bet there will still be a decent proportion of 32bit gamers out there, lets face it, 50% of users probably don't know the difference between 32 and 64bit, your steam survey link kind of suggests there are millions of 32bit OS users? I am not suggesting it will be an issue, although it will be to *any* 32bit users who do buy the card.



Indeed, but I only posted it to add some data to the conversation, and not to argue. 
Btw, they don't have to know if it's 32 or 64 bit because the survey is automatic after the user agrees to participate. They have more than 20 million active accounts and the sample they take is quite large, so we can consider it as a good representation of PC gamers. So it's *about* 13 million 64bit windows vs 5 million 32bit windows on Steam (win7, Vista and XP combined).


----------



## drdeathx (Feb 6, 2013)

Dj-ElectriC said:


> I feel bad for non-reviewers :X
> Even if it costs 800$



This will undoubtably be a loaner card for most reviewers


----------



## AsRock (Feb 6, 2013)

Makes me so happy that AMD did just a refresh as at least they should not be at these prices, although i do hope they did some nice tweaks for the none oem versions.


----------



## TheMailMan78 (Feb 6, 2013)

900 dollar card for console ports.


----------



## jihadjoe (Feb 7, 2013)

Those steam results are interesting. Left to my own devices, I never wouldn't have guessed 64-bit adoption was already that high.


----------



## xtremesv (Feb 7, 2013)

$900!!!??? Anyway considering I would buy the 770 instead but if this price is right for the 780 then the 770 would cost around $700!!! I ain't paying that gentlemen.


----------



## Crap Daddy (Feb 7, 2013)

Listed in Denmark, 6GB

http://www.proshop.dk/Grafikkort/ASUS-GeForce-GTX-Titan-6GB-GDDR5-2394804.html

A quick money conversion says: 975 Euro. An EVGA GTX690 on the same site is at 990 Euro


----------



## BarbaricSoul (Feb 7, 2013)

james888 said:


> I wonder how it would be for WCG



This is what I'm thinking about and the primary reason I'm planning on buying one for.


----------



## brandonwh64 (Feb 7, 2013)

BarbaricSoul said:


> This is what I'm thinking about and the primary reason I'm planning on buying one for.



It would be a toss up due to nvidia not being as good as AMD in WCG


----------



## xenocide (Feb 7, 2013)

jihadjoe said:


> Those steam results are interesting. Left to my own devices, I never wouldn't have guessed 64-bit adoption was already that high.



I believe Windows 7 was the first OS that launched with both and would install either 32-bit or 64-bit based on what hardware you had, but I could be wrong.



brandonwh64 said:


> It would be a toss up due to nvidia not being as good as AMD in WCG



I thought that was only with the 6xx series because of the lacking GPGPU functionality?


----------



## BarbaricSoul (Feb 7, 2013)

Same here


----------



## Nordic (Feb 7, 2013)

A 580 puts out 30k ppd is what I have read. My 7970 puts out 130k ppd.


----------



## HumanSmoke (Feb 7, 2013)

WCCF is reporting an 8+2 phase design for the GTX Titan (inc. pic of PCB layout)


----------

