Monday, February 18th 2013

NVIDIA GeForce GTX Titan Graphics Card Pictured in Full

Here it is, folks; the first pictures of NVIDIA's newest pixel crunching dreadnought, the GeForce GTX Titan. Pictures leaked by various sources east of the Greenwich Median reveal a reference board design that's similar in many ways to that of the GeForce GTX 690, thanks to the magnesium alloy cooler shroud, a clear acrylic window letting you peep into the aluminum fin stack, and a large lateral blower. The card features a glowy "GeForce GTX" logo much like the GTX 690, draws power from a combination of 6-pin and 8-pin PCIe power connectors, and features two SLI bridge fingers letting you pair four of them to run 3DMark Fire Strike as if it were a console port from last decade.
The GeForce GTX Titan PCB reveals that NVIDIA isn't using a full-coverage IHS on the GK110 ASIC, rather just a support brace. This allows enthusiasts to apply TIM directly on the chip's die. The GPU is wired to a total of twenty four 2 Gbit GDDR5 memory chips, twelve on each side of the PCB. The card's VRM appears to consist of a 6+2 phase design which uses tantalum capacitors, slimline chokes, and driver-MOSFETs. The PCB features a 4-pin PWM fan power output, and a 2-pin LED logo power output that's software controllable.

Given the rumored specifications of the GTX Titan, the card could be overkill for even 2560 x 1600, and as such could be designed for 3DVision Surround (3 display) setups. Display outputs include two dual-link DVI, an HDMI, and a DisplayPort.

According to most sources, the card's specifications look something like this:
  • 28 nm GK110-based ASIC
  • 2,688 CUDA cores ("Kepler" micro-architecture)
  • 224 TMUs, 48 ROPs
  • 384-bit GDDR5 memory interface
  • 6 GB memory
  • Clocks:
    o 837 MHz core
    o 878 MHz maximum GPU Boost
    o 6008 MHz memory
  • 250W board power
Sources: Egypt Hardware, VideoCardz
Add your own comment

118 Comments on NVIDIA GeForce GTX Titan Graphics Card Pictured in Full

#26
KashunatoR
the should have only give it 4gb of VRAM and shave off a 100 bucks...
Posted on Reply
#27
Nordic
zolizoliNot really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
This greedy corporates holding back our technological evolution.
Funding our future tech evolution.
Posted on Reply
#28
renz496
^ AFAIK even 3GB is plenty for triple monitor gaming. maybe they (nvidia) want to brag their performance with 4k resolution monitor....
Posted on Reply
#29
micropage7

somehow reminds me of robocop theme, with silver and black :laugh:
Posted on Reply
#30
Fluffmeister
zolizoliNot really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
This greedy corporates holding back our technological evolution.
Cool story bro!
Posted on Reply
#31
1c3d0g
EGGcellent. :rockout: This should be an amazing number-crunching GPU for us BOINC'ers/Folding@Home users. Too bad the price will be too high (~ $900 last I heard), but at least by the end of the year the 780 GTX and crew will be available for a more reasonable amount.
Posted on Reply
#34
D4S4
that's one big ass die. 700cid bbc big.
Posted on Reply
#35
lastcalaveras
Should be a nice card and the cut down version for 760 ti should be even better for the price. However there are two dreadful words that make me think twice "Nvidia Greenlight".
Posted on Reply
#36
Naito
zolizoliThe 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech
:shadedshu I'd say it would be highly unlikely that it was Nvidia's intention to 'fool' their customers. Whether the reason was due to yield on a new fabrication node or simply because AMD's card performed less than Nvidia had anticipated, it doesn't matter; it allowed AMD to stay very competitive. As for the 680, the card well and truly performs where one would expect a high-end card to perform, regardless of whether or not it was originally designed to be as such.
Posted on Reply
#37
Kaynar
renz496^ AFAIK even 3GB is plenty for triple monitor gaming. maybe they (nvidia) want to brag their performance with 4k resolution monitor....
If the performance of this card is 100% faster than current top end, then 6GB makes more sence cause this card will be plenty enough for at least 4-5 years of gaming, so they have to account of future needs... several titles already need nearly 3GB or vRAM on 1 screen already today.
Posted on Reply
#38
the54thvoid
Super Intoxicated Moderator
lastcalaverasShould be a nice card and the cut down version for 760 ti should be even better for the price. However there are two dreadful words that make me think twice "Nvidia Greenlight".
Cut down version? This is very much stand alone. I have no idea of Nvidia's refresh schedule but there's a chance they'll release the 7xx series as the 6xx refresh. This card is intended to stand out and not even be considered a 6 or 7 series model.
Posted on Reply
#39
hardcore_gamer
This card will make me buy..


..an XBOX 720 and a PS4.
Posted on Reply
#41
BigMack70
Looks like the paper launch got delayed till tomorrow... :(
Posted on Reply
#42
the54thvoid
Super Intoxicated Moderator
BigMack70Looks like the paper launch got delayed till tomorrow... :(
Unless the NDA is 8am Pacific Time. That means it's 4pm here in the uk.
Posted on Reply
#43
symmetrical
Holy crap, it looks like a graphics card!
Posted on Reply
#44
Prima.Vera
Kaynar... several titles already need nearly 3GB or vRAM on 1 screen already today.
Which titles? For 1080p or 1440p? And please don't tell me about latest mods for Skyrim. I can run it just fine with 1 GB vRAM only.:toast:
Posted on Reply
#45
symmetrical
zolizoliNot really. Even in the time of gtx280 it was affordable even at release and it was the fastes single chip for a while.
I think they turned the GREED ENGINE on since the gtx500 serie and it was just a refreshed 400 serie.
The 680 is so insanely priced,the GK104 not even designed to be high end. But it had a better performance than expected so why not fool the customers and rip them long as they can and keep yesterdays tech in a shelf and when the customers recover financialy they sell it as futures wonder tech.
This greedy corporates holding back our technological evolution.
The thing is though, regardless of what something was "meant" to be, if the price is right and people are willing to pay, people are going to pay, and people did pay.

Although I admit I copped my GTX 680 second hand for $400 3 months after launch because I didn't want to pay $550+ like I did with my GTX 580.

But at the time, the performance of the GTX 680 compared to a 580 was 1.5 to 1.8. And liked it or not, it was the fastest single GPU Nvidia card.

I mean if we lived in a fair wonderful utopia, then the Titan card will "only" be $500. But we live in reality and in reality there is a thing called Business. And for Nvidia's Business it dictates the card's price/performance will be in the neighborhood of $800+
Posted on Reply
#47
symmetrical
Prima.VeraWhich titles? For 1080p or 1440p? And please don't tell me about latest mods for Skyrim. I can run it just fine with 1 GB vRAM only.:toast:
The only game that ever allocated the full 2GB on my GTX 680 was the Crysis 3 Beta @1080p which is a rarity in itself.
Posted on Reply
#48
Prima.Vera
symmetricalThe only game that ever allocated the full 2GB on my GTX 680 was the Crysis 3 Beta @1080p which is a rarity in itself.
Interesting. Haven't played that yet. What AA were you using?
Posted on Reply
#50
symmetrical
Prima.VeraInteresting. Haven't played that yet. What AA were you using?
FXAA and TXAA 4X

Although there was a bug in the beta in which TXAA made foliage look like crap.

But yeah most other games are barely hitting 1GB of VRAM, so I don't know what the other guy is talking about either.
Posted on Reply
Add your own comment
Nov 23rd, 2024 09:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts