Monday, January 21st 2013

NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

2013 started off on a rather dull note for the PC graphics industry. NVIDIA launched its game console platform "Project: Shield," while AMD rebranded its eons-old GPUs to Radeon HD 8000M series. Apparently it could all change in late-February, with the arrival of a new high-end single-GPU graphics card based on NVIDIA's GK110 silicon, the same big chip that goes into making the company's Tesla K20 compute accelerator.

NVIDIA may have drawn some flack for extending its "GTX" brand extension too far into the mainstream and entry-level segment, and wants its GK110-based card to stand out. It is reported that NVIDIA will carve out a new brand extension, the GeForce Titan. Incidentally, the current fastest supercomputer in the world bears that name (Cray Titan, located at Oak Ridge National Laboratory). The GK110 silicon physically packs 15 SMX units, totaling 2,880 CUDA cores. The chip features a 384-bit wide GDDR5 memory interface.
Source: SweClockers
Add your own comment

203 Comments on NVIDIA to Name GK110-based Consumer Graphics Card "GeForce Titan"

#1
Naito
A possible late-February launch? Very interesting indeed.

Geforce Titan doesn't quite roll with me. Nvidia has lost the plot even further with their naming conventions.

Rumour of only 14 out of 15 shader clusters is a little disappointing considering how long they have been producing Kepler chips, but I guess it is a much larger core. Is this going to remain branded as 600 series (not mentioned in article)? If so, it could be possible the core may be hitting a performance range that might not slot in well with the future 700 series performance targets, thus they trimmed an SMX unit off.

Edit: Maybe it will slot in near the GTX 690 or even supersede it? GTX 690 3072 shaders vs GK110 2688 shaders without possible SLI overhead? If not, might just slot in under, as to not annoy the people who bought the GTX 690.

Edit 2: Checked original article. Claims 85% of the performance of the GTX 690. Possibility of limited edition card? (when considering naming): "Partner companies must comply with Nvidias reference design to the letter and may not even put their stickers on graphics cards."
Posted on Reply
#2
tastegw
Cores: 1536->2638
Interface: 256->384
Ram: 2/4->6 (overkill but nice)
I think I'll take two

So this isn't the 780?
Posted on Reply
#3
DarkOCean
"The same sources claim that Geforce Titanium released in late February and has a suggested retail price of 899 USD." holly Jesus :eek:
Posted on Reply
#4
sc
Time to put my 690 up for sale. Damn consumerism...
Posted on Reply
#5
Optimis0r
Not an average consumer gaming graphics card then.
Posted on Reply
#6
kniaugaudiskis
Well, with such a price tag and a non-GTX branding this card may be a limited production enthusiasts' dream.
Posted on Reply
#8
Shihab
Anyone interested in a cheap Kidney?
Posted on Reply
#9
qubit
Overclocked quantum bit
I'll bet it will perform similarly to a GTX 690 and be priced extortionately between £800 to £1000. :rolleyes:
Posted on Reply
#10
1c3d0g
Awesome! :cool: This fucker will be an incredible asset for us BOINC/Folding@Home enthusiasts, enabling us to achieve even better results for battling nasty diseases, researching black holes/entire galaxies, understanding vague physics processes and more! :D
Posted on Reply
#11
Samskip
If it really packs 6GB of RAM it would be an awsome card for driving 4K displays.
Posted on Reply
#12
Filiprino
Whoa, 6 GB of GDDR5 is great. A TDP of 235W isn't.
At least make it SLI friendly with only 1 slot of connectors instead of two. In that way you can use a waterblock and have 1 more PCIe connector of your motherboard available.
Posted on Reply
#13
dj-electric
Waiting for the SuperFunTime ill have with this card
Posted on Reply
#14
the54thvoid
Super Intoxicated Moderator
scTime to put my 690 up for sale. Damn consumerism...
Why?
Naito...Checked original article. Claims 85% of the performance of the GTX 690
Lose performance....
Posted on Reply
#15
Fluffmeister
the54thvoidWhy?



Lose performance....
Single GPU is always going to preferable, sure the 690 is a wonderful piece of kit, but it still relies on decent SLI profiles to perform at it's best, and even then not every title will scale that well.

Throw in some extra functionality and still packing the same amount of CUDA cores as TWO 670's and I'd think you'd be on to a winner.
Posted on Reply
#16
BigMack70
Hmmmmmmmmmm... depending on what actually happens with price/performance I could see myself selling off my 7970 Lightnings and going for one of these bad boys... If I can get anywhere near 7970 CF performance from a single GPU, I'm in.

Multi-GPU is a hassle comparatively.
Posted on Reply
#17
_JP_
I've got a biiig pair of Titans!! Wanna see them?

But can it run Battlefield 4 @ 2560x1600 & 16xAF & 4xAA & Ultra settings?














:p
Posted on Reply
#18
blibba
Given the increased memory bandwidth, this might only manage 85% of the throughput of the 690, but I bet it'll have lower 99th percentile frame times.
Posted on Reply
#19
sc
the54thvoidWhy?
Lose performance....
Because... there will be a dual GK110 board.
Posted on Reply
#20
Kaynar
What is the broad estimation of release date for these cards? Easter? Summer? Autumn?
Posted on Reply
#21
the54thvoid
Super Intoxicated Moderator
Google translated from Sweclockers:
When NVIDIA released the GeForce GTX 680 in March 2012 it was clear that the new flagship was not based on the full version of the GPU Kepler (GK110). Instead, a cut-down version (GK104) to hold down the manufacturing costs. Now, almost a year later, full-fledged Kepler be heading to a Geforce in consumer class.
Multiple independent sources say the SweClockers to GK110 appears in Geforce Titan - an upcoming graphics cards in high-end class. The name alludes to the world's fastest supercomputer Cray Titan at Oak Ridge National Laboratory in the USA, which is based on 18,688 pieces of Nvidia Tesla K20X with just GK110.

Calculation Card Nvidia Tesla K20X based on GK110 with 2688 CUDA cores, 384-bit memory bus, and 6 GB of GDDR5 memory. The circuit actually contains 2880 units, but NVIDIA has disabled a cluster (SMX), presumably for production reasons. While stopping the clock frequencies at relatively low 732 MHz GPU and 5.2 GHz GDDR5.

According SweClockers sources the launch of the GeForce Titanium to resemble that of the GeForce GTX 690th Partner Manufacturers must follow Nvidia's reference design to the letter and can not even put their own stickers on graphics cards. The performance is estimated at about 85 percent of a Geforce GTX 690th

The same sources claim that Geforce Titanium released in late February and has a suggested retail price of 899 USD.
Posted on Reply
#22
Fluffmeister
Sounds like it's gonna be another sexy reference design card using magnesium alloys and the like.

Bring it on. :toast:
Posted on Reply
#23
ThE_MaD_ShOt
900 bones is a little steep for most to justify. Wow
Posted on Reply
#24
Shihab
FiliprinoWhoa, 6 GB of GDDR5 is great. A TDP of 235W isn't.
At least make it SLI friendly with only 1 slot of connectors instead of two. In that way you can use a waterblock and have 1 more PCIe connector of your motherboard available.
Don't get greedy now, the ol' GF110 was rated at 244w when fitted to a GTX 580, the current 680's rated at 190-ish, and the current GTX 690's given a 300w TDP.

235w isn't much IMO if it packed the processing power it promises.
Posted on Reply
#25
tastegw
ThE_MaD_ShOt900 bones is a little steep for most to justify. Wow
Cheaper than two 680's, but many still went that route.
Posted on Reply
Add your own comment
Dec 22nd, 2024 02:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts