NVIDIA GeForce GTX TITAN 6 GB Review 190

NVIDIA GeForce GTX TITAN 6 GB Review

(190 Comments) »

Introduction

NVIDIA Logo


Had someone told us NVIDIA was going to launch the GeForce GTX Titan around this time a month ago, we'd have politely asked them to go jump off a skyscraper. Why? Because NVIDIA simply doesn't need it in its product stack at this time. The performance lead AMD's HD 7970 GHz Edition has over the GeForce GTX 680 is disputed at best, and the latter ships with better energy-efficiency and acoustics. The GeForce GTX 670 continues to beat the HD 7950 Boost Edition and the GTX 660 Ti trades blows with the HD 7950, with much better power/noise numbers. It's only with the "Pitcairn" based HD 7800 series that AMD appears to have a solid footing. Even at the ultra high-end segment, the dual-GPU GeForce GTX 690 scales near perfectly over the GTX 680. So what prompted NVIDIA to rush out the GTX Titan? Is it even being rushed out to begin with? We'll have to look back at 2012 for some answers.

When AMD launched its Radeon HD 7970 in December 2011, it appeared for a brief moment as though AMD was set for 2012. Brief, because there was more than just arrogance in NVIDIA's dismissal of AMD's new flagship GPU and the architecture that drives it. NVIDIA's "Kepler" GPU architecture was designed under the assumption that the HD 7970 would be much faster than it ended up being, so the company realized its second best chip, the GK104, had a fair shot against the HD 7900 series.



The GK104 really was just a successor of the GF114 that drives the performance-segment GeForce GTX 560 Ti. What followed was a frantic attempt by NVIDIA to re-package the GK104 into a high-end product, the GeForce GTX 680, while shelving its best but expensive chip, the GK110 (which drives the GTX Titan we're reviewing today). The gambit paid off when the GTX 680 snatched the performance crown from the HD 7970 in March. AMD may have responded with the faster HD 7970 GHz Edition in June, but it flunked energy-efficiency and fan-noise tests big time. The GK110 wore a business suit before a t-shirt, since NVIDIA built the Tesla K20 GPU compute accelerator that powers the Titan supercomputing array out of it. Normally, you'd want your ASIC to pass consumer applications before enterprise ones, and Intel usually sells Core processors on new silicon before Xeon. The Titan supercomputer, by the way, is where the GTX Titan got its name from.

2013 is more than just another year for PC GPU makers. It's when game console giants Microsoft, Sony and Nintendo each plan to ship their next-generation game consoles. The WiiU is already out, the PlayStation 4 was unveiled yesterday, and the Xbox "Durango" could follow closely. A rare commonality between the three is that they're all rumored to be driven by AMD graphics. Although we don't expect either of those consoles to match the graphics processing prowess of even PCs from 2010, their mere introduction could draw an entire generation of gamers to adopt them, which could be bad for the PC platform. It's now more than ever that we need some action in the PC hardware scene, even if it means launching hardware such as the GTX Titan, which the product stack doesn't really need.

There's yet another factor at play. This one is more local to the PC platform. AMD recently announced the Never Settle Reloaded offer in which most of its performance-thru-extreme graphics cards across AIB partners ship with games that are extremely relevant to the season. These include Crysis 3, Tomb Raider (the new one), Bioshock Infinite, and DmC: Devil May Cry; CrossFire HD 7900 series buyers are rewarded with up to six games (add Far Cry 3, Hitman Absolution, and Sleeping Dogs to that mix). In comparison, NVIDIA's "Free to Play bundle" which gives you about $25~$50 worth of in-game currency (per game) with free-to-play games such as Hawken, World of Tanks, and Planetside 2, only makes NVIDIA look bad. AMD, in a recent teleconference, called the GeForce GTX Titan a reaction by NVIDIA to "Never Settle Reloaded" and AMD getting cozy with leading game studios in general.

Introduction of the GeForce GTX Titan at this time could, hence, be a product of unorthodox but effective market foresight on NVIDIA's part.

In this review, we will take a single GeForce GTX Titan for a spin by taking it apart and exploring some of its new features. We have also posted a GeForce GTX Titan SLI and Tri-SLI review.

GeForce GTX Titan Market Segment Analysis
 GeForce
GTX 580
Radeon
HD 7970
HD 7970
GHz Ed.
GeForce
GTX 680
GeForce
GTX 590
GeForce
GTX Titan
GeForce
GTX 690
Shader Units5122048204815362x 51226882x 1536
ROPs483232322x 48482x 32
Graphics ProcessorGF110TahitiTahitiGK1042x GF110 GK1102x GK104
Transistors3000M4310M4310M3500M2x 3000M7100M2x 3500M
Memory Size1536 MB3072 MB3072 MB2048 MB2x 1536 MB6144 MB2x 2048 MB
Memory Bus Width384 bit384 bit384 bit256 bit2x 384 bit384 bit2x 256 bit
Core Clock772 MHz925 MHz1050 MHz1006 MHz+607 MHz837 MHz+915 MHz+
Memory Clock1002 MHz1375 MHz1500 MHz1502 MHz855 MHz1502 MHz1502 MHz
Price$430$390$400$480$750$1000$1000

Architecture

The GeForce GTX Titan is based on NVIDIA's biggest chip for the "Kepler" micro-architecture, codenamed "GK110." First introduced as part of the Tesla K20 GPU compute accelerator, the chip is built on the 28 nanometer silicon fabrication process and packs a staggering 7.1 billion transistors. That's over three times the transistor count of Intel's 8-core Xeon "Sandy Bridge-EP" processor. Its component hierarchy is identical to that of other GPUs based on the architecture.

While the GK104 features four graphics processing clusters (GPCs) with two streaming multiprocessors (SMXs) each, the GK110 features five GPCs with three SMXs each. The SMX design hasn't changed: it still houses 192 CUDA cores each, so the physical CUDA core count on the chip works out to 2,880. The GeForce GTX Titan feature-set only includes 2,688, since one of the 15 SMXs is disabled. This probably helps NVIDIA harvest the GK110 wafers better by giving TSMC room to get one SMX "wrong."



The GK110 features a total of 240 texture-mapping units (TMUs), but since TMUs are contained within SMXs, the GTX Titan ends up with 224. Since raster operations processors (ROPs) are tied to the memory bus width on the Kepler family of GPUs in general, the GK110 features 48 as compared to the 32 on the GK104. Speaking of memory, the GK110 features a 384-bit wide GDDR5 memory interface. To our surprise, NVIDIA made 6 GB the standard memory amount. While no game needs that much memory, even at 2560 x 1600, NVIDIA is probably looking to stabilize 3D Vision Surround performance. The GTX Titan lets you set up a 3D Vision Surround setup with three 2560 x 1600-pixel displays.

In terms of clock speeds, the GeForce GTX Titan core doesn't get anywhere close to the 1 GHz mark, which is characteristic of every big chip from NVIDIA. The core is clocked at 838 MHz, with a nominal GPU Boost frequency of 878 MHz, while the memory is clocked at 6008 MHz, so the card ends up with a memory bandwidth of 288 GB/s.



With the GeForce GTX Titan, NVIDIA also introduced its second generation GPU Boost technology. The technology allows GPUs to increase their core clock speed and voltage beyond nominal values while respecting a set power target. The new GPU Boost 2.0 takes temperatures and additional power draw into account when adjusting core clock speeds and supportive voltages. This ensures that even stressful applications get the benefit of higher clock speeds so long as the GPU isn't overheating, which should particularly please enthusiasts who overclock their cards using such sub-zero cooling methods as liquid nitrogen or dry-ice evaporators.

The Card

Graphics Card Front
Graphics Card Back

Visually, the GeForce GTX Titan resembles NVIDIA's GTX 690. It uses the same sexy unibody design with a magnesium alloy shell and Plexiglas window. Unlike the GTX 690, there is only a single window, and the fan has been moved further to the back, which makes sense as there is only one GPU to cool now. Length of the card is 27 cm, which is a bit shorter than the GTX 690.

Graphics Card Height

Installing the card requires two slots in your system.

Monitor Outputs, Display Connectors

Display connectivity options include two DVI ports, one full-size DisplayPort, and one full-size HDMI port. You may use all outputs at the same time, so triple-monitor surround gaming is possible with one card.

The GPU also includes an HDMI sound device. It is HDMI 1.4a compatible, which includes HD audio and Blu-ray 3D movies support.


Two SLI connectors are available, which means you can pair the GTX Titan with up to three other GTX Titans for a Quad-SLI rig - if you can afford it.

Graphics Card Teardown PCB Front
Graphics Card Teardown PCB Back

Pictured above are the front and back, showing the disassembled board. High-res versions are also available (front, back). If you choose to use these images for voltmods, etc., please include a link back to this site, or let us post your article.

A Closer Look

Graphics Card Cooler Front
Graphics Card Cooler Back

NVIDIA's cooler uses a complex heatsink base with vapor-chamber technology to cool the GPU, memory chips, and secondary components. A backplate is not included.


In order to measure real-time power consumption of the card, NVIDIA has placed a single Texas Instruments INA 3221 power sensor on their card. This chip replaces the three INA219 sensors that were used on earlier cards, providing space and cutting on cost.

Graphics Card Power Plugs

The card requires one 6-pin and one 8-pin PCI-Express power cable for operation. This power configuration is good for up to 300 W of power draw.


NVIDIA uses an OnSemi NCP4206 voltage controller on the Titan. We have seen this controller on many designs before. It is a cost effective solution that does not provide any I2C, so advanced monitoring is not possible. Please note how it sits on its own PCB, which means we could see different voltage controllers in the future. The GTX 680 uses a similar approach, but the variety of voltage controllers was relatively low.

Graphics Card Memory Chips

The GDDR5 memory chips are made by Samsung and carry the model number K4G20325FD-FC03. They are specified to run at 1500 MHz (6000 MHz GDDR5 effective).

Graphics Chip GPU

NVIDIA's GK110 graphics processor was first introduced as a Tesla-only product to power demanding GPU compute applications. NVIDIA has now released it as a GeForce GPU too. It uses 7.1 billion transistors on a die size that we measured to be 561 mm². The GPU is produced on a 28 nanometer process at TSMC, Taiwan.
Our Patreon Silver Supporters can read articles in single-page format.
Discuss(190 Comments)
Apr 13th, 2025 21:44 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts