Sunday, January 18th 2015

First PCB Shots of GeForce GTX TITAN-X Surface

Here are the first PCB shots of NVIDIA's next-gen flagship graphics card, the GeForce GTX TITAN-X. At the heart of this beast is the swanky new 28 nm GM200 silicon, which is rumored to feature 3072 CUDA cores based on the "Maxwell" architecture, a 384-bit wide GDDR5 memory interface, with NVIDIA's latest memory bandwidth management mojo, a staggering 96 ROPs, and 12 GB of memory. The design goal is probably 4K to 5K gaming with a single card, at reasonably high settings. The GM200 silicon appears slightly bigger than the GK110, NVIDIA's previous big chip.

The display I/O of this card looks identical to that of the GTX 980. We're not sure if the DVI connector will make it to the final design. There are no shots of the VRM, although given this architecture's track-record, we don't expect the TITAN-X to have any heavier power requirements than the GTX 780 Ti (6-pin + 8-pin power inputs). NVIDIA is expected to launch the GTX TITAN-X within this quarter. Don't hold off on your GTX 980 purchases just yet, because NVIDIA tends to overprice its "TITAN" branded graphics cards.
Source: Baidu Tieba Communities
Add your own comment

37 Comments on First PCB Shots of GeForce GTX TITAN-X Surface

#26
Trompochi
First Titan Z..... Now Titan X..... Next will be Titan Y, then we can combine them into the Titan XYZ card with 100000 attack power and 75000 defense points!!!11!!1!!!:roll:
Posted on Reply
#27
XL-R8R
Prima.VeraSure. Hopefully with 4GB of VRAM. More is kinda useless...
Do you ever play modern games? :twitch:

Far Cry 4 with everything maxed out uses about 3.3GB - 3.6GB of VRAM @ 1080.... that failure of a game, CoD Ghosts, uses around 3.5GB when all in-game eye candy options are enabled (and it doesnt crash!),.. this is without any driver enhancements added.

Those two examples are just the tip of the ever-growing iceberg of games that can and WILL use more than 4GB frame buffers, given the chance.....

So, will 4GB be enough** for 1080 in a years time? I doubt it, if current trends continue.


However, this Titan naming scheme of recent that nVidia insist on using for all overly expensive cards (price/perf ratio) is getting a bit old..... where are these ultra-low power cards that the 750Ti showed were possible with new fab?

The sooner they roll out, the better.


** "enough" is subjective; enough to me being 'can it max this game out and not run outta VRAM and/or HP!?'
Posted on Reply
#28
THU31
Is this not the Quadro M6000 or something that is supposed to launch on Jan 22nd along with the GTX 960?
Posted on Reply
#29
Sasqui
Why the heck don't they just call this the 990? The Titan name has some bad mojo going on.
Posted on Reply
#30
GhostRyder
SasquiWhy the heck don't they just call this the 990? The Titan name has some bad mojo going on.
Because the X90 name is associated with Dual GPU cards I would suppose is the reason for that (Though I guess Titan-Z was the last Dual GPU care they made so it could change).

On top of that, names can sell a product better than people imagine it does and Titan is something that has a good ring to it when you say your machine has a GTX Titan inside.
Posted on Reply
#31
Dave65
Will stick to my Gigabyte 970, that will be so over priced..
Posted on Reply
#32
Xzibit
Harry LloydIs this not the Quadro M6000 or something that is supposed to launch on Jan 22nd along with the GTX 960?
I think they will save the announcement for GDC in March.

They could announce a consumer variant at GTC two weeks after GDC. Titan Z was announced at GTC if anyone still remembers that.
Posted on Reply
#33
RyneSmith
Dave65Will stick to my Gigabyte 970, that will be so over priced..
Of course it will, as a gaming card. As a development card, this is where it's at.

I wish people on a tech forum that centers around products like this would understand that the Titan is NOT meant to be a gaming card. It's marketed as one, which causes confusion, but plenty of people on this forum have repeated that this is a developer's card due to CUDA.

People will never learn.
Posted on Reply
#34
HumanSmoke
RyneSmithOf course it will, as a gaming card. As a development card, this is where it's at.
I wish people on a tech forum that centers around products like this would understand that the Titan is NOT meant to be a gaming card. It's marketed as one, which causes confusion, but plenty of people on this forum have repeated that this is a developer's card due to CUDA.
People will never learn.
To be fair, Nvidia would market Titan as decorative art to visiting aliens if there was money to be made doing it. If you have a product that can be marketed across numerous users and workloads, why not target each and every one with specific marketing campaigns? This is exactly what Nvidia does, and has done in graphics since they realized SGI raked in bucketloads of cash catering to a nascent workstation/prosumer market in the 1990's...and then promptly acquired the division(for next to nothing).
Posted on Reply
#35
$ReaPeR$
TrompochiFirst Titan Z..... Now Titan X..... Next will be Titan Y, then we can combine them into the Titan XYZ card with 100000 attack power and 75000 defense points!!!11!!1!!!:roll:
LMFAO mate!!!
Posted on Reply
#37
xorbe
Titan was at the limit at $999 imho. $1350 is going to stop even most flush ballers. (If true.)
Posted on Reply
Add your own comment
Dec 23rd, 2024 12:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts