Sunday, January 18th 2015
First PCB Shots of GeForce GTX TITAN-X Surface
Here are the first PCB shots of NVIDIA's next-gen flagship graphics card, the GeForce GTX TITAN-X. At the heart of this beast is the swanky new 28 nm GM200 silicon, which is rumored to feature 3072 CUDA cores based on the "Maxwell" architecture, a 384-bit wide GDDR5 memory interface, with NVIDIA's latest memory bandwidth management mojo, a staggering 96 ROPs, and 12 GB of memory. The design goal is probably 4K to 5K gaming with a single card, at reasonably high settings. The GM200 silicon appears slightly bigger than the GK110, NVIDIA's previous big chip.
The display I/O of this card looks identical to that of the GTX 980. We're not sure if the DVI connector will make it to the final design. There are no shots of the VRM, although given this architecture's track-record, we don't expect the TITAN-X to have any heavier power requirements than the GTX 780 Ti (6-pin + 8-pin power inputs). NVIDIA is expected to launch the GTX TITAN-X within this quarter. Don't hold off on your GTX 980 purchases just yet, because NVIDIA tends to overprice its "TITAN" branded graphics cards.
Source:
Baidu Tieba Communities
The display I/O of this card looks identical to that of the GTX 980. We're not sure if the DVI connector will make it to the final design. There are no shots of the VRM, although given this architecture's track-record, we don't expect the TITAN-X to have any heavier power requirements than the GTX 780 Ti (6-pin + 8-pin power inputs). NVIDIA is expected to launch the GTX TITAN-X within this quarter. Don't hold off on your GTX 980 purchases just yet, because NVIDIA tends to overprice its "TITAN" branded graphics cards.
37 Comments on First PCB Shots of GeForce GTX TITAN-X Surface
Far Cry 4 with everything maxed out uses about 3.3GB - 3.6GB of VRAM @ 1080.... that failure of a game, CoD Ghosts, uses around 3.5GB when all in-game eye candy options are enabled (and it doesnt crash!),.. this is without any driver enhancements added.
Those two examples are just the tip of the ever-growing iceberg of games that can and WILL use more than 4GB frame buffers, given the chance.....
So, will 4GB be enough** for 1080 in a years time? I doubt it, if current trends continue.
However, this Titan naming scheme of recent that nVidia insist on using for all overly expensive cards (price/perf ratio) is getting a bit old..... where are these ultra-low power cards that the 750Ti showed were possible with new fab?
The sooner they roll out, the better.
** "enough" is subjective; enough to me being 'can it max this game out and not run outta VRAM and/or HP!?'
On top of that, names can sell a product better than people imagine it does and Titan is something that has a good ring to it when you say your machine has a GTX Titan inside.
They could announce a consumer variant at GTC two weeks after GDC. Titan Z was announced at GTC if anyone still remembers that.
I wish people on a tech forum that centers around products like this would understand that the Titan is NOT meant to be a gaming card. It's marketed as one, which causes confusion, but plenty of people on this forum have repeated that this is a developer's card due to CUDA.
People will never learn.