Friday, March 18th 2016

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

With the GeForce GTX 900 series, NVIDIA has exhausted its GeForce GTX nomenclature, according to a sensational scoop from the rumor mill. Instead of going with the GTX 1000 series that has one digit too many, the company is turning the page on the GeForce GTX brand altogether. The company's next-generation high-end graphics card series will be the GeForce X80 series. Based on the performance-segment "GP104" and high-end "GP100" chips, the GeForce X80 series will consist of the performance-segment GeForce X80, the high-end GeForce X80 Ti, and the enthusiast-segment GeForce X80 TITAN.

Based on the "Pascal" architecture, the GP104 silicon is expected to feature as many as 4,096 CUDA cores. It will also feature 256 TMUs, 128 ROPs, and a GDDR5X memory interface, with 384 GB/s memory bandwidth. 6 GB could be the standard memory amount. Its texture- and pixel-fillrates are rated to be 33% higher than those of the GM200-based GeForce GTX TITAN X. The GP104 chip will be built on the 16 nm FinFET process. The TDP of this chip is rated at 175W.
Moving on, the GP100 is a whole different beast. It's built on the same 16 nm FinFET process as the GP104, and its TDP is rated at 225W. A unique feature of this silicon is its memory controllers, which are rumored to support both GDDR5X and HBM2 memory interfaces. There could be two packages for the GP100 silicon, depending on the memory type. The GDDR5X package will look simpler, with a large pin-count to wire out to the external memory chips; while the HBM2 package will be larger, to house the HBM stacks on the package, much like AMD "Fiji." The GeForce X80 Ti and the X80 TITAN will hence be two significantly different products besides their CUDA core counts and memory amounts.

The GP100 silicon physically features 6,144 CUDA cores, 384 TMUs, and 192 ROPs. On the X80 Ti, you'll get 5,120 CUDA cores, 320 TMUs, 160 ROPs, and a 512-bit wide GDDR5X memory interface, holding 8 GB of memory, with a bandwidth of 512 GB/s. The X80 TITAN, on the other hand, features all the CUDA cores, TMUs, and ROPs present on the silicon, plus features a 4096-bit wide HBM2 memory interface, holding 16 GB of memory, at a scorching 1 TB/s memory bandwidth. Both the X80 Ti and the X80 TITAN double the pixel- and texture- fill-rates from the GTX 980 Ti and GTX TITAN X, respectively.
Source: VideoCardz
Add your own comment

180 Comments on NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

#1
Slizzo
Well shit, to get HBM2 you gotta shell out for the Titan...

Hmmm... what to do...
Posted on Reply
#2
jchambers2586
time to look at AMD for a new GPU if wattage goes down.
Posted on Reply
#3
zedn
It finally looks like a Titan.
Posted on Reply
#4
the54thvoid
Super Intoxicated Moderator
Guru3D almost didn't want post this 'rumour' its that unsubstantiated...
Posted on Reply
#5
msamelis
16GBs of HBM2? This sounds unreal.. Plus, it will probably cost much more than the current Titan. It also seems like quite a jump in performance from the previous generation which is not something we are accustomed to - not that I will be complaining if it turns out to be true.
Posted on Reply
#6
RejZoR
SlizzoWell shit, to get HBM2 you gotta shell out for the Titan...

Hmmm... what to do...
AMD Fury X 2 :P Yeah, it's stupid to have to buy Titan to get HBM.
Posted on Reply
#7
purplekaycee
hopefully when the new line of cards ia released it would drive down price of the great gtx 980 ti
Posted on Reply
#8
oinkypig
double the floating point so i wouldnt expect AMDs to close the performance gap for another year or so after release of their polaris "". should 16nm be quite as impressive as it is on paper. we'll be in for a real treat.
Posted on Reply
#9
matar
So we are gonna miss the old GTX name
Posted on Reply
#10
oinkypig
if you think anyone ever missed anti-aliasing.. crytek cryengine 1, which was supporting dx10, wrecked havoc on the first gen GTXs for their lack of driver support, and one year later dx11 was released. DirectX 12 should be plenty of fun
Posted on Reply
#11
Legacy-ZA
There are already games out that use more than 6GB VRAM, it will be laughable as well as pathetic at the same time if these new generation graphics cards don't have more than 8GB VRAM+ for things like higher resolutions... Anti-Aliasing... HD Texture mods for games etc.

Well, it is only rumors, but if it turns out to be true, I will both laugh and cry at the same time. I still don't know which.
Posted on Reply
#12
BiggieShady
So what's after X80, X80 Ti and X80 Titan ... does that mean that dual gpu card would be x90 and next gen silicon will go with X180, X180 Ti, X180 Titan and X190 for dual volta gpu?
Posted on Reply
#13
Gungar
THIS IS FAKE! the "1080 gtx" has 8go of vram! and it has only 6 here!!!
Posted on Reply
#14
RejZoR
BiggieShadySo what's after X80, X80 Ti and X80 Titan ... does that mean that dual gpu card would be x90 and next gen silicon will go with X180, X180 Ti, X180 Titan and X190 for dual volta gpu?
Yeah, I was wondering that too. I guess it's the only way to progress the model numbers like this.
Posted on Reply
#15
P4-630
ATi X80.... X80TI:nutkick:
Posted on Reply
#16
medi01
I missed what was the source of this information.

Is it "some TPU guy just made this up"? =/
Posted on Reply
#17
RejZoR
Well, knowing NVIDIA, this sounds more legit than them going full HBM on everything with new series as initially suggested.
Posted on Reply
#18
Frick
Fishfaced Nincompoop
BiggieShadySo what's after X80, X80 Ti and X80 Titan ... does that mean that dual gpu card would be x90 and next gen silicon will go with X180, X180 Ti, X180 Titan and X190 for dual volta gpu?
Then X800 and then they HAVE to make a X850XT PE for the lulz.
Posted on Reply
#19
uuuaaaaaa
FrickThen X800 and then they HAVE to make a X850XT PE for the lulz.
I have a Radeon X850XT PE AGP (R481 chip :)) and this sounds offensive xD It still stands as the best GPU I have ever got!
Posted on Reply
#20
efikkan
GP100 with 512-bit GDDR5 and HBM2? It's not going to happen.

This is basically just some guys "randomly" guessing what a new generation could look like by increasing every feature by 50-100%, just for the attention. Almost everyone who has been following the news could probably end up guessing the next generation specs with abouth ~80% accuracy, unless there is a huge change in architecture like Kepler was. These early on wild guesses has been the norm for every generation, e.g. Fermi, Kepler, Maxwell... and they've never been right. The actual specs are usually leaked within a month or so before the product release.
Posted on Reply
#21
Vayra86
medi01I missed what was the source of this information.

Is it "some TPU guy just made this up"? =/
No, some random source made it up and not a very bright one either. They just picked old information and reapplied it to the upcoming releases, with a pinch of salt and a truckload of wishful thinking. The amount of issues with this chart is endless.

- X80 does not really sound well when you think of cut down chips. X75?? For the Pascal 970? Nahhh
- X does not fit the lower segments at all. GTX in the low end becomes a GT. So X becomes a.... Y? U? Is Nvidia going Intel on us? It's weird.
- X(number) makes every card look unique and does not denote an arch or a gen. So the following series is... X1? That would result in an exact repeat of the previous naming scheme with two letters omitted. Makes no sense.
- They are risking exact copies of previous card names, especially those of the competitor.

Then on to the specs.

- Implementing both GDDR5 and HBM controllers on one chip is weird
- How are they differentiating beyond the Titan? Titan was never the fastest gaming chip. It was and is always the biggest waste of money with a lot of VRAM. They also never shoot all out on the first big chip release. X80ti will be 'after Titan' not before.
- HBM2 had to be tossed in here somehow, so there it is. Right? Right.
- How are they limiting bus width on the cut down versions? Why 6 GB when the previous gen AMD mid range already hits 8?
Posted on Reply
#23
Prima.Vera
Legacy-ZAThere are already games out that use more than 6GB VRAM..
Which Games, on what settings and resolution please? Otherwise I'm calling this a BS
Posted on Reply
#24
Frick
Fishfaced Nincompoop
Prima.VeraWhich Games, on what settings and resolution please? Otherwise I'm calling this a BS
Some actually do, but I have no idea if it impacts performance
Posted on Reply
#25
Ruru
S.T.A.R.S.
I guess they invented the name in couple of seconds.
Posted on Reply
Add your own comment
Nov 21st, 2024 08:49 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts