Friday, March 18th 2016

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

With the GeForce GTX 900 series, NVIDIA has exhausted its GeForce GTX nomenclature, according to a sensational scoop from the rumor mill. Instead of going with the GTX 1000 series that has one digit too many, the company is turning the page on the GeForce GTX brand altogether. The company's next-generation high-end graphics card series will be the GeForce X80 series. Based on the performance-segment "GP104" and high-end "GP100" chips, the GeForce X80 series will consist of the performance-segment GeForce X80, the high-end GeForce X80 Ti, and the enthusiast-segment GeForce X80 TITAN.

Based on the "Pascal" architecture, the GP104 silicon is expected to feature as many as 4,096 CUDA cores. It will also feature 256 TMUs, 128 ROPs, and a GDDR5X memory interface, with 384 GB/s memory bandwidth. 6 GB could be the standard memory amount. Its texture- and pixel-fillrates are rated to be 33% higher than those of the GM200-based GeForce GTX TITAN X. The GP104 chip will be built on the 16 nm FinFET process. The TDP of this chip is rated at 175W.
Moving on, the GP100 is a whole different beast. It's built on the same 16 nm FinFET process as the GP104, and its TDP is rated at 225W. A unique feature of this silicon is its memory controllers, which are rumored to support both GDDR5X and HBM2 memory interfaces. There could be two packages for the GP100 silicon, depending on the memory type. The GDDR5X package will look simpler, with a large pin-count to wire out to the external memory chips; while the HBM2 package will be larger, to house the HBM stacks on the package, much like AMD "Fiji." The GeForce X80 Ti and the X80 TITAN will hence be two significantly different products besides their CUDA core counts and memory amounts.

The GP100 silicon physically features 6,144 CUDA cores, 384 TMUs, and 192 ROPs. On the X80 Ti, you'll get 5,120 CUDA cores, 320 TMUs, 160 ROPs, and a 512-bit wide GDDR5X memory interface, holding 8 GB of memory, with a bandwidth of 512 GB/s. The X80 TITAN, on the other hand, features all the CUDA cores, TMUs, and ROPs present on the silicon, plus features a 4096-bit wide HBM2 memory interface, holding 16 GB of memory, at a scorching 1 TB/s memory bandwidth. Both the X80 Ti and the X80 TITAN double the pixel- and texture- fill-rates from the GTX 980 Ti and GTX TITAN X, respectively.
Source: VideoCardz
Add your own comment

180 Comments on NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

#126
PP Mguire
FrickTo the 4K users: how does old games look on it? Say Diablo 2 unmodded.
A little later today I'm going to try putting my 98 box on the 4k screen :roll:
Posted on Reply
#127
FRAGaLOT
BiggieShadySo what's after X80, X80 Ti and X80 Titan ... does that mean that dual gpu card would be x90 and next gen silicon will go with X180, X180 Ti, X180 Titan and X190 for dual volta gpu?
I don't think nvidia has made a dual GPU single-card in ages. Reference or 3rd party. That's more something AMD tends to do, nvidia has never had the need\s to make such a product since the competition is so far behind, plus it would probably way more expensive than just SLIing two cards.
Posted on Reply
#128
FRAGaLOT
FrickTo the 4K users: how does old games look on it? Say Diablo 2 unmodded.
Wasn't Diablo 2 just updated recently by Blizzard to run on modern PCs? Doubt it supports 4k tho.
Posted on Reply
#129
PP Mguire
FRAGaLOTI don't think nvidia has made a dual GPU single-card in ages. Reference or 3rd party. That's more something AMD tends to do, nvidia has never had the need\s to make such a product since the competition is so far behind, plus it would probably way more expensive than just SLIing two cards.
Titan Z was last gen and Nvidia. They allowed AIB partners to sell the card but like all Titan cards they couldn't modify it. The last dual GPU that was modified to any extent was the 590.
FRAGaLOTWasn't Diablo 2 just updated recently by Blizzard to run on modern PCs? Doubt it supports 4k tho.
I believe so, but idk what the update all has. Was never really a Diablo fan.
Posted on Reply
#130
FRAGaLOT
EarthDogYou do know that AMD processors overclock more % wise on average, right?
That means nothing since Intel CPUs still outperform AMD cpus. If overclocking an AMD cpu was that much of a performance improvement more than what you can squeeze out of an Intel CPU, then everyone would be using AMD CPUs like it was the early 2000s again.
Posted on Reply
#131
FRAGaLOT
PP MguireTitan Z was last gen and Nvidia. They allowed AIB partners to sell the card but like all Titan cards they couldn't modify it. The last dual GPU that was modified to any extent was the 590. I believe so, but idk what the update all has. Was never really a Diablo fan.
Yeah i just couldn't recall the last time nvidia ever did a dual GPU card, which AMD does all the time at every reversion.

As far as i know the Diablo 2 update allows the game to run on modern PCs and operating system (64bit OSes), considering most people were playing it on Windows 9x back in the day.
Posted on Reply
#132
trog100
EarthDogThat is how things work though, trog. It all depends on the quality of the silicon, binning, and, sales, actually. ;)
the chip i mentioned was on pre-order i had one of the first in the country.. it was a brand new line.. intel knew full well what it was capable of.. but knowing this they chose to clock it at 60% of what it could have been clocked at.. fast enough to clearly beat the opposition but not fast enough to make fools of them..

8 years later my current end of the line intel cpu is running at 4.5 gig.. very similar to what my 8 year old intel chip could do.. and very similar to what the latest generation intel chips can do..

it does make one wonder just what would have happened if that chip had not been deliberately down clocked.. where would we be now.. he he

one thing is for sure.. intel have had a pretty easy 8 years.. :)

trog
Posted on Reply
#133
EarthDog
FRAGaLOTThat means nothing since Intel CPUs still outperform AMD cpus. If overclocking an AMD cpu was that much of a performance improvement more than what you can squeeze out of an Intel CPU, then everyone would be using AMD CPUs like it was the early 2000s again.
while that is true.. that wasn't remotely a part of what we were discussing. Look at the post I quoted. ;)
Posted on Reply
#134
arbiter
EarthDogwhile that is true.. that wasn't remotely a part of what we were discussing. Look at the post I quoted. ;)
Ok lets get back on topic of Nvidia gpu and drop the whole AMD cpu discussion as that is probably some other thread.
Posted on Reply
#135
Frick
Fishfaced Nincompoop
FRAGaLOTAs far as i know the Diablo 2 update allows the game to run on modern PCs and operating system (64bit OSes), considering most people were playing it on Windows 9x back in the day.
This is getting way off topic, but it ran just fine on modern OS's before.
Posted on Reply
#136
Ubersonic
There's a certain comedy to this, after the 9000 series they went to the 200 series instead of a X000 series, now after the 900 series they are going to the X00 series lol.

No points for consistency Nvidia :P

(Yeah I know the was a 100 series before the 200 but it was OEM only rebrands).
Posted on Reply
#137
Basard
Prima.VeraLOL. Saw that actually. 800x600 stretched on a 4K monitor....Blur pr0n fest. Like watching a very low res video on your 1080p monitor.... :)))

However, if you use a hack, it looks like this :)))
That's the most beautiful thing I've ever seen!
Posted on Reply
#138
RejZoR
Then again they maybe want to differentiate the series more by making drastic name changes instead of logical ones...
Posted on Reply
#139
laszlo
as nvidia like green they shall name their cards :
-leprechaun
-goblin
-grinch
-ipkiss
-lantern
-t.m.n.t.
-wazowski
-hulk
-kermit
-yoda
-shrek
.....

plenty of green characters to choose from....

i couldn't resist :D:D:D:D:D:D:D:D
Posted on Reply
#140
64K
FRAGaLOTI don't know why nvidia won't name the products after the cores they have in them, "GeForce GP104" sounds like a fine product name along with "GeForce GP104 Ti" and "GeForce GP104 Titan" and whatever other gimped versions of the "GP10x" they make of this core for cheap cards.
Cool names sell hardware. Look at all the products that carry the "Gaming" logo. There is a reason manufacturers do that. Calling a card by the engineering term won't help to sell cards.
Posted on Reply
#141
InVasMani
Prima.VeraWhich Games, on what settings and resolution please? Otherwise I'm calling this a BS
Rainbow Six Siege on Ultra easily can surpass 6GB VRAM same with the latest CoD and even more so depending on how much system memory you have installed. From what I've seen higher VRAM can offset the need and impact of more system memory up to a point at least for gaming at least.
Posted on Reply
#142
EarthDog
That makes little sense considering what is in vRAM is different than what is in the System Ram...
Posted on Reply
#143
InVasMani
FRAGaLOTWasn't Diablo 2 just updated recently by Blizzard to run on modern PCs? Doubt it supports 4k tho.
Just play Grim Dawn at 4K instead it's a vastly better game and the price is pretty cheap all things considered plus it's getting modding tools on top of a all ready brilliant ARPG that doesn't feature Korean spam bots selling gold and items like every Blizzard game because they require bnet and they even spam you on that outside of the games themselves.
EarthDogThat makes little sense considering what is in vRAM is different than what is in the System Ram...
www.techpowerup.com/217308/black-ops-iii-12-gb-ram-and-gtx-980-ti-not-enough.html

What's in storage goes into system memory and into CPU cache and goes into vRAM where do you think it fetches and accesses those textures from? If you run out of VRAM your next logical fast access high storage density bottleneck is your system memory followed by your actual storage which is more of a non volatile cache for memory.

Try playing a game that loads textures on the fly like World of Warcraft or Diablo 3 with a traditional HD and see what happens stutter stutter stutter when you run out of available faster access VRAM on the GPU what do you think is going to happen stutter stutter stutter.

That's also where higher bandwidth lower latency ram helps as well and explains why it drastically impacts APU's as they don't have dedicated VRAM and rely on system memory.
Posted on Reply
#144
EarthDog
Ahhhhhhhhhhh, I see...after thinking a bit more critically about it.... it makes sense. If you are using more vRAM than what your GPU allows, it 'pages' out to the System RAM....this also depends on the settings and the card so it is not a hard/fast rule.

Please edit posts instead of double posting... they don;t like that 'round these parts. :p
Posted on Reply
#145
PP Mguire
Blops 3 actually scales to your VRAM and system RAM then uses all that it can. It can run smooth on a 4GB card but on my machine will utilize around 11.5GB.
Posted on Reply
#146
anubis44
oinkypigdouble the floating point so i wouldnt expect AMDs to close the performance gap for another year or so after release of their polaris "". should 16nm be quite as impressive as it is on paper. we'll be in for a real treat.
So you think AMD's Polaris will be like chocolate soft-serve ice cream with eyeballs? Very interesting.
Posted on Reply
#147
Frick
Fishfaced Nincompoop
InVasManiJust play Grim Dawn at 4K instead it's a vastly better game and the price is pretty cheap all things considered plus it's getting modding tools on top of a all ready brilliant ARPG that doesn't feature Korean spam bots selling gold and items like every Blizzard game because they require bnet and they even spam you on that outside of the games themselves.
I was about to write something but realized you're probably talking about Diablo 3. Crisis averted.
Posted on Reply
#148
InVasMani
FrickI was about to write something but realized you're probably talking about Diablo 3. Crisis averted.
It's better than any of the Diablo games really it doesn't matter which one seeing as it's got better game mechanics, graphics, story, humor and overall fun factor. The game literally has improved every time I've played it for the last year and half or so. Diablo 2 wasn't a bad game by any means, but even in comparison to that Grim Dawn is still head and shoulders a superior ARPG it has so many great things going for it I wish your typical EQ/WoW class based MMO's were more like it in fact.
Posted on Reply
#149
medi01
FRAGaLOT...since the competition is so far behind...
Seriously?

Posted on Reply
#150
Frick
Fishfaced Nincompoop
InVasManiIt's better than any of the Diablo games really it doesn't matter which one seeing as it's got better game mechanics, graphics, story, humor and overall fun factor. The game literally has improved every time I've played it for the last year and half or so. Diablo 2 wasn't a bad game by any means, but even in comparison to that Grim Dawn is still head and shoulders a superior ARPG it has so many great things going for it I wish your typical EQ/WoW class based MMO's were more like it in fact.
I won't take your word for it and I make no apologies for it.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts