Friday, March 18th 2016

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

With the GeForce GTX 900 series, NVIDIA has exhausted its GeForce GTX nomenclature, according to a sensational scoop from the rumor mill. Instead of going with the GTX 1000 series that has one digit too many, the company is turning the page on the GeForce GTX brand altogether. The company's next-generation high-end graphics card series will be the GeForce X80 series. Based on the performance-segment "GP104" and high-end "GP100" chips, the GeForce X80 series will consist of the performance-segment GeForce X80, the high-end GeForce X80 Ti, and the enthusiast-segment GeForce X80 TITAN.

Based on the "Pascal" architecture, the GP104 silicon is expected to feature as many as 4,096 CUDA cores. It will also feature 256 TMUs, 128 ROPs, and a GDDR5X memory interface, with 384 GB/s memory bandwidth. 6 GB could be the standard memory amount. Its texture- and pixel-fillrates are rated to be 33% higher than those of the GM200-based GeForce GTX TITAN X. The GP104 chip will be built on the 16 nm FinFET process. The TDP of this chip is rated at 175W.
Moving on, the GP100 is a whole different beast. It's built on the same 16 nm FinFET process as the GP104, and its TDP is rated at 225W. A unique feature of this silicon is its memory controllers, which are rumored to support both GDDR5X and HBM2 memory interfaces. There could be two packages for the GP100 silicon, depending on the memory type. The GDDR5X package will look simpler, with a large pin-count to wire out to the external memory chips; while the HBM2 package will be larger, to house the HBM stacks on the package, much like AMD "Fiji." The GeForce X80 Ti and the X80 TITAN will hence be two significantly different products besides their CUDA core counts and memory amounts.

The GP100 silicon physically features 6,144 CUDA cores, 384 TMUs, and 192 ROPs. On the X80 Ti, you'll get 5,120 CUDA cores, 320 TMUs, 160 ROPs, and a 512-bit wide GDDR5X memory interface, holding 8 GB of memory, with a bandwidth of 512 GB/s. The X80 TITAN, on the other hand, features all the CUDA cores, TMUs, and ROPs present on the silicon, plus features a 4096-bit wide HBM2 memory interface, holding 16 GB of memory, at a scorching 1 TB/s memory bandwidth. Both the X80 Ti and the X80 TITAN double the pixel- and texture- fill-rates from the GTX 980 Ti and GTX TITAN X, respectively.
Source: VideoCardz
Add your own comment

180 Comments on NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

#51
GhostRyder
trog100can we assume there will be an X60 and an X70 series.. :)

the naming makes sense the rest of it sounds a tad on the optimistic side.. but who knows.. he he..

6 gigs of memory is a realistic "enough" the fact that some amd cards have 8 is pure marketing hype.. no current game needs more than 4 gigs.. people who think otherwise are wrong..

i will be bit pissed off if my currently top end cards become mere mid range all in one go but such is life.. :)

trog
I don't get that, its better that the next cards put the old cards to shame because that means they are worthy to upgrade. This round with Maxwell and R9 3XX the upgrades were very minimal compared to the previous generation which to me was poor because it made me wonder why I would want to upgrade. We need reasons to upgrade and a card that puts the old generation flagship to shame is the perfect thing in my book as long as its done on pure performance of the cards (And not gimping performance on older generation cards).

As for the 8gb part, it is overkill in most cases but beyond 4gb itself is not. It just means if there is a need for beyond 4gb we will still be a ok. I do agree its overkill, but I would put better safe than sorry.
Posted on Reply
#52
PP Mguire
trog100can we assume there will be an X60 and an X70 series.. :)

the naming makes sense the rest of it sounds a tad on the optimistic side.. but who knows.. he he..

6 gigs of memory is a realistic "enough" the fact that some amd cards have 8 is pure marketing hype.. no current game needs more than 4 gigs.. people who think otherwise are wrong..

i will be bit pissed off if my currently top end cards become mere mid range all in one go but such is life.. :)

trog
My cards have 12GB, it's definitely not marketing hype.

People who think future titles and 4k gaming is realistic with 4GB is wrong. Actually, if people thought 4GB was enough for 1080p gaming there wouldn't be artards complaining about the 970 still like it's still a way to troll.
Posted on Reply
#53
AuDioFreaK39
nickbaldwin86You should apply for their "Lead of naming things in marketing Director of name stuff" I hear they have a new opening at launch of every new card series. Bit of a long title but small font and you can get it on a business card ;)

Sorry just love how people get hung up on the name!

You must remember that marketing names these and it is based feel good not technical reasons. how ever they can sell more! a X in front of anything sells!!!!!
How about this instead:

GeForce PX 4000
GeForce PX 5500 Ti
GeForce PX 6500
GeForce PX 7500
GeForce PX 8500 Ti
GeForce PX2 9500 (dual-GPU)
GeForce PX Titan
GeForce PX Titan M
Posted on Reply
#54
overclocking101
this just doesn't seem legit to me. one would think that with the progression of cards now the new nvidia cards would come with 8gb vram. i'll believe it when I see it though the new naming scheme would make sense so that part of it may be true. we will see I guess
Posted on Reply
#55
rtwjunkie
PC Gaming Enthusiast
AuDioFreaK39How about this instead:

GeForce PX 4000
GeForce PX 5500 Ti
GeForce PX 6500
GeForce PX 7500
GeForce PX 8500 Ti
GeForce PX2 9500 (dual-GPU)
GeForce PX Titan
GeForce PX Titan M
I feel this gets too close to their prior naming schemes in the 4000, 5000, 6000, 7000, and 8000 series. I'm guessing Nvidia thought so too.

The very first thing I thought of when I saw this were Ti 4200 and Ti4400.
Posted on Reply
#56
PP Mguire
overclocking101this just doesn't seem legit to me. one would think that with the progression of cards now the new nvidia cards would come with 8gb vram. i'll believe it when I see it though the new naming scheme would make sense so that part of it may be true. we will see I guess
Nah. The GP104 chip is midrange ala 980. In other words upping the high end midrange to 6GB, 8GB for enthusiast card and then 16GB HBM2 for the TItan level card. Falls into their typical strategy really. Don't expect huge amounts of memory until midrange cards start going HBM due to stacking. Getting high amounts of GDDR5 on a board amounts to lots of chips.
Posted on Reply
#57
nickbaldwin86
AuDioFreaK39How about this instead:

GeForce PX 4000
GeForce PX 5500 Ti
GeForce PX 6500
GeForce PX 7500
GeForce PX 8500 Ti
GeForce PX2 9500 (dual-GPU)
GeForce PX Titan
GeForce PX Titan M
I hope you don't get the job
Posted on Reply
#58
trog100
EarthDog+1... That is just the way it works.

But look at the bright side... you neuter yourself (limit FPS) so when you need the horsepower, you have some headroom built in from your curious methods. :p
not quite the same thing though is it.. my gpu power will be more than adequate for quite some time to come.. but more than adequate aint quite the same as top end though.. he he

but as other have said keeping at the leading edge is f-cking expensive and as soon as you catch up the goal posts get f-cking moved.. he he

as old as i am e-peen still plays a part in my thinking.. maybe it shouldnt but at least i am honest enough to admit that it does. .:)

years ago when Rolls Royce were ask the power output of their cars they had one simple reply.. they used to say "adequate".. i aint so sure that "adequate" is good enough now though.. :)

trog
Posted on Reply
#59
trog100
PP MguireMy cards have 12GB, it's definitely not marketing hype.

People who think future titles and 4k gaming is realistic with 4GB is wrong. Actually, if people thought 4GB was enough for 1080p gaming there wouldn't be artards complaining about the 970 still like it's still a way to troll.
i think it is which is why i bought a 6 gig 980 TI instead of a titan x.. at least for playing games on 12 gig isnt of much use.. but i was really thinking of certain lower power amd cards when i made the market hype comment..

but back on topic i recon 6 gigs is okay for the next generation of graphics cards.. there will always be something higher for those with the desire and the cash for more..

in this day and age market hype aint that uncommon.. he he.. who the f-ck needs 4K anyway.. 4K is market hype.. maybe the single thing that drives it all.. i would guess the next thing will be 8K.. the whole game will start again then.. including phones with 6 inch 8K screens.. he he

i like smallish controlled jumps with this tech malarkey.. that way those who bought last year aint that pissed off this year.. i recon its all planned this way anyway.. logically it has to be..

trog

ps.. if one assumes the hardware market drives the software market which i do 6 gigs for future cards is about right.. the games are made to work on mass market hardware.. not the other way around..
Posted on Reply
#60
64K
6 GB should be plenty for even the high end of mid range Pascal card for the next few years. I would like to have an X80 Ti with 8 GB though. That quote from W1zzard in my post #33 that 4 GB VRAM is enough for all games to date was from a year ago and VRAM demands will continue to go up especially for people running DSR and ultra settings.
Posted on Reply
#61
xfia
pretty speculative on vram amounts too seeing the difference that dx12 can make in a actual game.

lower system latency means faster rendering..
Posted on Reply
#62
PP Mguire
trog100i think it is which is why i bought a 6 gig 980 TI instead of a titan x.. at least for playing games on 12 gig isnt of much use.. but i was really thinking of certain lower power amd cards when i made the market hype comment..

but back on topic i recon 6 gigs is okay for the next generation of graphics cards.. there will always be something higher for those with the desire and the cash for more..

in this day and age market hype aint that uncommon.. he he.. who the f-ck needs 4K anyway.. 4K is market hype.. maybe the single thing that drives it all.. i would guess the next thing will be 8K.. the whole game will start again then.. including phones with 6 inch 8K screens.. he he

i like smallish controlled jumps with this tech malarkey.. that way those who bought last year aint that pissed off this year.. i recon its all planned this way anyway.. logically it has to be..

trog

ps.. if one assumes the hardware market drives the software market which i do 6 gigs for future cards is about right.. the games are made to work on mass market hardware.. not the other way around..
12GB is great because I literally have never capped my VRAM usage even at 4k. I have seen 11GB used at the highest, and that's with the newer engines barely coming out and DX12 is barely making it's way onto our PCs. It'll be necessary later on, trust me.

4k is not market hype, it literally looks a ton crisper than 1080p and 1440p. You can't sit here and say 4x the average resolution is "market hype", it's the next hump in the road whether some people want to admit it or not. I got my TV lower than most large format 4k monitors so I took the dive knowing Maxwell performance isn't to par, but with 4k I don't need any real amounts of AA either. 2x at most in some areas depending on the game. That being said, I'd rather not have small incremental jumps in performance because some either can't afford it or find a way to afford it. That's called stagnation, and nobody benefits from that. Just look at CPUs for a clean cut example of why we don't need stagnation.

Games are made to work on consoles, not for PCs unless specified. That hardly happens, because money is in the console market. That doesn't mean that shit ports won't drive up the amount of VRAM needed regardless of texture streaming. I agree though, 6GB for midrange next gen should be plenty as I doubt the adoption to 4k will great during the Pascal/Polaris generation. That'll be left to Volta.
Posted on Reply
#63
nickbaldwin86
PP Mguire12GB is great because I literally have never capped my VRAM usage even at 4k. I have seen 11GB used at the highest, and that's with the newer engines barely coming out and DX12 is barely making it's way onto our PCs. It'll be necessary later on, trust me.

4k is not market hype, it literally looks a ton crisper than 1080p and 1440p. You can't sit here and say 4x the average resolution is "market hype", i
it isn't he has just never played on a 4K and he is just jelly. the only person that would say 4K sucks is someone that hasn't experienced it.

I don't "like"/"want" 4K right now only because of 60hz but that is another subject completely!

When 4K hits 144hz I will move up but going from 144hz to 60hz hurts my eyes in a real way. 4K is amazing to look at just don't move fast LOL
Posted on Reply
#64
PP Mguire
nickbaldwin86it isn't he has just never played on a 4K and he is just jelly. the only person that would say 4K sucks is someone that hasn't experienced it.

I don't "like"/"want" 4K right now only because of 60hz but that is another subject completely!

When 4K hits 144hz I will move up but going from 144hz to 60hz hurts my eyes in a real way. 4K is amazing to look at just don't move fast LOL
I have no issues with 60 and I came from over a year's use of 100+. Then again, my Battlefield playing days are basically over. It's all boring to me now.
Posted on Reply
#65
xfia
I feel the same.. when I play a lot games now. zzz... I get the people that like consoles for the two player and having fun with friends. some would say that dreamers live another reality when they sleep because how long they spend doing so. If I have multiple realities to live I dont want one of them to be zeros and ones.
Posted on Reply
#66
arbiter
geon2k2They forgot to mention the prices I think.

Let me correct that:

X80 - 1000 $
X80ti - 1500$
X80 Titan - 3000$
Let me guess your an AMD fanboy that conveniently forgot that last 2 price drops were result of Nvidia settings prices lower then AMD? 290x was 500+$ card, here comes gtx970 that is just as fast for 330$. Then 980ti for 650$ o look fury had priced same to compete.

Besides the AMD red fanboy claims about pricing, GP100 having both HBM and GDDR5(x) on that chart is bit suspect as to how close to true those are. Could like be x80ti using HBM but not gonna be our til later this year like current ti was
Posted on Reply
#68
xorbe
TPU should be ashamed of the headline, if they read the original post.
Posted on Reply
#69
Brusfantomet
AuDioFreaK39I do not like the name "X80." It seems like a backwards step for such a powerful architecture, especially from "GTX 980," "GTX 780," "GTX 680," etc. The other issue is that many people may be inclined to believe that "GTX 980" is more powerful simply because it sounds as such.

It honestly looks like anyone could have typed this up in Excel and posted to Imgur, as the table is very generic and does not have indicators of Nvidia's more presentation-based "confidential" slides.

History of notable GPU naming schemes:
GeForce 3 Ti500 (October 2001)
GeForce FX 4800 (March 2003)
GeForce 6800 GT (June 2004)
GeForce GTX 680 (March 2012)

GeForce PX sounds more appropriate for Pascal, considering they already branded the automotive solution Drive PX 2. I would have no problem purchasing a GPU called "GeForce PX 4096," "GeForce PX 5120 Ti" or "GeForce PX Titan," where 4096 and 5120 indicate the number of shaders.

They could even do the same with Volta in 2018: GeForce VX, GeForce VX Ti, and GeForce VX Titan, etc.
There was never a GeForce FX 4000 series, but a FX 5000 series.
Posted on Reply
#70
AuDioFreaK39
BrusfantometThere was never a GeForce FX 4000 series, but a FX 5000 series.
Ah yes, thank you. I had it confused with the Quadro FX 4800 from 2008.
Posted on Reply
#71
HumanSmoke
AuDioFreaK39It honestly looks like anyone could have typed this up in Excel and posted to Imgur, as the table is very generic and does not have indicators of Nvidia's more presentation-based "confidential" slides.
I actually thought that was exactly what has happened. Slow news week especially after the Capsaicin non-reveal, so why not plant some guesswork and watch the page hits flow.
AuDioFreaK39GeForce PX sounds more appropriate for Pascal
To close to Nvidia's (+ ex-3Dfx's GPU design team) NV30 series I suspect. When PCI-Express first launched, Nvidia differentiated models from AGP with the PCX name. I suspect PX might not have the marketing cachet.
PP MguireLeaks, PR, and Nvidia have all said Pascal is bringing back compute performance for deep learning which could limit all but the Titan GPU to GDDR5x. Reasoning behind this is because the compute cards were pretty much confirmed already to have HBM2 and a minimum of 16GB at that, and if that's the case it could also mean the Titan will have the double duty classification brought back with HBM2, leaving the rest with GDDR5x. This really isn't an issue, but I suppose people will argue otherwise.
IMO I'd think GP100 would be the only real "compute" (FP64/double precision) chip. I'd really expect GP104 to pull double duty as a high end gaming GPU and enthusiast level mobile option so adding power hungry SFU's might not be an option. As far as I'm aware, all Pascal chips will feature mixed compute allowing for half-precision (FP16) ops (as will AMD's upcoming chips), since game engines as well as other applications can utilize it effectively. I really think the days of the a full compute second (and lower) tier gaming GPU are well and truly behind us - both Nvidia and AMD have sacrificed double precision in the name of retaining a more balanced approach to power, die size, and application of late.
Posted on Reply
#72
arterius2
Vayra86No, some random source made it up and not a very bright one either. They just picked old information and reapplied it to the upcoming releases, with a pinch of salt and a truckload of wishful thinking. The amount of issues with this chart is endless.

- X80 does not really sound well when you think of cut down chips. X75?? For the Pascal 970? Nahhh
- X does not fit the lower segments at all. GTX in the low end becomes a GT. So X becomes a.... Y? U? Is Nvidia going Intel on us? It's weird.
- X(number) makes every card look unique and does not denote an arch or a gen. So the following series is... X1? That would result in an exact repeat of the previous naming scheme with two letters omitted. Makes no sense.
- They are risking exact copies of previous card names, especially those of the competitor.

Then on to the specs.

- Implementing both GDDR5 and HBM controllers on one chip is weird
- How are they differentiating beyond the Titan? Titan was never the fastest gaming chip. It was and is always the biggest waste of money with a lot of VRAM. They also never shoot all out on the first big chip release. X80ti will be 'after Titan' not before.
- HBM2 had to be tossed in here somehow, so there it is. Right? Right.
- How are they limiting bus width on the cut down versions? Why 6 GB when the previous gen AMD mid range already hits 8?
No, the names sound believable as Nvidia probably want to simply their naming schemes, I always thought the whole GTX/GT/GTS shenanigans were stupid from the start, this is a move in the right direction.

X means 10 in roman numerals, it looks even more stupid when you write shit like gtx1080.

Nobody is copying from competitors, AMD does not have any cards named X80.
Posted on Reply
#73
Ithanul
Prima.VeraWhich Games, on what settings and resolution please? Otherwise I'm calling this a BS
Unless the person was talking about a heavily modded Skyrim or other game. Then again, I have not touched any of the past year's new AAA games (none interest me at all).

Though, finally! Compute be back, be able to have work horses again for other things. :p

Be interesting though if the X80 and/or X80 Ti have the GDDR5 or GDDR5X. I would like to see comparison with those overclocked to a Titan with HBM2 that is overclocked. Just see if there be a difference in performance with the RAM. Though, I could care less about RAM speed most of the time. Since I don't even game at high resolutions. *sites in corner with 1080P 60Hz screen*
Posted on Reply
#74
arterius2
IthanulUnless the person was talking about a heavily modded Skyrim or other game. Then again, I have not touched any of the past year's new AAA games (none interest me at all).

Though, finally! Compute be back, be able to have work horses again for other things. :p

Be interesting though if the X80 and/or X80 Ti have the GDDR5 or GDDR5X. I would like to see comparison with those overclocked to a Titan with HBM2 that is overclocked. Just see if there be a difference in performance with the RAM. Though, I could care less about RAM speed most of the time. Since I don't even game at high resolutions. *sites in corner with 1080P 60Hz screen*
Should seriously upgrade your screen before anything in your case.
Posted on Reply
#75
rtwjunkie
PC Gaming Enthusiast
There was the ATI/AMD x-1000 series in 2004 or 2005 IIRC, so it definitely would have been confusing had NVIDIA gone with the 1080, etc.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts