Friday, March 18th 2016

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

With the GeForce GTX 900 series, NVIDIA has exhausted its GeForce GTX nomenclature, according to a sensational scoop from the rumor mill. Instead of going with the GTX 1000 series that has one digit too many, the company is turning the page on the GeForce GTX brand altogether. The company's next-generation high-end graphics card series will be the GeForce X80 series. Based on the performance-segment "GP104" and high-end "GP100" chips, the GeForce X80 series will consist of the performance-segment GeForce X80, the high-end GeForce X80 Ti, and the enthusiast-segment GeForce X80 TITAN.

Based on the "Pascal" architecture, the GP104 silicon is expected to feature as many as 4,096 CUDA cores. It will also feature 256 TMUs, 128 ROPs, and a GDDR5X memory interface, with 384 GB/s memory bandwidth. 6 GB could be the standard memory amount. Its texture- and pixel-fillrates are rated to be 33% higher than those of the GM200-based GeForce GTX TITAN X. The GP104 chip will be built on the 16 nm FinFET process. The TDP of this chip is rated at 175W.
Moving on, the GP100 is a whole different beast. It's built on the same 16 nm FinFET process as the GP104, and its TDP is rated at 225W. A unique feature of this silicon is its memory controllers, which are rumored to support both GDDR5X and HBM2 memory interfaces. There could be two packages for the GP100 silicon, depending on the memory type. The GDDR5X package will look simpler, with a large pin-count to wire out to the external memory chips; while the HBM2 package will be larger, to house the HBM stacks on the package, much like AMD "Fiji." The GeForce X80 Ti and the X80 TITAN will hence be two significantly different products besides their CUDA core counts and memory amounts.

The GP100 silicon physically features 6,144 CUDA cores, 384 TMUs, and 192 ROPs. On the X80 Ti, you'll get 5,120 CUDA cores, 320 TMUs, 160 ROPs, and a 512-bit wide GDDR5X memory interface, holding 8 GB of memory, with a bandwidth of 512 GB/s. The X80 TITAN, on the other hand, features all the CUDA cores, TMUs, and ROPs present on the silicon, plus features a 4096-bit wide HBM2 memory interface, holding 16 GB of memory, at a scorching 1 TB/s memory bandwidth. Both the X80 Ti and the X80 TITAN double the pixel- and texture- fill-rates from the GTX 980 Ti and GTX TITAN X, respectively.
Source: VideoCardz
Add your own comment

180 Comments on NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

#76
ppn
There are 8Ghz chips only in 8Gb density, So 384-bit makes 12GB, And 1.5V will run them very hot. Better avoid that. definitely Go for HBM2.
Posted on Reply
#77
PP Mguire
HumanSmokeI actually thought that was exactly what has happened. Slow news week especially after the Capsaicin non-reveal, so why not plant some guesswork and watch the page hits flow.

To close to Nvidia's (+ ex-3Dfx's GPU design team) NV30 series I suspect. When PCI-Express first launched, Nvidia differentiated models from AGP with the PCX name. I suspect PX might not have the marketing cachet.

IMO I'd think GP100 would be the only real "compute" (FP64/double precision) chip. I'd really expect GP104 to pull double duty as a high end gaming GPU and enthusiast level mobile option so adding power hungry SFU's might not be an option. As far as I'm aware, all Pascal chips will feature mixed compute allowing for half-precision (FP16) ops (as will AMD's upcoming chips), since game engines as well as other applications can utilize it effectively. I really think the days of the a full compute second (and lower) tier gaming GPU are well and truly behind us - both Nvidia and AMD have sacrificed double precision in the name of retaining a more balanced approach to power, die size, and application of late.
By compute I was really only referencing the Titans because they should be the only ones with HBM2 like the Teslas.
Posted on Reply
#78
64K
arterius2Should seriously upgrade your screen before anything in your case.
@Ithanul mostly uses GPUs for Folding.
Posted on Reply
#80
Frick
Fishfaced Nincompoop
rtwjunkieThere was the ATI/AMD x-1000 series in 2004 or 2005 IIRC, so it definitely would have been confusing had NVIDIA gone with the 1080, etc.
Not really confusing because those GPUs doesn't exist outside collector basements, and it was x1800/x1900, so some ways off numerically speaking.

Edit: No man it's coming back to me, they went down to x12xx, but those were integrated graphics, the dedicated GPU's started in x1300.
Posted on Reply
#81
Ikaruga
Another quality nonsense....
Posted on Reply
#82
efikkan
To my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
Posted on Reply
#83
PP Mguire
efikkanTo my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
GDDR5x goes into mass production this summer.
Posted on Reply
#84
Ikaruga
efikkanTo my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
Who would? That table says the same chip (GP100) will be available with ddr5 and hmb2 memory configurations. That alone is a dead giveaway, and lets not mention the rest.
Posted on Reply
#85
MxPhenom 216
ASIC Engineer
PP MguireMy cards have 12GB, it's definitely not marketing hype.

People who think future titles and 4k gaming is realistic with 4GB is wrong. Actually, if people thought 4GB was enough for 1080p gaming there wouldn't be artards complaining about the 970 still like it's still a way to troll.
Because a $1k GPU with 12GB means its not marketting hype...When its ENTIRELY marketting surrounding that product.
Posted on Reply
#86
MxPhenom 216
ASIC Engineer
xorbeTPU should be ashamed of the headline, if they read the original post.
There is nothing wrong with it.
Posted on Reply
#87
PP Mguire
MxPhenom 216Because a $1k GPU with 12GB means its not marketting hype...When its ENTIRELY marketting surrounding that product.
I'm talking about the VRAM amount, not the card.
Posted on Reply
#88
MxPhenom 216
ASIC Engineer
PP MguireI'm talking about the VRAM amount, not the card.
No shit...
Posted on Reply
#89
Fouquin
AuDioFreaK39History of notable GPU naming schemes:
GeForce 3 Ti500 (October 2001)
GeForce FX 4800 (March 2003)
GeForce 6800 GT (June 2004)
GeForce GTX 680 (March 2012)
GeForce FX 5800*

Also not sure why you picked up on the GTX 680 as being notable, when that naming system began with GT200 and the GTX 280. (Or the GTS 150, depending on who you ask.)
AuDioFreaK39GeForce PX sounds more appropriate for Pascal
While that's cool, it seems weird to have put so much time and money into the build-up of the "GeForce GTX" brand only to kill it off just because they need to figure out some different numbers to put in the name. It seems more likely to just decide on some new numbers, keep the decade old branding that everyone already recognizes, and move on.
Posted on Reply
#90
Caring1
arterius2Nobody is copying from competitors, AMD does not have any cards named X80.
But they did have the All In Wonder series the X800
Posted on Reply
#91
kiddagoat
I miss the All In Wonder series cards..... I wish they'd bring those back.... AMD/ATi always did a top notch job on the multimedia side of their cards, those media center remotes some of those cards came with were amazing for their time.
Posted on Reply
#92
Octopuss
PP Mguire4k is not market hype, it literally looks a ton crisper than 1080p and 1440p. You can't sit here and say 4x the average resolution is "market hype", it's the next hump in the road whether some people want to admit it or not. I got my TV lower than most large format 4k monitors so I took the dive knowing Maxwell performance isn't to par, but with 4k I don't need any real amounts of AA either. 2x at most in some areas depending on the game. That being said, I'd rather not have small incremental jumps in performance because some either can't afford it or find a way to afford it. That's called stagnation, and nobody benefits from that. Just look at CPUs for a clean cut example of why we don't need stagnation.
Some people are more than happy with 1080/1200p and don't intend to buy larger monitors you know?
Posted on Reply
#93
Prima.Vera
Personally I am waiting for the new gen cards as a reason to buy those 3440x1440 21:9 curved monitors...
Posted on Reply
#94
trog100
i expect a 25% to 30% performance increase for the same money from the next generation of cards..

which in reality means not all that much.. 80 fps instead of 60 fps or 40 fps instead of 30 fps..

my own view is that once you add in g-sync or free-sync anything much over 75 fps dosnt show any gains..

4 K gaming for those that must have it will become a little more do-able but not my much.. life goes on..

affordable VR will also become a bit more do-able..

trog
Posted on Reply
#95
Keullo-e
S.T.A.R.S.
nickbaldwin86I see what they did there.

X80 = X = 10 = 1080 ;) so they thought X sounded "cooler" because marketing!

Time for a set of X80 Ti,s ... :)
That's what ATI did 12 years ago. After 9800, came X800 = 10800, nothing new in this.


damn, should have read other comments too.. :)
Posted on Reply
#96
RejZoR
AMD's current naming scheme is also good. The R9, R7 and R5 designators tell you what class it belongs and then the actual model digits following that. Plus it sounds good. R9 390X. Or R9 Fury X.
Posted on Reply
#97
the54thvoid
Intoxicated Moderator
I think we'll all find the next Nvidia flagship card is called:

1) "That's so too super expensive - you must be fanboyz to buy it", or
2) "Nvidia crushed AMD, Team red is going bust lozers", or
3) some other alpha numerical variant with a possible 'Titan' slipped in.

And yes, I'm, pushing this pointless 'news' piece to get to the 100 post mark. Come on everyone, chip in to make futility work harder.
Posted on Reply
#98
PP Mguire
OctopussSome people are more than happy with 1080/1200p and don't intend to buy larger monitors you know?
The same people said the same thing about their 19" 1280x1024 Dell monitors and now I bet they're all running 1080p or 1440p. I too once said 24" 1200p is all I need for great gaming in 09 and now I'm running a 48" 4k TV. Times change, people change, and some faster than others. Even Sony/Microsoft are releasing revamped consoles to support 4k. Then again, nobody has a gun to your head saying upgrade either. You like 1080p? Cool, a 980ti is an absolute monster for 1080p.
Posted on Reply
#99
trog100
printers are rated in dots per inch.. DPI..

maybe monitors need rating the same way.. PPI.. pixels per inch.. your 4K 48 inch TV makes sense to me but when i see 4K on a 17 inch laptop it just makes a nonsense of it..

its like the mega pixel race with still cameras.. for web viewing you dont need that many.. to make errr 48 inch prints you do though..

4K is 8 million pixels.. at what size point (or viewing distance) it simply becomes unnoticeable i havnt a clue but there must be one..

my 1080 24 inch monitor at my normal viewing distance looked okay to me.. my 1440 27 inch monitor at the same viewing distance still looks okay to me..

however quite what sticking my nose 12 inches away from a 48 inch TV would make of things i dont know.. :)

4K dosnt come free.. at a rough guess i would say it takes 4 x the gpu power to drive a game at than 1080 does..

i would also guess that people view a 48 inch TV from a fair distance away.. pretty much like they do with large photo prints..

but unless viewing distances and monitor size are taken into account its all meaningless..

trog
Posted on Reply
#100
64K
Tech relentlessly moves forward. It always does. Always has. Always will. What is considered adequate this year is considered obsolete in a few years.

I remember when 320 X 200 resolution was considered cool on my C64. I used Display List Interrupts (Raster Interrupts) using Machine Language to put more Sprites on screen at once than would have normally been possible for the hell of it. Rose colored glasses? Yes, but it was good times for us.
Posted on Reply
Add your own comment
May 21st, 2024 20:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts