Friday, March 18th 2016

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

With the GeForce GTX 900 series, NVIDIA has exhausted its GeForce GTX nomenclature, according to a sensational scoop from the rumor mill. Instead of going with the GTX 1000 series that has one digit too many, the company is turning the page on the GeForce GTX brand altogether. The company's next-generation high-end graphics card series will be the GeForce X80 series. Based on the performance-segment "GP104" and high-end "GP100" chips, the GeForce X80 series will consist of the performance-segment GeForce X80, the high-end GeForce X80 Ti, and the enthusiast-segment GeForce X80 TITAN.

Based on the "Pascal" architecture, the GP104 silicon is expected to feature as many as 4,096 CUDA cores. It will also feature 256 TMUs, 128 ROPs, and a GDDR5X memory interface, with 384 GB/s memory bandwidth. 6 GB could be the standard memory amount. Its texture- and pixel-fillrates are rated to be 33% higher than those of the GM200-based GeForce GTX TITAN X. The GP104 chip will be built on the 16 nm FinFET process. The TDP of this chip is rated at 175W.
Moving on, the GP100 is a whole different beast. It's built on the same 16 nm FinFET process as the GP104, and its TDP is rated at 225W. A unique feature of this silicon is its memory controllers, which are rumored to support both GDDR5X and HBM2 memory interfaces. There could be two packages for the GP100 silicon, depending on the memory type. The GDDR5X package will look simpler, with a large pin-count to wire out to the external memory chips; while the HBM2 package will be larger, to house the HBM stacks on the package, much like AMD "Fiji." The GeForce X80 Ti and the X80 TITAN will hence be two significantly different products besides their CUDA core counts and memory amounts.

The GP100 silicon physically features 6,144 CUDA cores, 384 TMUs, and 192 ROPs. On the X80 Ti, you'll get 5,120 CUDA cores, 320 TMUs, 160 ROPs, and a 512-bit wide GDDR5X memory interface, holding 8 GB of memory, with a bandwidth of 512 GB/s. The X80 TITAN, on the other hand, features all the CUDA cores, TMUs, and ROPs present on the silicon, plus features a 4096-bit wide HBM2 memory interface, holding 16 GB of memory, at a scorching 1 TB/s memory bandwidth. Both the X80 Ti and the X80 TITAN double the pixel- and texture- fill-rates from the GTX 980 Ti and GTX TITAN X, respectively.
Source: VideoCardz
Add your own comment

180 Comments on NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

#101
xfia
:toast:102:toast:
Posted on Reply
#102
PP Mguire
trog100printers are rated in dots per inch.. DPI..

maybe monitors need rating the same way.. PPI.. pixels per inch.. your 4K 48 inch TV makes sense to me but when i see 4K on a 17 inch laptop it just makes a nonsense of it..

its like the mega pixel race with still cameras.. for web viewing you dont need that many.. to make errr 48 inch pints you do though..

4K is 8 million pixels.. at what size point (or viewing distance) it simply becomes unnoticeable i havnt a clue but there must be one..

my 1080 24 inch monitor at my normal viewing distance looked okay to me.. my 1440 27 inch monitor at the same viewing distance still looks okay to me..

however quite what sticking my nose 12 inches away from a 48 inch TV would make of things i dont know.. :)

4K dosnt come free.. at rough guess i would say it takes 4 x the gpu power to drive a game at than 1080 does..

i would also guess that people view a 48 inch TV from fair distance away.. pretty much like they do with large photo prints..

but unless viewing distances and monitor size are taken into account its all meaningless..

trog
You're forgetting what a higher rendering resolution does too, it's not all just screen and PPI. My 4k TV is the same distance as I had my 27" ROG Swift. The clarity in gaming is a substantial difference between even 1440p to 4k regardless of screen size. Before I was using copious amounts of AA and now the most I use is 2x for aggressive lines like fences or roof tops. Naturally it depends on the game in the end, but even silly things like Rocket League show a major improvement. To kind of keep it on topic, this is where me whining about wanting a newer GPU architecture comes into play. Yes, it does take more power. I wouldn't say 4x the GPU power but a bit more. Tbh I didn't expect to keep the TV as it was supposed to be handed off to my gf but I liked it so much I sold my Swift and now I'm paying the consequences :laugh:
That's ok. I'm on an old school gaming kick again so most of my time is spent on my 98 box lately. It'll tide me over enough and 2x Titan X under water is more than plenty to max the frame cap in Rocket League, even with INI mods.
Posted on Reply
#103
xfia
ya know the other day I held a galaxy s4 and s6 next to each other and took some pics.. even at such a small screen size the winner was clear.
Posted on Reply
#104
Dethroy
trog100[...]
4K dosnt come free.. at a rough guess i would say it takes 4 x the gpu power to drive a game at than 1080 does..
[...]
trog
3.840/1.920 = 2
2.160/1.080 = 2
2*2 = 4

Good guess. But it only takes one glance at the numbers to see it is 4x the pixels :cool:

But the 16:9 aspect ratio is so yesteryear. Can't believe that the adoption of 21:9 takes so long that even tech savvy people like here on tpu still don't have it on their radar.
I'd prefer 3.440 * 1.440 over 3.840x2.160 anyday. Can't wait for the arrival of 1.440p Ultrawide 144Hz HDR. 4K is dogshit and needs 67.44% more horsepower - but anyone foolish enough believing all that marketing crap deserves to pay the price...
Posted on Reply
#105
PP Mguire
Dethroy3.840/1.920 = 2
2.160/1.080 = 2
2*2 = 4

Good guess. But it only takes one glance at the numbers to see it is 4x the pixels :cool:

But the 16:9 aspect ratio is so yesteryear. Can't believe that the adoption of 21:9 takes so long that even tech savvy people like here on tpu still don't have it on their radar.
I'd prefer 3.440 * 1.440 over 3.840x2.160 anyday. Can't wait for the arrival of 1.440p Ultrawide 144Hz HDR. 4K is dogshit and needs 67.44% more horsepower - but anyone foolish enough believing all that marketing crap deserves to pay the price...
I have a 34" 3440x1440 Dell at work and I don't like it. I also owned an LG Freesync 29" ultrawide and didn't like that either. Vertical size is way too small for me. Most of my PC buds have the same issue with it.
Posted on Reply
#106
Frick
Fishfaced Nincompoop
To the 4K users: how does old games look on it? Say Diablo 2 unmodded.
Posted on Reply
#107
Prima.Vera
FrickTo the 4K users: how does old games look on it? Say Diablo 2 unmodded.
LOL. Saw that actually. 800x600 stretched on a 4K monitor....Blur pr0n fest. Like watching a very low res video on your 1080p monitor.... :)))

However, if you use a hack, it looks like this :)))
Posted on Reply
#108
trog100
PP MguireYou're forgetting what a higher rendering resolution does too, it's not all just screen and PPI. My 4k TV is the same distance as I had my 27" ROG Swift. The clarity in gaming is a substantial difference between even 1440p to 4k regardless of screen size. Before I was using copious amounts of AA and now the most I use is 2x for aggressive lines like fences or roof tops. Naturally it depends on the game in the end, but even silly things like Rocket League show a major improvement. To kind of keep it on topic, this is where me whining about wanting a newer GPU architecture comes into play. Yes, it does take more power. I wouldn't say 4x the GPU power but a bit more. Tbh I didn't expect to keep the TV as it was supposed to be handed off to my gf but I liked it so much I sold my Swift and now I'm paying the consequences :laugh:
That's ok. I'm on an old school gaming kick again so most of my time is spent on my 98 box lately. It'll tide me over enough and 2x Titan X under water is more than plenty to max the frame cap in Rocket League, even with INI mods.
oddly enough i think using copious amounts of AA is more habit than anything else..

i have been experimenting with turning it off at 1440.. running the heaven or valley benchmarks and looking very carefully is a good test to use.. 8 x AA creates a blurry image.. turning it off creates a sharper more detailed image.. can i see jaggies.. maybe in certain situations but as a general rule the image just looks sharper with better defined edges..

but i am currently running my games at 1440 with AA ether off or very low.. i am into a power saving mode which is why i dont just run everything balls out on ultra settings..

again this all comes down to screen size and viewing distances.. i know for fact it does with high quality still images.. so far i cant see any reason the same principle should not apply to moving images..

currently i am lowering settings to see just what i can get away with before it noticeably affects game play visuals..

trog
Posted on Reply
#109
Prima.Vera
DethroyBut the 16:9 aspect ratio is so yesteryear. Can't believe that the adoption of 21:9 takes so long that even tech savvy people like here on tpu still don't have it on their radar.
I'd prefer 3.440 * 1.440 over 3.840x2.160 anyday. Can't wait for the arrival of 1.440p Ultrawide 144Hz HDR. 4K is dogshit and needs 67.44% more horsepower - but anyone foolish enough believing all that marketing crap deserves to pay the price...
Completely agree. Personally I am waiting for this card to see if it can push games on 3440x1440 without any issues, because the 34 21:9 incher is already in my plans.
Posted on Reply
#110
Ithanul
arterius2Should seriously upgrade your screen before anything in your case.
Have no interest in upgrading my screen. I don't game with both my Tis anyways. Note, I do have a two higher res screens, but none of them are ever going to see gaming since they are for other things. *Dell Ultrasharp 30 inch 1600P and Wacom Cintiq 24HD that weighs a ton*

Plus, I have not game on my rig for the past several months. Been busy playing the Wii U instead. :p
64K@Ithanul mostly uses GPUs for Folding.
Yep, 24/7 to the WALL! Plus, I abuse the crap out of GPUs. I have a dead GTX Titan to my name (good thing for EVGA warrenty). Though, I do more than just fold. Main rig is my power house jack of trades, folding, boincing, 3D rendering, RAW photos, etc. Though, I did just get two old IBM servers. Woot, now to get some RAM and SCSI HDD for them and get them to working.
Posted on Reply
#111
Katanai
Well, personally I'm waiting for the next generation of GPU's to be released because I have to upgrade soon. I don't know if it will be exactly like this article says but I know Nvidia will be superior once again and I'll have to read countless posts on TPU about how AMD is better although it isn't, how Nvidia is evil bla bla bla. All I have to say is: see you in 2017 AMD fanboys when Nvidia will be superior to AMD like it was in 2016, 2015, 2014, etc... and I will be really sad about it:

Posted on Reply
#112
PP Mguire
trog100oddly enough i think using copious amounts of AA is more habit than anything else..

i have been experimenting with turning it off at 1440.. running the heaven or valley benchmarks and looking very carefully is a good test to use.. 8 x AA creates a blurry image.. turning it off creates a sharper more detailed image.. can i see jaggies.. maybe in certain situations but as a general rule the image just looks sharper with better defined edges..

but i am currently running my games at 1440 with AA ether off or very low.. i am into a power saving mode which is why i dont just run everything balls out on ultra settings..

again this all comes down to screen size and viewing distances.. i know for fact it does with high quality still images.. so far i cant see any reason the same principle should not apply to moving images..

currently i am lowering settings to see just what i can get away with before it noticeably affects game play visuals..

trog
How AA affects your image quality entirely depends on the type of AA being used. Multi-sampling in general is an edge sharpener, Super-sampling basically renders the entire scene at a higher resolution (why Metro killed many rigs with it enabled), and techniques like FXAA use a blurring method to reduce jaggies at a much lower cost to performance. Of course this is really simplified but in general MS and SS shouldn't blur your image. Moving from 1080p/1440p to 4k is basically SS in itself which is why I now prefer it. I use 2x MS if the situation calls which is hardly ever.
Posted on Reply
#113
Caring1
the54thvoidAnd yes, I'm, pushing this pointless 'news' piece to get to the 100 post mark. Come on everyone, chip in to make futility work harder.
Futility seems pointless, I'm still thinking about procrastinating. :nutkick: :laugh:
Posted on Reply
#114
medi01
L.ccdAnd yet the same unsubstantiated crap is linked and talked about on a number of hardware sites. Why does TPU relay that made-up crap?
Cause, moar clicks and most active comments section, I guess...
Posted on Reply
#115
Frick
Fishfaced Nincompoop
Prima.VeraLOL. Saw that actually. 800x600 stretched on a 4K monitor....Blur pr0n fest. Like watching a very low res video on your 1080p monitor.... :)))

However, if you use a hack, it looks like this :)))
And that is just daft IMO. :( Higher res is good up to a point, but in most old games I've found it best to stay on the low side. Everything becomes so tiny.
Posted on Reply
#116
the54thvoid
Super Intoxicated Moderator
KatanaiWell, personally I'm waiting for the next generation of GPU's to be released because I have to upgrade soon. I don't know if it will be exactly like this article says but I know Nvidia will be superior once again and I'll have to read countless posts on TPU about how AMD is better although it isn't, how Nvidia is evil bla bla bla. All I have to say is: see you in 2017 AMD fanboys when Nvidia will be superior to AMD like it was in 2016, 2015, 2014, etc... and I will be really sad about it:

Spoken like a true fanboy. FWIW, if Fiji clocked 10-15% higher, it would be faster than the 980ti. Only reason the 980ti is favoured is because it was designed to be more energy efficient, therefore clocks way higher. And before you say I'm a fanboy my avatar is a 75.9% ASIC 980ti Kingpin. With a hideously expensive US imported Bitspower WB.

Fury X is a good card - it pretty much equals (or exceeds) an out of the box 980ti. It just doesn't match it when overclocked and as a lot of people point out - most non techy folk don't overclock.
Caring1Futility seems pointless, I'm still thinking about procrastinating. :nutkick: :laugh:
Wow! That's a metaphysical state of knowing you're unsure about even knowing if you want to start thinking about it. I bow to your supreme state of mental awareness. :respect:
Posted on Reply
#117
BiggieShady
Prima.VeraHowever, if you use a hack, it looks like this :)))
Mindboggling. I couldn't resist overlapping original diablo 2 game view over your image to get a sense of scale.
Funny coincidence, what you see on screen at 4K is roughly the area of mini map in original:
Posted on Reply
#118
Katanai
the54thvoidSpoken like a true fanboy. FWIW, if Fiji clocked 10-15% higher, it would be faster than the 980ti. Only reason the 980ti is favoured is because it was designed to be more energy efficient, therefore clocks way higher. And before you say I'm a fanboy my avatar is a 75.9% ASIC 980ti Kingpin. With a hideously expensive US imported Bitspower WB.
Yeah, yeah, IF Fiji clocked higher and IF 980ti clocked lower and IF we lived on another planet then it would be faster than the 980ti. And IF on that other planet Intel wouldn't exist then AMD would have the best CPU's in the world. In the meantime on planet Earth TPU is one of the last corners of the internet where people still believe that AMD is worth anything...
Posted on Reply
#119
trog100
there is a very simple reason one teams chip can overclock more than the other teams.. the winning team sets the pace..

one chip is cruising (can go faster) the other chip is running close to balls out (cant go faster) trying to keep up..

this is also why the winning teams products (at stock) will use less power generate less heat and make less noise.. its a three way win..

in a way its a rigged race with one team running just fast enough to stay in the lead..

the winning teams currently are intel and nvidia.. not a lot else to say..


trog
Posted on Reply
#120
EarthDog
What??? o_O

You do know that AMD processors overclock more % wise on average, right? Also, non fury products(read rX series) tend to overclock quite well... there are exceptions on both sides. ;)

But there are other reasons why performamce, power use, and overclocking are different.

Creative, that thinking...I will give you that!
Posted on Reply
#121
the54thvoid
Super Intoxicated Moderator
KatanaiYeah, yeah, IF Fiji clocked higher and IF 980ti clocked lower and IF we lived on another planet then it would be faster than the 980ti. And IF on that other planet Intel wouldn't exist then AMD would have the best CPU's in the world. In the meantime on planet Earth TPU is one of the last corners of the internet where people still believe that AMD is worth anything...
I own a 980ti for a reason - it is the fastest single gpu card you can buy. I know Fiji's core is close to the top end while Maxwell is running easy but the next chips are unknowns. If AMD get the shrink right they will have a formidable card. Nvidia ripped out Kepler's compute component with Maxwell, that's why they still sell as a Kepler chip as one of their top end HPC parts.

www.techpowerup.com/207265/nvidia-breathes-life-into-kepler-with-the-gk210-silicon.html

That sacrifice on Maxwell allowed the clocks to go higher with a lower power draw. Fiji kept a relevant chunk of compute and in the next process shrink it may turn out quite well for them.

I had a 7970 that clocked at 1300Mhz (huge back then but a lot of them did it). My Kepler card didn't clock so well (compute is hot to clock) but with Maxwell and lower compute, the clocks are good. The base design of Fiji, which Polaris will be based on(?) has good pedigree for DX12. If Nvidia drop the ball (highly unlikely) AMD could take a huge lead in next gen gfx.

As it is, I don't believe Nvidia will drop the ball. I think they'll bring something quite impressive to the 'A' game. At the same time, I think AMD will also be running with something powerful. We can only wait and see.

FTR - there are many forums out there that believe AMD are worth something.
Posted on Reply
#122
trog100
EarthDogWhat??? o_O

You do know that AMD processors overclock more % wise on average, right? Also, non fury products(read rX series) tend to overclock quite well... there are exceptions on both sides. ;)

But there are other reasons why performamce, power use, and overclocking are different.

Creative, that thinking...I will give you that!
its pretty logical thinking.. one team will always have the edge.. that team will also hold back something in reserve.. all it has to do is beat the other team..

the team that is struggling to keep up dosnt have this luxury.. it in simple terms it has to try harder.. trying harder means higher power usage more heat and more noise..

from a cpu point of view amd have simply given up trying to compete.. they are still trying with their gpus..

when you make something for mass sale it needs a 5 to 10 percent safety margin to avoid too many returns

back in the day when i moved from an amd cpu to an intel.. i bought a just released intel chip.. it came clocked at 3 gig.. at 3 gig it pissed all over the best amd chip.. within a few days i was benching it at 4.5 gig without much trouble..

a pretty good example of what i am talking about.. a 50% overclock is only possible if the chip comes underclocked in the first place..

trog
Posted on Reply
#123
EarthDog
trog100a pretty good example of what i am talking about.. a 50% overclock is only possible if the chip comes underclocked in the first place..
That is how things work though, trog. It all depends on the quality of the silicon, binning, and, sales, actually. ;)
Posted on Reply
#125
FRAGaLOT
I don't know why nvidia won't name the products after the cores they have in them, "GeForce GP104" sounds like a fine product name along with "GeForce GP104 Ti" and "GeForce GP104 Titan" and whatever other gimped versions of the "GP10x" they make of this core for cheap cards.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts