• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's Next Flagship Graphics Cards will be the GeForce X80 Series

There was the ATI/AMD x-1000 series in 2004 or 2005 IIRC, so it definitely would have been confusing had NVIDIA gone with the 1080, etc.
 
There are 8Ghz chips only in 8Gb density, So 384-bit makes 12GB, And 1.5V will run them very hot. Better avoid that. definitely Go for HBM2.
 
I actually thought that was exactly what has happened. Slow news week especially after the Capsaicin non-reveal, so why not plant some guesswork and watch the page hits flow.

To close to Nvidia's (+ ex-3Dfx's GPU design team) NV30 series I suspect. When PCI-Express first launched, Nvidia differentiated models from AGP with the PCX name. I suspect PX might not have the marketing cachet.

IMO I'd think GP100 would be the only real "compute" (FP64/double precision) chip. I'd really expect GP104 to pull double duty as a high end gaming GPU and enthusiast level mobile option so adding power hungry SFU's might not be an option. As far as I'm aware, all Pascal chips will feature mixed compute allowing for half-precision (FP16) ops (as will AMD's upcoming chips), since game engines as well as other applications can utilize it effectively. I really think the days of the a full compute second (and lower) tier gaming GPU are well and truly behind us - both Nvidia and AMD have sacrificed double precision in the name of retaining a more balanced approach to power, die size, and application of late.
By compute I was really only referencing the Titans because they should be the only ones with HBM2 like the Teslas.
 
There was the ATI/AMD x-1000 series in 2004 or 2005 IIRC, so it definitely would have been confusing had NVIDIA gone with the 1080, etc.

Not really confusing because those GPUs doesn't exist outside collector basements, and it was x1800/x1900, so some ways off numerically speaking.

Edit: No man it's coming back to me, they went down to x12xx, but those were integrated graphics, the dedicated GPU's started in x1300.
 
To my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
 
To my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
GDDR5x goes into mass production this summer.
 
To my knowledge neither HBM2 nor GDDR5X will ship in volume products until Q4, so don't expect any high-end models any time soon.

I wouldn't give any credence to these suggested specs.
Who would? That table says the same chip (GP100) will be available with ddr5 and hmb2 memory configurations. That alone is a dead giveaway, and lets not mention the rest.
 
My cards have 12GB, it's definitely not marketing hype.

People who think future titles and 4k gaming is realistic with 4GB is wrong. Actually, if people thought 4GB was enough for 1080p gaming there wouldn't be artards complaining about the 970 still like it's still a way to troll.

Because a $1k GPU with 12GB means its not marketting hype...When its ENTIRELY marketting surrounding that product.
 
Because a $1k GPU with 12GB means its not marketting hype...When its ENTIRELY marketting surrounding that product.
I'm talking about the VRAM amount, not the card.
 
History of notable GPU naming schemes:
GeForce 3 Ti500 (October 2001)
GeForce FX 4800 (March 2003)
GeForce 6800 GT (June 2004)
GeForce GTX 680 (March 2012)

GeForce FX 5800*

Also not sure why you picked up on the GTX 680 as being notable, when that naming system began with GT200 and the GTX 280. (Or the GTS 150, depending on who you ask.)

GeForce PX sounds more appropriate for Pascal

While that's cool, it seems weird to have put so much time and money into the build-up of the "GeForce GTX" brand only to kill it off just because they need to figure out some different numbers to put in the name. It seems more likely to just decide on some new numbers, keep the decade old branding that everyone already recognizes, and move on.
 
Nobody is copying from competitors, AMD does not have any cards named X80.
But they did have the All In Wonder series the X800
 
I miss the All In Wonder series cards..... I wish they'd bring those back.... AMD/ATi always did a top notch job on the multimedia side of their cards, those media center remotes some of those cards came with were amazing for their time.
 
4k is not market hype, it literally looks a ton crisper than 1080p and 1440p. You can't sit here and say 4x the average resolution is "market hype", it's the next hump in the road whether some people want to admit it or not. I got my TV lower than most large format 4k monitors so I took the dive knowing Maxwell performance isn't to par, but with 4k I don't need any real amounts of AA either. 2x at most in some areas depending on the game. That being said, I'd rather not have small incremental jumps in performance because some either can't afford it or find a way to afford it. That's called stagnation, and nobody benefits from that. Just look at CPUs for a clean cut example of why we don't need stagnation.
Some people are more than happy with 1080/1200p and don't intend to buy larger monitors you know?
 
Personally I am waiting for the new gen cards as a reason to buy those 3440x1440 21:9 curved monitors...
 
i expect a 25% to 30% performance increase for the same money from the next generation of cards..

which in reality means not all that much.. 80 fps instead of 60 fps or 40 fps instead of 30 fps..

my own view is that once you add in g-sync or free-sync anything much over 75 fps dosnt show any gains..

4 K gaming for those that must have it will become a little more do-able but not my much.. life goes on..

affordable VR will also become a bit more do-able..

trog
 
I see what they did there.

X80 = X = 10 = 1080 ;) so they thought X sounded "cooler" because marketing!

Time for a set of X80 Ti,s ... :)
That's what ATI did 12 years ago. After 9800, came X800 = 10800, nothing new in this.


damn, should have read other comments too.. :)
 
AMD's current naming scheme is also good. The R9, R7 and R5 designators tell you what class it belongs and then the actual model digits following that. Plus it sounds good. R9 390X. Or R9 Fury X.
 
I think we'll all find the next Nvidia flagship card is called:

1) "That's so too super expensive - you must be fanboyz to buy it", or
2) "Nvidia crushed AMD, Team red is going bust lozers", or
3) some other alpha numerical variant with a possible 'Titan' slipped in.

And yes, I'm, pushing this pointless 'news' piece to get to the 100 post mark. Come on everyone, chip in to make futility work harder.
 
Some people are more than happy with 1080/1200p and don't intend to buy larger monitors you know?
The same people said the same thing about their 19" 1280x1024 Dell monitors and now I bet they're all running 1080p or 1440p. I too once said 24" 1200p is all I need for great gaming in 09 and now I'm running a 48" 4k TV. Times change, people change, and some faster than others. Even Sony/Microsoft are releasing revamped consoles to support 4k. Then again, nobody has a gun to your head saying upgrade either. You like 1080p? Cool, a 980ti is an absolute monster for 1080p.
 
printers are rated in dots per inch.. DPI..

maybe monitors need rating the same way.. PPI.. pixels per inch.. your 4K 48 inch TV makes sense to me but when i see 4K on a 17 inch laptop it just makes a nonsense of it..

its like the mega pixel race with still cameras.. for web viewing you dont need that many.. to make errr 48 inch prints you do though..

4K is 8 million pixels.. at what size point (or viewing distance) it simply becomes unnoticeable i havnt a clue but there must be one..

my 1080 24 inch monitor at my normal viewing distance looked okay to me.. my 1440 27 inch monitor at the same viewing distance still looks okay to me..

however quite what sticking my nose 12 inches away from a 48 inch TV would make of things i dont know.. :)

4K dosnt come free.. at a rough guess i would say it takes 4 x the gpu power to drive a game at than 1080 does..

i would also guess that people view a 48 inch TV from a fair distance away.. pretty much like they do with large photo prints..

but unless viewing distances and monitor size are taken into account its all meaningless..

trog
 
Last edited:
Back
Top