# NVIDIA GeForce GTX 1080 Put Through 3DMark



## btarunr (May 5, 2016)

Some of the first 3DMark performance numbers of NVIDIA's upcoming GeForce GTX 1080 graphics card made it to Futuremark's online database. The results page hint at samples of the GTX 1080 running on early drivers, on two separate machines (likely from two different sources). The first source, who ran the card on a machine with a Core i7-5820K processor, scored P19005 on 3DMark 11 (performance preset). The second source, who ran the card on a machine with a Core i7-3770K processor, scored 8959 points on 3DMark FireStrike Extreme. Both scores point at GTX 1080 being faster than a GTX 980 Ti.



 



*View at TechPowerUp Main Site*


----------



## RejZoR (May 5, 2016)

Numbers mean nothing to me if I don't have anything to compare them with. Can someone also post numbers for current gen cards using similar system?


----------



## EarthDog (May 5, 2016)

RejZoR said:


> Numbers mean nothing to me if I don't have anything to compare them with. Can someone also post numbers for current gen cards using similar system?


You can search the FM database........................

Anyway, its about spot on/a bit faster than a factory overclocked 980Ti and 5820K.


----------



## ZoneDymo (May 5, 2016)

If it does not destroy the GTX980Ti its a fail in my book


----------



## MxPhenom 216 (May 5, 2016)

ZoneDymo said:


> If it does not destroy the GTX980Ti its a fail in my book



Considering its likely not big die Pascal I disagree. However it should royally destroy the 980 non-ti. Thats a better comparison.


----------



## badtaylorx (May 5, 2016)

wow, that's disappointing...

OC'd 980ti.... that's it???


----------



## Kursah (May 5, 2016)

That's what the 1080Ti is for.


----------



## the54thvoid (May 5, 2016)

This is not a 'ti'.

It's going to be slightly better (maybe) than 980ti, I'd guess with greater power efficiency.  What the videocardz piece says as well is that the LN2 cooled 980ti FM results at similar clocks destroy the 1080.  So perhaps Pascal isn't very good until we see daddy Pascal.  Same as 980 Maxwell wasn't really much better than a 780ti - this is the same all over again.  It's only because of it's high clock speed that it's improving...


----------



## EarthDog (May 5, 2016)

badtaylorx said:


> wow, that's disappointing...
> 
> OC'd 980ti.... that's it???


With a lot less power use... its also not the flagship.


----------



## ssdpro (May 5, 2016)

This has been the pattern for years and is proper and expected.  A new generation of cards will not make the previous gen obsolete or useless.  The new 1080 should be on par or lightly exceed the 980Ti, the 1070 should meet or slightly exceed the 980, the 1060 should meet or slightly exceed the 970.  The difference is higher performance, less power, for less money.  I would consider it a success if the "exceed" part is about 10% since the lifespan of Maxwell has been quite long.


----------



## ZoneDymo (May 5, 2016)

So 2 cards maybe that actually stand for some better performance we actually need (and might I add for quite a few generations now) and those will be the highest priced cards?

Yeah good stuff.... anyone giving a damn about actual damn progress and actually something worth being called "next gen" would want a GTX1060 to be at the level of a GTX980Ti now and everything above it should destroy it.
But nope, guess another "Generation" of mediocre upgrades unless you pay 700 dollars, great, good stuff.

I mean http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/7.html

a current top of the line card, that this new pretty much close to top of the line card gets similar performance to, gets a mere 43 fps on 2560x1440....
You are buying a card that is incapable from the start. 
Lets not play around people this is the "80" not the 70 not the 60 but the 80 line of cards, and this is what we can expect from a future card, not even being able to max a CURRENT game, let alone games in the near future?

I dont see the point.


----------



## medi01 (May 5, 2016)

EarthDog said:


> You can search the FM database........................



Yeah, that's why we read articles, to go dig numbers out there somewhere...


----------



## EarthDog (May 5, 2016)

Sorry you have to make SOME effort on your own...

Why was that up to the person posting the news??!??


----------



## PP Mguire (May 5, 2016)

Think people tend to forget....a lot...that P104 is a midrange chip.


----------



## EarthDog (May 5, 2016)

Well, in fairness, you are now the third person that brought it up in 15 posts...


----------



## PP Mguire (May 5, 2016)

EarthDog said:


> Well, in fairness, you are now the third person that brought it up in 15 posts...


And yet others still going on about how this is shit, when in fact a midrange beating a top end is quite good.


----------



## EarthDog (May 5, 2016)

God bless the miracle of forums!!!


----------



## ZoneDymo (May 5, 2016)

PP Mguire said:


> And yet others still going on about how this is shit, when in fact a midrange beating a top end is quite good.



oh right 980 midrange, add a "T" and a "i" and its top end.
Makes sense.....


----------



## EarthDog (May 5, 2016)

ZoneDymo said:


> oh right 980 midrange, add a "T" and a "i" and its top end.
> Makes sense.....


It does, actually. Welcome to NVIDIA nomenclature.


----------



## TheGuruStud (May 5, 2016)

This sounds like fud.


----------



## erocker (May 5, 2016)

ZoneDymo said:


> oh right 980 midrange, add a "T" and a "i" and its top end.
> Makes sense.....


That's exactly what they're doing. Makes sense, makes them more money.


----------



## ZoneDymo (May 5, 2016)

EarthDog said:


> It does, actually. Welcome to NVIDIA nomenclature.



the 60 is low end gaming
the 70 is mid end gaming
the 80 is high end gaming
the Ti is a boosted version

Titan is overpriced

Thats how the nomenclature worked, 
the GTX960Ti would not suddenly better then the 980 purely because of the Ti thing


----------



## EarthDog (May 5, 2016)

Yes and no...

The 980 and 980TI are DIFFERENT GPUs. Its not simply an overclocked 980. On the mid/lower level, the TI's tend to be overclocked versions, but there is some GPU difference. It would have to be looked at on a case by case basis.

But yeah, 980 to 980Ti are two different cores under the hood. 980Ti and Titan X have the same GPU. They do NOT share it with the 980Ti. So, that is not 'what they are doing' at least at the flagship level. I don't know about the midrange... that may be true (you can take the time to look it up if you choose. ).


----------



## efikkan (May 5, 2016)

And this was pretty much exactly what we expected; performance sligtly above GTX 980 Ti...


----------



## newtekie1 (May 5, 2016)

EarthDog said:


> The 980 and 980TI are DIFFERENT GPUs. Its not simply an overclocked 980. On the mid/lower level, the TI's tend to be overclocked versions, but there is some GPU difference. It would have to be looked at on a case by case basis.



The only Ti I remember that wasn't high end was the GTX560Ti, and it was a different GPU than the GTX560.


----------



## Frick (May 5, 2016)

EarthDog said:


> Yes and no...
> 
> The 980 and 980TI are DIFFERENT GPUs. Its not simply an overclocked 980. On the mid/lower level, the TI's tend to be overclocked versions, but there is some GPU difference. It would have to be looked at on a case by case basis.
> 
> But yeah, 980 to 980Ti are two different cores under the hood. 980Ti and Titan X have the same GPU. They do NOT share it with the 980Ti. So, that is not 'what they are doing' at least at the flagship level. I don't know about the midrange... that may be true (you can take the time to look it up if you choose. ).



The 980 is still a high end card. No ifs or buts. It's €500. If the 1080 is about as much that too is a high end card.


----------



## EarthDog (May 5, 2016)

True... but not sure what your point is in regards to the previous context (dude just said Ti is an overclocked version... its NOT from 980 to 980Ti.. its a different GPU).


----------



## efikkan (May 5, 2016)

newtekie1 said:


> The only Ti I remember that wasn't high end was the GTX560Ti, and it was a different GPU than the GTX560.


What about GTX 750 Ti then? (Low end)


----------



## MagnuTron (May 5, 2016)

.. So it can run games in 1080P? .. he.. he.. my humor is amazing *cries in solitude*


----------



## Toothless (May 5, 2016)

MadsMagnus said:


> .. So it can run games in 1080P? .. he.. he.. my humor is amazing *cries in solitude*


1080i and struggling


----------



## EarthDog (May 5, 2016)

efikkan said:


> What about GTX 750 Ti then? (Low end)


That was the first Maxwell GPU... I think the 750 is a different GPU as well? 

There was also the 660Ti and I believe it was overclocked.


----------



## xvi (May 5, 2016)

EarthDog said:


> But yeah, 980 to 980Ti are two different cores under the hood. 980Ti and Titan X have the same GPU. They do NOT share it with the 980Ti.


Oh, my head.







Makes sense in a weird nVidia kind of way since they release the Ti a bit later. From a marketing perspective, they wouldn't want the x80 slot empty in their lineup only to be filled in a while later. Instead, I'm guessing they'll advertize the 1080 as top dog until enough competition comes out for them to release their actual top dog, 1080 Ti.

Edit: Seems like that's going to put a decent performance gap between the 1080 and 1080 Ti though. Somewhere for AMD to sneak in?


----------



## efikkan (May 5, 2016)

EarthDog said:


> That was the first Maxwell GPU... I think the 750 is a different GPU as well?
> 
> There was also the 660Ti and I believe it was overclocked.


No, it was different binnings of the same: GM107-400-A2 and GM107-300-A2.
GTX 660 was GK106-400-A1 and GTX 660 Ti was GK104-300-KD-A2.
"GTX 650 Ti boost" as an overclocked GTX 650 Ti.


----------



## idx (May 5, 2016)

I think the GTX1080 or whatever the name will be .. will have a low number of shaders and a really high clock speed. this is exactly what nvidia did with GM204 and GM200 it was boosting up to 1500Mhz in game.
In general low number of shaders means much less power consumption and they do make up for the low shaders with the high clocks.

if AMD end up again with a lower GPU clocks ... it will be really bad for the red team. I really hope AMD also going the same path.


----------



## efikkan (May 5, 2016)

idx said:


> I think the GTX1080 or whatever the name will be .. will have a low number of shaders and a really high clock speed. this is exactly what nvidia did with GM204 and GM200 it was boosting up to 1500Mhz in game.
> In general low number of shaders means much less power consumption and they do make up for the low shaders with the high clocks.
> 
> if AMD end up again with a lower GPU clocks ... it will be really bad for the red team. I really hope AMD also going the same path.


The ~200m² GPU from AMD will not be able to compete with GTX 1080, the closest thing from AMD will be the Fury X. AMD will also have a hard time with GTX 1070, but probably be able to match GTX 1060(Ti).


----------



## ppn (May 5, 2016)

Well 1080Ti isn't going to impress either. So 330mm² chip has 2560 Cuda, and 610mm² sporting only 3540 out of 3840. If it isn't at least full 5120 Cuda. It's a joke. Better release the 400mm² chip with full 3072 Now! or just forget about it. this generation begins and ends with 1060 Ti.


----------



## Sah7d (May 5, 2016)

Well... no point for upgrading my 980Ti SLI 
So much hype for the "next gen" when actually Nvidia has ALWAYS managed this way
the GREAT new generations of 10% more efficiency for 20% more cost

In other words, If You want the performance of a 980Ti for the mid end range
wait for the GTX 2070 line in 2018 and thats optimistic but sure for the GTX 3070 in 2020


----------



## idx (May 5, 2016)

efikkan said:


> The ~200m² GPU from AMD will not be able to compete with GTX 1080, the closest thing from AMD will be the Fury X. AMD will also have a hard time with GTX 1070, but probably be able to match GTX 1060(Ti).


Nothing for sure yet. we will just have to wait and see. there is still one thing that amd might be doing this time. The number of ROPs if they go for something really high like 128ROPs? Then they may really pull off something special this time.
I dont think Nvidia is going for more than 80ROPs in GP10x.


----------



## Space Lynx (May 5, 2016)

Kursah said:


> That's what the 1080Ti is for.




14nm was supposed to a HUGE leap though, the 1080 non-ti should have smashed a 980 ti... 1080 ti 15% increase max if we lucky, it is sad honestly. all the hype saying 50% increase cause finally going to a new die shrink, same hype they said about skylake, and my 4.8ghz 2500k owns  stock skylake in games still, lulz..


----------



## efikkan (May 5, 2016)

ppn said:


> Well 1080Ti isn't going to impress either. So 330mm² chip has 2560 Cuda, and 610mm² sporting only 3540 out of 3840. If it isn't at least full 5120 Cuda. It's a joke. Better release the 400mm² chip with full 3072 Now! or just forget about it. this generation begins and ends with 1060 Ti.


GTX 1080 Ti isn't going to be based on GP104, and wouldn't arrive for many months yet.

Which 400mm² chip are you talking about?



Sah7d said:


> Well... no point for upgrading my 980Ti SLI
> So much hype for the "next gen" when actually Nvidia has ALWAYS managed this way
> the GREAT new generations of 10% more efficiency for 20% more cost


You are right, it's no sense upgrading your GTX 980 Tis, but we all knew that already? It has been known for a while that the first Pascals will be the GP104, so it wouldn't push the high-end just yet.



idx said:


> Nothing for sure yet. we will just have to wait and see. there is still one thing that amd might be doing this time. The number of ROPs if they go for something really high like 128ROPs? Then they may really pull off something special this time.
> I dont think Nvidia is going for more than 80ROPs in GP10x.


Neither AMD nor Nvidia is limited by ROP performance at this time, and AMD will only compete in the lower mid-range and low-end with Polaris, so I don't see why ROPs are going to matter.


----------



## EarthDog (May 5, 2016)

efikkan said:


> No, it was different binnings of the same: GM107-400-A2 and GM107-300-A2.
> GTX 660 was GK106-400-A1 and GTX 660 Ti was GK104-300-KD-A2.
> "GTX 650 Ti boost" as an overclocked GTX 650 Ti.


Excellent! THanks for the clarity!


----------



## Tsukiyomi91 (May 5, 2016)

I'll keep my GTX970s for another 2 more years since Pascal isn't really taken off, despite promising benchmarks stating it can more or less keep up with a GTX980Ti. Seems that we need to give some time for Pascal to mature I guess...


----------



## ppn (May 5, 2016)

A hypothetical GP102-400mm² would be nice, There is no reason for 3540 lower clocked GP100-based. If 3072 Cuda can do the same job. So we don't know what Ti is going to be, surely 610mm² seems inefficient - 1/4 of it is represents dead weight.


----------



## DarkOCean (May 5, 2016)

idx said:


> Nothing for sure yet. we will just have to wait and see. there is still one thing that amd might be doing this time. The number of ROPs if they go for something really high like 128ROPs? Then they may really pull off something special this time.
> I dont think Nvidia is going for more than 80ROPs in GP10x.


Higher clocks=more Gpixels/s so i doubt it will have more than 64.


----------



## Casecutter (May 5, 2016)

If priced appropriately below the last "top" mid-range offering, it might present some counterpose to this commensurate performance increase.  To me it feels like a $400 MSRP


----------



## jabbadap (May 5, 2016)

efikkan said:


> No, it was different binnings of the same: GM107-400-A2 and GM107-300-A2.
> GTX 660 was GK106-400-A1 and GTX 660 Ti was GK104-300-KD-A2.
> "GTX 650 Ti boost" as an overclocked GTX 650 Ti.



GTX650ti boost has base and boost clocks, which gtx650ti does not have. Other difference is boost has full gk106:s memory interface 192bit and non-boost has 128bit. All added boost is well faster than non-boost.


----------



## idx (May 5, 2016)

efikkan said:


> GTX 1080 Ti isn't going to be based on GP104, and wouldn't arrive for many months yet.
> 
> Which 400mm² chip are you talking about?
> 
> ...



ROPs really does matter specially if you have high number of shaders, the more shaders you got the more pixels that can be pushed through (actually processed), however there are other stuff to be done along the pipeline with each pixel, like vertices (polygons) and here where ROPs comes in. Yes with more shaders you can run high res but you also need to push a huge number of polygons every frame.
notice that Fiji cards where actually doing great when it comes to running games at high res but it didnt seem to go that much faster at lower res, thats because the gpu was limited by its much smaller ROPs (only 64) for such a huge GPU with 4k SP that was really a bottleneck. If Fiji gpus had more ROPs and TMUs, things would have been much better for AMD. 
AMDs Raja did actually say something about this, and that the reason was something related to the 28nm limitations with such huge number of shaders (honestly I didn't understand part of what he said but it was something also related to that).

sorry for getting too nerdy.. lol


----------



## efikkan (May 5, 2016)

idx said:


> ROPs really does matter specially if you have high number of shaders, the more shaders you got the more pixels that can be pushed through (actually processed), however there are other stuff to be done along the pipeline with each pixel, like vertices (polygons) and here where ROPs comes in.


ROPs only matter when you increase the amount of raserization; higher resolution, higher framerate, higher AA, higher resolution temporary framebuffers, etc. More Gflop/s increases the throughput and does of course increase ROP load if there are no other bottlenecks, but then someone first would need to increase the throughput. AMD is currently struggling a lot with their architecture, despite having up to 50% more GFlop/s than Nvidia, so the other bottlenecks will have to be addressed first.



idx said:


> notice that Fiji cards where actually doing great when it comes to running games at high res but it didnt seem to go that much faster at lower res, thats because the gpu was limited by its much smaller ROPs (only 64) for such a huge GPU with 4k SP that was really a bottleneck.


As stated above, not true at all. Higher resolution increases the load on the ROPs, so that's irrelevant in this case. The problem with Fury X is the scheduler which is unable to feed the massive amount of cores.


----------



## truth teller (May 5, 2016)

truth teller said:


> you guys are expecting way too much from this, its just a rehash from maxwell, there are no architectural changes. performance gains will be marginal in most cases. and in case of improvement in performance it will most likely be related to ram/bus saturation of the previous gen (inb4 3.5).


i dont really get why people are so surprised of the outcome, we already _knew_ this would happen.

the sad part is nbadia is still going to sell a lot of these "best" gpus to their religious flock...
"hey look im part of that orderof10tacles secrect club stuff! is awesum! i pity u lame poor outsiders"


----------



## EarthDog (May 5, 2016)

truth teller said:


> i dont really get why people are so surprised of the outcome, we already _knew_ this would happen.
> 
> the sad part is nbadia is still going to sell a lot of these "best" gpus to their religious flock...
> "hey look im part of that orderof10tacles secrect club stuff! is awesum! i pity u lame poor outsiders"


Is it just a rehash though? Before we go down this rabbit hole, how do you define 'rehash'? I define it as AMD does it... reBRAND the same EXACT GPU (with more vRAM and different clocks). THAT is a rehash/rebrand to me. 

I am pretty sure Pascal has some architectural differences over Maxwell as well as being more efficient per clock.


----------



## HumanSmoke (May 5, 2016)

the54thvoid said:


> What the videocardz piece says as well is that the LN2 cooled 980ti FM results at similar clocks destroy the 1080.  So perhaps Pascal isn't very good until we see daddy Pascal.  Same as 980 Maxwell wasn't really much better than a 780ti - this is the same all over again.  *It's only because of it's high clock speed that it's improving.*..


That is at least something. A theoretical clock-for-clock "loss" to GM200 doesn't mean too much in the real world if 99.999% of 980Ti/Titan X users are capped at <1500-1550MHz. Like the previous generation, big die to small die isn't the natural upgrade path, but it does take the shine off the previous kings of the hill.
I think more interest (at least for me) will be
ROP count - although Maxwell wasn't particularly ROP limited- just as an indicator of what we might see with GP102/GP100
TMU throughput - something that has limited Maxwell to a degree. It still wouldn't surprise me to see the 980Ti/Titan X shade GP104 in texture fill and narrow any gap to the new Pascal at high resolutions/DSR/FSAA.
Board power limits and overclock headroom.


truth teller said:


> the sad part is nbadia is still going to sell a lot of these "best" gpus to their religious flock...
> "hey look im part of that orderof10tacles secrect club stuff! is awesum! i pity u lame poor outsiders"


Because that is solely a Nvidia trait obviously...

Demand the best.......get a neutered Tonga....no hyperbole there then


----------



## truth teller (May 5, 2016)

EarthDog said:


> I am pretty sure Pascal has some architectural differences over Maxwell as well as being more efficient per clock.


does a pig with makeup stops being a pig?


----------



## TissueBox (May 5, 2016)

Really surprised at the community here. The 1080 replaces the 980, and it is on average ~34% faster (assuming GTX 980 Ti @ 1190MHz performance levels). Similar to the 980 reveal, which was about 30% better than the 780 and just barely faster than the 780 Ti (5-10%).

That's pretty solid. 980 Ti owners should be looking at the 1080 Ti or Pascal Titan as their replacement. 

I'll probably opt for SLI 1080s as it shifts down a price bracket when the 1080 Ti releases.


----------



## idx (May 5, 2016)

efikkan said:


> ROPs only matter when you increase the amount of raserization; higher resolution, higher framerate, higher AA, higher resolution temporary framebuffers, etc. More Gflop/s increases the throughput and does of course increase ROP load if there are no other bottlenecks, but then someone first would need to increase the throughput. AMD is currently struggling a lot with their architecture, despite having up to 50% more GFlop/s than Nvidia, so the other bottlenecks will have to be addressed first.
> 
> 
> As stated above, not true at all. Higher resolution increases the load on the ROPs, so that's irrelevant in this case. The problem with Fury X is the scheduler which is unable to feed the massive amount of cores.



I am sorry, I don't mean to look like arguing. but It seems that you dont understand my point.

The processing of pixel shaders and any other kind of shaders is already done by the time the data reaches the  *rasterisation *part. And there where the rasterization of vector graphics happens AT the gpu ROPs. And thats why it is so important to have more ROPs in case if we have a big number of SPs and TMUs on the gpu.

Again I really apologize if I sounded like arguing. I am really not , I'm just sharing an info (talking from experience).


----------



## Legacy-ZA (May 5, 2016)

ZoneDymo said:


> So 2 cards maybe that actually stand for some better performance we actually need (and might I add for quite a few generations now) and those will be the highest priced cards?
> 
> Yeah good stuff.... anyone giving a damn about actual damn progress and actually something worth being called "next gen" would want a GTX1060 to be at the level of a GTX980Ti now and everything above it should destroy it.
> But nope, guess another "Generation" of mediocre upgrades unless you pay 700 dollars, great, good stuff.
> ...



I am so glad somebody else gets it.


----------



## xvi (May 5, 2016)

I'd be curious to see what kind of mobile offerings this generation spawns. Performance per watt, please.


----------



## N3M3515 (May 5, 2016)

ZoneDymo said:


> Yeah good stuff.... anyone giving a damn about actual damn progress and actually something worth being called "next gen" would want a GTX1060 to be at the level of a GTX980Ti now and everything above it should destroy it.
> But nope, guess another "Generation" of mediocre upgrades unless you pay 700 dollars, great, good stuff.



This. MEDIOCRE upgrade. And yes, GTX1060 should be above every single gpu card.



lynx29 said:


> 14nm was supposed to a HUGE leap though, the 1080 non-ti should have smashed a 980 ti... 1080 ti 15% increase max if we lucky, it is sad honestly. all the hype saying 50% increase cause finally going to a new die shrink, same hype they said about skylake, and my 4.8ghz 2500k owns  stock skylake in games still, lulz..



This.



Frick said:


> The 980 is still a high end card. No ifs or buts. It's €500. If the 1080 is about as much that too is a high end card.



This.



ZoneDymo said:


> the 60 is low end gaming
> the 70 is mid end gaming
> the 80 is high end gaming
> the Ti is a boosted version
> ...



This.



ZoneDymo said:


> If it does not destroy the GTX980Ti its a fail in my book



Exactly!, this is more of an incremental upgrade, as if they where still at 28nm



TissueBox said:


> Really surprised at the community here. The 1080 replaces the 980, and it is on average ~34% faster (assuming GTX 980 Ti @ 1190MHz performance levels). Similar to the 980 reveal, which was about 30% better than the 780 and just barely faster than the 780 Ti (5-10%).
> 
> That's pretty solid. 980 Ti owners should be looking at the 1080 Ti or Pascal Titan as their replacement.
> 
> I'll probably opt for SLI 1080s as it shifts down a price bracket when the 1080 Ti releases.



Pretty solid?, seriously?
So, you expect that a vcard on 14nm performs 10% more than one on 28nm? wtf???
The 1060 should be on par with 980Ti!, that is the way it always is when there is a node change and generational change, hell, they even skipped 20nm!!!!!!

Where is the huge jump in performance from 14nm? could anybody talk about that??
From my own perspective AMD and NVIDIA are screwing with us.


----------



## efikkan (May 5, 2016)

idx said:


> I am sorry, I don't mean to look like arguing. but It seems that you dont understand my point.


That's OK 

I completely understood what you said. Your first point was correct (when everything else scales you need more ROPs as well), your second was not (the one about Fiji being limited by ROPs in 4K).


----------



## efikkan (May 5, 2016)

N3M3515 said:


> So, you expect that a vcard on 14nm performs 10% more than one on 28nm? wtf???
> The 1060 should be on par with 980Ti!, that is the way it always is when there is a node change and generational change, hell, they even skipped 20nm!!!!!!
> 
> Where is the huge jump in performance from 14nm? could anybody talk about that??
> From my own perspective AMD and NVIDIA are screwing with us.


Yes we can!
Well, I think your expectations are too high based on PR. What TSMC calls "16nm" FinFET and Samsung calls "14nm" FinFET would be classified as "20nm" FinFET if it were made by Intel...


----------



## G33k2Fr34k (May 5, 2016)

The 980TI with an i7 4770K gets around 12,000 points in Fire Sttike Extreme. Correct me if I'm wrong, but isn't the GTX1080 slower with a score of ~9000 running on an i7 3770K?


----------



## TissueBox (May 5, 2016)

N3M3515 said:


> Pretty solid?, seriously?
> So, you expect that a vcard on 14nm performs 10% more than one on 28nm? wtf???
> The 1060 should be on par with 980Ti!, that is the way it always is when there is a node change and generational change, hell, they even skipped 20nm!!!!!!
> 
> ...



It roughly follows the same increase in performance per generation - Kepler to Kepler refresh (20%-30%), and Kepler refresh to Maxwell (20-40%). As I recall, the switch to 16nm offers a 40% improvement in performance compared to 28nm, *OR* use 50% less power.

It performs ~34% better than the similar card on their previous generation, so yes, I would say it's solid (Or perhaps I should use expected) and in line with what they've been doing.


----------



## yogurt_21 (May 5, 2016)

So ...
1. It pretty much matches a factory clocked 980 Ti performance wise
2. The commas are missing from the memory, core clock, and memory clock compared with regular entries
http://www.3dmark.com/fs/6704407
http://www.3dmark.com/fs/6264721
3. There's a 1290 MHZ difference between the 2 core clocks
4. The memory bus speed is over twice that of a 980 ti 


I call BS. These are 980 Ti's with edited graphics card specs on the entry.


----------



## efikkan (May 5, 2016)

TissueBox said:


> It roughly follows the same increase in performance per generation - Kepler to Kepler refresh (20%-30%), and Kepler refresh to Maxwell (20-40%). As I recall, the switch to 16nm offers a 40% improvement in performance compared to 28nm, OR use 50% less power.
> 
> It performs ~35% better than the similar card on their previous generation, so yes, I would say it's solid (Or perhaps I should use expected) and in line what they've been doing.


Yes, and accounting for the significant performance increase in the Pascal archtecture a ~400mm² should be able to achieve up to 70% or so over GM204, but since this is a new node GP104 is only ~333mm². So ~35% performance gain over GTX 980 is exactly to be expected.

GV100(2018) is scheduled to be made on 10nm FinFET, but it wouldn't surprise me if the successor of GP104 will be still made on 16nm FinFET, but with a slightly bigger chip ~400mm², and with the architectural improvements we'll get another 30% once again...</speculation>


----------



## PP Mguire (May 5, 2016)

ZoneDymo said:


> oh right 980 midrange, add a "T" and a "i" and its top end.
> Makes sense.....


GM204 midrange, GM200 high end. P104 midrange, P100 next year will be high end. They've done the same release strategy since the 680. I could bet money on them doing the same with Volta in 2018.


----------



## truth teller (May 5, 2016)

yogurt_21 said:


> 1. It pretty much matches a factory clocked 980 Ti performance wise


as expected, i hope this doesnt come as a surprise to you



yogurt_21 said:


> 4. The memory bus speed is over twice that of a 980 ti


again, as expected for gddr5x, not that its gonna do any better than the previous gddr5
does you car go faster just because you are in a racing circuit?



yogurt_21 said:


> I call BS.


dont, you will be sad once the cards are launched and definitive reviews are posted


----------



## TheHunter (May 6, 2016)

RejZoR said:


> Numbers mean nothing to me if I don't have anything to compare them with. Can someone also post numbers for current gen cards using similar system?





EarthDog said:


> You can search the FM database........................
> 
> Anyway, its about spot on/a bit faster than a factory overclocked 980Ti and 5820K.




This is my factory OC  980Ti @ ~ 1418-1443MHz boost, vram stock 1806Mhz (346gb/s)






so only 300points more, or 1-2fps difference.. Right.
And the way that GTX1080 @ 1800mhz is clocked I bet it won't have much room left, probably 2000MHz limit.



3dmark firestrike extreme, same 1418MHz, vram stock 1805Mhz




> Score 8 679 with NVIDIA GeForce GTX 980 Ti(1x) and Intel Core i7-4770K
> Graphics Score
> 9 340
> Physics Score
> ...


http://www.3dmark.com/fs/8357945

Ok here its 600-700points difference, but that's minimal..



Imo overall its nothing special, I suspected it will be like that.. Slightly Oc'ed 980ti perf. at best.

I think sometimes it will be slower when that extra 16ROPS and 20-30GB/s count.

Personally I dont regret anything, bought this 980Ti 2 months ago for 420€, sold my old 780GTX for 250€ (originally payed 320€) and its a bomb. For me its only big Pascal or next-nextgen Volta and AMD Navi that's really interesting. Think I'll just skip Pascal all together.


----------



## lemkeant (May 6, 2016)

I'm pretty surprised regarding the arguments here. There's no way Nvidia or AMD are going to blow their load on the 1st gen 14/16nm cards. Why? As we saw with 28nm, this process is going to need to last at least a few years (with Moore's law being mostly dead).

The next set of cards will be good enough to beat the previous cards but thats it. The next series from there will repeat the same trend and a few years from now, we'll have the 1380ti and R9 590X still built on the same 14/16 nm process, just larger and more power hungry


----------



## cedrac18 (May 6, 2016)

Hmmm is no one expecting better performance when the drivers mature? Not sure why is everyone is flipping out already when these results are not even from proper review.


----------



## ensabrenoir (May 6, 2016)

......same pro/con arguments....just a new year........its all just business.....Nvidia like intel like everyone else should be, is out to make* consistent * profit. year after year, model after model....business 101.


----------



## truth teller (May 6, 2016)

cedrac18 said:


> Hmmm is no one expecting better performance when the drivers mature? Not sure why is everyone is flipping out already when these results are not even from proper review.


if these "new" cards could have their performance increased by driver tweaks, then that meant that maxwell cards can also have that the performance increase. drivers for that generation are pretty mature by now, it will _not_ happen
why are people still thinking this is an entirely new gpu? jesus christ folks, you have numbers right in front of you...


----------



## dj-electric (May 6, 2016)

GTX 1070 (350$) slightly stronger than a GTX 970 (330$)? wtf?
GTX 960 (200$) slightly stronger than a GTX 760 (200$)? wtf?
HD 6870 (239$) slightly stronger than an HD 5850 (229$)? wtf?
HD 5850 (260$) slightly stronger than an HD 4890 (250$)? wtf?
HD 4850 (200$) slightly stronger than an 8800 GT (160$)? wtf?

C'mon guys, we've been there many many times. its a small step in performance with another in efficiency.
These are all examples of replacements and competitors with an MSRP higher than their replacement - *that's how it starts*. sometimes it doesn't, but mostly it does.


----------



## Caring1 (May 6, 2016)

truth teller said:


> again, as expected for gddr5x, not that its gonna do any better than the previous gddr5
> does you car go faster just because you are in a racing circuit?


Fuck yeah! No silly restrictions to slow me down.


----------



## Prima.Vera (May 6, 2016)

To be honest, after all the hype and giggles regarding the new generation, I was hoping that at least the 1070 to be ~5% faster than the 980Ti, just like the previous gen was with the pre-previous gen. Seeing that the 1080 barely beats a stock 980Ti, is so disappointing...


----------



## rtwjunkie (May 6, 2016)

Frick said:


> The 980 is still a high end card. No ifs or buts. It's €500. If the 1080 is about as much that too is a high end card.



Um no....it's an upper-midrange, using the GM204 chip which happens to NOT be Nvidia's high end Maxwell chip.  

Cost has nothing to do with whether a GPU is high-end, and everything to do with it just being expensive.


----------



## Prima.Vera (May 6, 2016)

rtwjunkie said:


> Um no....it's an upper-midrange, using the GM204 chip which happens to NOT be Nvidia's high end Maxwell chip.


I always thought that from nVidia, upper-midrange means *70 series, lower-midrange *60 series, with the <midrange> gap being filled by AMD...


----------



## rtwjunkie (May 6, 2016)

Prima.Vera said:


> I always thought that from nVidia, upper-midrange means *70 series, lower-midrange *60 series, with the <midrange> gap being filled by AMD...



No, it goes by chip, and the 70's are firm mid-range.  They are always designed to just beat the previous gen 80 series (but not the 80Ti version). The 80 was on the same chip, topping out performance for the GM204.  The 960's are very mainstream, but are lower mid range.


----------



## Frick (May 6, 2016)

rtwjunkie said:


> Um no....it's an upper-midrange, using the GM204 chip which happens to NOT be Nvidia's high end Maxwell chip.
> 
> Cost has nothing to do with whether a GPU is high-end, and everything to do with it just being expensive.



It depends on how you look at it, and whether you think "high end" is just one chip or not but frankly I don't see why you would think that. Is the low end made up solely by the GT710 because that is the slowest card, and does that make the GT720 "very-slightly-less-lower-low-end"? It's the second fastest mainstream card from Nvidia (not counting the Titan). From a market standpoint cost has everything to do with it, because that is how the market is segmented and percieved. But it's all semantics anyway and I definitely see the various "end"s as ranges.


----------



## Kissamies (May 6, 2016)

PP Mguire said:


> And yet others still going on about how this is shit, when in fact a midrange beating a top end is quite good.


I'd call it high-end and the big-chip model enthusiast. Just like 980 is high-end and 980Ti is enthusiast class.

If the difference is similar like 980 vs 780Ti, I'd say that there's absolutely no sense to upgrade from 980Ti.


----------



## N3M3515 (May 6, 2016)

TissueBox said:


> It roughly follows the same increase in performance per generation - Kepler to Kepler refresh (20%-30%), and Kepler refresh to Maxwell (20-40%). As I recall, the switch to 16nm offers a 40% improvement in performance compared to 28nm, *OR* use 50% less power.
> 
> It performs ~34% better than the similar card on their previous generation, so yes, I would say it's solid (Or perhaps I should use expected) and in line with what they've been doing.



Maxwell was already expectacular at power comsumption.
Do you remember when node changes meant huge leaps or am i dreaming. What you say is correct for vcards at the same node. Look how many years it took for a node change, at the very least one should expect a 60% increase in perf from a gtx 960 to a gtx 1060.

Wake up people.


----------



## sweet (May 6, 2016)

9700 Pro said:


> I'd call it high-end and the big-chip model enthusiast. Just like 980 is high-end and 980Ti is enthusiast class.
> 
> If the difference is similar like 980 vs 780Ti, I'd say that there's absolutely no sense to upgrade from 980Ti.


Don't worry. nVidia's driver will force you to upgrade soon after Pascal is released. It did and will happen again


----------



## TissueBox (May 6, 2016)

N3M3515 said:


> Maxwell was already expectacular at power comsumption.
> Do you remember when node changes meant huge leaps or am i dreaming. What you say is correct for vcards at the same node. Look how many years it took for a node change, at the very least one should expect a 60% increase in perf from a gtx 960 to a gtx 1060.
> 
> Wake up people.



No as a matter of fact, I do not remember when node changes meant huge 60% leaps you're referring to. The jump from 65nm (8800 GTX/9800 GTX) to 55nm (GTX 285) was 30%. The jump from 55nm (GTX 285) to 40nm (GTX 480) was 30%. The jump from 40nm (GTX 580) to 28nm (GTX 680) was 15%.

All the reviews can be found on TechPowerUp.


----------



## johnspack (May 6, 2016)

Dammit the 1070 better not be that much faster,  still paying off my dam 970.....


----------



## ShockG (May 6, 2016)

1080 looking mighty impressive here. 
Stock clocks beating GTX 980 G1 Gaming model @ 1550MHz (fixed clock) core and 2.1GHz DRAM clock. 
scoring 7060~ or so in 3dMark FS X vs 9,100 for 1080. Only one PCI-E plug and lower power draw. Rather impressed with this GPU. 
Even faster when overclocked at over P10K, something that was never going to happen unless 980 was on LN2. 
Worthwhile upgrade for 980 owners for sure and 980Ti owners wanting to reduce power consumption.


----------



## N3M3515 (May 6, 2016)

TissueBox said:


> No as a matter of fact, I do not remember when node changes meant huge 60% leaps you're referring to. The jump from 65nm (8800 GTX/9800 GTX) to 55nm (GTX 285) was 30%. The jump from 55nm (GTX 285) to 40nm (GTX 480) was 30%. The jump from 40nm (GTX 580) to 28nm (GTX 680) was 15%.
> All the reviews can be found on TechPowerUp.



Oh really?

GTX 280 vs (8800 ultra - 9800 GTX) Thats a 60% faster than the 9800 GTX right there, that's solid.
Lol, even the GTX 260 is like 25% faster than the 8800 ultra, and you are reffering to the 9800GTX which was even slower, and guess what, the GTX 285 is faster than GTX 280.
Note: in the chart here in TPU, the 9800gtx being 72% of a gtx 285 does not mean the gtx285 is 30% faster. From the perspective of the 9800gtx it is 30% slower than a gtx 285, and from the perspective of a gtx 285 it is 40% faster than a 9800GTX.
And again, take a look at how the second best from the new gen (GTX 260) is 25% faster than last gen fastest (8800ultra).

GTX 285 vs GTX 480, again same story, 40% faster.
Again nvidia's second best (GTX 470) faster than the fastest from the previous gen.
EDIT: the GTX 460 was on par with previous gen fastest (GTX 285).

Now the GTX 580 vs GTX 680 is an unfair comparison.
And even then, GTX 680 was 25% faster than GTX 580 and the GTX 670 also by 20%
EDIT: GTX 660Ti was on par with previous gen fastest (GTX 580)

You are comparing a card that was meant to be $300 GTX 660, but got renamed to GTX 680 at $500 because AMD had nothing to counter the big full chip from NVIDIA.


----------



## Valdas (May 6, 2016)

N3M3515 said:


> ...


If you compare reference 1080 to reference 980 then you should be able to see that performance leap you're looking for. How much faster 980 Ti is over 980? How much faster is OCed 980 Ti vs reference 980 Ti?


----------



## Legacy-ZA (May 6, 2016)

Well, I am still waiting for a proper review, so I actually don't want to believe this, yet, I really hope it's not true. But, if it is; I told everyone this is what they were gonna do and I got berated for it. Not so happy anymore are we? Enjoy.

I really want this generation to start crunching 4k UHD to bits.


----------



## johnspack (May 6, 2016)

I remember my 480 was a 2.5x improvement over my 285....  so I wouldn't be surprised if an arch change might do that again....


----------



## silentbogo (May 6, 2016)

All this buzz and no one even thought that these screenshots might be fake, or not related to 10-series at all?
I've seen the exact same ones about a week or two ago, except there was "Generic VGA" on both of them. No original links to 3DMark score pages either (though it won't probably make the difference). I had a little theory that this might actually be a Quadro M5000 or something along these lines, but now I'm starting to doubt even that.

But wait...  There's always more:
http://www.3dmark.com/3dm11/11061015


----------



## Prima.Vera (May 6, 2016)

N3M3515 said:


> ... a card that was meant to be $300 GTX 660, but got renamed to GTX 680 at $500 because AMD had nothing to counter the big full chip from NVIDIA.



Oh. Those were good times....NOT! I hope nVidia wont pull that shit over again because of AMD's incompetence.


----------



## Caring1 (May 6, 2016)

silentbogo said:


> All this buzz and no one even thought that these screenshots might be fake, or not related to 10-series at all?
> I've seen the exact same ones about a week or two ago, except there was "Generic VGA" on both of them. No original links to 3DMark score pages either (though it won't probably make the difference). I had a little theory that this might actually be a Quadro M5000 or something along these lines, but now I'm starting to doubt even that.
> 
> But wait...  There's always more:
> http://www.3dmark.com/3dm11/11061015


The scores may not be spectacular for the card in that link, but it's core and memory clocks seem strange and not quite 8Gb of Vram either.


----------



## silentbogo (May 6, 2016)

That's from comments in VideoCardz with _intentionally sarcastic_ "shocking revelation" that this is a "crippled" 1070


----------



## Vayra86 (May 6, 2016)

ZoneDymo said:


> the 60 is low end gaming
> the 70 is mid end gaming
> the 80 is high end gaming
> the Ti is a boosted version
> ...



What have you - and a whole herd of other sheep - been smoking the past ten years?

The x70 is mid range on a budget
The x80 is overpriced mid range with mildly better performance (please remind yourself of the 670 that could be on 680 stock performance with a regular OC, or the 970 doing the same versus a stock 980)
The x80ti is big chip, high/top end.

The exception to this rule is ONLY the GTX 780. And that is *only* because 7xx was a _Kepler rebrand_ and that pushed the 680 > 770 which was just a 680 with faster VRAM. The 7xx series was a strange release because it landed 'between architectures', moving us slowly from Kepler to Maxwell, with the 750ti as the only Maxwell v1 part.

So, what's new? Exactly nothing, and exactly as I and the rest of the elevated people on this forum (mind you, this is a rather small portion of our forumites, evident in this thread) have been trying to get into your thick, thick skulls.

Move along now.


----------



## RejZoR (May 6, 2016)

cedrac18 said:


> Hmmm is no one expecting better performance when the drivers mature? Not sure why is everyone is flipping out already when these results are not even from proper review.



Drivers play less and less of a role in performance these days. Except for SLi/Crossfire profiles. There might be slight changes with per-game optimizations, but nothing that would bring general improvements across all games.

Also, GTX 1070 with 5.5 GB of VRAM instead of 6GB XD


----------



## bug (May 6, 2016)

TissueBox said:


> Really surprised at the community here. The 1080 replaces the 980, and it is on average ~34% faster (assuming GTX 980 Ti @ 1190MHz performance levels). Similar to the 980 reveal, which was about 30% better than the 780 and just barely faster than the 780 Ti (5-10%).
> 
> That's pretty solid. 980 Ti owners should be looking at the 1080 Ti or Pascal Titan as their replacement.
> 
> I'll probably opt for SLI 1080s as it shifts down a price bracket when the 1080 Ti releases.



The problem (as I see it) is that if this is true, driving 4k from a single card is going to remain a no go for one more generation.
I was really, really hoping that after being stuck at 20nm for all these years, 14nm will at least enable a more respectable jump in performance. Oh, well...


----------



## Vayra86 (May 6, 2016)

bug said:


> The problem (as I see it) is that if this is true, driving 4k from a single card is going to remain a no go for one more generation.
> I was really, really hoping that after being stuck at 20nm for all these years, 14nm will at least enable a more respectable jump in performance. Oh, well...



This is where it goes wrong everytime in people's heads.

Did 14nm bring us a massive performance jump on CPU? Is Skylake 20-30% faster than 22nm? Nope! We gained 10%.
Did 32nm bring us the great jump in performance? No it was architecture. Sandy Bridge made it happen, same process as Nehalem. Then we moved to Ivy on 22nm... and gained 10%.

Did we not see 30%+ performance jumps on GPU in the past years, for similar price parts? Yes we did.

Are we going to see 30% performance jump this year? Yes we are. Given the fact that Intel gets to squeeze 10% out of a node shrink, and we get 30% REGARDLESS of node shrinks on GPU, I'd say we have nothing to complain about.

Once again. Move along now


----------



## Prima.Vera (May 6, 2016)

Vayra86 said:


> What have you - and a whole herd of other sheep - been smoking the past ten years? ...
> ... I and the rest of the elevated people on this forum (mind you, this is a rather small portion of our forumites, evident in this thread) have been trying to get into your thick, thick skulls.
> 
> Move along now.



Wow. You are indeed a high elevated and intelligent human being by throwing up so many insults and considering yourself better and smarter than the others...
High elevated indeed...


----------



## Legacy-ZA (May 6, 2016)

Vayra86 said:


> What have you - and a whole herd of other sheep - been smoking the past ten years?
> 
> The x70 is mid range on a budget
> The x80 is overpriced mid range with mildly better performance (please remind yourself of the 670 that could be on 680 stock performance with a regular OC, or the 970 doing the same versus a stock 980)
> ...



I always thought that;

50/Ti was low end
60/Ti was Mid-Range (Budget)
70/Ti was Upper Mid-Range
80/Ti was High End/Top End

In fact, I think a nice article should be made to start putting all this into perspective.

The bars got skewed a lot thanks to the Titan X/Z that is now the Top End model and by way, replacing the dual GPU's, the  GTX590 and GTX690... To confuse matters more, the different amount of money they are charging for all the different models now.

I am starting to doubt people are actually getting a "Free Game" with new GPU purchases these days.


----------



## Vayra86 (May 6, 2016)

Prima.Vera said:


> Wow. You are indeed a high elevated and intelligent human being by throwing up so many insults and considering yourself better and smarter than the others...
> High elevated indeed...



Do you also have some relevant input or do you just want to be butthurt about getting told a very real truth? We've been seeing the same comments for years now and nothing has changed, so forgive me if it gets boring and if it starts to show patterns between some visitors of this forum. Recently, with these 'big leaps' that are being marketed (yes, marketed, not actually sold), it seems like the level of stupidity rises with the level of marketing. We just blindly follow, like sheep, hence the comment about herd and sheep.

It's a bit like people being surprised at the sun rising every morning, and then being surprised it goes down again.


----------



## Parn (May 6, 2016)

Considering the move from 28nm to 16nm which would give 40% more transistors to the engineers to play with and better energy efficiency, I would have expected the 1080 to crush 980Ti. If not, I'd skip this generation.


----------



## ZoneDymo (May 6, 2016)

Vayra86 said:


> Do you also have some relevant input or do you just want to be butthurt about getting told a very real truth? We've been seeing the same comments for years now and nothing has changed, so forgive me if it gets boring and if it starts to show patterns between some visitors of this forum. Recently, with these 'big leaps' that are being marketed (yes, marketed, not actually sold), it seems like the level of stupidity rises with the level of marketing. We just blindly follow, like sheep, hence the comment about herd and sheep.
> 
> It's a bit like people being surprised at the sun rising every morning, and then being surprised it goes down again.



First you act like you are 12, I mean "butthurt", really? go back to 4chan if you cannot be an adult.
Secondly, its more about wanting people to wake up and realize they are being milked, I find it amazing people defend companies/Nvidia with this practice by saying "its business, the want profit", yeah they do and who is paying for that easy profit? WE ARE! 
Why would we stand for that? why would we continue to buy mediocre upgrades?

If lets say Apple would bring out a new Iphone thats literally a piece of wood, yeah im sure its good profit for them to sell us a piece of wood for 600 dollars, does not mean we should buy it.
I would be very much inclined to help people "Wake up" by informing them that they are buying a piece of wood.

We are not the company, we do not thrive with their profits, infact rewarding them for mediocre upgrades hurts us more because it continually slows down progress in the world of computing.
I would like to be able to do 8k gaming at 200hz in my lifetime thank you very much, oh and beyond would be nice as well....
But nope, instead of making the leaps we want we take baby steps and people defend this practice, well not me and maybe I can make others see this as well which will hopefully cut into the profits of the company which will hopefully spur on some actual progress.

If you want to buy it, go ahead, if you want to defend it by saying "they want to make as much money as possible" go ahead.
For me its not a compelling argument as to why all this baby stepping is ok.



Legacy-ZA said:


> I always thought that;
> 
> 50/Ti was low end
> 60/Ti was Mid-Range (Budget)
> ...



It seems to be more about what peoples budgets are then where Nvidia actually places their products now 
And yeah you are right, suddenly this Titan is part of the bracket, yet its the same people that also claim its not part of it because its "meant for more then just gaming" bit of that cake and eating it to kinda thing.

Lastly, that free game thing, or free anything, never ever again think anything is added in for free.
You are in that case paying for the package, 600 dollars for a card and a game.
They do this as is no suprise as an extra incentive to buy the card, a new game you might want bundled in might sell better for me people (working better psychologically) then just offering the card with a 60 dollar discount.
It also helps with marketing, suddenly your new game that you want is advertised with this card and it will "give you the best experience with this kewl new game yo".

You should never see it as getting anything free with anything, you are paying for both and should wonder if that then is worth it.


----------



## matar (May 6, 2016)

Great lets Hope the GTX 1070 is equal to a GTX 980 Ti


----------



## rtwjunkie (May 6, 2016)

matar said:


> Great lets Hope the GTX 1070 is equal to a GTX 980 Ti



Except why would it be? Historically, this is not the case.  It would exceed the 980, not the 980Ti.  Nvidia make theeir new models one level higher than previous gen or basically equal, that's it.

Just look at 770 to 680; 970 to 780.


----------



## Legacy-ZA (May 6, 2016)

ZoneDymo said:


> Lastly, that free game thing, or free anything, never ever again think anything is added in for free.
> You are in that case paying for the package, 600 dollars for a card and a game.
> They do this as is no suprise as an extra incentive to buy the card, a new game you might want bundled in might sell better for me people (working better psychologically) then just offering the card with a 60 dollar discount.
> It also helps with marketing, suddenly your new game that you want is advertised with this card and it will "give you the best experience with this kewl new game yo".
> ...



This is exactly my view of the story as well, I am rather glad to see I am not the only one that has noticed this.


----------



## bug (May 6, 2016)

Vayra86 said:


> This is where it goes wrong everytime in people's heads.
> 
> Did 14nm bring us a massive performance jump on CPU? Is Skylake 20-30% faster than 22nm? Nope! We gained 10%.
> Did 32nm bring us the great jump in performance? No it was architecture. Sandy Bridge made it happen, same process as Nehalem. Then we moved to Ivy on 22nm... and gained 10%.
> ...



You're wrong. All of those transitions brought massive performance boosts. But intel chose to put them all in the iGPU and keep the CPU mostly unchanged. You can't do that on a dGPU, so I was hoping we'd get more performance.
A smaller process means smaller transistors can fit in the same area. And if more transistors won't give you more performance, why put them there? You would just shrink the die and cut production costs.


----------



## Tatty_One (May 6, 2016)

bug said:


> You're wrong. All of those transitions brought massive performance boosts. But intel chose to put them all in the iGPU and keep the CPU mostly unchanged. You can't do that on a dGPU, so I was hoping we'd get more performance.
> A smaller process means smaller transistors can fit in the same area. And if more transistors won't give you more performance, why put them there? *You would just shrink the die and cut production costs.*



Sadly this is probably the biggest single motivating factor, not disagreeing with your comments but just suggesting that from a business perspective this is what gives you the margins that often does not exist in a competitive marketplace.


----------



## Frick (May 6, 2016)

Vayra86 said:


> What have you - and a whole herd of other sheep - been smoking the past ten years?
> 
> The x70 is mid range on a budget
> The x80 is overpriced mid range with mildly better performance (please remind yourself of the 670 that could be on 680 stock performance with a regular OC, or the 970 doing the same versus a stock 980)
> ...



You just keep your mighty, erect, throbbing, slobbering brain as far away from my skull as possible dude. You've got the bug in a bad way.


----------



## Prima.Vera (May 6, 2016)

rtwjunkie said:


> Except why would it be? Historically, this is not the case.  It would exceed the 980, not the 980Ti.  Nvidia make theeir new models one level higher than previous gen or basically equal, that's it.
> 
> Just look at 770 to 680; *970 to 780*.



Except that the 970 is 5% faster in most of benches than a *780Ti *


----------



## geon2k2 (May 6, 2016)

According to rumors on the internet this will be priced somewhere towards 700$. I've seen estimated from 600$ to 800$ but seeing that it is better than 980ti I don't expect it will cost less. This never happen in the recent history, although it used to be the case some 5 years ago.

Well with this money you could do soo many much better things than blowing it on high tech technology.

I know everybody will look for the down-vote button as there are so many people on this forum which spent thousands of $ on technology, but currently that's my view. I also see that a 960/380 GPU can easily play with decent quality everything that's out there on the market in Full HD, so then, why spend so much more, just to inflate the bottom line of companies like nvidia or amd ? Ok, for the later one a bigger pump is required as somehow they have more holes and what the consumers manage to pump in is not enough 

And don't get me started on apple or the latest android flag-sink-ships. 800$ for a phone ???? Replacement every 1 or 2 years??? What is this world we live in ???? No wonder some are sitting on 200 billion $.


----------



## rtwjunkie (May 6, 2016)

Prima.Vera said:


> Except that the 970 is 5% faster in most of benches than a *780Ti *


Negative.  I can tell you from personal experience on identical systems, the 780 beat the 970 half the time, and the 970 took the other half.

Just think about that....and how much more potent a 780Ti was than 780.


----------



## EarthDog (May 6, 2016)

Prima.Vera said:


> Except that the 970 is 5% faster in most of benches than a *780Ti *


Nope.
https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/27.html

Here is my review: http://www.overclockers.com/gigabyte-gtx-970-extreme-video-card-review/

970 worked over a 780, but the 780ti was faster by a few % overall.


----------



## Prima.Vera (May 6, 2016)

rtwjunkie said:


> Negative.  I can tell you from personal experience on identical systems, the 780 beat the 970 half the time, and the 970 took the other half.
> 
> Just think about that....and how much more potent a 780Ti was than 780.


Could be, I won't deny it. I was just looking at the tests here on TPU:














To be fair also wins on some over the 970, mostly older games...



EarthDog said:


> Nope.
> https://www.techpowerup.com/reviews/MSI/GTX_970_Gaming/27.html
> 
> Here is my review: http://www.overclockers.com/gigabyte-gtx-970-extreme-video-card-review/
> ...



Yep, that was the case before nVidia start screwing with the drivers and halt optimization for the new games. Why do I have a feeling the story will repeat itself again with the new gen?


----------



## rtwjunkie (May 6, 2016)

Prima.Vera said:


> Why do I have a feeling the story will repeat itself again with the new gen?



You are very likely correct about that! I believe in about a year and a half (at most), my 980 will receive no more driver optimizations.

But thats pretty much how it always is with drivers.  It's a diminishing return ratio....most improvements will be early in the hardware life.

In any case, this is fine with me, because I will have upgraded again long before then.


----------



## [XC] Oj101 (May 6, 2016)

Two of these on LN2 beat four 980 Tis on LN2
10 GHz RAM will be a possibility with some of these cards
Frequencies will hit CPU-like speeds in LN2


----------



## EarthDog (May 6, 2016)

[XC] Oj101 said:


> Two of these on LN2 beat four 980 Tis on LN2
> 10 GHz RAM will be a possibility with some of these cards
> Frequencies will hit CPU-like speeds in LN2


Oh? Links please!


----------



## [XC] Oj101 (May 6, 2016)

EarthDog said:


> Oh? Links please!



I am the link


----------



## PP Mguire (May 6, 2016)

9700 Pro said:


> I'd call it high-end and the big-chip model enthusiast. Just like 980 is high-end and 980Ti is enthusiast class.
> 
> If the difference is similar like 980 vs 780Ti, I'd say that there's absolutely no sense to upgrade from 980Ti.


Except I'm not referring to branding, I'm referring to the chip. It's midranged no matter where they put it in the naming/pricing stack. 



ZoneDymo said:


> First you act like you are 12, I mean "butthurt", really? go back to 4chan if you cannot be an adult.
> Secondly, its more about wanting people to wake up and realize they are being milked, I find it amazing people defend companies/Nvidia with this practice by saying "its business, the want profit", yeah they do and who is paying for that easy profit? WE ARE!
> Why would we stand for that? why would we continue to buy mediocre upgrades?
> 
> ...


I'm not disagreeing with you, but I'll just say that competition drives performance in this arena. Nvidia and Intel won't get off their ass to really produce something staggering until they have a reason to. It's business 101 to not put all your guns on the table if you can get away with reusing the polished 9mil. In the case of people purchasing, I can't speak for anybody else but myself but I honestly don't give a rat's ass. I sell my top end lineup then pickup the latest and greatest for a couple hundred out of pocket. I want the best performance on tap for any game that I may want to play (getting rare these days though). My purchasing or lack thereof isn't going to change Nvidia's release habits. I'd be doing the same on my platform, but we all know how CPU upgrades go lol. In the end it's not defending them at all, it's just being realistic. I can get the new cards or I don't, it doesn't matter to them.


----------



## EarthDog (May 6, 2016)

[XC] Oj101 said:


> I am the link


Waiting for some proof....


----------



## ZoneDymo (May 6, 2016)

PP Mguire said:


> Except I'm not referring to branding, I'm referring to the chip. It's midranged no matter where they put it in the naming/pricing stack.
> 
> I'm not disagreeing with you, but I'll just say that competition drives performance in this arena. Nvidia and Intel won't get off their ass to really produce something staggering until they have a reason to. It's business 101 to not put all your guns on the table if you can get away with reusing the polished 9mil. In the case of people purchasing, I can't speak for anybody else but myself but I honestly don't give a rat's ass. I sell my top end lineup then pickup the latest and greatest for a couple hundred out of pocket. I want the best performance on tap for any game that I may want to play (getting rare these days though). My purchasing or lack thereof isn't going to change Nvidia's release habits. I'd be doing the same on my platform, but we all know how CPU upgrades go lol. In the end it's not defending them at all, it's just being realistic. I can get the new cards or I don't, it doesn't matter to them.



Well thats kinda where you are wrong, vote with your wallet so to speak.
If nobody goes and buy their new cards because we stand together and demand better, then yes, better cards will be made, I mean hell they do want/need to sell stuff.
I mean you have the attitude of, "im but one person, what does it matter if I do something", and if billions of other people have the same mentality then yeah, nothing will happen.
But if those billions including you make a stand and demand better, well, better will come.

". Nvidia and Intel won't get off their ass to really produce something staggering until they have a reason to"

Like people not buying their products anymore, that seems a solid reason to get off their asses me thinks.

And the idea that nobody will join you in this is not what I would describe as realistic, but more pessimistic.
With such an attitude a lot of things we have achieved would not be achieved and a lot of things we strive for now might as well be ended.
Pff Electric cars...who wants that..
Renewable energy pfff why bother, nobody will join in anyway..
A trip to mars? eh, nobody will help out with that, just a pipe dream..
etc etc

Sure, change does not happen fast, but spreading the word you are not happy with sloppy improvements, even if you are still buying them because well it is still unfortunately the latest and greatest, might go a long way in the long run, now you are just a silent sheep buying all this crap.

lastly:
"I want the best performance on tap for any game that I may want to play"

and that best performance could be so much better, unlike now where this GTX1080 will do a current game like AC Syndicate at 2560x1440 at like 45 fps.... a current game, not running 4k, with a 500/600 dollar latest card and not even 60fps, in a current already released game.
Each their own but I find it unacceptable.


----------



## [XC] Oj101 (May 6, 2016)

EarthDog said:


> Waiting for some proof....



Either look at my reputation for releasing info before the time or wait until release.


----------



## EarthDog (May 6, 2016)

I don't know you from a hole in the ground friend. I'll wait if you won't pony up any proof.


----------



## PP Mguire (May 6, 2016)

ZoneDymo said:


> Well thats kinda where you are wrong, vote with your wallet so to speak.
> If nobody goes and buy their new cards because we stand together and demand better, then yes, better cards will be made, I mean hell they do want/need to sell stuff.
> I mean you have the attitude of, "im but one person, what does it matter if I do something", and if billions of other people have the same mentality then yeah, nothing will happen.
> But if those billions including you make a stand and demand better, well, better will come.
> ...


You're forgetting the mass amount of clueless buyers that don't give a shit. That's not pessimistic, that's realistic and where I was coming from. I'd say not even 10% of the buyers of Nvidia chips think like we do or even care. 10% being generous to the cause. That's why I said what I said, I'll go about my daily life not caring what they do and decide whether or not I want to upgrade because in the end with the scale of people it quite literally doesn't matter. I do believe this was brought up in a couple of other threads too, and it's exactly how I see every thread like this and this very argument. "Speak with your wallet fellas!". K, we're like an ant size amount compared to the masses that purchase without a clue and nary a care to that clue. You don't want to buy it and think it's not enough? Don't. It won't matter to me, anybody else, or Nvidia. You change your mind and buy it? Same thing. It's only a waste of time debating it or caring. Life goes on. I'd be willing to bet even if the P104 chip was 50% faster than a 980ti anybody moaning about it still wouldn't pony up day 1 and would still complain about how Nvidia is screwing us.


----------



## medi01 (May 6, 2016)

EarthDog said:


> Why was that up to the person posting the news?


It's called "journalism". At least, in some countries.


----------



## the54thvoid (May 6, 2016)

ZoneDymo said:


> Well thats kinda where you are wrong, vote with your wallet so to speak.
> If nobody goes and buy their new cards because we stand together and demand better, then yes, better cards will be made, I mean hell they do want/need to sell stuff.
> I mean you have the attitude of, "im but one person, what does it matter if I do something", and if billions of other people have the same mentality then yeah, nothing will happen.
> But if those billions including you make a stand and demand better, well, better will come.
> ...



Expecting a company to try harder because people elect to band together and boycott their product does a few things:
1) Share price crash
2) Your pious campaign leads to job lay offs and cut's in R&D.
3) The product actually gets worse.
4) The competition sees the opportunity to make profit for it's shareholders and raises it's own prices due to market conditions.
5) The slide of the other company continues for years, allowing the other to profit even more

Does that sound familiar - oh yeah - it's what happened in a sense to AMD.  You are outrageously so far away from business reality it's almost comical.  Big business does NOT listen to it's consumers - it is dictated to by it's shareholder to whom it owes everything.  Shareholders demand a return on investment and that is acquired through profiteering at our expense.  That is capitalism.  I don't like it but I understand it.

The only and I mean THE ABSOLUTE ONLY incentive for Nvidia to lower it's prices is when AMD has the stand out best gfx chip and prices it in such a way Nvidia will only make sales if it lowers it's profit margin.

Stop blaming Nvidia for making profit for it's shareholders - blame the absolute lack of high end competition from AMD.  And yes - I am actually very surprised that Fury X didn't claw that back a lot more than it did because Fury X is a great gfx card but AMD priced it (initially) at the same price and hey, guess what - Nvida didn't need to make their product cheaper.  AMD got screwed as soon as they released the HD7970 at the inflated price they did.  That opened Nvidia's floodgate of overpriced 104 chips, made worse by luxury price 100 chips.

I'm not having a go at you Zone, I'm really not - but as admirable as your stance is for better consumer prices the market reality of capitalist economics doesn't give a shit.  Until AMD match Nvidia stride for stride and make their product MORE desirable, Nvidia prices wont budge.

I studied a module on economics at Uni, so I know enough to see the unfortunate picture.  I do think unless Nvidia pulls a rabbit out the hat that AMD might just start to get a snowball effect if they push the DX12 and GCN message enough.  But they also need to push developer adoption of large queue Async because with Pascal's rumoured clock speeds, it looks like they might be trying to brute force Async until Vega.


----------



## Prima.Vera (May 6, 2016)

^-- What he said. However, this is starting to feel more and more like a price fixation or worst...monopoly...


----------



## N3M3515 (May 6, 2016)

Valdas said:


> If you compare reference 1080 to reference 980 then you should be able to see that performance leap you're looking for. How much faster 980 Ti is over 980? How much faster is OCed 980 Ti vs reference 980 Ti?



I compare the latest with the previous fastest.



rtwjunkie said:


> Except why would it be? Historically, this is not the case.  It would exceed the 980, not the 980Ti.  Nvidia make theeir new models one level higher than previous gen or basically equal, that's it.
> 
> Just look at 770 to 680; 970 to 780.



So you take the only excptions to the rule, 770 was a refresh of the 680, merely oc.
And then you have the boldness to pick the 970, which is on the same node as the 780.

GTX 670 > GTX 580 
GTX 470 > GTX 285 
GTX 260 > 8800 ultra
8800GT 256MB > 7900GTX
7800GT > 6800 ultra
6800GT > 5950 ultra
5700 ultra > 4600 Ti

See the pattern? ALWAYS the x70 is faster than previous gen fastest.


----------



## rtwjunkie (May 6, 2016)

N3M3515 said:


> So you take the only excptions to the rule, 770 was a refresh of the 680, merely oc.
> And then you have the boldness to pick the 970, which is on the same node as the 780.
> 
> GTX 670 > GTX 580
> ...


No, No and No.  

Thank you. That will be all.


----------



## yogurt_21 (May 6, 2016)

truth teller said:


> as expected, i hope this doesnt come as a surprise to you


not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.


truth teller said:


> again, as expected for gddr5x, not that its gonna do any better than the previous gddr5
> does you car go faster just because you are in a racing circuit?


umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/



> the real clock rates of the memory will be the same.



Read more: http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/#ixzz47tYvJsbZ

so how do you explain these results showing a 1290MHZ core clock difference between the 2 shots, and 10GHZ effective memory rate which is over double that of the 980 ti?

Fun fact THESE ARE FAKE. Some kid threw up whatever numbers they wanted for the specs onto a 980 ti result.


truth teller said:


> dont, you will be sad once the cards are launched and definitive reviews are posted


no I really won't these are 980 ti's with the results edited to fool you. Obviously it worked you took the bait and are hooked right into their gag. Have fun with that.


----------



## the54thvoid (May 6, 2016)

Prima.Vera said:


> ^-- What he said. However, this is starting to feel more and more like a price fixation or worst...monopoly...



<start intercepted transcript>

.........

Lisa Su: "Hi Jen - you want to keep your price at the $599 mark for those new 104 chips?"
JHH: "That was the plan - how's Polaris?"
Lisa Su: "We figure we'll go toe to toe with 980ti so you can drop that 10% and we'll come in at $399"
JHH: "So we have maybe faster chip in GP104 and sell at top price for now, you guys get new Polaris chip snatching at our 980ti sales... sounds good to me."
Lisa Su: "Business as usual.  I just got off the phone to Intel.  They've gone nuts with Broadwell-E.  Said the top part costs $1500 and one kidney..."
JHH: "What you pricing Zen at?"
Lisa Su: "Well, Intel's price is so basket case we think we'll do it at $599 - we should sell a million units in the first week.  We're putting them in the new Nintendo."
JHH: "Ouch, well, we're working on Async on Vega so that'll suit us well, besides GP200 has an async module so at least our guys will be able to get fully functional DX12 in 2017."
Lisa Su: "Just lay off the Gameworks for a bit?"
JHH: "Yeah, sorry babes, that went a bit too far - I blame Ubisoft for that - they bought it hook line and sinker and coded for shit - my bad.  Did you get my Ferrari apology present?"
Lisa Su: "I did, thank you, in my favourite colour too!  Well gotta go, PR's talking silly stuff again - need to reel in Roy."
JHH: "KK, good luck!"

...click...


----------



## okidna (May 6, 2016)

the54thvoid said:


> <start intercepted transcript>
> 
> .........
> 
> ...



Oi Void, that should be *Volta*, not Vega  You better edit it before AMD sheep at *W*ater*C*loset*C*luster*F*uck*tech* taking it as a news headline : _"NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"_


----------



## [XC] Oj101 (May 6, 2016)

EarthDog said:


> I don't know you from a hole in the ground friend. I'll wait if you won't pony up any proof.



Entirely up to you. For what it's worth, I ran a news site called flyingsuicide.net (now defunct, life got in the way) and was the first person to release accurate GTX Titan performance (quoted here http://videocardz.com/39436/nvidia-geforce-gtx-titan-real-performance-revealed and http://www.nordichardware.se/nyheter/geforce-gtx-titan-50-60-kraftfullare-aen-gtx-680.html - I was the author of the non-watermarked barchart) on 8 February 2013. That was several days before anyone else could say for sure that the X7107 score was fake. At the same time I confirmed the price at $999.

On OCN (my username there is Oj010) on 24 June 2015 I told people that Fury X would average around 1140 MHz while everybody was hoping (banking on, believing, whatever) for 1300 MHz or more. I also said at the time that voltage would make very little difference - that was months before anyone else.

Also on OCN I told people 23 December 2015 already that Intel was going to be blocking BCLK overclocking on non-K series CPUs.

I could go on, but those are the easiest verifiable references I can give you off the top of my head.


----------



## N3M3515 (May 6, 2016)

rtwjunkie said:


> No, No and No.
> 
> Thank you. That will be all.



What a smart answer.
Nothing on topic? that's what i thought.


----------



## HumanSmoke (May 6, 2016)

N3M3515 said:


> I compare the latest with the previous fastest.
> 
> 
> 
> ...



That is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.

As an aside, your timeline is out of whack:
The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
The succeeding second tier card following the FX 5950U was the GF 6800 (non-GT) in May 2004. The 6800GT didn't arrive at the second tier pricing segment until November when the 6800 (non-GT) moved down to the $299 bracket
The succeeding second tier card following the 7900GTX512 was the 8800GTS 640M (G80). The 8800GT 256M didn't arrive until very late in 2007.

Regardless of the hierarchy, your examples work because in the past the succeeding chip has been more complex than than the one it replaced:

FX 5800 (NV30, 125m transistors) > Ti 4600 ( NV25 A3, 63m transistors)
GF 6800 (NV41, 222m transistors) > FX 5950U (NV38, 135m transistors)
GF 7800GT (G70/NV47, 302m transistors) > GF 6800U (NV45, 222m transistors)
8800GTS640 (G80, 681m transistors) > 7900GTX512 (G71, 278m transistors)
GTX 260 (GT200, 1400m transistors) > 8800U (G80, 681m transistors)
GTX 470 (GF100, 3100m transistors) > GTX 285 (GT200B, 1400m transistors)
GTX 670 (GK104, 3540m transistors) > GTX 580 (GF110, 3000m transistors)

We are now at a point where this is no longer true. GP104 carries less transistors than GM200. So, thanks to increased wafer costs, likely worse yield prediction, and a huge disparity in die area between GM200 and GP104, it is very probably that you can throw out past examples because the rules no longer apply - especially when factoring in salvage parts...and with foundry costs escalating, and GPUs evolving a degree of specialization depending upon market and workload, we probably wont be returning to "the good old days".


----------



## the54thvoid (May 6, 2016)

okidna said:


> Oi Void, that should be *Volta*, not Vega  You better edit it before AMD sheep at *W*ater*C*loset*C*luster*F*uck*tech* taking it as a news headline : _"NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"_



Hell, I'll leave it at Vega - makes it even more of a close relationship.


----------



## rtwjunkie (May 6, 2016)

N3M3515 said:


> What a smart answer.
> Nothing on topic? that's what i thought.



Because it's like arguing with a wall.  And I'm smart enough not to do that.  Everyone except you and one or two others knows that the 70 series does not always, and will not be beating the previous flagship.  Ever since Kepler, when the Flagship is of a higher end chip, the 70 series cannot beat it. Nor will it this time.

But I'll not argue with the wall anymore.  I will let you be upset when your desires and predictions don't come true...as if any of this is worth getting upset over.


----------



## Dethroy (May 7, 2016)

What a sad day 
I've never been more disappointed in the tpu community than right now...


----------



## Fluffmeister (May 7, 2016)

Dethroy said:


> What a sad day
> I've never been more disappointed in the tpu community than right now...



Don't be sad my friend. 

Random leaks ... straight to sweeping conclusions is what the TPU "community" does.

All we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!


----------



## Prima.Vera (May 7, 2016)

For all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? ))))


----------



## truth teller (May 7, 2016)

yogurt_21 said:


> not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.
> 
> umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
> http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/
> ...


well, now that nvidias presentation is over i must ask, was there anyt factual information in that post of yours? did i really took the bait? and perhaps most importantly, did it make you sad?



Prima.Vera said:


> For all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? ))))


i only heard them mentioning 2x the performance of a titan x, which isnt such a feat, hell, even a 290x beats a titan x.


----------



## okidna (May 7, 2016)

[XC] Oj101 said:


> Two of these on LN2 beat four 980 Tis on LN2
> 10 GHz RAM will be a possibility with some of these cards
> Frequencies will hit CPU-like speeds in LN2



Wow, you're right, 2114 Mhz boost clock under stock cooling, so indeed CPU-like frequencies could be achieved under LN2 :



HumanSmoke said:


> 2114MHz clock air cooled running the demo. Not too shabby at all.
> 
> 
> 
> ...





Prima.Vera said:


> Guess how fast the 1070 will be then...? ))))



They claimed it to be faster than Titan-X : http://www.techpowerup.com/forums/threads/nvidia-also-announces-faster-than-titan-x-gtx-1070.222280/


----------



## HumanSmoke (May 7, 2016)

Fluffmeister said:


> All we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!


After the GTX 1080/1070 presentation, I think it all went up a notch. wccftech AMD fanboys have just gone to DEFCON1 ( Comparing their collective brown-stained pants for Rorschach test candidacy).

If the minimum guaranteed boost is 1733MHz ( http://www.geforce.com/hardware/10series/geforce-gtx-1080 ) then that 2114MHz core / 11000MHz effective memory augers well for overclocking and a raft of AIB custom cards.


----------



## Prima.Vera (May 7, 2016)

Yeah, just saw the 1070 article. 

So 1070 card faster than a 980Ti for 380$... 
Not bad at all. This card could even go lower than 350 if AMD pulls 2 rabbits from its hat. 

Exciting times ahead!


----------



## rtwjunkie (May 7, 2016)

Prima.Vera said:


> So 1070 card faster than a 980Ti for 380$...


Check your reading comprehension. Faster than Titan-X, not 980Ti.

SMFH....

The level of wishful thinking a few people on here are exhibiting is astounding.


----------



## Prima.Vera (May 7, 2016)

rtwjunkie said:


> Check your reading comprehension. Faster than Titan-X, not 980Ti.
> 
> SMFH....
> 
> The level of wishful thinking a few people on here are exhibiting is astounding.


Getting butt-hurt all of the sudden?   
I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing 

Cheers!


----------



## rtwjunkie (May 7, 2016)

Prima.Vera said:


> Getting butt-hurt all of the sudden?
> I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing
> 
> Cheers!



Wait, where did you admit you were wrong?  Because you are on record multiple times saying the 1070 would beat  980Ti.  Now that is not the case (as I pointed out to you beforehand that it wouldn't), you are unable to admit your wishful thinking was wrong?  Wow.


----------



## Prima.Vera (May 7, 2016)

*@rtwjunkie*

Congrats! You just got awarded the Troll (_I really hope you are trolling and not being stupid_) of the Year award.

Now move along kid, you're bothering me...


----------



## rtwjunkie (May 7, 2016)

Prima.Vera said:


> *@rtwjunkie*
> 
> Congrats! You just got awarded the Troll (_I really hope you are trolling and not being stupid_) of the Year award.
> 
> Now move along kid, you're bothering me...


You're oblivious. I'm way older than you. The only one who has been trolling is you, who has been practically slobbering like a rabid fanboy at the prospect that a 970 would beat a 980Ti.

I on the other hand have attempted to be a voice of reason and rational thought.  Only in your Bizzaro-World is the voice of reason labeled a troll.


----------



## N3M3515 (May 7, 2016)

HumanSmoke said:


> That is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.
> 
> As an aside, your timeline is out of whack:
> The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
> ...



Now that's an excellent explanation.


----------



## EarthDog (May 7, 2016)

[XC] Oj101 said:


> Two of these on LN2 beat four 980 Tis on LN2
> 10 GHz RAM will be a possibility with some of these cards
> Frequencies will hit CPU-like speeds in LN2


----------



## [XC] Oj101 (May 7, 2016)

EarthDog said:


>



See.?


----------



## Caring1 (May 7, 2016)

Gotta love TPU, we get intelligent responses, some butt hurt and a smattering of smart arse to lighten the mood


----------



## Vayra86 (May 7, 2016)

the54thvoid said:


> Expecting a company to try harder because people elect to band together and boycott their product does a few things:
> 1) Share price crash
> 2) Your pious campaign leads to job lay offs and cut's in R&D.
> 3) The product actually gets worse.
> ...



Thank you, sir, for the dose of realism.

People seem to think they live in Utopia sometimes, especially when the hype train for a new GPU release is starting again. It never ceases to amaze me. Just a month ago everyone 'needed HBM2'.


----------



## Prima.Vera (May 7, 2016)

My guess is that HBM2 will be used a little later for the new Titan, and maybe 1080Ti (really hope so)


----------



## Fluffmeister (May 7, 2016)

The GP100 is currently lapping up the HBM2 supply for Nv's big bucks HPC client deals.


----------



## william_homyk (May 8, 2016)

The question is how do these benchmarks compare to a 980 Ti. The 27,000 score is from 3dmark 11 ( an old 720p test) which My 980 Ti SC+ scores a 26,000 and the Fire strike extreme score (1440p) of 10,000 which my 980 Ti SC+ got 9200 on. For reference My 980 Ti is OC to 1,300 MHz over its 1191 base clock. I’ll be generous and go by the stock benchmarks the 980 Ti gets in Firestrike Extreme (8500) and 3D Mark 11 (24,500). So at best your getting a 15-20% increase over a stock 980 Ti and for someone with an overclocked 980Ti your basically getting a no percent increase.


----------



## EpicGrog (May 10, 2016)

RejZoR said:


> Numbers mean nothing to me if I don't have anything to compare them with. Can someone also post numbers for current gen cards using similar system?



Im getting 12,168 with 2 GTX 980's SLI on Firestrike Extreme. Thats with a 12 Core Xeon @ 4.5 Ghz on an Asus ROG x58!


----------



## TheHunter (May 10, 2016)

This is 980ti @ factory boost 1405mhz,

http://www.3dmark.com/fs/8357945
Graphics Score 9 340
Physics Score 13 411
Combined Score 4 214


around 700points (or maaybe 2-3fps) behind that 1850mhz 1080gtx.. If that's base freq. then it means its oc'ed too.. Well I assume that since both  780gtx and 980ti reads base freq.


----------



## medi01 (May 10, 2016)

xvi said:


> From a marketing perspective, they wouldn't want the x80 slot empty in their lineup only to be filled in a while later.


Dude. Fury. 
Ti was to spoil it.


----------



## redeye (May 14, 2016)

I looked up my benchmark
http://www.3dmark.com/fs/5090497
and if the 1080 benchmark is not bogus... The gtx 1080 will be  equivalent to two GTX 980's ...

firestrike extreme... http://www.3dmark.com/fs/5090798




interesting... My Oc firestrike  ultra bench
http://www.3dmark.com/fs/5311970
Is 6240, compared to the "leaked" gtx1080 oc score of 6232. See the post below...
Is 5750 Non-oc benchmark (mine) http://www.3dmark.com/fs/5153181

Of course, when when you set the render output(resolution tp 1080p )in 3dmark ultra and use a 4k gsync monitor (xb280) you can get a boost in the score to 6156 (gtx980SLI stock) http://www.3dmark.com/fs/5153245http://www.3dmark.com/fs/5153245

I REALLY hope thar a 1080 is equal to a SLI980!... The buttery smoothness that is a single card is heaven!  
(OTOH hand as long as the min fps on a single card gtx980 does not go below 30fps, the SLI is "smooth")


----------



## basco (May 15, 2016)

more benches:

http://videocardz.com/59882/nvidia-geforce-gtx-1080-3dmark-overclocking-performance


----------



## D007 (May 15, 2016)

Frankly the 600 dollar card should be the TI and these people have been ripping us off, for far too long. Do not expect results as good as they have been stating.. I'm calling bullshit..


----------



## medi01 (May 15, 2016)

D007 said:


> 600 dollar card


Last time I checked it was 699$, no less, any time soon.

But that's a fair price for 10x Maxwell. (Although, Maxwell was in no way lesser physicist, than Pascal, I'd say, rather on the opposite)


----------



## TheHunter (May 16, 2016)

^^
That has nothing to do with GTX 1080..Its only by Double precision and FP16 deep learning CUDA apps.. A misleading slide imo.


----------



## D007 (May 16, 2016)

medi01 said:


> Last time I checked it was 699$, no less, any time soon.
> 
> But that's a fair price for 10x Maxwell. (Although, Maxwell was in no way lesser physicist, than Pascal, I'd say, rather on the opposite)



What you call "fair" I call price gouging..


----------



## medi01 (May 17, 2016)

Germany EUR 789
ROFLMAO


----------

