Thursday, May 5th 2016

NVIDIA GeForce GTX 1080 Put Through 3DMark

Some of the first 3DMark performance numbers of NVIDIA's upcoming GeForce GTX 1080 graphics card made it to Futuremark's online database. The results page hint at samples of the GTX 1080 running on early drivers, on two separate machines (likely from two different sources). The first source, who ran the card on a machine with a Core i7-5820K processor, scored P19005 on 3DMark 11 (performance preset). The second source, who ran the card on a machine with a Core i7-3770K processor, scored 8959 points on 3DMark FireStrike Extreme. Both scores point at GTX 1080 being faster than a GTX 980 Ti.
Source: VideoCardz
Add your own comment

163 Comments on NVIDIA GeForce GTX 1080 Put Through 3DMark

#76
Frick
Fishfaced Nincompoop
rtwjunkieUm no....it's an upper-midrange, using the GM204 chip which happens to NOT be Nvidia's high end Maxwell chip.

Cost has nothing to do with whether a GPU is high-end, and everything to do with it just being expensive.
It depends on how you look at it, and whether you think "high end" is just one chip or not but frankly I don't see why you would think that. Is the low end made up solely by the GT710 because that is the slowest card, and does that make the GT720 "very-slightly-less-lower-low-end"? It's the second fastest mainstream card from Nvidia (not counting the Titan). From a market standpoint cost has everything to do with it, because that is how the market is segmented and percieved. But it's all semantics anyway and I definitely see the various "end"s as ranges.
Posted on Reply
#77
Ruru
S.T.A.R.S.
PP MguireAnd yet others still going on about how this is shit, when in fact a midrange beating a top end is quite good.
I'd call it high-end and the big-chip model enthusiast. Just like 980 is high-end and 980Ti is enthusiast class.

If the difference is similar like 980 vs 780Ti, I'd say that there's absolutely no sense to upgrade from 980Ti.
Posted on Reply
#78
N3M3515
TissueBoxIt roughly follows the same increase in performance per generation - Kepler to Kepler refresh (20%-30%), and Kepler refresh to Maxwell (20-40%). As I recall, the switch to 16nm offers a 40% improvement in performance compared to 28nm, OR use 50% less power.

It performs ~34% better than the similar card on their previous generation, so yes, I would say it's solid (Or perhaps I should use expected) and in line with what they've been doing.
Maxwell was already expectacular at power comsumption.
Do you remember when node changes meant huge leaps or am i dreaming. What you say is correct for vcards at the same node. Look how many years it took for a node change, at the very least one should expect a 60% increase in perf from a gtx 960 to a gtx 1060.

Wake up people.
Posted on Reply
#79
sweet
9700 ProI'd call it high-end and the big-chip model enthusiast. Just like 980 is high-end and 980Ti is enthusiast class.

If the difference is similar like 980 vs 780Ti, I'd say that there's absolutely no sense to upgrade from 980Ti.
Don't worry. nVidia's driver will force you to upgrade soon after Pascal is released. It did and will happen again :pimp:
Posted on Reply
#80
TissueBox
N3M3515Maxwell was already expectacular at power comsumption.
Do you remember when node changes meant huge leaps or am i dreaming. What you say is correct for vcards at the same node. Look how many years it took for a node change, at the very least one should expect a 60% increase in perf from a gtx 960 to a gtx 1060.

Wake up people.
No as a matter of fact, I do not remember when node changes meant huge 60% leaps you're referring to. The jump from 65nm (8800 GTX/9800 GTX) to 55nm (GTX 285) was 30%. The jump from 55nm (GTX 285) to 40nm (GTX 480) was 30%. The jump from 40nm (GTX 580) to 28nm (GTX 680) was 15%.

All the reviews can be found on TechPowerUp.
Posted on Reply
#81
johnspack
Here For Good!
Dammit the 1070 better not be that much faster, still paying off my dam 970.....
Posted on Reply
#82
ShockG
1080 looking mighty impressive here.
Stock clocks beating GTX 980 G1 Gaming model @ 1550MHz (fixed clock) core and 2.1GHz DRAM clock.
scoring 7060~ or so in 3dMark FS X vs 9,100 for 1080. Only one PCI-E plug and lower power draw. Rather impressed with this GPU.
Even faster when overclocked at over P10K, something that was never going to happen unless 980 was on LN2.
Worthwhile upgrade for 980 owners for sure and 980Ti owners wanting to reduce power consumption.
Posted on Reply
#83
N3M3515
TissueBoxNo as a matter of fact, I do not remember when node changes meant huge 60% leaps you're referring to. The jump from 65nm (8800 GTX/9800 GTX) to 55nm (GTX 285) was 30%. The jump from 55nm (GTX 285) to 40nm (GTX 480) was 30%. The jump from 40nm (GTX 580) to 28nm (GTX 680) was 15%.
All the reviews can be found on TechPowerUp.
Oh really?

GTX 280 vs (8800 ultra - 9800 GTX) Thats a 60% faster than the 9800 GTX right there, that's solid.
Lol, even the GTX 260 is like 25% faster than the 8800 ultra, and you are reffering to the 9800GTX which was even slower, and guess what, the GTX 285 is faster than GTX 280.
Note: in the chart here in TPU, the 9800gtx being 72% of a gtx 285 does not mean the gtx285 is 30% faster. From the perspective of the 9800gtx it is 30% slower than a gtx 285, and from the perspective of a gtx 285 it is 40% faster than a 9800GTX.
And again, take a look at how the second best from the new gen (GTX 260) is 25% faster than last gen fastest (8800ultra).

GTX 285 vs GTX 480, again same story, 40% faster.
Again nvidia's second best (GTX 470) faster than the fastest from the previous gen.
EDIT: the GTX 460 was on par with previous gen fastest (GTX 285).

Now the GTX 580 vs GTX 680 is an unfair comparison.
And even then, GTX 680 was 25% faster than GTX 580 and the GTX 670 also by 20%
EDIT: GTX 660Ti was on par with previous gen fastest (GTX 580)

You are comparing a card that was meant to be $300 GTX 660, but got renamed to GTX 680 at $500 because AMD had nothing to counter the big full chip from NVIDIA.
Posted on Reply
#84
Valdas
N3M3515...
If you compare reference 1080 to reference 980 then you should be able to see that performance leap you're looking for. How much faster 980 Ti is over 980? How much faster is OCed 980 Ti vs reference 980 Ti?
Posted on Reply
#85
Legacy-ZA
Well, I am still waiting for a proper review, so I actually don't want to believe this, yet, I really hope it's not true. But, if it is; I told everyone this is what they were gonna do and I got berated for it. Not so happy anymore are we? Enjoy.

I really want this generation to start crunching 4k UHD to bits.
Posted on Reply
#86
johnspack
Here For Good!
I remember my 480 was a 2.5x improvement over my 285.... so I wouldn't be surprised if an arch change might do that again....
Posted on Reply
#87
silentbogo
All this buzz and no one even thought that these screenshots might be fake, or not related to 10-series at all?
I've seen the exact same ones about a week or two ago, except there was "Generic VGA" on both of them. No original links to 3DMark score pages either (though it won't probably make the difference). I had a little theory that this might actually be a Quadro M5000 or something along these lines, but now I'm starting to doubt even that.

But wait... There's always more:
www.3dmark.com/3dm11/11061015
Posted on Reply
#88
Prima.Vera
N3M3515... a card that was meant to be $300 GTX 660, but got renamed to GTX 680 at $500 because AMD had nothing to counter the big full chip from NVIDIA.
Oh. Those were good times....NOT! I hope nVidia wont pull that shit over again because of AMD's incompetence.
Posted on Reply
#89
Caring1
silentbogoAll this buzz and no one even thought that these screenshots might be fake, or not related to 10-series at all?
I've seen the exact same ones about a week or two ago, except there was "Generic VGA" on both of them. No original links to 3DMark score pages either (though it won't probably make the difference). I had a little theory that this might actually be a Quadro M5000 or something along these lines, but now I'm starting to doubt even that.

But wait... There's always more:
www.3dmark.com/3dm11/11061015
The scores may not be spectacular for the card in that link, but it's core and memory clocks seem strange and not quite 8Gb of Vram either.
Posted on Reply
#90
silentbogo
That's from comments in VideoCardz with intentionally sarcastic "shocking revelation" that this is a "crippled" 1070 :banghead: :nutkick::roll:
Posted on Reply
#91
Vayra86
ZoneDymothe 60 is low end gaming
the 70 is mid end gaming
the 80 is high end gaming
the Ti is a boosted version

Titan is overpriced

Thats how the nomenclature worked,
the GTX960Ti would not suddenly better then the 980 purely because of the Ti thing
What have you - and a whole herd of other sheep - been smoking the past ten years?

The x70 is mid range on a budget
The x80 is overpriced mid range with mildly better performance (please remind yourself of the 670 that could be on 680 stock performance with a regular OC, or the 970 doing the same versus a stock 980)
The x80ti is big chip, high/top end.

The exception to this rule is ONLY the GTX 780. And that is only because 7xx was a Kepler rebrand and that pushed the 680 > 770 which was just a 680 with faster VRAM. The 7xx series was a strange release because it landed 'between architectures', moving us slowly from Kepler to Maxwell, with the 750ti as the only Maxwell v1 part.

So, what's new? Exactly nothing, and exactly as I and the rest of the elevated people on this forum (mind you, this is a rather small portion of our forumites, evident in this thread) have been trying to get into your thick, thick skulls.

Move along now.
Posted on Reply
#92
RejZoR
cedrac18Hmmm is no one expecting better performance when the drivers mature? Not sure why is everyone is flipping out already when these results are not even from proper review.
Drivers play less and less of a role in performance these days. Except for SLi/Crossfire profiles. There might be slight changes with per-game optimizations, but nothing that would bring general improvements across all games.

Also, GTX 1070 with 5.5 GB of VRAM instead of 6GB XD
Posted on Reply
#93
bug
TissueBoxReally surprised at the community here. The 1080 replaces the 980, and it is on average ~34% faster (assuming GTX 980 Ti @ 1190MHz performance levels). Similar to the 980 reveal, which was about 30% better than the 780 and just barely faster than the 780 Ti (5-10%).

That's pretty solid. 980 Ti owners should be looking at the 1080 Ti or Pascal Titan as their replacement.

I'll probably opt for SLI 1080s as it shifts down a price bracket when the 1080 Ti releases.
The problem (as I see it) is that if this is true, driving 4k from a single card is going to remain a no go for one more generation.
I was really, really hoping that after being stuck at 20nm for all these years, 14nm will at least enable a more respectable jump in performance. Oh, well...
Posted on Reply
#94
Vayra86
bugThe problem (as I see it) is that if this is true, driving 4k from a single card is going to remain a no go for one more generation.
I was really, really hoping that after being stuck at 20nm for all these years, 14nm will at least enable a more respectable jump in performance. Oh, well...
This is where it goes wrong everytime in people's heads.

Did 14nm bring us a massive performance jump on CPU? Is Skylake 20-30% faster than 22nm? Nope! We gained 10%.
Did 32nm bring us the great jump in performance? No it was architecture. Sandy Bridge made it happen, same process as Nehalem. Then we moved to Ivy on 22nm... and gained 10%.

Did we not see 30%+ performance jumps on GPU in the past years, for similar price parts? Yes we did.

Are we going to see 30% performance jump this year? Yes we are. Given the fact that Intel gets to squeeze 10% out of a node shrink, and we get 30% REGARDLESS of node shrinks on GPU, I'd say we have nothing to complain about.

Once again. Move along now :)
Posted on Reply
#95
Prima.Vera
Vayra86What have you - and a whole herd of other sheep - been smoking the past ten years? ...
... I and the rest of the elevated people on this forum (mind you, this is a rather small portion of our forumites, evident in this thread) have been trying to get into your thick, thick skulls.

Move along now.
Wow. You are indeed a high elevated and intelligent human being by throwing up so many insults and considering yourself better and smarter than the others...
High elevated indeed...
Posted on Reply
#96
Legacy-ZA
Vayra86What have you - and a whole herd of other sheep - been smoking the past ten years?

The x70 is mid range on a budget
The x80 is overpriced mid range with mildly better performance (please remind yourself of the 670 that could be on 680 stock performance with a regular OC, or the 970 doing the same versus a stock 980)
The x80ti is big chip, high/top end.

The exception to this rule is ONLY the GTX 780. And that is only because 7xx was a Kepler rebrand and that pushed the 680 > 770 which was just a 680 with faster VRAM. The 7xx series was a strange release because it landed 'between architectures', moving us slowly from Kepler to Maxwell, with the 750ti as the only Maxwell v1 part.

So, what's new? Exactly nothing, and exactly as I and the rest of the elevated people on this forum (mind you, this is a rather small portion of our forumites, evident in this thread) have been trying to get into your thick, thick skulls.

Move along now.
I always thought that;

50/Ti was low end
60/Ti was Mid-Range (Budget)
70/Ti was Upper Mid-Range
80/Ti was High End/Top End

In fact, I think a nice article should be made to start putting all this into perspective.

The bars got skewed a lot thanks to the Titan X/Z that is now the Top End model and by way, replacing the dual GPU's, the GTX590 and GTX690... To confuse matters more, the different amount of money they are charging for all the different models now.

I am starting to doubt people are actually getting a "Free Game" with new GPU purchases these days.
Posted on Reply
#97
Vayra86
Prima.VeraWow. You are indeed a high elevated and intelligent human being by throwing up so many insults and considering yourself better and smarter than the others...
High elevated indeed...
Do you also have some relevant input or do you just want to be butthurt about getting told a very real truth? We've been seeing the same comments for years now and nothing has changed, so forgive me if it gets boring and if it starts to show patterns between some visitors of this forum. Recently, with these 'big leaps' that are being marketed (yes, marketed, not actually sold), it seems like the level of stupidity rises with the level of marketing. We just blindly follow, like sheep, hence the comment about herd and sheep.

It's a bit like people being surprised at the sun rising every morning, and then being surprised it goes down again. :banghead:
Posted on Reply
#98
Parn
Considering the move from 28nm to 16nm which would give 40% more transistors to the engineers to play with and better energy efficiency, I would have expected the 1080 to crush 980Ti. If not, I'd skip this generation.
Posted on Reply
#99
ZoneDymo
Vayra86Do you also have some relevant input or do you just want to be butthurt about getting told a very real truth? We've been seeing the same comments for years now and nothing has changed, so forgive me if it gets boring and if it starts to show patterns between some visitors of this forum. Recently, with these 'big leaps' that are being marketed (yes, marketed, not actually sold), it seems like the level of stupidity rises with the level of marketing. We just blindly follow, like sheep, hence the comment about herd and sheep.

It's a bit like people being surprised at the sun rising every morning, and then being surprised it goes down again. :banghead:
First you act like you are 12, I mean "butthurt", really? go back to 4chan if you cannot be an adult.
Secondly, its more about wanting people to wake up and realize they are being milked, I find it amazing people defend companies/Nvidia with this practice by saying "its business, the want profit", yeah they do and who is paying for that easy profit? WE ARE!
Why would we stand for that? why would we continue to buy mediocre upgrades?

If lets say Apple would bring out a new Iphone thats literally a piece of wood, yeah im sure its good profit for them to sell us a piece of wood for 600 dollars, does not mean we should buy it.
I would be very much inclined to help people "Wake up" by informing them that they are buying a piece of wood.

We are not the company, we do not thrive with their profits, infact rewarding them for mediocre upgrades hurts us more because it continually slows down progress in the world of computing.
I would like to be able to do 8k gaming at 200hz in my lifetime thank you very much, oh and beyond would be nice as well....
But nope, instead of making the leaps we want we take baby steps and people defend this practice, well not me and maybe I can make others see this as well which will hopefully cut into the profits of the company which will hopefully spur on some actual progress.

If you want to buy it, go ahead, if you want to defend it by saying "they want to make as much money as possible" go ahead.
For me its not a compelling argument as to why all this baby stepping is ok.
Legacy-ZAI always thought that;

50/Ti was low end
60/Ti was Mid-Range (Budget)
70/Ti was Upper Mid-Range
80/Ti was High End/Top End

In fact, I think a nice article should be made to start putting all this into perspective.

The bars got skewed a lot thanks to the Titan X/Z that is now the Top End model and by way, replacing the dual GPU's, the GTX590 and GTX690... To confuse matters more, the different amount of money they are charging for all the different models now.

I am starting to doubt people are actually getting a "Free Game" with new GPU purchases these days.
It seems to be more about what peoples budgets are then where Nvidia actually places their products now
And yeah you are right, suddenly this Titan is part of the bracket, yet its the same people that also claim its not part of it because its "meant for more then just gaming" bit of that cake and eating it to kinda thing.

Lastly, that free game thing, or free anything, never ever again think anything is added in for free.
You are in that case paying for the package, 600 dollars for a card and a game.
They do this as is no suprise as an extra incentive to buy the card, a new game you might want bundled in might sell better for me people (working better psychologically) then just offering the card with a 60 dollar discount.
It also helps with marketing, suddenly your new game that you want is advertised with this card and it will "give you the best experience with this kewl new game yo".

You should never see it as getting anything free with anything, you are paying for both and should wonder if that then is worth it.
Posted on Reply
#100
matar
Great lets Hope the GTX 1070 is equal to a GTX 980 Ti
Posted on Reply
Add your own comment
Nov 24th, 2024 22:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts