# NVIDIA GeForce GTX 1080 8 GB



## W1zzard (May 17, 2016)

NVIDIA's GeForce GTX 1080 was announced recently. Today, we have the first review! Performance is incredible, doubling GTX 970 performance levels. Efficiency is also sky high, nearly doubling everything we've seen from NVIDIA's Maxwell architecture. Our GTX 1080 review compares 10 cards in 16 games at up to 4K resolution.

*Show full review*


----------



## sabre23 (May 17, 2016)

Overpriced shit in India, showing @ Rs. 80,000 (1180 $) ..

Hopefully Polaris will be available under 350 $..


----------



## mrthanhnguyen (May 17, 2016)

Wonder why the temp was 67 at the event.


----------



## KarymidoN (May 17, 2016)

Great job from Nvidia, now it's better AMD to make a good move, cause we need better prices, in some places this GPUS will be too overpriced.


----------



## FMinus (May 17, 2016)

Are all reviewers instructed by nvidia to bench against stock nvidia cards? I hope you stick to that when benching future AMD cards too.


----------



## Mussels (May 17, 2016)

the power usage is stunning here.

as someone who sees a 50W jump in power just because a webpage has a video ad, that 7W power use for bluray is incredible.


----------



## W1zzard (May 17, 2016)

FMinus said:


> Are all reviewers instructed by nvidia to bench against stock nvidia cards?


No, I've benched like that for 10 years I think, you are the first to negatively comment on that.
All comparison cards in the review are reference version. When no reference version is available I go out and buy the closest one and clock it at reference clocks.


----------



## KainXS (May 17, 2016)

It looks just like expected to be honest about as fast as 980 sli performance and 20-30% faster than a 980Ti

but it looks like the rumors about the board power limiting overclocking are true(founders version will have the better board)

but pretty expected none the less.


not bad at all

edit:

Did not see that the GPU reviewed was the founders version, looks like non reference is going to be what to buy


----------



## Krekeris (May 17, 2016)

Hmmmm I was going to get ultrawide freesync monitor and something from upcoming AMD line-up. But I don`t know anymore. GTX1080p is really impressive but still I don`t want to pay extra for G-sync crap.


----------



## W1zzard (May 17, 2016)

KainXS said:


> founders version


Added a note to the conclusion that the GPU on FE is not specially selected for overclocking or anything else.


----------



## AlienIsGOD (May 17, 2016)

more excited for the GTX 1070  @W1zzard will TPU have a 1070 review?


----------



## Fluffmeister (May 17, 2016)

Extremely impressive piece of kit, looks like the 1070 is gonna kick my 980 Ti in the teeth too and for a fraction of the price and power consumption.

Can't wait to see what the custom AIB partner cards will do.


----------



## General Lee (May 17, 2016)

Fantastic performance, but the cooler and price tag really don't impress. I really hope we get 599$ custom AIB cards with the capability to keep those 2100 MHz Boost OC clocks Nvidia showcased. Which leads to the question, how did that Nvidia OC demo show such low temps if it was supposedly a reference cooler used? Does the cooler operate at such low RPM by default and the OC was done with 100% fan?

In reality I expect AIBs to price their custom cards for 700€+ here.


----------



## Absolution (May 17, 2016)

AOTS should become standard as part of the testing suite too 

Anyway to tell if async is real or maxwell style supported?


----------



## The Quim Reaper (May 17, 2016)

Not really worth replacing my two 970's with one of these, looks like that will only come once the 1080Ti lands, oh well.


----------



## W1zzard (May 17, 2016)

Absolution said:


> AOTS should become standard as part of the testing suite too


No plans to add Ashes, feels more like a synthetic benchmark than a game. Next addition will be DOOM


----------



## the54thvoid (May 17, 2016)

Absolution said:


> AOTS should become standard as part of the testing suite too
> 
> Anyway to tell if async is real or maxwell style supported?



Well, DX12 was used on Tomb Raider and it didn't do well there... no wait, it romped home. DX12 doesn't need massive Async workloads, it's only one part of the API.

But I agree, it would be good to see a heavily AMD biased title tested.


----------



## Luka KLLP (May 17, 2016)

Those efficiency charts make me droooool


----------



## Pandora's Box (May 17, 2016)

mrthanhnguyen said:


> Wonder why the temp was 67 at the event.



they clearly used a frame rate limiter to cap the fps at 61. the card was probably @ 60% usage.


----------



## WarEagleAU (May 17, 2016)

Looks pretty good. I'm interested in the 1070 as that's more in my wheelhouse based on price. Think I may go Nvidia this go round.


----------



## droopyRO (May 17, 2016)

Krekeris said:


> GTX1080p is really impressive but still I don`t want to pay extra for G-sync crap.


Welcome to the club, you had to choose from two GPU makers now you have to choose what monitor you want with your video card.
There goes my ideea to SLI two GTX970, i will save money and buy a single 1080.


----------



## W1zzard (May 17, 2016)

droopyRO said:


> Welcome to the club, you had to choose from two GPU makers now you have to choose what monitor you want with your video card.


I had2lol


----------



## Absolution (May 17, 2016)

W1zzard said:


> No plans to add Ashes, feels more like a synthetic benchmark than a game. Next addition will be DOOM



Aw, no DX12 runs for Hitman?
Edit: Ah its buggy - reread the thing

Just wanted to see Pascals async compute capabilities. Could be maxwell style implementation with just raw power thrown in.


----------



## neatfeatguy (May 17, 2016)

Certainly is impressive the jump the 1080 has over the 980. It's also impressive that the 1080 (stock) runs around 15-20% faster than my 980Ti (overclocked) on most games benched here.

I'm curious as to what the 1080Ti version can do. If the performance gap on the 1080/1080Ti are similar to the 980/980Ti....that will be one heck of a card and I'm guessing it'll be flirting with a $800 price tag.


----------



## nickbaldwin86 (May 17, 2016)

Would love to see how it stacked up next to SLI 980s.... NV was claiming that the 1080 was going to beat out SLI 980s in the launch media right?


----------



## TheLostSwede (May 17, 2016)

Absolution said:


> AOTS should become standard as part of the testing suite too
> 
> Anyway to tell if async is real or maxwell style supported?


----------



## deemon (May 17, 2016)

Great review! 

some thought...
1) The Fast Sync latency slide does not make sense. How is the VSYNC ON latency so big on this picture, whereas normally... on 60Hz monitor ... the maximum it could be is 2x what the VSYNC OFF latency (16.67) is, so ... ... 33.33ms max. Why on the slide it shows 88ms+ did they count with some 5 pre-rendered frames? Nobody does that. Max pre-rendered frames should never ever be more than 1.
2) For me surprising was HDR inclusion... since I have only read about AMD HDR hype and NVidia was rather quiet about it. Glad to see they also decided to add this. 
3) I am slightly sad how they implemented the FastSync or more precicely, I would love, when they added another "mode" to VSYNC, that tries to sync GPU frames output to static sync display by predicting when to start rendering next frame ahead of time instead of bruteforcing maximum frames out of GPU.
4) Really sad about no mention from NVIDIA about VESA Adaptive Sync support. Nvidia must hate us :-(
5) SMP is cool feature... 
6) Ansel is just gimmick for most of us.


----------



## DuxCro (May 17, 2016)

Hmmm. Not sure if it's time to upgrade my HD 4850 or should i wait for a bigger performance jump.  Joke aside, i'll wait for AMD to see what they have to offer before i upgrade.


----------



## TheLostSwede (May 17, 2016)

Absolution said:


> Aw, no DX12 runs for Hitman?
> 
> Just wanted to see Pascals async compute capabilities. Could be maxwell style implementation with just raw power thrown in.


----------



## W1zzard (May 17, 2016)

Absolution said:


> Aw, no DX12 runs for Hitman?


DX12 was completely broken with Hitman at time of rebench (last April). It's apparently better now. I'll look DX12 again for next rebench, maybe they get more fixes out.


----------



## deemon (May 17, 2016)

TheLostSwede said:


>



This slide shows that 1080 async is as bad/good as was with 980Ti. 1080 is just ~33% faster at 4k than 980Ti ... thus the number is ~33% bigger here aswell. Did I understand that right?


----------



## Absolution (May 17, 2016)

TheLostSwede said:


>



Any idea whats the differences between Extreme and Crazy quality settings? Dunno why they picked a lower setting :/


----------



## W1zzard (May 17, 2016)

deemon said:


> I would love, when they added another "mode" to VSYNC, that tries to sync GPU frames output to static sync display by predicting when to start rendering next frame ahead of time instead


FastSync lets the game render as quickly as possible, it buffers the rendered frames and when the monitor starts a new frame it will send the last completed frame from the GPU buffer. This approach requires no game developer support.

More info:


----------



## GreiverBlade (May 17, 2016)

and here it is ... the "founder edition that cost 100$ more" am i the only one that find nvidia particularly despicable for that one ? custom model are far above that one in term or cooler (generally) and already more expensive than a stock one (everybody know the price of a custom one will not be 599$ ...) , so why make the stock one wear the name of "founder" just to add a 100$ overprice? 

on the other hand, the perf are correct, tho +36% is not a warrant for a change over my 980   (not even for 4gb more, nor 41% in 4k case )


oh ... so the fps is in the "barely acceptable"  for 4k? oh well at last you need only 1 card for it


----------



## chodaboy19 (May 17, 2016)

Thank you for the comprehensive review.

Where there any improvement or changes to G-Sync?


----------



## Frick (May 17, 2016)

So clock for clock it's not much faster than the 980ti? It has crazy clocks though...

Now bring the effeciency too mid-low end cards.

Also, I really like that backplate. Like a lot.


----------



## $ReaPeR$ (May 17, 2016)

LOL 700$ and still not able to pull 4k @ 60fps.. and with the ti version coming i really dot get why would someone buy this card.


----------



## Absolution (May 17, 2016)

Interesting part is it cant run 60fps @ 4k in most titles, guess the 1080Ti will be the one for it.


----------



## Mussels (May 17, 2016)

what glorious pricing the australian overlords have given me :/


----------



## GreiverBlade (May 17, 2016)

Mussels said:


> what glorious pricing the australian overlords have given me :/


don't worry ... Switzerland will be worse ... after all 699$ is the standard price of a custom model 980 ... 1299$ being the one for the 980Ti...

funny enough in the end a 295x2 is more attractive than that one ...


----------



## Legacy-ZA (May 17, 2016)

Well, I can finally say; there is a card that can keep most games above 100FPS for 1080p gaming and no doubt for a long while yet, thanks to it's 8GB VRAM. Although the card is more expensive than what I would have liked, I will probably buy the card. I plan to stick with 1080p for a long time, that is, until we see 100FPS+ results for 1440p and 4k gaming. I can't stand anything below 100FPS, my brain doesn't lag.


----------



## W1zzard (May 17, 2016)

chodaboy19 said:


> Where there any improvement or changes to G-Sync?



NVIDIA did not mention G-Sync at all, so I think it's safe to assume that there were no changes


----------



## GreiverBlade (May 17, 2016)

i noticed the power consumption, well at last it's on par with the 980. (not that i care about power consumption, but they promised decreased power consumption, while it's not really the case  )


----------



## Air (May 17, 2016)

Looks like blower cooler technology already reached its limits, if for $100 thats all it can do...


----------



## GC_PaNzerFIN (May 17, 2016)

I have never felt this happy being robbed of 1400 euros. This months salary incoming... aand its gone!


----------



## ViperXTR (May 17, 2016)

Looking at the results, i could imagine the 1070 being in the Titan X/980 Ti territory (rumours being 1920 cores, ~1.6Ghz boost, 150W TDP)


----------



## Sempron Guy (May 17, 2016)

is just me or that Witcher 3 Wild Hunt Nvidia estimate translate to .75x faster than the 980?


----------



## rtwjunkie (May 17, 2016)

FMinus said:


> Are all reviewers instructed by nvidia to bench against stock nvidia cards? I hope you stick to that when benching future AMD cards too.



He always does the initial reference version, and then most of the AIB versions that follow.  It's great to have a baseline.


----------



## LightningJR (May 17, 2016)

great review wizz, the power consumption is unreal, even the low power states like bluray is crazy. I didn't expect anything from this cooler tbh, all fluff from nvidia.. I am sure manufacturers will have amazing coolers and 8+6 pin on some of their cards that'll push the 16nm Pascal architecture to it's limits. Now I wanna see it under water or dice.


----------



## medi01 (May 17, 2016)

DX11 vs DX12 for 1080 (in short, no gains)


----------



## Slizzo (May 17, 2016)

Can't wait for board partner designs that add a 6pin power connector as well so we're not power draw limited on OC


----------



## Suka (May 17, 2016)

Great review TPU @Wizzard. This is one card I would like to have. Great job Nvidia. That price though


----------



## Octopuss (May 17, 2016)

This might be interesting when custom PCB variants (with real coolers) start to appear.


----------



## Steevo (May 17, 2016)

Looks to be a great improvement in process and architecture, its quite awesome to see it trade blows and beat the 295X2 with its power draw and features, plus enough brute force to play anything thrown at it. 


I almost wish Nvidia could or would have cut out some of the gimmic features likes Ansel and VR and released a pure performance GPU to see what kind of animal they could have created.


----------



## SmokingCrop (May 17, 2016)

W1zzard said:


> NVIDIA did not mention G-Sync at all, so I think it's safe to assume that there were no changes


http://www.eteknix.com/nvidia-reaffirms-plans-support-g-sync/
Nothing new, just reaffirmation that they are not going with free-sync


----------



## W1zzard (May 17, 2016)

Steevo said:


> I almost wish Nvidia could or would have cut out some of the gimmic features likes Ansel and VR and released a pure performance GPU to see what kind of animal they could have created.



Ansel is a pure software feature that will work on older cards, too.

VR improvements like simultaneous multi-projection could end up providing extra performance due to driver optimizations. I doubt NVIDIA is putting in hardware units just to make VR faster, which they know is a small subset of their customers


----------



## iO (May 17, 2016)

Great GPU but the PCB, cooler and especially the pricing is just obscene..


----------



## redeye (May 17, 2016)

Forget one GTX 1080, sli 1080's might just be future proof.

BTW with gtx1080 being 700... Expect a 1080ti, to be 1000  and titan ZEUS edition (LOL) to be 1400 us...


----------



## wrathchild_67 (May 17, 2016)

The most surprising thing from this review is how capable the 295x2 still is. I remember that card selling for $500-550 at one point years ago. I remember the Geforce 690 having some amazing staying power for years after its release as well.


----------



## Steevo (May 17, 2016)

W1zzard said:


> Ansel is a pure software feature that will work on older cards, too.
> 
> VR improvements like simultaneous multi-projection could end up providing extra performance due to driver optimizations. I doubt NVIDIA is putting in hardware units just to make VR faster, which they know is a small subset of their customers





Good to know, and the review deserves another read to catch what I *may* have missed trying to get to the numbers. 
Its good to be back on track with process changes that allow for these kinds of gains.


----------



## HD64G (May 17, 2016)

Pandora's Box said:


> they clearly used a frame rate limiter to cap the fps at 61. the card was probably @ 60% usage.


If that's so, we are talking about a fraud...

And for anyone who didn't do the comparison to the most recently tested oced 980Ti by W1z:

4K: 1080 wins by 13-14% in average
14440P: 1080 wins by 16-17% in average.

So, imho, not a big jump in performance, only in efficiency. 1080Ti will be the real jump in performance. And 1080 isn't worthy of its price of FE me thinks.


----------



## btarunr (May 17, 2016)

Frick said:


> So clock for clock it's not much faster than the 980ti?



Yeah, good luck running a 980 Ti at these clocks.


----------



## LightningJR (May 17, 2016)

mrthanhnguyen said:


> Wonder why the temp was 67 at the event.





General Lee said:


> Fantastic performance, but the cooler and price tag really don't impress. I really hope we get 599$ custom AIB cards with the capability to keep those 2100 MHz Boost OC clocks Nvidia showcased. Which leads to the question, how did that Nvidia OC demo show such low temps if it was supposedly a reference cooler used? Does the cooler operate at such low RPM by default and the OC was done with 100% fan?
> 
> In reality I expect AIBs to price their custom cards for 700€+ here.





Pandora's Box said:


> they clearly used a frame rate limiter to cap the fps at 61. the card was probably @ 60% usage.





HD64G said:


> If that's so, we are talking about a fraud...



Honestly I thought the same, either frame limiter or vsync since the fps was hovering at 60-61... But I just saw HWC's review and it looks like the fan can push very high RPM. He had his fan at 55% and the card overclocked to ~2100Mhz @ 99% usage getting 58C.. so... yeah..

Actually a confirmation would be great @W1zzard, are these temps attainable at the overclock with a higher fan speed? How high can the fan speed go? I trust you more.


----------



## W1zzard (May 17, 2016)

wrathchild_67 said:


> The most surprising thing from this review is how capable the 295x2 still is.


Count the number of games that don't show CF scaling.

Then assume Pro Duo is 2x the performance of Fury X when it scales, and 1x when it doesn't scale.

Use relative perf at 4K numbers and you'll have an estimate how Pro Duo does against 1080.

Share your math  I got a Doom Steam key to give away for best math


----------



## LightningJR (May 17, 2016)

W1zzard said:


> I got a Doom Steam key to give away for best math



You're lucky I don't like Doom, or I would math the shit outta that.


----------



## wrathchild_67 (May 17, 2016)

W1zzard said:


> Count the number of games that don't show CF scaling.
> 
> Then assume Pro Duo is 2x the performance of Fury X when it scales, and 1x when it doesn't scale.
> 
> ...




I wasn't advising anyone to go out and buy a 295x2. But if you did buy one when they were $500 a few years ago, you basically made one of the best choices for years to come.


----------



## W1zzard (May 17, 2016)

wrathchild_67 said:


> I wasn't advising anyone to go out and buy a 295x2. But if you did buy one when they were $500 a few years ago, you basically made one of the best choices for years to come.


You are right of course. 295X2 was a great deal at that pricing, and still is if it scales.



LightningJR said:


> Actually a confirmation would be great @W1zzard, are these temps attainable at the overclock with a higher fan speed? How high can the fan speed go?


Fan speed 100% in Furmark at stock = 63°C, so ya I think that claim is realistic


----------



## Ferrum Master (May 17, 2016)

Fury Duo should be only ~15% faster if CFX works.

But... yeah a nice card. A win for nvidia.


----------



## Chaitanya (May 17, 2016)

sabre23 said:


> Overpriced shit in India, showing @ Rs. 80,000 (1180 $) ..
> 
> Hopefully Polaris will be available under 350 $..


Thats shitty taxation for you, official pricing is lower but not by much. Apart from nvidia greed for jacking up the price this generation and then idiotic pricing for refrence card. I dont think either 1080/1070 will be selling as much as 970/980/Ti before.


----------



## arnold_al_qadr (May 17, 2016)

@W1zzard, do you get two cards for sli review?


----------



## W1zzard (May 17, 2016)

arnold_al_qadr said:


> @W1zzard, do you get two cards for sli review?


No, I'll try to get a 2nd one though from another reviewer, but not sure


----------



## qurotro (May 17, 2016)

my Titan X is still going on and on.....


----------



## sith'ari (May 17, 2016)

> *W1zzard said:*
> This means that with one quick swoop NVIDIA's GeForce GTX 1080 has obsoleted everything in the high-end market.



Exactly that!! I'm simply stunned by 1080GTX's performance numbers!!


----------



## rootuser123 (May 17, 2016)

@W1zzard Can you upload the GTX 1080 BIOS? I want to analyse it in Maxwell BIOS Tweaker. Thanks.


----------



## medi01 (May 17, 2016)

qurotro said:


> my Titan X is still going on and on.....



But... Do you play VR? I mean, Steam VR?



Spoiler












PS
In other news "VRFuckYouWorks" incoming.


----------



## xkm1948 (May 17, 2016)

Awesome 16nm final product. Truly amazing performance and power consumption. Nvidia polished it as good as it can apparently.  Great job to team Green! Pushing the technology is always great.


----------



## z1tu (May 17, 2016)

medi01 said:


> But... Do you play VR? I mean, Steam VR?



What is your point? I see a difference of only .2%


----------



## HD64G (May 17, 2016)

sew3333 said:


> Hello. So my card is Gigabyte 980 ti Gaming Xtreme boost is 1468mhz. Is any sense to sell this card and buy 1080 gtx or even aftermarket oc 1080 gtx?


In my previous post I posted the calculation results of stock 1080 vs your very GPU. So, only for 13-17% better performance, I can say confidently to keep your great GPU until 1080Ti or Vega is out. Then, you will see again if they are worthy of it.


----------



## Legacy-ZA (May 17, 2016)

W1zzard said:


> Ansel is a pure software feature that will work on older cards, too.


As I understand it; 600 series and up.


----------



## D007 (May 17, 2016)

nickbaldwin86 said:


> Would love to see how it stacked up next to SLI 980s.... NV was claiming that the 1080 was going to beat out SLI 980s in the launch media right?


This is all I care about and I think this is one of the most important tests for this series.. Should really be in there.
As it is, the guru review is better due to lack of the 980's.


----------



## Devon68 (May 17, 2016)

I'm so tempted to get this card especially now that I see it performs better than 2 970's in SLI, but I know it would not make any sense with my current system.


----------



## z1tu (May 17, 2016)

Devon68 said:


> I'm so tempted to get this card especially now that I see it performs better than 2 970's in SLI, but I know it would not make any sense with my current system.


Yeah, looking at your specs, if those are current, I would say upgrading cpu and adding some ram first would be best.


----------



## Aquinus (May 17, 2016)

Great review but, I do find it funny that when AMD releases a GPU with 8GB of VRAM and it gets a Con for "8 GB VRAM provides no tangible benefits" but, when nVidia does it, suddenly it's not a con anymore. In fact there is practically no mention of the 8GB of VRAM beyond the title, intro, and a single bullet in the Pro list in the conclusion. I just find that interesting. Did 8GB suddenly become worth while? Are these games utilizing more than 4GB with the 1080 when it didn't with other nVidia cards? I might just be looking into it too much but, I always see cons for large amounts of VRAM and I'm wondering why it's not a con anymore. It's very interesting timing...


----------



## Valdas (May 17, 2016)

nickbaldwin86 said:


> Would love to see how it stacked up next to SLI 980s.... NV was claiming that the 1080 was going to beat out SLI 980s in the launch media right?


There are some youtube videos showing that it does outperform 980 SLI in some games but not in all and not on all resolutions.


----------



## Sah7d (May 17, 2016)

You ALL must be blind or just stupid I mean.... GREAT JUMP IMPROVEMENT !??

The GTX980Ti has the same jump with the GTX780Ti and now that the "improvement" is AGAIN about 20% at best you all are saying is a miracle, c´mon people.
Yes is a new card but is the same gen after gen, the NEW thing that comes was more RV support and less power consumption.


Great review and still waiting for the GTX1070 but watching this, I would say that hardly beats a SLI of GTX970, the GTX1080 as barely over the GTX970SLI by 4FPS - 10FPS


----------



## Casecutter (May 17, 2016)

the54thvoid said:


> But I agree, it would be good to see a heavily AMD biased title tested


Or at least drop some of the non-demanding titles at least, as so many just punch up the averages.


While it gave some gains to 4K to me it's no so much from architectural improvements, but the faster clocks and still being able to have 320 GB/s on a cost effective 256-bit.  It pack a punch at lower resolutions, and that's fine, but I see W1zzard 9.1 score telling in the value of this for 1440p it's a lot to pay, while for 4K it might, although the FE might not be a good purchase.   Even to buy today as a way to tide one over till larger gains are recognized, as I don't think you will recouped the premium once the days of Vega and GP100 arrive.  I think for 1440p this offer excellent FpS, but even then I'd say wait to see the 1070 and to a lesser extent a Polaris because you shouldn't have to approach even $400 any more to see great 1440p.

As to the fan cooler I kind of understand as W1zzard said, "I just hope that this will not tempt custom board manufacturers to go for ultra-low temperatures while ignoring fan noise."  I see this cooler stepping aback from what the 980/908Ti could offer, especially knowing how much power/heat this new GPU would need to shed.
I think most AIB customs should have no issue controlling the clocks, temp, and noise as it seems Nvidia has provide a fairly wide breath in this F-E cooler.  Honestly, is there any real difference between a 980Ti Ref and the F-E as the picture from W1zzard's previous 980Ti has only passing adjustments for PCB components. I'd say the vapo-chamber and cooling fins inside are hardly if any changed. Does anybody see this cooler as anything more than changes to the shrouds aesthetics.  I almost see the F-E as a way to hold the reins in on what probably going to be fairly non-saturated channel for some time.  You have a opportunity to pay the "tribute" price or be stigmatized as one of those who just can't/wont pay up till that $600 is found in the market.


----------



## farlandprince (May 17, 2016)

I have the feeling that AMD will win this round ...

Any news about their new cards release date?


----------



## lemkeant (May 17, 2016)

Great review. I think the extra $100 for the Founders Edition is just good marketing, but horrible for the consumer/gamers

I just finished struggling through Quantum Break with my 680. Since the patches that game plays well and is very demanding. Would be cool to see that tested someday


----------



## Valdas (May 17, 2016)

Sah7d said:


> You ALL must be blind or just stupid I mean.... GREAT JUMP IMPROVEMENT !??
> 
> The GTX980Ti has the same jump with the GTX780Ti and now that the "improvement" is AGAIN about 20% at best you all are saying is a miracle, c´mon people.
> Yes is a new card but is the same gen after gen, the NEW thing that comes was more RV support and less power consumption.
> ...


You do realize that you're comparing 980 Ti that has 96 ROPs/8000M transistors/2816 Cuda vs 780 Ti 48/7100M/2880 vs 1080 64/7200M/2560. To make this proper comparison you should compare 780/980/1080 or 780 Ti/980 Ti/1080 Ti.


----------



## SandroX (May 17, 2016)

I wounder.... why in this reviews results of other graphics cars are so very different?!

For example Titan X






In review of Gigabyte GTX 980 Ti XtremeGaming 6GB





73.4 vs 53.4 ... How come ??


----------



## newtekie1 (May 17, 2016)

Fluffmeister said:


> looks like the 1070 is gonna kick my 980 Ti in the teeth too and for a fraction of the price and power consumption.



And everyone said I was crazy to sell off my 980Ti and 970s a month ago.  "They aren't going to go down in value" they said "980Ti should be just as fast as the GTX1080" they said...


----------



## erocker (May 17, 2016)

SandroX said:


> I wounder.... why in this reviews results of other graphics cars are so very different?!
> 
> For example Titan X
> 
> ...



Different drivers?


----------



## SandroX (May 17, 2016)

erocker said:


> Different drivers?


Really?  New driver gives +20 fps, sounds like joke...


----------



## erocker (May 17, 2016)

No probably not a joke. Then again, Nvidia has had a few awful driver releases in the past months that could possibly be cosidered jokes.


----------



## MxPhenom 216 (May 17, 2016)

What a killer card. God damn I wish I had a job this summer instead of taking classes...


----------



## JJJJJamesSZH (May 17, 2016)

I think a EVGA engineer confirmed that 1080 only supports 2-way SLI?
I think the performance boost credits to the boost on frequency rather than the architecture.


----------



## Ferrum Master (May 17, 2016)

SandroX said:


> 73.4 vs 53.4 ... How come ??



Different RIGS.


----------



## LightningJR (May 17, 2016)

newtekie1 said:


> And everyone said I was crazy to sell off my 980Ti and 970s a month ago.  "They aren't going to go down in value" they said "980Ti should be just as fast as the GTX1080" they said...



idk who this "they" guy is but he's a dick 


the nm shrink should have been evidence enough, they should have realized that..


----------



## SandroX (May 17, 2016)

Ferrum Master said:


> Different RIGS.


RIGS? Test systems are same.


----------



## qubit (May 17, 2016)

Very impressive performance, as expected. Looking forward to what Big Pascal can do and hope it's affordable as that's the one I'd want to buy. It should be very capable indeed at 4K too, sitting well above 60fps especially if quality isn't fully maxed out and the game isn't super-taxing.

It will probably be the MSI Gaming edition one, as these coolers are fantastic. Currently got it in GTX 780 Ti form.


----------



## erocker (May 17, 2016)

SandroX said:


> RIGS? Test systems are same.


W1zzard must hate Titan X. Death to Titan X!!!

Really though, drivers.


----------



## Ferrum Master (May 17, 2016)

SandroX said:


> RIGS? Test systems are same.



No they are not, they change all the time. Look in test setup revisions.


----------



## cadaveca (May 17, 2016)

SandroX said:


> I wounder.... why in this reviews results of other graphics cars are so very different?
> 73.4 vs 53.4 ... How come ??



NVidia driver BS so they could claim 1080 is more than it really is? It's brilliant marketing that NVidia has employed many times over now. We all know that after some time drivers pull back performance of previous generation as the driver focuses on newer generation?

I personally have noticed retracted performance on my 980; Dirt Rally used to give me average 45 FPS with my used settings @ 2560X1600, but now, it's 32-34 FPS. That's a considerable drop.

Dirt Rally has new "Daily" races every single day, with certain tracks repeated over and over again, but using different cars. So I play it every day, noticed the difference right away since I have STEAM overlay display FPS in all titles.


----------



## SandroX (May 17, 2016)

Ferrum Master said:


> No they are not, they change all the time. Look in test setup revisions.



I did, becuse and write that they are same.


----------



## SandroX (May 17, 2016)

erocker said:


> Really though, drivers.


Heh, that sad, unpossible to compare them.


----------



## erocker (May 17, 2016)

Ferrum Master said:


> No they are not, they change all the time. Look in test setup revisions.


In the two examples he provided they're the same. Different drivers. 361.43's were bloody awful for some people, I'm not surprised.


----------



## Ferrum Master (May 17, 2016)

erocker said:


> In the two examples he provided they're the same. Different drivers. 361.43's were bloody awful for some people, I'm not surprised.



You sure mate?

http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/5.html 

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/7.html


----------



## z1tu (May 17, 2016)

cadaveca said:


> NVidia driver BS so they could claim 1080 is more than it really is? It's brilliant marketing that NVidia has employed many times over now. We all know that after some time drivers pull back performance of previous generation as the driver focuses on newer generation?


But how come tho, AMD cards that were performing worse than nvidia cards still perform worse after countless driver updates for older gen nvidia cards?


----------



## erocker (May 17, 2016)

SandroX said:


> Heh, that sad, unpossible to compare them.


No it isn't. If you have the card in question, you can easily find previous drivers on Nvidia's website.


----------



## newtekie1 (May 17, 2016)

erocker said:


> In the two examples he provided they're the same. Different drivers. 361.43's were bloody awful for some people, I'm not surprised.



Test system Rev. 41 vs Rev. 42.  You are right, the hardware stayed the same, but the drivers changed.  But I'm also under the impression that the games are updated and possibly settings adjusted between revisions as well.


----------



## cadaveca (May 17, 2016)

z1tu said:


> But how come tho, AMD cards that were performing worse than nvidia cards still perform worse after countless driver updates for older gen nvidia cards?


Dunno. I just know I see less performance with 980 myself than I did before. Whether this is a marketing tactic, or because they did something to make drivers more stable, I do not know. I also know that W1zz is very reliable in the numbers he posts, and impartial to any brand in reviews.


----------



## rtwjunkie (May 17, 2016)

SandroX said:


> I wounder.... why in this reviews results of other graphics cars are so very different?!
> 
> For example Titan X
> 
> ...



Pretty sure W1zz changed his entire PC out since then.  That's why he retests a lot of old cards each time he does new ones.  That way you CAN compare them.  You can't compare old test vs. new card test.  Hence, why W1zzard goes the extra mile and RE-tests.

*Edit*, nm...looks like his change was before then.

@Aquinus I don't think there's anything weird with W1zzard now saying 8GB is fine.  He's now included games like Arkham Knight (now that it runs pretty well), which will use more than 4GB RAM.  I don't recall that being used prior.  There may be others.


----------



## erocker (May 17, 2016)

Ferrum Master said:


> You sure mate?
> 
> http://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/5.html
> 
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/7.html


Yeah, that isn't the review he linked.


----------



## z1tu (May 17, 2016)

cadaveca said:


> Dunno. I just know I see less performance with 980 myself than I did before. Whether this is a marketing tactic, or because they did something to make drivers more stable, I do not know.


I can also see that in previous reviews, can't say my 780 has been too affected tho. Other than the fact I can't OC my board anymore because of the heat, it's still going strong on 1080p.


----------



## cadaveca (May 17, 2016)

z1tu said:


> I can also see that in previous reviews, can't say my 780 has been too affected tho. Other than the fact I can't OC my board anymore because of the heat, it's still going strong on 1080p.


I would have to put my 780 back in my rig and test, as I have not used my 780 in some time as it is now in my son's rig, but I do know that I saw the same drop in performance when new gen came out with 780 Ti. When that happened, I got rid of 780 Ti and moved up to 980 SLi. Soon I'll upgrade to 1080 SLi, unless AMD has something to compete with 1080.


----------



## W1zzard (May 17, 2016)

SandroX said:


> I wounder.... why in this reviews results of other graphics cars are so very different?!
> 73.4 vs 53.4 ... How come ??


I think the new bench is running different settings for Witcher. Note VGA Test System Rev. 41, vs. 42.



JJJJJamesSZH said:


> I think a EVGA engineer confirmed that 1080 only supports 2-way SLI?


NVIDIA said several times that 3 and 4-way SLI is possible. They also said that EVGA engineer was misinformed.


----------



## the54thvoid (May 17, 2016)

Umm. Is the past dozen posts all conspiracy crap. Every card is updated from the previous review. Fury X is up from 40's to 60's. 
All cards performing better, even 780ti. Maybe W1zz changed a game setting by mistake because if every card from both brands is significantly better, it's a generic change, not proprietary.


----------



## Moofachuka (May 17, 2016)

W1zzard said:


> I think the new bench is running different settings for Witcher. Note VGA Test System Rev. 41, vs. 42.
> 
> 
> NVIDIA said several times that 3 and 4-way SLI is possible. They also said that EVGA engineer was misinformed.



I read somewhere the 1080 supports 3-4 way SLI only through normal bridge.  the HB bridge is only for 2 way SLI.


----------



## [502] (May 17, 2016)

This card blows nearly all competitions our of the water, just like Nvidia promised. No need for them to release any Ti version AFAIK. AMD should release an equal card or Nvidia will milk this card till dry, and that's not good for us end consumers.


----------



## phoenixwings (May 17, 2016)

It would be awesome if you guys could post a ROM dump for the GTX 1080 8GB, or upload it to the GPU database. I am really curious how the GPU Boost 3.0 looks like.

Thank you.


----------



## deemon (May 17, 2016)

Legacy-ZA said:


> Well, I can finally say; there is a card that can keep most games above 100FPS for 1080p gaming and no doubt for a long while yet, thanks to it's 8GB VRAM. Although the card is more expensive than what I would have liked, I will probably buy the card. I plan to stick with 1080p for a long time, that is, until we see 100FPS+ results for 1440p and 4k gaming. I can't stand anything below 100FPS, my brain doesn't lag.



and now with the superior SMP you can play the games on 3 * 1080p monitors even better than before ... and they still require 25% less GPU power than 4k


----------



## lukart (May 17, 2016)

Crazy how nvidia manipulates the masses. Announces the card at 599USD but you won't find any at this price.
Then you see temperatures in the presentation, 65c .. and real life 80+ 

They never change.


----------



## z1tu (May 17, 2016)

lukart said:


> Crazy how nvidia manipulates the masses. Announces the card at 599USD but you won't find any at this price.
> Then you see temperatures in the presentation, 65c .. and real life 80+
> 
> They never change.



The reference card will be 599, aftermarket ones will obviously be higher priced as has always been with nvidia and amd.


----------



## MxPhenom 216 (May 17, 2016)

lukart said:


> Crazy how nvidia manipulates the masses. Announces the card at 599USD but you won't find any at this price.
> Then you see temperatures in the presentation, 65c .. and real life 80+
> 
> They never change.





z1tu said:


> The reference card will be 599, aftermarket ones will obviously be higher priced as has always been with nvidia and amd.




The founders edition is the 699 varient, which is basically the reference version, but they wanted to call it something else. Cards from the AIBs that use their own coolers and all that will be around the 599 MSRP.


----------



## z1tu (May 17, 2016)

MxPhenom 216 said:


> The founders edition is the 699 varient, which is basically the reference version, but they wanted to call it something else. Cards from the AIBs that use their own coolers and all that will be around the 599 MSRP.



From what I understood, there will be a reference nvidia card and the founders edition with the "special cooler". Or maybe I got that wrong...


----------



## MxPhenom 216 (May 17, 2016)

z1tu said:


> From what I understood, there will be a reference nvidia card and the founders edition with the "special cooler". Or maybe I got that wrong...




Nope. Founders edition is the reference with the newly designed titan looking cooler with all the angles on it. Its really just a reference card with fancy name.

http://www.theverge.com/circuitbrea...nvidia-founders-edition-geforce-gtx-explained

Official statement from Nvidia:



> "Relative to previous NVIDIA designs, we have upgraded the following areas: The power supply has been upgraded from 4-phases to a 5-phase dual-FET design and tuned for bandwidth, phase balancing and acoustics. We added extra capacitance to our filtering network, and optimised the power delivery network on the PCB for low impedance. As a result, power efficiency increased by roughly 6% compared to the GTX 980, and peak to peak voltage noise was reduced from 209mV to 120mV for improved overclocking.
> 
> "The GTX 1080 uses a vapour chamber cooling, a first for a sub-250W NVIDIA designed graphics card. Finally, we have a new industrial design with a faceted body and a new low profile backplate. The backplate features a removable section to allow better airflow between multiple graphics cards in adjacent SLI configuration."


----------



## rtwjunkie (May 17, 2016)

z1tu said:


> The reference card will be 599, aftermarket ones will obviously be higher priced as has always been with nvidia and amd.



Not this time.  NVIDIA has deliberately priced the reference card (founders edition) higher.  Why, I don't know, since it's a baffling move.


----------



## z1tu (May 17, 2016)

rtwjunkie said:


> Not this time.  NVIDIA has deliberately priced the reference card (founders edition) higher.  Why, I don't know, since it's a baffling move.


Well hopefully partners don't use that as an excuse to jack prices.


----------



## GreiverBlade (May 17, 2016)

just to add

+30-40% more than a 980 is not blowing any other out of the water ...

i expected a little more, but i guess the 1080 is a letdown for me



rtwjunkie said:


> Not this time.  NVIDIA has deliberately priced the reference card (founders edition) higher.  Why, I don't know, since it's a baffling move.


because they just wanted to make it more expensive aka camouflaging a raise in price  adding a fancy "founder" edition fooling early buyer (without lub included) and make them think they have a unique piece of art of a cooler, while it's only a reference cooler like the previous one (which received so many praise that nvidia did choose to make people pay for it afterward on the next gen ..) pure shady move



z1tu said:


> Well hopefully partners don't use that as an excuse to jack prices.


they will and they are already ... for me even the 980Ti is on par with MSRP of a Titan X ... i fear the 1080 and 1080Ti will be ridiculously high priced ... remember 699~ was supposed to be the 980Ti ... naaaahhh nope nope nope ... my 980 Poseidon is already near that price BNIB.

edit: after a quick check i noticed some custom 980 and 980Ti prices were going down a bit ... a bit, mind you: -1 to 5% of initial price, that's what i call a reduction


----------



## MxPhenom 216 (May 17, 2016)

rtwjunkie said:


> Not this time.  NVIDIA has deliberately priced the reference card (founders edition) higher.  Why, I don't know, since it's a baffling move.



Seriously I dont understand their train of thought on it. 



z1tu said:


> Well hopefully partners don't use that as an excuse to jack prices.



Time will tell. All I know is my next card will be an MSI Gaming variant.


----------



## z1tu (May 17, 2016)

MxPhenom 216 said:


> Seriously I dont understand their train of thought on it.
> 
> 
> 
> Time will tell. All I know is my next card will be an MSI Gaming variant.



How can you be sure tho? I mean my SuperJetsream was awesome and it was very good price/performance but for maxwell it was very disappointing. I know the Gaming variants have proven very good in the past but I'll be waiting for reviews until I make a verdict.


----------



## MxPhenom 216 (May 17, 2016)

z1tu said:


> How can you be sure tho? I mean my SuperJetsream was awesome and it was very good price/performance but for maxwell it was very disappointing. I know the Gaming variants have proven very good in the past but I'll be waiting for reviews until I make a verdict.



I have built like 3 systems with MSI gaming cards and love them, and these systems were for other people, so I want one now.


----------



## mroofie (May 17, 2016)

GreiverBlade said:


> just to add
> 
> *+30-40% more than a 980 is not blowing any other out of the water ...*
> 
> ...



Might want to check those fps figures again 

Bit more than 40% as you claim 

#rip


----------



## medi01 (May 17, 2016)

Comparison to 980 is wrong, to begin with, as card is priced higher than 980Ti, and, if check DE pricing, 789 Euros, it's on "980Ti++" levels.

So in the end you get 20-25% over 980Ti, for 10-20% more.




cadaveca said:


> and impartial to any brand in reviews.





Aquinus said:


> when AMD releases a GPU with 8GB of VRAM and it gets a Con for "8 GB VRAM provides no tangible benefits"


Lovely.


----------



## newconroer (May 17, 2016)

Some improvement sure, though overall seems underwhelming and more just a showcase of their new (yet questionable if really useful to anyone?) hardware tech.

Given the power draw, heat output, noise level, fabrication, die shrink and relative performance, I don't see how this reference card is any thing special. 
Let's hope the third party turn Nvidia's silly 'market space' approach upside down and give us something worth more for our cash.

Let's also hope that the TI model actually launches at $699 like the reference 1080, then it may be worth it. Pascal doesn't feel like Pascal without HBM2 any ways .


----------



## HD64G (May 17, 2016)

mroofie said:


> Might want to check those fps figures again
> 
> Bit more than 40% as you claim
> 
> #rip


GTX1080 performs 32% better from stock 980Ti at 4K and 1440P and 36% better at 1080P.


----------



## Valdas (May 17, 2016)

medi01 said:


> So in the end you get 20-25% over 980Ti, for 10-20% more.


It's more like 35%-ish over 980 Ti, unless you're comparing it with OCed 980Ti.


----------



## the54thvoid (May 17, 2016)

HD64G said:


> GTX1080 performs 32% better from stock 980Ti at 4K and 1440P and 36% better at 1080P.



Taking percentile as fps (which is allowed given rating of 1080 as 100), at 4k & 1440p, 1080 is 100 and 980ti is 73.  100-73=27.  27/73 = 36.98 (37).  That means the upshot is 1080 has 137% the comparable performance of a 980ti (where 100% would be the same, not to be confused with a 100% performance _increase_).

At 1080p the comparable performance is 131.5%. (76 versus 100).

Also - as far as the shortsighted comments regarding the 8Gb of Vram being great all of a sudden, the VR environment has changed the need for Vram (looking forward).  Also the 8Gb on a 390 was (although a slight boost to high res performance) insignificant for 4k as the chip itself was not powerful enough to perform at 4k (unless crossfired).

The Fury X 4Gb is not high enough looking forward but it's sheer speed helps it massively.


----------



## dirtyferret (May 17, 2016)

FINALLY!!  A card I can play Hearstone with, The way it's meant to be played.


----------



## ironwolf (May 17, 2016)

This is the 3rd review today I have read for the 1080.  After reading each one my jaw hit the floor with the performance % increases.  Very impressed with the performance.  I'm afraid that my 970 is going to start crying when I get home and mention these reviews.


----------



## GreiverBlade (May 17, 2016)

mroofie said:


> Might want to check those fps figures again
> 
> Bit more than 40% as you claim
> 
> #rip


mmhhh? #rip.... ? childish hashtag to say "you virtually murdered me"? ...

i see 30 to 40% increase all over a stock 980 ... (talking 1080p and 4k respectively ...) its still not a good enough card to warrant a 980 replacement )

oh wait ... 4k ... the 1080 is 41% above 980 ... my bad ... 1080P it's 36% ... my bad again ...



 







HD64G said:


> GTX1080 performs 32% better from stock 980Ti at 4K and 1440P and 36% better at 1080P.


you added a Ti too much ... in the charts the 980 has clearly no Ti behind ... at last the one that is 36% and 41% under a 1080 since the 980Ti is  at 24% of a 1080 (1080P) and 27% in 4k ....


----------



## newconroer (May 17, 2016)

ironwolf said:


> This is the 3rd review today I have read for the 1080.  After reading each one my jaw hit the floor with the performance % increases.  Very impressed with the performance.  I'm afraid that my 970 is going to start crying when I get home and mention these reviews.


Why?

You could get another one and still be in the same performance bracket as the 1080.  The 1080 can't hold 96+ fps at 1440p in a lot of titles(especially non triple AAA), so what exactly are you gaining other than the additional 4GB VRAM?

970 SLI now, 1080 TI later (maybe).


----------



## fullinfusion (May 17, 2016)

I believe I'm packing up and leaving the red camp...

I never thought I'd ever say that but after reading this review I'm all in on this bas ass card!

Sorry AMD but ya did it to yourself


----------



## medi01 (May 17, 2016)

Valdas said:


> It's more like 35%-ish over 980 Ti, unless you're comparing it with OCed 980Ti.


Aren't most 980Ti-s overclocked? (and don't they OC much better than what we see 1080 do?)


----------



## radrok (May 17, 2016)

Awesome review 

I guess it's finally time to upgrade...


----------



## z1tu (May 17, 2016)

fullinfusion said:


> I believe I'm packing up and leaving the red camp...
> 
> I never thought I'd ever say that but after reading this review I'm all in on this bas ass card!
> 
> Sorry AMD but ya did it to yourself


I don't understand why you would say that before seeing what AMD have to offer.


----------



## BiggieShady (May 17, 2016)

That Simultaneous Multi-projection is a big deal ... the point is doing the geometry pipeline only once and have up to 16 projection in a single pass. Possible only on new pascal arch, there is a smp block in a new updated polymorph engine in each SM


----------



## GreiverBlade (May 17, 2016)

also, "founders edition" +100$ = 699 and AIB will table on a 599$ MSRP my @$$, AIB will never price their custom lower than a reference and since the founders IS the reference model they will table on a base price of 699$, why would they do otherwise?



medi01 said:


> Comparison to 980 is wrong, to begin with, as card is priced higher than 980Ti, and, if check DE pricing, 789 Euros, it's on "980Ti++" levels.
> 
> So in the end you get 20-25% over 980Ti, for 10-20% more.


i will compare the 980Ti with a 1080Ti ... and not the other way, furthermore since the price of the 2 line is similar where i live, 980 to 1080, tho seeing price for custom coming like 1200ish ... indeed i would get a 1080 for 600$ more than my actual 980 price ... eh? way advantageous indeed ... (which would place them at a 1080 price = 980Ti price for even less performance increase than over a 980.).

nope that card is not the one that will make me change my 980, overpriced (nvidia, what else) discutable "founders" calling to put them at 699$ and saying AIB will table on a 599MSRP (yeah yeah that cooler is worth 100$ ... right )



fullinfusion said:


> I believe I'm packing up and leaving the red camp...
> 
> I never thought I'd ever say that but after reading this review I'm all in on this bas ass card!
> 
> Sorry AMD but ya did it to yourself


meh... on my side i am currently waiting on what they will release before saying that ...  as the 290X 390X are still, more than capable cards (so is the 295x2 over the 1080 if we make abstraction of the CFX issues )



z1tu said:


> I don't understand why you would say that before seeing what AMD have to offer.


exactly.


----------



## z1tu (May 17, 2016)

BiggieShady said:


> That Simultaneous Multi-projection is a big deal ... the point is doing the geometry pipeline only once and have up to 16 projection in a single pass. Possible only on new pascal arch, there is a smp block in a new updated polymorph engine in each SM


I recognize some of those words!


----------



## medi01 (May 17, 2016)

fullinfusion said:


> I believe I'm packing up and leaving the red camp...



Team red might offer something for you Oct this year the earliest, since you got 290 and 480 will be rather on par with it.
Although, I'd say, 1070/1080 is an overpriced flop that is not worth it.




GreiverBlade said:


> i will compare the 980Ti with a 1080Ti ... and not the other way,


Then compare it to R9 380.
After all, it also ends with 80.
Makes as much sense.

Price wise, 1080 currently sits between 980Ti and Titanium X, 980 is a different tier entirely. (789 Euros FFS)


----------



## Champ (May 17, 2016)

Kinda regard limiting myself with a freesync monitor now. It'll be another two gens before AMD touches this. They'll always be very behind


----------



## erocker (May 17, 2016)

MxPhenom 216 said:


> Seriously I dont understand their train of thought on it.
> 
> 
> 
> Time will tell. All I know is my next card will be an MSI Gaming variant.


No, through what they've said about it, the "Founders Edition" is the reference card. They release it first at a price premium, then after, they let their partners battle it out amongst themselves. So, they've taken the normally "cheaper" card and made it the more expensive card.

I think I too am going to get one of MSI's 1080's.


----------



## rtwjunkie (May 17, 2016)

z1tu said:


> I don't understand why you would say that before seeing what AMD have to offer.



You took the words out of my mouth.  I'd prefer to await as many findings as possible first.

Really, this is an exciting card, but I will need to see game requirements vastly increase before I move on from my 980, which is quite a capable card in these recent tests, still.  So an upgrade is not currently in the cards.


----------



## GreiverBlade (May 17, 2016)

medi01 said:


> Then compare it to R9 380.
> After all, it also ends with 80.
> Makes as much sense.


clever ... but a R9 380 i will compare it to a R9 480 if they ever come up ...and if i did effectively own one ... 

make sense ... yep


----------



## medi01 (May 17, 2016)

GreiverBlade said:


> . but a R9 380 i will compare it to a R9 480 if they ever come up ...and if i did effectively own one ...


That will make sense, provided, it's priced accordingly. (which it very likely will be)



Champ said:


> It'll be another two gens before AMD touches this


Lolwhat?
Jeez, and you post on a tech review site, what a shame... =(


----------



## xorbe (May 17, 2016)

Mussels said:


> as someone who sees a 50W jump in power just because a webpage has a video ad, that 7W power use for bluray is incredible.



Also VirtualBox sessions keep my Titan X spun up.  The 960 (used one for a while, it's in the htpc now) was nice in that it hurt a lot less this way (heating up my room upstairs).  Reduced power would be a nice reason to swap out for a 1080.  I don't really need a new card though ...


----------



## gasolina (May 17, 2016)

some of the local msi staff in Vietnam claim that their FE 1080 is "pinned" and charge around 870$ could be overclock to 2800mhz under watercooling i just don't know if it is fraud or not since here you can make it to 2060. I cam barely think non ref design pcb and after market cooler/ waterblock will be around 2200mhz Oc base clock will be maximum i guess. Is there any source claiming thay they could reach to 2500mhz for some custom design ?


----------



## droopyRO (May 17, 2016)

Would it be compatible with an ArcticCooling Xtreme III or IV ?


----------



## GC_PaNzerFIN (May 17, 2016)

GreiverBlade said:


> mmhhh? #rip.... ? childish hashtag to say "you virtually murdered me"? ...
> 
> i see 30 to 40% increase all over a stock 980 ... (talking 1080p and 4k respectively ...) its still not a good enough card to warrant a 980 replacement )
> 
> oh wait ... 4k ... the 1080 is 41% above 980 ... my bad ... 1080P it's 36% ... my bad again ...



GTX 980 is _36% slower_ than  GTX1080 at Full HD res. That means GTX 1080 is _*56% faster*_ than 980.


----------



## m&m's (May 17, 2016)

GreiverBlade said:


> i see 30 to 40% increase all over a stock 980 ... (talking 1080p and 4k respectively ...) its still not a good enough card to warrant a 980 replacement )
> 
> oh wait ... 4k ... the 1080 is 41% above 980 ... my bad ... 1080P it's 36% ... my bad again ...
> View attachment 74688 View attachment 74689



That's mathematically incorrect. You can't do 100-64=36. What you're saying is the 1080 is 36% faster than the 64% of the 980. Which doesn't make sense. You have to do a* rule of three*.
100*100/64=156.25% so 56.25% faster at 1080P compared to the 980.
100*100/59=169.49% so 69.49% faster at 4K.


----------



## GC_PaNzerFIN (May 17, 2016)

droopyRO said:


> Would it be compatible with an ArcticCooling Xtreme III or IV ?



Yes, although I would recommend installing some heatsinks directly on the VRM chips too. Not just the silly backplate thing. 

http://videocardz.com/60151/nvidia-geforce-gtx-1080-tested-with-aftermarket-cooler


----------



## the54thvoid (May 17, 2016)

medi01 said:


> Aren't most 980Ti-s overclocked? (and don't they OC much better than what we see 1080 do?)



Yes and No.  Most 980ti's come overclocked but the overclock of 450Mhz on this amounts to a 28% overclock, versus a 26% like for like on the 'stock' 980ti (TPU review with W!zzard's base 980ti and 1080).  Moreover, the ceiling for non exotic cooling (LN2) is about 1550Mhz on 980ti.  The rumours suggest, once power unlocked and water cooled, these will reach up to 2.5Ghz.  That will be the factory overclocked versions with additional OC (again if rumours are correct).

Partner cards with unlocked power and cooled with a great air solution or water will clock far higher than a founder edition (again, if rumours are to be believed and tbh, given this review and others hitting 2100+Mhz, I'd be believing them rumours).


----------



## the54thvoid (May 17, 2016)

GC_PaNzerFIN said:


> Yes, although I would recommend installing some heatsinks directly on the VRM chips too. Not just the silly backplate thing.
> 
> http://videocardz.com/60151/nvidia-geforce-gtx-1080-tested-with-aftermarket-cooler



Needs more powah.


----------



## W1zzard (May 17, 2016)

BiggieShady said:


> That Simultaneous Multi-projection is a big deal ... the point is doing the geometry pipeline only once and have up to 16 projection in a single pass. Possible only on new pascal arch, there is a smp block in a new updated polymorph engine in each SM


Just to burst any theory bubbles: SMP can only work from the same viewpoint, so imagine in a 3d shooter being able to use the mouse to look anywhere you want, but you can not walk somewhere else


----------



## Nihilus (May 17, 2016)

m&m's said:


> That's mathematically incorrect. You can't do 100-64=36. What you're saying is the 1080 is 36% faster than the 64% of the 980. Which doesn't make sense. You have to do a* rule of three*.
> 100*100/64=156.25% so 56.25% faster at 1080P compared to the 980.
> 100*100/59=169.49% so 69.49% faster at 4K.





GC_PaNzerFIN said:


> GTX 980 is _36% slower_ than  GTX1080 at Full HD res. That means GTX 1080 is _*56% faster*_ than 980.



Thank you.  Griever's math was making my head hurt.  A 56% improvement at 1080p and a 69% improvement at 4k while getting the same power consumption is NOTHING to sneeze at.  
Price is high now, but it will go down once Vega launches.


----------



## nem.. (May 17, 2016)

13% what a bargain. min 23:24


----------



## GreiverBlade (May 17, 2016)

Nihilus said:


> Thank you.  Griever's math was making my head hurt.  A 56% improvement at 1080p and a 69% improvement at 4k while getting the same power consumption is NOTHING to sneeze at.
> Price is high now, but it will go down once Vega launches.


i answer in reverse ... Vega will make nothing in price change (unless Vega is a huge leap in performance over the actual line) ... when the Fury got out ... no 980/980Ti price lowered ... a contrario they did go up ... (might be situational depending the region and country ... ok)


ok so the number in the chart means nothing if not taken with another calculation in between, i see ... my bad
not everyone are aware of that kind of additional step needed to have the effective numbers ...

my bad to apply a logic to something that require more steps ... and thanks @m&m's to have effectively explained the issue there (albeit in a manner that does not suits me totally ) but at last someone explained it instead of considering me like an idiotic ignorant for not applying any math in that case ... pfeh.

rule of three ... well it's not really something that you would take in account automatically ... cross multiplication i see... sorry i am an idiot ... thanks to make me realize it  

still not making that card worth it tho ... since AIB and retailers will overprice it, thanks nvidia.



nem.. said:


> 13% what a bargain. min 23:24


not worth it over a 980Ti indeed .... for a 980 it might be ... though still no games bring a 980 on her knees (ok ok 4k ... the new "standard" ... yep that case make it worth it and maybe 144hz
screen)





medi01 said:


> That will make sense, provided, it's priced accordingly. (which it very likely will be)


well in that case my 980 to 1080 make sense ... unless you didn't read about the price were i live ... : a 980 is already at the same price of a 1080 "fooled" edition (albeit the fact that i forgot the "rule of three" )


----------



## Nihilus (May 17, 2016)

^^^ 1080 fps vs 980ti fps

Bench 1: 51.7 vs 45  (8:18)
Bench 2: 98.3 vs 86.3 (8:47)
Bench 3: 129 vs 129 (9:12)  CPU LIMITED!!!!
Bench 4: 49.0 vs 50.3 (dx11) ; 58.5 vs 74.8 (dx12)
Bench 5: 39.0 vs 56.9.3 (dx11) ; 41.9 vs 56.1 (dx12)
Bench 6: 42.3 vs 56.0 (12:10)
Bench 7: 107 vs 125  (12:23)  cpu limited????
Bench 8: 202 vs 157 (13:04)  um the 980 gets 134 here making the 980ti only 17% faster than the 980.   GARBAGE benchmarks. 
Bench 9: 51.7 vs 68 (13:42)

Where did you guys dig up this burnout?   Ignoring the dx11 Ashes of Singularity and CPU bottlenecks in bench 3, 7 and 8, it is much bigger than 13%

Also, is the 980 that much better than a 780ti??  In most older games, they are very close.  Why would you expect a 1080 to blow away a 980ti?  COMPARE a 1080 to a 980.  What is so hard about that?!


----------



## Kanan (May 17, 2016)

Without reading previous comments (no time):

980 Ti custom is barely slower, maybe 3-5% estimated. GTX 1080 custom could be 15% faster, tops, compared to custom GTX 980 Ti, but costs a lot more - not worth it. It's just Maxwell on speed, this GPU is a pretty big disappointment. Enthusiast gamers, keep your fingers crossed for Vega10 with HBM2 + 4096 shaders in October, it will easily crush this GPU.


----------



## xorbe (May 18, 2016)

No cpu speeds seem to be disclosed in the video review or the linked article, so I guess stock settings and not pegged at 4.3 -4.7 GHz like the typical die hard enthusiast.  That would explain the one bench that was cpu limited.


----------



## kithylin (May 18, 2016)

WHY DO YOU NOT SHOW US THE MINIMUMS! WE NEED MINIMUM NUMBERS!

YOUR ENTIRE REVIEW IS USELESS AND A WASTE OF TIME!


----------



## rtwjunkie (May 18, 2016)

kithylin said:


> WHY DO YOU NOT SHOW US THE MINIMUMS! WE NEED MINIMUM NUMBERS!
> 
> YOUR ENTIRE REVIEW IS USELESS AND A WASTE OF TIME!



Damn....I guess he should change his respected reviews just for your prima dona ass, huh? 

Not gonna happen.


----------



## xorbe (May 18, 2016)

kithylin said:


> WHY DO YOU NOT SHOW US THE MINIMUMS! WE NEED MINIMUM NUMBERS!
> 
> YOUR ENTIRE REVIEW IS USELESS AND A WASTE OF TIME!



A single number is useless. Other reviews have provided some proper frame percentile charts.


----------



## Kanan (May 18, 2016)

kithylin said:


> WHY DO YOU NOT SHOW US THE MINIMUMS! WE NEED MINIMUM NUMBERS!
> 
> YOUR ENTIRE REVIEW IS USELESS AND A WASTE OF TIME!


MORE NUMBERS!  Here you go:
http://www.tomshardware.com/reviews/nvidia-geforce-gtx-1080-pascal,4572.html


----------



## Absolution (May 18, 2016)

z1tu said:


> I don't understand why you would say that before seeing what AMD have to offer.



Exactly, I had plans to go GTX 1080 (from the Nano) but given the prices (in AU) and given the not so great improvement at AMD titles, I think Im going to wait a bit. The same seems to apply when AMD hardware runs slower on nvidia backed games. Really wish there was a better way to measure performance.

Basically the whole nvidia vs AMD thing is going to pan out over which vendor is backing the most number of titles, heh...


----------



## Kanan (May 18, 2016)

Absolution said:


> Exactly, I had plans to go GTX 1080 (from the Nano) but given the prices (in AU) and given the not so great improvement at AMD titles, I think Im going to wait a bit. The same seems to apply when AMD hardware runs slower on nvidia backed games. Really wish there was a better way to measure performance.
> 
> Basically the whole nvidia vs AMD thing is going to pan out over which vendor is backing the most number of titles, heh...


Wait for the Vega10 GPU, 1st 14nm true enthusiast GPU - coming to you October.  Jokes, I think the Nano is still very good, but if you need something better, wait for the Vega.


----------



## natr0n (May 18, 2016)

Nice card very impressive stuff.

Only thing that kinda bothers me it's unnecessarily long.


----------



## Fluffmeister (May 18, 2016)

I feel sorry for AMD, even their own fanboys compare this to the 980 Ti.


----------



## buggalugs (May 18, 2016)

Fluffmeister said:


> I feel sorry for AMD, even their own fanboys compare this to the 980 Ti.



 With every generation we compare the new high end with last gens high end single GPU card.  It seems like a reasonable card, but at around 27% better performance, even less if you compare to last gens non-reference/overclocked cards, its not ground breaking. To me, this is the bare minimum improvement to expect from a new process node. Personally I would have expected at least 50% improvement over 980Ti/FuryX.  They obviously left plenty in reserve for a refresh or 2 and the Ti versions.

 The kind of people interested in spending the money on this card are 980Ti and FuryX owners, and theres not a huge incentive to upgrade. With the temp situation, even less, I would be waiting for a non-reference version.

 It looks like GPUs are going the same way as CPUs at smaller process nodes, super low voltage at stock, but any overclocking and temps rise really fast.


----------



## Kanan (May 18, 2016)

buggalugs said:


> With every generation we compare the new high end with last gens high end single GPU card.  It seems like a reasonable card, but at around 27% better performance, even less if you compare to last gens non-reference/overclocked cards, its not ground breaking. To me, this is the bare minimum improvement to expect from a new process node. Personally I would have expected at least 50% improvement over 980Ti/FuryX.  They obviously left plenty in reserve for a refresh or 2 and the Ti versions.


The problem with Pascal is, it's a compute GPU architecture, whereas Maxwell was a pure Gaming Architecture, that's why it's not such a big jump (ressources wasted on transistors that aren't helpful for games).


----------



## Fluffmeister (May 18, 2016)

The problem is this isn't high-end Pascal, the fact that it beats everything at the moment is just the inconvenient truth and they are charging accordingly.

We all look forward to AMD's response.


----------



## Caring1 (May 18, 2016)

As a negative, I would have put the back plate.
Being two pieces doesn't give it the needed support, in fact the removable section is dead weight which may add to the cards distortion as time goes on.
The high temperatures under load may be a reason for the power limit, hopefully something remedied by the partners as they release cards.
AMD you almost had it right with the Fury X, but failed to implement HBM2 in time.


----------



## rtwjunkie (May 18, 2016)

Caring1 said:


> As a negative, I would have put the back plate.
> Being two pieces doesn't give it the needed support, in fact the removable section is dead weight which may add to the cards distortion as time goes on.
> The high temperatures under load may be a reason for the power limit, hopefully something remedied by the partners as they release cards.
> AMD you almost had it right with the Fury X, but failed to implement HBM2 in time.



Actually, that metal shroud is quite strong and fitted firmly to the pcb, just like the non-stealth-angled one on the 780 and Titan before.  It is quite enough support, trust me.  On that Founder's edition, the back plate serves no purpose but looks.


----------



## Caring1 (May 18, 2016)

rtwjunkie said:


> Actually, that metal shroud is quite strong and fitted firmly to the pcb, just like the non-stealth-angled one on the 780 and Titan before.  It is quite enough support, trust me.


I thought I had read that the fan shroud was plastic, nice if it is an alloy of some kind.


----------



## Frick (May 18, 2016)

cadaveca said:


> NVidia driver BS so they could claim 1080 is more than it really is? It's brilliant marketing that NVidia has employed many times over now. We all know that after some time drivers pull back performance of previous generation as the driver focuses on newer generation?
> 
> I personally have noticed retracted performance on my 980; Dirt Rally used to give me average 45 FPS with my used settings @ 2560X1600, but now, it's 32-34 FPS. That's a considerable drop.
> 
> Dirt Rally has new "Daily" races every single day, with certain tracks repeated over and over again, but using different cars. So I play it every day, noticed the difference right away since I have STEAM overlay display FPS in all titles.



If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?


----------



## rtwjunkie (May 18, 2016)

Caring1 said:


> I thought I had read that the fan shroud was plastic, nice if it is an alloy of some kind.



Hmm, I will have to go re-read! 
If they cheaped out, instead of alloy shroud like they used to use, that would make the $100 premium even less explainable.


----------



## Caring1 (May 18, 2016)

Frick said:


> If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?


I think it is because they can justify the changes in drivers and refer to older cards as legacy, no longer supported in new drivers.
Users can revert back to known working drivers, or "upgrade" their GPU, which is a win, win for the manufacturer.


----------



## nem.. (May 18, 2016)




----------



## cadaveca (May 18, 2016)

Frick said:


> If this is factually correct and verified as such, why isn't the entire Internet boiling about it, and why isn't the lawsuits flying?


I dunno?  Because you can restore past performance by changing back to older drivers? I asked questions, and made one statement about one titles performance. Not the first time new drivers released for upcoming titles affect other things.


----------



## Xavier Gonzalez (May 18, 2016)

I think that 295x2 is a lot more impressive to be honest. That thing managing well!

Any actual benchmarks with the R9 Pro Duo in the future? I expect it to be better than the 1080 if the 295x2 is holding up this good.



Xavier Gonzalez said:


> I think that 295x2 is a lot more impressive to be honest. That thing managing well!
> 
> Any actual benchmarks with the R9 Pro Duo in the future? I expect it to be better than the 1080 if the 295x2 is holding up this good.



And not trying to take away from the 1080. Certainly a huge performance upgrade. Lets see what AMD can do in response.


----------



## Frick (May 18, 2016)

cadaveca said:


> I dunno?  Because you can restore past performance by changing back to older drivers? I asked questions, and made one statement about one titles performance. Not the first time new drivers released for upcoming titles affect other things.



You said they've done it many times. Dropping old cards from new drivers is one thing, but if new drivers intentionally cripples cards that aren't even old... That's a different thing entirely.


----------



## Valdas (May 18, 2016)

medi01 said:


> Then compare it to R9 380.
> After all, it also ends with 80.
> Makes as much sense.
> 
> Price wise, 1080 currently sits between 980Ti and Titanium X, 980 is a different tier entirely. (789 Euros FFS)


Nobody is forcing you to buy Founders Edition. Wait a few weeks and get a cheaper custom card. It's not like every 1080 will be priced at $699 and above.


----------



## the54thvoid (May 18, 2016)

Kanan said:


> Wait for the Vega10 GPU, 1st 14nm true enthusiast GPU - coming to you October.  Jokes, I think the Nano is still very good, but if you need something better, wait for the Vega.



Vega is 5 months out at a minimum. Card now is 1080. 5 months after Vega you get daddy Pascal (almost twice the chip) which may well be GP102 and tweaked to provide more gaming centric purpose. The main Pascal chip if clocking similar to 1080 will obliterate anything before it.
Worse thing from your standpoint is that 5 months after Vega 10 you will also have the full Vega chip. 
Saying half a year wait for Vega 10 is just ignorant of tech market conditions. If you wait like that, you may as well wait for the 'proper' Vega chip.

For now, the FE 1080 is by reviewer conclusions, an incredible feat but too expensive. If you can get partner OC and power relaxed cards for same or less price, then you'll have a ridiculously fast piece of kit.
In fact rather than wait for Vega 10, people ought to just wait till June for the the partner cards. I'll wait for EK to release some custom partner blocks too.


----------



## ViperXTR (May 18, 2016)

Then comes Pascal refresh at 14nm GTX 1085, 1075 etc


----------



## geon2k2 (May 18, 2016)

I've always supported the underdog for the sake of competition, so that's AMD, but this is like OM(F)G. 
Incredibly fast, incredibly efficient. Well done nVidia. Too bad I stil sit on a 1080p monitor ... and this doesn't make much sense for me, but maybe 1060 ... will.


----------



## RejZoR (May 18, 2016)

I have to say, at 1080p (which is what I use) it's basically 40% faster than GTX 980. That's quite significant. Also the overclock scaling, in give game, OC gave the user 20 additional frames per second. That's a lot. May vary depending on game but still.

Btw, GPU Boost thingie is actually a legit cheat tool. Look at it from a different perspective, in actual games people play, it'll always heat up so much the clock will drop quite a lot because of it, meaning you're not really getting the performance you were advertised in a card review where benchmark shoot up the clocks, was benchmarked on that and card never really had enough stress to start drastically downclock the GPU through the GPU Boost. And between different games, it has a time to cool down enough to boost it up more on next testing cycle.

I think reviewers should pre-heat the graphic card to its normal operating temperature and then do a benchmark to give users the actual performance they'll be getting from the card. This is especially important considering how far Pascal is pushing the clocks and how that affects the performance. Sure, it looks great in a review and those 40% boost over GTX 980 is amazing, but is it really 40% in actual gaming when you play for a hour or two and card really heats up properly? I think it's not. And that is important imo.


----------



## Assimilator (May 18, 2016)

Aquinus said:


> Great review but, I do find it funny that when AMD releases a GPU with 8GB of VRAM and it gets a Con for "8 GB VRAM provides no tangible benefits" but, when nVidia does it, suddenly it's not a con anymore. In fact there is practically no mention of the 8GB of VRAM beyond the title, intro, and a single bullet in the Pro list in the conclusion. I just find that interesting. Did 8GB suddenly become worth while? Are these games utilizing more than 4GB with the 1080 when it didn't with other nVidia cards? I might just be looking into it too much but, I always see cons for large amounts of VRAM and I'm wondering why it's not a con anymore. It's very interesting timing...



AMD cards with 8GB of VRAM were all using Hawaii, which doesn't have enough horsepower to drive the higher resolutions/VR that would actually use that extra VRAM. Pascal does, hence why the 8GB is actually useful.


----------



## BiggieShady (May 18, 2016)

W1zzard said:


> Just to burst any theory bubbles: SMP can only work from the same viewpoint, so imagine in a 3d shooter being able to use the mouse to look anywhere you want, but you can not walk somewhere else


They are using SMP for simultaneous stereo in VR and each eye has a different position. One eye has 4 projections from one point and other eye 4 projections from another position, total 8 projections.
If they are really imposing restrictions for projection origin it's obvious from VR example that's not because it's required.
However, complete and total freedom with projections could mess up ability to partially load the game world, to use level of detail and algorithms like frustum culling or occlusion culling, but IMO that's not a reason to disable it.


----------



## cadaveca (May 18, 2016)

Frick said:


> You said they've done it many times. Dropping old cards from new drivers is one thing, but if new drivers intentionally cripples cards that aren't even old... That's a different thing entirely.


It's not nefarious. They reach the pinnacle of performance for a design, and then focus on the new gen. So a driver optimized for a newer gen doesn't run as well as a driver in the past gen, especially when there are significant changes in architecture. I may have given the wrong idea in my post, but that's my sarcasm, yet again.  They intentionally focus on the new cards, and this affects the older cards. It only sucks if you play all games as they come out and are constantly changing drivers, but if you stick to older drivers and play the same games of the same generation as the cards, there aren't any problems. So I keep updating the cards I use, and deal with it, as to me it's no different than having to buy a new console every time they release one.

As to NVidia not causing problems by doing so, and nobody complaining about it, there are countless articles and threads on countless sites complaining of newer drivers affecting performance in specific titles or outright stability issues.


----------



## FordGT90Concept (May 18, 2016)

NVIDIA has placed the bar high.  Here comes AMD...





I hope not but I am a realist.


----------



## BiggieShady (May 18, 2016)

cadaveca said:


> They intentionally focus on the new cards, and this affects the older cards.


They are supposed to maintain different codepaths for all supported architectures in all critical segments being optimized ... I'd say someone over there doesn't know how to properly merge git branches .


----------



## laszlo (May 18, 2016)

farlandprince said:


> I have the feeling that AMD will win this round ...
> 
> Any news about their new cards release date?



if you mean taking the crown for fastest single gpu ? i don't think so...

will be a win for them launching a card within 980 perf. range but at much lower prices; not everyone afford to spend over 300$ for a gpu just because a new one hit the market

for me best bang for the buck decide next upgrade

i must congr. NV for this new card as perf. wise is like the new veyron


----------



## RejZoR (May 18, 2016)

I don't think projections work that simple imo. Sure they say it's a single pass thing, but that doesn't mean it makes no performance hit or operates at perfect double rate than before. After all, you do have to process two different sets of pixels because they are shifted a bit (VR) or shifted a lot (multi monitor). In the end, you still have to process double the amount of pixels, it's just the way how GPU takes that from the scene which is now more efficient.


----------



## matar (May 18, 2016)

I am so glad I waited and skipped Kepler i was going to buy Maxwell but was upset that 600 series & 700 & 900 and still on the 28nm so I skipped it and very happy I did the performance of the  GTX 1080 its like NVidia skipped one full GTX series and gave us this wow of a performance with this pascal.
GTX 1070 is my next GPU can't wait...


----------



## ivan375 (May 18, 2016)

Waiting to see what Polaris is about. GTX 1080 and 1070 are nice but expensive and overkill for a single 1080p panel.


----------



## mroofie (May 18, 2016)

HD64G said:


> GTX1080 performs 32% better from stock 980Ti at 4K and 1440P and 36% better at 1080P.



He said 980


----------



## mroofie (May 18, 2016)

FordGT90Concept said:


> NVIDIA has placed the bar high.  Here comes AMD...
> 
> 
> 
> ...


after what happened last time Im inclined to agree


----------



## mroofie (May 18, 2016)

ivan375 said:


> Waiting to see what Polaris is about. GTX 1080 and 1070 are nice but expensive and overkill for a single 1080p panel.


Lmao


----------



## Nosada (May 18, 2016)

laszlo said:


> i must congr. NV for this new card as perf. wise is like the new veyron


One can only hope AMD somehow pulls of a Kawasaki H2R.


----------



## Warrgarbl (May 18, 2016)

As awesome as this card is it worries me that Nvidia is on its way to a de-facto monopoly in the GPU market. The first sign of that is the Founder edition. If AMD can't put out a product that can compete we are going to have a lot more price gouging and other bullshit by Nvidia coming for us. That being said it really IS an awesome card.


----------



## GreiverBlade (May 18, 2016)

rtwjunkie said:


> Hmm, I will have to go re-read!
> If they cheaped out, instead of alloy shroud like they used to use, that would make the $100 premium even less explainable.


even with an alloy it would still be non-explainable, it's the exact same shroud as the 9XX series with a more edgy design, nvidia only did that because their previous iteration got good impression about, and now they want to charge 100$ more for early adopter... 



Valdas said:


> Nobody is forcing you to buy Founders Edition. Wait a few weeks and get a cheaper custom card. It's not like every 1080 will be priced at $699 and above.


yeah right because custom cards from AIB will be priced lower than the reference model (founders ed. is only the reference model thus being the reference price.) 699$ is the reference price albeit nvidia "saying" the msrp will be 599$ and also AIB, most of the time, price their custom above MSRP so even if they follow nvidia about the MSRP they will not be cheaper.



Assimilator said:


> AMD cards with 8GB of VRAM were all using Hawaii, which doesn't have enough horsepower to drive the higher resolutions/VR that would actually use that extra VRAM. Pascal does, hence why the 8GB is actually useful.


well the 295x2 could use 8gb for each chips ... so a 8gb 390X CFX would be enough then, since those 8gb was meant for VR/4K but in CFX ... ok technically being useful only in case of a multiGPU can make it pass for a cons.


----------



## trog100 (May 18, 2016)

very impressive.. especially the low power usage.. but i think the "founders edition" trick is just a way of making the new card seem cheaper than what it really is.. the real price will be $699 and not $599.. 

a new model 980 that comes in more expensive than the old model 980 TI dosnt look that good.. they are cleverly trying to hide the fact.. he he

trog


----------



## medi01 (May 18, 2016)

Assimilator said:


> Pascal does


Mostly 1440p.



Valdas said:


> Nobody is forcing you to buy Founders Edition. Wait a few weeks...


Why weeks? Why not wait a few month and claim "that" is the true release price (whatever the price will be by that time).



Fluffmeister said:


> I feel sorry for AMD, even their own fanboys compare this to the 980 Ti.


Aristotle is rolling is his grave, as The (n)Grandmaster of (n)Logic has spoken.

PS
980Ti is the best card of previous generation, it was released to spoil Fury launch (and it did spoil it quite well). On stock it is hardly spectacular, but OCed into sky it trounces anything in previous gen and spoils 1080 with its pathetic 12% OC.

PPS
10x Maxwell, dude.
2.5 Titan at VR.
For mere 599$ or less, some day.
Peace & Harmony.


----------



## Fluffmeister (May 18, 2016)

You're damn right the 980 Ti was the best card, glad to see everyone finally accepting that. 

But you go girl, fight the power!


----------



## Frick (May 18, 2016)

laszlo said:


> i must congr. NV for this new card as perf. wise is like the new veyron



Nope. Not even close. This is like a new Opel, tops. A Veyron would have to be a super-limited quad card running at 2.5 Ghz and scaling perfectly in every game ever made, and being inaudible to boot.

What it in reality is, is a new GPU with above all great effeciency and decent power, which is only what it should be.


----------



## Basard (May 18, 2016)

Interesting, it's even better than I expected.


----------



## rtwjunkie (May 18, 2016)

Caring1 said:


> I thought I had read that the fan shroud was plastic, nice if it is an alloy of some kind.



I found a little explanation. Magnesium alloy.  So, fitted firmly to the pcb, like the 780, etc...it is all the support that pcb needs. https://techarx.com/nvidia-gtx-10801070-a-look-after-cooler-shroud-taken-apart/


----------



## Valdas (May 18, 2016)

medi01 said:


> Why weeks? Why not wait a few month and claim "that" is the true release price (whatever the price will be by that time)


Whatever floats your boat. You should remember that MSRP for 980 Ti at it's launch day was $650 while MSRP for 1080 is $599. You should be able to get this "beauty" at around that price on day one.


----------



## rtwjunkie (May 18, 2016)

Valdas said:


> Whatever floats your boat. You should remember that MSRP for 980 Ti at it's launch day was $650 while MSRP for 1080 is $599. You should be able to get this "beauty" at around that price on day one.



Except, you are comparing a card made with the mid-level gp104 to the previous gen flagship.  The real price comparison should be the 980.

With the price coming in not very far under the 980 Ti, and way above the 980, whose place in the lineup the 1080 occupies, it's not a good buy monetarily at all.


----------



## Valdas (May 18, 2016)

rtwjunkie said:


> Except, you are comparing a card made with the mid-level gp104 to the previous gen flagship.  The real price comparison should be the 980.
> 
> With the price coming in not very far under the 980 Ti, and way above the 980, whose place in the lineup the 1080 occupies, it's not a good buy monetarily at all.


You're correct, but medi01 insist on comparing it with 980 Ti, so same should be applied to price, not just the performance.
980 was priced at $549 MSRP, so we're getting ~60% more performance for ~10% higher price when compared to 980.


----------



## Caring1 (May 18, 2016)

Valdas said:


> Whatever floats your boat. You should remember that MSRP for 980 Ti at it's launch day was $650 while MSRP for 1080 is $599.


Except the reference 1080 is $699 and the partners most likely won't undercut that by much, if at all.
To be realistic it wouldn't make sense for partners to put money in to new designs and sell cheaper.


----------



## Valdas (May 18, 2016)

Caring1 said:


> Except the reference 1080 is $699 and the partners most likely won't undercut that by much, if at all.
> To be realistic it wouldn't make sense for partners to put money in to new designs and sell cheaper.


True to that as well. But with 1080 we have two price points, not one, so it's not entirely correct to pick the highest either. In the end everyone has their own preferences, I'll be looking for Gigabyte Extreme Gaming few weeks later (rumor).


----------



## newconroer (May 18, 2016)

xorbe said:


> Also VirtualBox sessions keep my Titan X spun up.  The 960 (used one for a while, it's in the htpc now) was nice in that it hurt a lot less this way (heating up my room upstairs).  Reduced power would be a nice reason to swap out for a 1080.  I don't really need a new card though ...


Is that on a server box or ..picturing the use of a high end graphics card in Virtualbox.



kithylin said:


> WHY DO YOU NOT SHOW US THE MINIMUMS! WE NEED MINIMUM NUMBERS!
> 
> YOUR ENTIRE REVIEW IS USELESS AND A WASTE OF TIME!



Erm Wizz is a bit old school and not in a good way...


----------



## Aquinus (May 18, 2016)

Assimilator said:


> AMD cards with 8GB of VRAM were all using Hawaii, which doesn't have enough horsepower to drive the higher resolutions/VR that would actually use that extra VRAM. Pascal does, hence why the 8GB is actually useful.


No, the reasoning was that nothing really could take advantage of 8GB, not that it can't use it. I've seen some games use more than 4GB on my 390 and it runs just fine but, there are other GPUs that use upto 3GB or 4GB that still run just as well. W1zz has even stated that most games, even at 4k don't tend to need more than 4GB yet but, there are games that will use it if it's there. The problem with saying that Hawaii doesn't have the ability to drive the VRAM because it depends on the workload. Generally speaking, it isn't compute that takes up a lot of VRAM, it's textures, and Hawaii has quite a large number of TMUs and has some pretty significant texturing capability. I was able to play Farcry 4 in surround with AA off without too much problem and there were occasions where I used just over 4GB. Same deal with Elite Dangerous, in some situations (with the 64-bit client,) more than 4GB of VRAM could be used, in fact I saw usages almost as high at 5GB but, that isn't to say the GPU needs all of it at once or that the 390 can't handle it.

Either way, I still want to know the reasoning behind 8GB not being a con. It's either because nVidia is doing it which now makes it "normal," or games have evolved enough where 8GB can actually provide some level of tangible benefits.

Also, even if I were to play devil's advocate and say that the 390 can't handle 8GB worth of whatever gets put in there, I would argue that wouldn't be the case in CFX as I can attest to personal experience that more memory for multi-GPU setups is generally a good thing having come from CFX 6870s.


----------



## medi01 (May 18, 2016)

Tell me, guys, *why labels matter to you*.

If 1080 was called "3.1415" would you suddenly be comparing it to a, uh, oh, something else?

To which AMD card will you compare it, to one with "80" in it, or one that is priced accordingly?

Price is the ultimate metric that tells you what tier a card really is.
And 699$/789€ (fuck you, nZilla)  1080 is in a tier of its own, between 980Ti and Titanium X.


----------



## Parn (May 18, 2016)

The performance and efficiency gain is really impressive (sort of expected moving from 28nm straight to 16nm. If maxwell was built using 20nm, maybe the gap wouldn't be this big). Now waiting for AMD to bring out something that can compete with this so as to lower its price to a more reasonable range.


----------



## laszlo (May 18, 2016)

Frick said:


> Nope. Not even close. This is like a new Opel, tops. A Veyron would have to be a super-limited quad card running at 2.5 Ghz and scaling perfectly in every game ever made, and being inaudible to boot.
> 
> What it in reality is, is a new GPU with above all great effeciency and decent power, which is only what it should be.



not quite understand your comparison .....as i see now is the fastest single GPU on market; in your view which is the veyron of single GPU's ?


----------



## peche (May 18, 2016)

as i t was posted by the review, founders cards aren't made for OC, so why are more expensive?
My mind couldn't get the reason why they are more expensive... if the coolers also get the card to the limits... 

Regards,


----------



## redeye (May 18, 2016)

Stop moaning about the cost of the 1080, is is twice the performance of the gtx970, at twice the price in 6-12 months. Right now, is it worth the extra 100 dollars NOT to have SLI problem's of a 970SLI?... Is it worth 100 dollars more  to have twice the performance of a SLI970 and SMOOTH as butter?.


What can i say, when everybody is complaining about the cost of the 1080?...
It is a new product.
looking at the steam hardware, 4% have 970's, 1% have 980's. A 980 is 200 dollars more than a 970, so if nvidia had only 980's and priced then at 50 dollars more than a 970  would they have sold more?. Or was it was people buying the 970 thinking it is a 980?.

TL;DR... Would everybody replace their 970/980 with a 1080 at 500?. Or would they have some excuse that their new card, is good enough?.

Looking at the review of the gtx980ti, most recent one, the gtx 1080 at stock is as fast as the overclocked gtx980ti https://www.techpowerup.com/reviews/Gigabyte/GTX_980_Ti_XtremeGaming/26.html
(stock gtx1080 137.3 fps vs 136fps overclocked gtx980ti)


----------



## apoe (May 18, 2016)

People mention the price, but it has a preeetty impressive price/performance ratio, especially for a higher tier card. Interested to see where the 1070 lands.


----------



## FreedomEclipse (May 18, 2016)

Looking for one of these to replace my 970s... And bring balance to the force


----------



## GhostRyder (May 18, 2016)

Looks like a great card as expected.  I only see a few problems one of which being that cooler which is being charged a lot for and only delivers ok performance.  Guess I was expecting more by the way they were touting it at the event but whatever since I never buy those cards for the coolers.

I really want to see some classified variants or similar.  I want to push that overclock well beyond the 2100mhz mark!!!  I want to actually get a pair of these with the new SLI bridge, water cool them, overclock the life out of them, then get a nice Ultra Wide G-Sync monitor to complement them.  Great performance from the card, its nice to see a decent performance gap.


----------



## peche (May 18, 2016)

founders edition is the reference cooled one or the VR ready?


----------



## chinmi (May 18, 2016)

If amd cant make a better card then the 1080 is the nail in the coffin for amd


----------



## HD64G (May 18, 2016)

mroofie said:


> He said 980


Price and performance wise 1080 is the opposition for 980Ti and FuryX. So, GTX1080 vs GTX980Ti Gigabyte Extreme, 1080 is a bit more pricey with 13-17% more performance. That's its competition in GPU market atm for anyone willing to spend much for the best single core GPU.


----------



## medi01 (May 18, 2016)

redeye said:


> Stop moaning about the cost of the 1080, is is twice the performance of the gtx970, at twice the price in 6-12 months.



Look, kid:

1) Pricing for cards in each tier normally stay the same, since people tend to spend roughly the same amount of money (regardless of the tier they stick to)
2) Progress, aka "we got a new fab process" allows to produce faster cards for roughly the same money
3) This leads to HIGHER perf/$ numbers

Now, when you get a card that is two times faster, but 3 times more expensive... That's pure pwnage of you, the customer.
Which I, for one, welcome!
Please, dear Huang (Juang? Anyhow, CEO of nZilla) make it even moar "foundation"-all, many dudes out there do deserve it!


----------



## cadaveca (May 18, 2016)

medi01 said:


> Look, kid:
> 
> 1) Pricing for cards in each tier normally stay the same, since people tend to spend roughly the same amount of money (regardless of the tier they stick to)
> 2) Progress, aka "we got a new fab process" allows to produce faster cards for roughly the same money
> ...




"If you build it, they will come."


----------



## geon2k2 (May 18, 2016)

redeye said:


> Stop moaning about the cost of the 1080, is is twice the performance of the gtx970, at twice the price in 6-12 months. Right now, is it worth the extra 100 dollars NOT to have SLI problem's of a 970SLI?... Is it worth 100 dollars more  to have twice the performance of a SLI970 and SMOOTH as butter?.
> 
> 
> What can i say, when everybody is complaining about the cost of the 1080?...
> ...



Please consider that NA and few countries from EU have salaries for exactly the same job, with exactly the same working hours and mostly the same prices for food and goods, up to 10 times higher than the rest of the world. In some countries that 700E is about yearly salary, heard stories about some Chinese Walmart employees for which that is 2 years pay, in some other 2 average monthly salaries, from which bills have to be paid, children to be fed. So yeah it is crazy expensive. The fact that you are fortunate to be born in a western country doesn't mean that suddenly this is cheap.  

Please check this :


----------



## z1tu (May 18, 2016)

geon2k2 said:


> Please consider that NA and few countries from EU have salaries for exactly the same job, with exactly the same working hours and mostly the same prices for food and goods, up to 10 times higher than the rest of the world. In some countries that 700E is about yearly salary, heard stories about some Chinese Walmart employees for which that is 2 years pay, in some other 2 average monthly salaries, from which bills have to be paid, children to be fed. So yeah it is crazy expensive. The fact that you are fortunate to be born in a western country doesn't mean that suddenly this is cheap.
> 
> Please check this :


People who have children and 700E yearly salaries have other more important things to focus on than the price of a new video card, this made no sense.


----------



## TheGuruStud (May 18, 2016)

Ha, what a failure. I just benched GTA V (tons of crap in background no less) with same settings and at 1440, I got 99 fps on my OCed 980ti. Sorry, 12% increase from OCing 1080 won't cut it.

NEXT

Logan's numbers make more sense.


----------



## Assimilator (May 19, 2016)

Aquinus said:


> No, the reasoning was that nothing really could take advantage of 8GB, not that it can't use it. I've seen some games use more than 4GB on my 390 and it runs just fine but, there are other GPUs that use upto 3GB or 4GB that still run just as well. W1zz has even stated that most games, even at 4k don't tend to need more than 4GB yet but, there are games that will use it if it's there. The problem with saying that Hawaii doesn't have the ability to drive the VRAM because it depends on the workload. Generally speaking, it isn't compute that takes up a lot of VRAM, it's textures, and Hawaii has quite a large number of TMUs and has some pretty significant texturing capability. I was able to play Farcry 4 in surround with AA off without too much problem and there were occasions where I used just over 4GB. Same deal with Elite Dangerous, in some situations (with the 64-bit client,) more than 4GB of VRAM could be used, in fact I saw usages almost as high at 5GB but, that isn't to say the GPU needs all of it at once or that the 390 can't handle it.
> 
> Either way, I still want to know the reasoning behind 8GB not being a con. It's either because nVidia is doing it which now makes it "normal," or games have evolved enough where 8GB can actually provide some level of tangible benefits.
> 
> Also, even if I were to play devil's advocate and say that the 390 can't handle 8GB worth of whatever gets put in there, I would argue that wouldn't be the case in CFX as I can attest to personal experience that more memory for multi-GPU setups is generally a good thing having come from CFX 6870s.



As you rightly pointed out, there are games that take advantage of >4GB GPU memory, but those are still few and far between and most importantly, they are generally more texture-heavy than processing-heavy (open-world games like Far Cry, Elite, GTA, Skyrim with extra texture packs). If those games are your primary use-case, then yes Hawaii + 8GB is going to be better for you than Hawaii + 4GB.

But in my opinion, games that are "imbalanced" in this way (heavy on textures, light on GPU) aren't (or at least, shouldn't be) the future of gaming. Unfortunately, due to certain realities (consoles and the age of DirectX 11), game engines just haven't been able to properly balance great textures with great particle effects, highly detailed models, etc. DirectX 12 will (I fervently hope) end this stagnation and give us a new generation of game engines that need the horsepower that Pascal has and Hawaii doesn't. I'm talking the graphical leaps we saw going from Unreal Engine 1 to 2 to 3, not sightly more detailed tessellation.

In all honesty, Fiji was the right time to make 8GB the new normal because Fiji was the first arch that truly had the horsepower to drive 4K; unfortunately just as Hawaii + 8GB is mostly hamstrung by Hawaii, Fiji + 4GB was mostly hamstrung by the 4GB limit. AMD mitigated that somewhat with the system memory cache, but in all honesty Fiji with 8GB HBM (and maybe a few extra ROPs) would've blown GTX 980 Ti out of the water, and I'll be very interested to see how Vega (which should essentially be Fiji+ coupled to 8GB+ memory) will perform in regards to validating that theory. nVIDIA just happens to have not fumbled the ball in regards to coupling 8GB memory to a GPU that can actually use it well, and probably they don't really deserve that credit, just as AMD doesn't really deserve getting pissed on for trying to push 8GB ahead of its time. But such is the way the chips have fallen.

I purposefully ignored Crossfire for the simple reason that it's mostly irrelevant, because the vast majority of people don't have CF setups; they have single GPUs. That means developers are going to optimise for single GPUs, and that means there are really no games that were coded to exploit the horsepower of 2x Hawaii. But Pascal has that amount of horsepower and 8GB memory on a single card, which means it's going to become the new optimisation target - and also means, ironically, that 8GB Hawaiis in CF might be a better investment now than they were when first released...


----------



## Fluffmeister (May 19, 2016)

I love my 980 Ti too, but this card is just plain better in every way, and it isn't even it's direct replacement (which people just don't seem to get).


----------



## Assimilator (May 19, 2016)

TheGuruStud said:


> Ha, what a failure. I just benched GTA V (tons of crap in background no less) with same settings and at 1440, I got 99 fps on my OCed 980ti. Sorry, 12% increase from OCing 1080 won't cut it.
> 
> NEXT
> 
> Logan's numbers make more sense.



Yes dear, we all know that performance is the only metric that matters and that small things like power consumption, noise, and heat output are irrelevant. The latter three, strangely enough, are precisely the reason why I sidegraded from a 980Ti to a 980; the Ti ran too hot to sustain its overclock, even though it had a custom cooler whose fans were loud enough to make me want to scream.


----------



## Assimilator (May 19, 2016)

medi01 said:


> Look, kid:
> 
> 1) Pricing for cards in each tier normally stay the same, since people tend to spend roughly the same amount of money (regardless of the tier they stick to)
> 2) Progress, aka "we got a new fab process" allows to produce faster cards for roughly the same money
> ...



Wrong. New fabrication processes are more expensive initially because yields are always poorer than established processes (this is something you'd know if you had *any* sort of knowledge of the semiconductor industry).

And, again, for the billionth time, nobody is forcing anybody to buy the Founders Edition cards. If you can afford to buy on day one and choose to do so, good for you. If you can't, or you have patience, you wait for the cheaper custom designs and save $100. It's entirely up to you, the buyer.


----------



## TheGuruStud (May 19, 2016)

Assimilator said:


> Yes dear, we all know that performance is the only metric that matters and that small things like power consumption, noise, and heat output are irrelevant. The latter three, strangely enough, are precisely the reason why I sidegraded from a 980Ti to a 980; the Ti ran too hot to sustain its overclock, even though it had a custom cooler whose fans were loud enough to make me want to scream.



Mine may break into the 70s in the summer. It's a Zotac AMP! Fans aren't loud, either, and I have a custom profile to crank it up. 

Yes, performance is all that matters lol. If the Fury X wasn't a pile of crap, I would have bought it.


----------



## Aquinus (May 19, 2016)

Assimilator said:


> As you rightly pointed out, there are games that take advantage of >4GB GPU memory, but those are still few and far between and most importantly, they are generally more texture-heavy than processing-heavy (open-world games like Far Cry, Elite, GTA, Skyrim with extra texture packs). If those games are your primary use-case, then yes Hawaii + 8GB is going to be better for you than Hawaii + 4GB.
> 
> But in my opinion, games that are "imbalanced" in this way (heavy on textures, light on GPU) aren't (or at least, shouldn't be) the future of gaming. Unfortunately, due to certain realities (consoles and the age of DirectX 11), game engines just haven't been able to properly balance great textures with great particle effects, highly detailed models, etc. DirectX 12 will (I fervently hope) end this stagnation and give us a new generation of game engines that need the horsepower that Pascal has and Hawaii doesn't. I'm talking the graphical leaps we saw going from Unreal Engine 1 to 2 to 3, not sightly more detailed tessellation.
> 
> ...


Ah, okay. I see where you're going. I think I need to do a little more explanation on what's going through my head. You're right that everything isn't going to be about textures but, that's where the extra VRAM is important. If we're going to talk about compute and texturing, you have to keep in mind we're talking about two different parts of the GPU. On top of that, they're implemented differently between AMD and nVidia which further muddies the waters.

Texturing is going to have a huge preference for higher bandwidth memory, this is why the huge number of TMU and wide memory bus on Hawaii excels at texturing. Textures are big so, they're not going to easily reside in cache, so it's important to get this data into the TMUs are quickly as possible when it's needed. If you want higher resolutions to look pretty, you need higher resolution textures which will immediately put more strain on both the GPU's memory controller as well as the TMUs.

On the compute side, Hawaii most definitely has a weakness. It's the same damn weakness AMD introduced with Bulldozer and that's "omg more cores." If you look at the 390, it has 40 CUs with a preference for lower clocks. Whereas nVidia has a preference for fewer compute clusters but, higher clocks. Just like normal CPUs, this makes serial-like workloads much faster, so unless an engine is capable of taking advantage of all 40 CUs at once with whatever 3D API it's using, you need to do some serious work. Async shaders favors AMD because AMD GPUs have more parallel throughput than serial throughput compared to nVidia's GPUs so there are a lot of untapped resources in that respect.

I'm seriously not trying to say which one is better, I'm just saying that there are certain workloads that each GPU are good at and that the choices AMD has made have been pretty consistent just as nVidia's has. I will agree that with pre-DX12 games, that nVidia will almost always wipe the floor clean but, DX12 has some perks that can have a huge preference for the way GCN has been architected. For that reason, I think that Hawaii might fare better than you might think it will.

Either way, the question still remains regardless of which camp is better. What changed to make 8GB not a con?


----------



## xorbe (May 19, 2016)

Fluffmeister said:


> I love my 980 Ti too, but this card is just plain better in every way, and it isn't even it's direct replacement (which people just don't seem to get).



980 Ti is $649?  1080 is $599/699?  Seems like a direct replacement ...


----------



## Fluffmeister (May 19, 2016)

xorbe said:


> 980 Ti is $649?  1080 is $599/699?  Seems like a direct replacement ...



Good point.... it's faster, more efficient and offers more VRAM to boot. I can see why you'd think that.

Those pesky GXX04 chips certainly do fly.


----------



## Kanan (May 19, 2016)

the54thvoid said:


> Vega is 5 months out at a minimum. Card now is 1080. 5 months after Vega you get daddy Pascal (almost twice the chip) which may well be GP102 and tweaked to provide more gaming centric purpose. The main Pascal chip if clocking similar to 1080 will obliterate anything before it.
> Worse thing from your standpoint is that 5 months after Vega 10 you will also have the full Vega chip.
> Saying half a year wait for Vega 10 is just ignorant of tech market conditions. If you wait like that, you may as well wait for the 'proper' Vega chip.
> 
> ...


1080 is only good for users who don't own a Fury X or GTX 980 Ti and only worthwile with custom cooling. The ref cooling version is severly hindered by its cooler = lower boost clocks + lower OC capability.

The "5 months after"-argument is moot. You could always say that. Yeah and after "daddy pascal" there comes "daddy Vega". What now? And after "daddy Vega" comes new architecture of Nvidia (kinda "Maxwell 2" architecture without productive crap that gamers don't need and wastes a lot of transistors). And after that there comes something new of AMD again. Like I said, a moot point.

No, what you say is ignorant. People buy things when they need them, they usualy DON'T wait (they only do if it's a few week, tops, not 5 months, lol). What I said was just, that the first GPU worthwile for enthusiast users (who obviously already have a enthusiast GPU) is most probably Vega10, end of story.


----------



## -1nf1n1ty- (May 19, 2016)

I was worried a bit that I should have just waited for the new cards but my R9 390 seems to hold up pretty well. This isn't a HUGE performance boost to me like nvidia was claiming. The power draw is definitely great though. Wonder what companies like Gigabyte and EVGA are gonna do.

Or maybe I'm blind!


----------



## HumanSmoke (May 19, 2016)

Aquinus said:


> Great review but, I do find it funny that when AMD releases a GPU with 8GB of VRAM and it gets a Con for "8 GB VRAM provides no tangible benefits" but, when nVidia does it, suddenly it's not a con anymore


I think the difference is that previous 8GB cards are "upgrades" over existing 4GB cards. The underlying architecture wasn't strong enough to utilize 8GB (Most comparisons between 4GB and 8GB only showed difference when the 4GB framebuffer was saturated with HD textures. Less a win for 8GB than a loss for 4GB). When the "cons" were voiced, the instances when 8GB made a tangible difference much more fewer and further between .The GTX 1080 is only available as an 8GB card. No other option exists.


JJJJJamesSZH said:


> I think a EVGA engineer confirmed that 1080 only supports 2-way SLI?


Not actually true. Via Nvidia via Kyle Bennett:


> While NVIDIA no longer recommends 3 or 4 way systems for SLI, we know that true enthusiasts will not be swayed…and in fact some games will continue to deliver great scaling beyond two GPUs. For this class of user we have developed an Enthusiast Key that can be downloaded off of NVIDIA’s website and loaded into an individual’s GPU.


Catering for the HWBot crowd.


GreiverBlade said:


> also, "founders edition" +100$ = 699 and AIB will table on a 599$ MSRP my @$$, AIB will never price their custom lower than a reference and since the founders IS the reference model they will table on a base price of 699$, why would they do otherwise?


Because some vendors see the value of selling reduced BoM cards for those consumers who value savings over bling. Sapphire did it for years with their Flex cards.
I'm pretty sure the high volume sellers will be more than happy to sell at $599. Care  to make a wager that they won't? Palit/Galax have already shown a reduced BoM GTX 1080 using a recycled reduced BoM GTX 980 / GTX 980 Ti HSF shroud.


Caring1 said:


> Except the reference 1080 is $699 and the partners most likely won't undercut that by much, if at all.
> To be realistic it wouldn't make sense for partners to put money in to new designs and sell cheaper.


That'swhy they recycle previous cooling solutions and plonk them on top of the reference PCB. The largest graphics AIB's also act as full manufacturers for not just their own house brands (i.e. Palit with GALAX and Gainward), but AIB's that have no manufacturing base (i.e. the aforementioned Palit building cards for PNY). ODM's like Asus, EVGA, and Gigabyte probably wont bother selling at the low end of the 1080 market, but it has already been shown that the really large OEMs selling huge volumes of cards are quite eager to sell at every price point to maximize opportunity and keep the manufacturing and assembly lines fully occupied.


medi01 said:


> Then compare it to R9 380.
> After all, it also ends with 80.
> Makes as much sense.


Only makes sense if the GPU tier is the same. GP104 is a second tier GPU, the R9 380 is a third tier GPU.
Comparison should be by GPU tier, not numbers. GM204 is a second tier GPU of the Maxwell architecture, GP104 is a second tier GPU of the Pascal architecture.


----------



## BoyGenius (May 19, 2016)

Surrounding air @ 20 C & fan at 100 % leads to that temprature.
Reference



mrthanhnguyen said:


> Wonder why the temp was 67 at the event.


----------



## xorbe (May 19, 2016)

To me sounds like that basically they want to move 3- and 4-way support to enterprise class, and by handing out keys one at a time, ensuring that only a few gamer end-users are getting the legacy functionality for free.


----------



## GreiverBlade (May 19, 2016)

HumanSmoke said:


> I'm pretty sure the high volume sellers will be more than happy to sell at $599. Care  to make a wager that they won't?


nope i don't care. (mainly because the 980 was supposed to be like that or lower and it ended for the 980 to be at 980Ti MSRP and 980Ti nearing a Titan X MSRP for me ... ) 

because:


HumanSmoke said:


> Palit/Galax have already shown a reduced BoM GTX 1080 using a recycled reduced BoM GTX 980 / GTX 980 Ti HSF shroud.


these brands are not widely distributed, specially where i am, tho Palit have some good review for them .... Galax on the other hand ...


----------



## medi01 (May 19, 2016)

Assimilator said:


> New fabrication processes are more expensive initially


Or mature processes become cheaper? Oh wait...
That thing is called "investment", "Research and Development".
This is something you'd know if you had *any* sort of knowledge of pretty much any (manufacturing and not) industry.

And in this very context, it's just a silly attempt to justify blatant price gouging with some "expensive tech".



Kanan said:


> The "5 months after"-argument is moot. You could always say that.


It is THE FIRST TIME EVER we see GPU manufacturers advance over 2 node processes in one go. Oh and that lovely HBM2 too.

We'll see vastly faster cards wiping floor with previous gen cards, with 2+ times performance jump.

It is "always like that", my arse.


----------



## deemon (May 19, 2016)

one question I haven't found answer yet. DOES THE 1080 DO HAIRWORKS and the likes of nvidia gimmicks any better than previous generations? I mean apart from the general performance boost, is the hairworks working now even better than the average performance increase or not?


also really sad to see that the 1080 ASYNC "now OK" is a lie (and seems to work off CPU / drivers in software):
http://e.infogr.am/74b6f789-48a6-4a83-a44e-845be2a8cb3c?src=embed


----------



## Aquinus (May 19, 2016)

HumanSmoke said:


> I think the difference is that previous 8GB cards are "upgrades" over existing 4GB cards. The underlying architecture wasn't strong enough to utilize 8GB (Most comparisons between 4GB and 8GB only showed difference when the 4GB framebuffer was saturated with HD textures. Less a win for 8GB than a loss for 4GB). When the "cons" were voiced, the instances when 8GB made a tangible difference much more fewer and further between .The GTX 1080 is only available as an 8GB card. No other option exists.


What? The only time memory capacity impacts performance is when there was already not enough memory in the first place. Once again, I'm claiming shenanigans that it can't handle it because that really depends on the workload. Since games don't really need over 4GB yet, you're not going to see a slowdown because the 8GB is merely not used or required. So lets say you have 8GB of system memory in two DIMMs but, you haven't utilized all of it yet, then you upgrade to 16GB, are you still going to expect a performance boost? I would say, generally speaking, no. I think GPUs are the same way. It's not that VRAM capacity improves performance, it's that it stops performance from degrading nearly as quickly when you eventually do need more than 4GB as opposed to having to resort to streaming textures which always hinders performance to some degree.

So it's not about gaining performance, it's about not losing performance... because performance tanks when you run out of VRAM.


----------



## Artas1984 (May 19, 2016)

It's amazing that at 1440p the R9 280X is now *significantly* faster than GTX770 and equal to GTX780, while R9 270X is equal to GTX960 and GTX770. If you thought about mid-range AMD cards performance back in 2013, there was no way you'd ever though about that kind of catch-up with NVIDIA.


----------



## Air (May 19, 2016)

Artas1984 said:


> It's amazing that at 1440p the R9 280X is now *significantly* faster than GTX770 and equal to GTX780, while R9 270X is equal to GTX960 and GTX770. If you thought about mid-range AMD cards performance back in 2013, there was no way you'd ever though about that kind of catch-up with NVIDIA.



Yeah, also noticed this. When i bought my R9 270x (for $200) it had ~10% less performance then GTX 760 (priced at ~240$ i think)?. Now it has the same performance as a GTX 770, which was $400 at the time. If thats correct, clueless me made a great choice back then.


----------



## rtwjunkie (May 19, 2016)

Fluffmeister said:


> Good point.... it's faster, more efficient and offers more VRAM to boot. I can see why you'd think that.
> 
> Those pesky GXX04 chips certainly do fly.



So many people think price is what determines the next gen replacement, when in reality it's the chip used and where it falls in the product line.  Price is immaterial!


----------



## TheinsanegamerN (May 19, 2016)

Artas1984 said:


> It's amazing that at 1440p the R9 280X is now *significantly* faster than GTX770 and equal to GTX780, while R9 270X is equal to GTX960 and GTX770. If you thought about mid-range AMD cards performance back in 2013, there was no way you'd ever though about that kind of catch-up with NVIDIA.


OTOH, nvidia users dont need to wait for a year+ to get proper performance with their cards. If the 280x, which is a 7970, could be faster then a 680/770, it should not have taken 3+ years to get there. It's simply a matter of when you want the best performance, now or years from now?

And it doesnt help AMD either. Their cards age too well to be replaced consistently.


----------



## Valdas (May 19, 2016)

peche said:


> It is THE FIRST TIME EVER we see GPU manufacturers advance over 2 node processes in one go. Oh and that lovely HBM2 too.
> 
> We'll see vastly faster cards wiping floor with previous gen cards, with 2+ times performance jump.
> 
> It is "always like that", my arse.


Fury X was released with HBM. How much faster it was when compared to 980 Ti that was running gddr5? You assume >2x without any data to back it up.


----------



## red_stapler (May 19, 2016)

So should I finally consider replacing my 7950?  Is the 1080 "fast enough" that I'll see a difference?


----------



## Valdas (May 19, 2016)

red_stapler said:


> So should I finally consider replacing my 7950?  Is the 1080 "fast enough" that I'll see a difference?


In 7950 review it averages 34 fps at 1440p and 52 fps at 1080p in BF3 while 1080 runs it at 137 fps at 1440p and 199 fps at 1080p. While it's not entirely apples to apples comparison it should give you a rough idea.
Edit: You can also make an assumption that 7950 = 270X which is present in 1080 performance summary review.


----------



## rtwjunkie (May 19, 2016)

So, after a more thorough look at the fps in games, I have concluded that this is such a big leap ahead of my 980, it's not even funny.  It is getting more FPS at 1440p than the 980 is getting in the same games at 1080p.  This means going 1440p monitor AND getting a 1080, it would still be performing better than the 980 at the previous 1080p resolution.

I cannot afford it now, and haven't found anything I can't play like I want yet with a 980, but this is the kind of revelation that makes people start saving and planning for someday...even if it's a year from now.


----------



## Grings (May 19, 2016)

Given the cooler seems to be the limiting factor in wizz's review i am interested to see how the decent non reference cards do

While im sure there will be Kingpin and 8 pack branded cards even more expensive than the Founder Edition, it would be nice to see some come in cheaper AND perform even better


----------



## nem.. (May 19, 2016)

recycled architecture, recycled slides...
BILLIONS of $$ rofl


----------



## CounterSpell (May 19, 2016)

according to this site, a gtx 1080 system consumes 314w - http://www.legitreviews.com/nvidia-geforce-gtx-1080-founders-edition-video-card-review_181298/12

so, can a 500psu (a good one) hold this system?

i have a cx500 from corsair and would like to know if i upgrade my vga to a 1080, will i have to change my psu?


----------



## Kanan (May 19, 2016)

CounterSpell said:


> according to this site, a gtx 1080 system consumes 314w - http://www.legitreviews.com/nvidia-geforce-gtx-1080-founders-edition-video-card-review_181298/12
> 
> so, can a 500psu (a good one) hold this system?
> 
> i have a cx500 from corsair and would like to know if i upgrade my vga to a 1080, will i have to change my psu?


Your PSU is good enough for it, enjoy. Even a lot of reserves with that combination.


----------



## CounterSpell (May 20, 2016)

Kanan said:


> Your PSU is good enough for it, enjoy. Even a lot of reserves with that combination.


for real or trolling?


----------



## Caring1 (May 20, 2016)

CounterSpell said:


> for real or trolling?


314W will be peak usage, not constant draw during game play.
A decent 500W PSU should be capable of supporting the 1080


----------



## Kanan (May 20, 2016)

CounterSpell said:


> for real or trolling?


Why should I troll? All my answers here are serious at least if it's about hardware questions. i5 2500K is maybe 150 W tops (with OC, without under 100W), mainboard ram etc. 50W, thats 200W tops, and the graphics card GTX 1080 on top of that would be 180 or 220W (in OC mode) tops, so you have a power usage of 380 to 420W tops (!), the average is much lower, maybe 300 to 350W. As I said, you have a lot of reserves, it's no problem.


----------



## CounterSpell (May 20, 2016)

Kanan said:


> Why should I troll? All my answers here are serious at least if it's about hardware questions. i5 2500K is maybe 150 W tops (with OC, without under 100W), mainboard ram etc. 50W, thats 200W tops, and the graphics card GTX 1080 on top of that would be 180 or 220W (in OC mode) tops, so you have a power usage of 380 to 420W tops (!), the average is much lower, maybe 300 to 350W. As I said, you have a lot of reserves, it's no problem.



its because everyone says we need at least a 750w for a descent gpu....


----------



## red_stapler (May 20, 2016)

CounterSpell said:


> its because everyone says we need at least a 750w for a descent gpu....



You might want to go tri-SLI in the future!


----------



## Kanan (May 20, 2016)

CounterSpell said:


> its because everyone says we need at least a 750w for a descent gpu....


That's for multi-GPU. A high power GPU like 980 Ti or Fury X needs a 500 or 600W PSU to run (depends on CPU too, of course). My system is a good example for that. The GTX 1080 uses much less power than these GPUs, so it's no problem.


----------



## Tran Hong Duc (May 20, 2016)

how the hell did the temp show only 67 degree at launch event? perhaps they fit an air conditioner in the case lol.


----------



## okidna (May 20, 2016)

Tran Hong Duc said:


> how the hell did the temp show only 67 degree at launch event? perhaps they fit an air conditioner in the case lol.



V-sync.


----------



## FordGT90Concept (May 20, 2016)

Yeah, pretty much any game with vsync off, they'll run up to the thermal throttle.  Even FortressCraft: Evolved can hit 100% GPU load with vsync off.


----------



## trog100 (May 20, 2016)

Kanan said:


> That's for multi-GPU. A high power GPU like 980 Ti or Fury X needs a 500 or 600W PSU to run (depends on CPU too, of course). My system is a good example for that. The GTX 1080 uses much less power than these GPUs, so it's no problem.



its the low power usage that interests me.. similar performance at half the power usage.. my pair of 980 TI cards have more than enough performance to last for a while so any upgrade for me would be based purely on power heat noise factors.. burning (and have to cool) 300 watts i dont need to will probably niggle me a bit.. 

trog


----------



## okidna (May 20, 2016)

I just realised that @W1zzard changed his method to obtain clock comparison percentage for GTX 1080 default vs overclock. I don't know the reason why he changed the method but maybe it's related to GPU Boost 3.0 behavior.

You can see it for yourself :

For 980 he compared the BASE CLOCK vs overclocked base clock, 1127 to 1350 so 20% overclock : https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980/29.html



> Maximum overclock of our sample is *1350 MHz GPU base clock (20% overclock)*



For 980 Ti he compared the BASE CLOCK vs overclocked base clock, 1000 to 1260 so 26% overclock : https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_980_Ti/34.html



> Maximum overclock of our sample is *+260 MHz to the GPU's base clock* *(26% overclock)*



For Titan X he compared the BASE CLOCK vs overclocked base clock, 1000 to 1110 so 11% overclock : https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_X/32.html



> Maximum overclock of our sample is *+110 MHz to the GPU base clock (11% overclock)*



But for GTX 1080 he changed the method to comparing the BOOST CLOCK : http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1080/30.html



> Maximum overclock of our sample is +453 MHz to the GPU's base clock, which *increases max Boost from 1898 MHz to 2114 MHz (11% overclock)*



So if we use the old method (comparing default base clock vs overclocked base clock), GTX 1080 will still get a good number of 28% base clock overclock.


----------



## the54thvoid (May 20, 2016)

Kanan said:


> 1080 is only good for users who don't own a Fury X or GTX 980 Ti and only worthwile with custom cooling. The ref cooling version is severly hindered by its cooler = lower boost clocks + lower OC capability.
> 
> The "5 months after"-argument is moot. You could always say that. Yeah and after "daddy pascal" there comes "daddy Vega". What now? And after "daddy Vega" comes new architecture of Nvidia (kinda "Maxwell 2" architecture without productive crap that gamers don't need and wastes a lot of transistors). And after that there comes something new of AMD again. Like I said, a moot point.
> 
> No, what you say is ignorant. People buy things when they need them, they usualy DON'T wait (they only do if it's a few week, tops, not 5 months, lol). What I said was just, that the first GPU worthwile for enthusiast users (who obviously already have a enthusiast GPU) is most probably Vega10, end of story.



Good grief man. You're  the one saying wait till Vega. When there is a card out now with unequivocally higher performance metrics. My point is neither ignorant or moot. Vega will bring HBM2, so what? So will consumer GP100 and big Vega. So the whole thing of saying, ignore GP104, wait for real next gen is hypocritical.
I know what you are implying, that GP104 isn't proper next gen but frankly, we know so little about Vega that it may not be hugely different from Fiji. HBM is no longer new tech. HBM2 is to HBM as GDDR5X is to GDDR5.

Long story short, if you pass GP104 because you want true next gen, you'll be upset when the proper (not crippled, again rumours) Vega part is released a few months after.  Moral of story, buy when you want to buy. And also, nobody needs any of these powerful desktop gfx cards except for work.


----------



## SPARTAN11810 (May 20, 2016)

What Map are you testing in Battlefield 4?


----------



## W1zzard (May 20, 2016)

okidna said:


> changed his method to obtain clock comparison percentage for GTX 1080 default vs overclock


I changed the wording to better reflect the OC'd clocks on Boost 3.0


----------



## BoyGenius (May 20, 2016)

Give it try & see for yourself 
It will be more than twice as faster as far as benchmarks goes.


red_stapler said:


> So should I finally consider replacing my 7950?  Is the 1080 "fast enough" that I'll see a difference?


----------



## Rautate (May 20, 2016)

*Overclocked Nvidia GTX 1080 VS 980 Ti | Benchmarks & SLI - Various Games*


----------



## CrAsHnBuRnXp (May 20, 2016)

@W1zzard Could you please tell me how you bench WoW: WoD? Ive always been curious but never noticed how it's benched since it doesnt have a benchmarking mode.


----------



## Frick (May 21, 2016)

They've been listed in swedish stores for €775.


----------



## Rautate (May 21, 2016)

Frick said:


> They've been listed in swedish stores for €775.



any link?


----------



## Rautate (May 21, 2016)

*Nvidia GeForce GTX 1080 & GTX 1070 UK release date, price, features and specification*

The GTX 1080 will be made available on 27 May 2016
GTX 1080 will cost £619 in the UK

GTX 1070 will come slightly later with a release date of 10 June 2016

PS: GTX 1080 Founders Edition will cost a whopping £619, a ridiculous £139 difference between the US and UK price



http://www.pcadvisor.co.uk/new-prod...release-date-price-specs-new-confirm-3639751/


----------



## nem.. (May 21, 2016)




----------



## Frick (May 21, 2016)

Rautate said:


> any link?



http://cdon.se/hemelektronik/evga-g...kt_se_PC+-+Comp+-+Video+Card_home-electronics

https://www.netonnet.se/art/kompone...a-geforce-gtx1080-founders-edito/232999.8989/


----------



## Valdas (May 21, 2016)

Rautate said:


> PS: GTX 1080 Founders Edition will cost a whopping £619, a ridiculous £139 difference between the US and UK price


You forgot to account for VAT difference. It's more like 40 pounds, not 140.


----------



## Rautate (May 21, 2016)

Valdas said:


> You forgot to account for VAT difference. It's more like 40 pounds, not 140.



no mate , Ebuyer.com have the price 633 pounds ! so its not a joke ( to expensive )


----------



## Rautate (May 21, 2016)

Frick said:


> http://cdon.se/hemelektronik/evga-geforce-gtx-1080-founders-edition-8gb-gddr5x-dvi-hdmi-3xdisplayport-p37725125?utm_source=prisjakt&utm_medium=cpc&utm_term=EVGA+GeForce+GTX+1080+Founders+Edition++8GB+GDDR5X+DVI,+HDMI,+3xDisplayPort+NVIDIA&utm_content=37725125&utm_campaign=prisjakt_se_PC+-+Comp+-+Video+Card_home-electronics
> 
> https://www.netonnet.se/art/kompone...a-geforce-gtx1080-founders-edito/232999.8989/




out of stock ( like all sites ) will be available on 27 May 2016


----------



## Frick (May 21, 2016)

Rautate said:


> out of stock ( like all sites ) will be available on 27 May 2016



"Listed". Not "avaliable".


----------



## efikkan (May 21, 2016)

I would appreciate if reviews were performed in a closed case and with a ~5 minute "warmup" of FurMark or something to give a more realistic measurement of the GPU's performance. These days GPUs are more and more aggressive boosting which means that short benchmarks will diverge quite a lot from actual real-world performance.

GTX 1080 is definetively the fastest card yet, but benchmarks tend to give exaggerate the performance gain.


----------



## Rautate (May 22, 2016)

*other rumors  * *Geforce GTX 1080 Titan , GTX 1080 Ti* >> source https://www.chiphell.com/thread-1588175-1-1.html

GTX 1080Ti from AIBs: 799 USD

GTX 1080Ti Founders Edition: 849 USD

GTX Titan (Founders Edition only): 999 USD

And also interesting thread on a table with the classification chiphell graphics cards .. do not understand who did it , or if it is correct (I have not checked)


----------



## Xzibit (May 22, 2016)

efikkan said:


> I would appreciate if reviews were performed in a closed case and with a ~5 minute "warmup" of FurMark or something to give a more realistic measurement of the GPU's performance. These days GPUs are more and more aggressive boosting which means that short benchmarks will diverge quite a lot from actual real-world performance.
> 
> GTX 1080 is definetively the fastest card yet, but benchmarks tend to give exaggerate the performance gain.



The closest test to do such a thing was from ComputerBase.de  They checked clocks at 4K gaming inside a case






Toms Hardware did a heat up test on an Open Air test bed.


----------



## efikkan (May 22, 2016)

Rautate said:


> other rumors  Geforce GTX 1080 Titan , GTX 1080 Ti >> source https://www.chiphell.com/thread-1588175-1-1.html


Probably pure speculation at this point, but there _is_ in fact a GP102 in testing. The approximate size, CUDA cores, and memory configuration is though close to my guess. And since even GP102 wouldn't be fast enough to require HBM2 for gaming, I would actually prefer GDDR5X until big-Volta arrives.

I wonder what they will call the new Titan, "Titan X" (X for 10) is already taken. So "Titan 10" then?



Xzibit said:


> The closest test to do such a thing was from ComputerBase.de  They checked clocks at 4K gaming inside a case


That's right. That review is part of the reason why I mentioned it.


----------



## Am* (May 22, 2016)

Am I the only one not impressed in the slightest with this card?

The GTX 980 came out with very similar performance gains, used the same manufacturing process, same memory type, and was priced almost £150 less than this thing at launch...I really don't see the point of this card. The games it beats the last gen 980Ti and Titan X at the most already run well enough on last gen cards (from 80 to 100 FPS etc - what is the point), except in the case of 4k where performance is still pathetic and unacceptable. The new VR features are unproven and need developer support so might as well be fluff as far as I'm concerned -- and still no clear explanation on how good Nvidia's support of Async Compute is with this gen (which makes me presume their cards are brute forcing it now with hopes of AMD changing plans and abandoning dedicated die space purely for compute).

As for their Founders Edition -- should've renamed it the Sucker's Edition with guaranteed buyer's remorse. £619 for what is, at best, a mid-range stopgap card...and I thought my Titan X was badly overpriced at £700 almost a year ago. Ridiculous...

P.S. and what's with that cooler design? Is that Nvidia's idea of trying to be "edgy"?


----------



## Rautate (May 22, 2016)

*NVIDIA Launches Tesla M10 Quad Maxwell Graphics Card – 32 GB Memory, 2560 Cores Aimed at Virtual Computing http://wccftech.com/nvidia-tesla-m10-graphics-card/*


----------



## the54thvoid (May 22, 2016)

Pascal's async is no better than Maxwell's. Some reviews have done a DX11 versus DX12 comparison and the score doesn't improve.
What should worry AMD is that exact same thing. Without dedicated hardware built specifically for async compute, Pascal is so fast, it still beats the Async specialist Fiji. I don't know if AMD can throw any more hardware at async but given Pascal beats Fury in practically all DX12 tests, despite not having great hardware for it, I'd say the pressure is on AMD.


----------



## efikkan (May 22, 2016)

the54thvoid said:


> Pascal's async is no better than Maxwell's. Some reviews have done a DX11 versus DX12 comparison and the score doesn't improve.
> What should worry AMD is that exact same thing. Without dedicated hardware built specifically for async compute, Pascal is so fast, it still beats the Async specialist Fiji. I don't know if AMD can throw any more hardware at async but given Pascal beats Fury in practically all DX12 tests, despite not having great hardware for it, I'd say the pressure is on AMD.


That is utter nonsense.

First of all, it's a total myth that AMD has better hardware than Direct3D 12. There is no such thing as specialized hardware for each API. Nvidia chose to bring the driver level changes of Direct3D 12 to all APIs, which of course makes them have a lower relative gain from Direct3D 11 to 12. (1)(2)

The whole point of async shaders is for the shaders to utilize different resources in the GPU. Yet, AMD uses async shaders to compensate for the huge inefficiencies in their scheduler, proven by their performance gain from doing compute and rendering at the same time. The reason why Nvidia doesn't even bother isn't because their architecture can't handle it, it's because it wouldn't gain anything as both the Maxwell and Pascal schedulers are able to achieve optimal utilization. This is why Nvidia is prioritizing general optimizations rather than a feature that would give them less than 1% gain.


----------



## the54thvoid (May 22, 2016)

efikkan said:


> That is utter nonsense.
> 
> First of all, it's a total myth that AMD has better hardware than Direct3D 12. There is no such thing as specialized hardware for each API. Nvidia chose to bring the driver level changes of Direct3D 12 to all APIs, which of course makes them have a lower relative gain from Direct3D 11 to 12. (1)(2)
> 
> The whole point of async shaders is for the shaders to utilize different resources in the GPU. Yet, AMD uses async shaders to compensate for the huge inefficiencies in their scheduler, proven by their performance gain from doing compute and rendering at the same time. The reason why Nvidia doesn't even bother isn't because their architecture can't handle it, it's because it wouldn't gain anything as both the Maxwell and Pascal schedulers are able to achieve optimal utilization. This is why Nvidia is prioritizing general optimizations rather than a feature that would give them less than 1% gain.



Thank you for correcting me. So AMD should worry then and everyone should stop talking up async like it's the best thing in the world?


----------



## efikkan (May 22, 2016)

the54thvoid said:


> Thank you for correcting me. So AMD should worry then and everyone should stop talking up async like it's the best thing in the world?


AMD is in panic mode, and if the rumors of Polaris are true 2016 is going to be their worst year yet. Remember that Nvidia has still not brought their "big guns".

Async shaders will be useful for what it's intended for, but most people on the forums have no idea what the purpose of it really is. During rendering of a single frame, most of work is computational intensive (typically >95% of the time). But some of the smaller tasks are not, such as texture compression, data transfer from system memory, video encoding/decoding etc. Without async shaders most of the GPU will be idle during these tasks, so the sole purpose of async shaders is to do different workloads simultaneously utilizing different GPU resources. Async shaders can be utilized to do things like streaming of textures without a performance penalty, but we are still talking about 1-3% performance gain.

The reason why AMD is getting a "big" performance boost in games like Ashes of the Singularity is because their inefficient architecture has over 30% of the GPU idle during a single intensive task. So the game is actually compensating for the GPU's inefficiency, and it's not a testament to AMD's ability to utilize async shaders. It's used as a solution to a problem of AMD's own making...


----------



## nem.. (May 23, 2016)

efikkan said:


> AMD is in panic mode, and if the rumors of Polaris are true 2016 is going to be their worst year yet. Remember that Nvidia has still not brought their "big guns".
> 
> Async shaders will be useful for what it's intended for, but most people on the forums have no idea what the purpose of it really is. During rendering of a single frame, most of work is computational intensive (typically >95% of the time). But some of the smaller tasks are not, such as texture compression, data transfer from system memory, video encoding/decoding etc. Without async shaders most of the GPU will be idle during these tasks, so the sole purpose of async shaders is to do different workloads simultaneously utilizing different GPU resources. Async shaders can be utilized to do things like streaming of textures without a performance penalty, but we are still talking about 1-3% performance gain.
> 
> The reason why AMD is getting a "big" performance boost in games like Ashes of the Singularity is because their inefficient architecture has over 30% of the GPU idle during a single intensive task. So the game is actually compensating for the GPU's inefficiency, and it's not a testament to AMD's ability to utilize async shaders. It's used as a solution to a problem of AMD's own making...


 
ahahah still preemtion


----------



## nem.. (May 23, 2016)




----------



## Kanan (May 23, 2016)

the54thvoid said:


> Good grief man. You're  the one saying wait till Vega. When there is a card out now with unequivocally higher performance metrics. My point is neither ignorant or moot. Vega will bring HBM2, so what? So will consumer GP100 and big Vega. So the whole thing of saying, ignore GP104, wait for real next gen is hypocritical.


No it's not, I see you're not geting my point. To explain it do you directly: the GTX 1080 isn't a enthusiast card, because the performance gain over 980 Ti or Titan X is too little (also it's the successor of GTX 980, a high end level card). Exactly that would change with a  bigger chip like Vega, which is probably more suited for a real upgrade than GTX 1080. You're hyping GTX 1080, nothing more. I'am not, that's the difference between us. Read some reviews with GTX 1080 (with OC and without) vs. custom or overclocked 980 Ti/Titan X (I like this one: http://www.pcgameshardware.de/Nvidi...5598/Specials/Benchmark-Test-Video-1195464/2/ ) , I already did this and the result is that 980 Ti / Titan X users can ignore that card and wait for the next big thing. Vega, GTX 1080 Ti, Titan Pascal, something like that. 



> I know what you are implying, that GP104 isn't proper next gen but frankly, we know so little about Vega that it may not be hugely different from Fiji. HBM is no longer new tech. HBM2 is to HBM as GDDR5X is to GDDR5.


HBM isn't really important to me, even for Fiji it was just for the purpose to enable less power consumption so a card with 00W TDP is possible, the bandwidth isn't really needed (2x (CF) R9 380X have same specs as a Fury X and manage to come along with much less bandwidth for example). The same holds true for Vega, if they decide to ship it with HBM2 instead GDDR5X, because I think GDDR5X would be sufficient. About the architecture, what I know is, that the architecture will be something really new, whereas Polaris was something along the lines of Fiji, small or no architectural changes with everything else geting changed, so that the shaders get better utilization + HDMI 2.0a, DP 1.3 etc. 





New Command Processor: better utilization in APIs that have suboptimal usage, like DX11. 



> Long story short, if you pass GP104 because you want true next gen, you'll be upset when the proper (not crippled, again rumours) Vega part is released a few months after.  Moral of story, buy when you want to buy. And also, nobody needs any of these powerful desktop gfx cards except for work.


This is exactly what I said earlier, just in other words: buy when you want, or need it. There is no fixed "plan" to buy something, I don't think anyone waits 5 months, if he needs something new now. My point with the GTX 1080 is, it's not exactly a big upgrade to a GTX 980 Ti or Titan X, so it's better to wait for something like Vega. Why should I be upset, I'm not a  enthusiast user, I tend to buy GPUs in the line of 300€ maximum, which I think is the best price to performance point. I only know, if I wanted a enthusiast GPU now, I would certainly not buy a GTX 1080, with prices of GTX 980 Ti that low, I'd get one of these instead and have my fun with it.


----------



## cadaveca (May 23, 2016)

Am* said:


> P.S. and what's with that cooler design? Is that Nvidia's idea of trying to be "edgy"?



I think it's more meant to improve airflow when cards are in SLI and in close spaces. That's been a big issue with running two cards in mATX boards; top card severely overheats because of lack of airflow. Think about the parts of backplate that can be removed... why?


----------



## Legacy-ZA (May 23, 2016)

I see a lot of RGB lighting marketing going around the board partners, I sure hope this doesn't mean the $599 bracket will be destroyed for a tiny bit of LED shiny lights. I just hope we can turn the lights off.


----------



## Am* (May 23, 2016)

cadaveca said:


> I think it's more meant to improve airflow when cards are in SLI and in close spaces. That's been a big issue with running two cards in mATX boards; top card severely overheats because of lack of airflow. Think about the parts of backplate that can be removed... why?


I meant the random sharp edges added to their stock silver/vapor hamber heatsink, not the slimmer backplate...the backplate is more than welcome.


----------



## rtwjunkie (May 23, 2016)

Am* said:


> the backplate is more than welcome



The backplate is superfluous on the reference (FE) model, as all the PCB support it needs is from the tightly affixed magnesium alloy shroud.


----------



## the54thvoid (May 23, 2016)

Kanan said:


> No it's not, I see you're not geting my point. To explain it do you directly: the GTX 1080 isn't a enthusiast card, because the performance gain over 980 Ti or Titan X is too little (also it's the successor of GTX 980, a high end level card). Exactly that would change with a  bigger chip like Vega, which is probably more suited for a real upgrade than GTX 1080. You're hyping GTX 1080, nothing more. I'am not, that's the difference between us. Read some reviews with GTX 1080 (with OC and without) vs. custom or overclocked 980 Ti/Titan X (I like this one: http://www.pcgameshardware.de/Nvidi...5598/Specials/Benchmark-Test-Video-1195464/2/ ) , I already did this and the result is that 980 Ti / Titan X users can ignore that card and wait for the next big thing. Vega, GTX 1080 Ti, Titan Pascal, something like that.
> 
> 
> HBM isn't really important to me, even for Fiji it was just for the purpose to enable less power consumption so a card with 00W TDP is possible, the bandwidth isn't really needed (2x (CF) R9 380X have same specs as a Fury X and manage to come along with much less bandwidth for example). The same holds true for Vega, if they decide to ship it with HBM2 instead GDDR5X, because I think GDDR5X would be sufficient. About the architecture, what I know is, that the architecture will be something really new, whereas Polaris was something along the lines of Fiji, small or no architectural changes with everything else geting changed, so that the shaders get better utilization + HDMI 2.0a, DP 1.3 etc.
> ...



I understand everything you are saying and trust me, I own a Kingpin, I know high end...... but....

there is no point



Kanan said:


> Wait for the Vega10 GPU, 1st 14nm true enthusiast GPU - coming to you October.  Jokes, I think the Nano is still very good, but if you need something better, wait for the Vega.



Vega 10 is being rushed forward to combat 1080.  It might not even beat it.  Vega 11 is the proper chip.  My exception to your post is that vega 10 will not be the leap from a 980ti either.  *You need to wait for vega 11 OR big Pascal*.


----------



## cadaveca (May 23, 2016)

Am* said:


> I meant the random sharp edges added to their stock silver/vapor hamber heatsink, not the slimmer backplate...the backplate is more than welcome.


That's exactly what I was talking about.. those edges create pathways for air to flow into the fan... you might notice how they pretty much all radiate out from the fan....


----------



## Kanan (May 24, 2016)

the54thvoid said:


> Vega 10 is being rushed forward to combat 1080.  It might not even beat it.  Vega 11 is the proper chip.  My exception to your post is that vega 10 will not be the leap from a 980ti either.  *You need to wait for vega 11 OR big Pascal*.


If Vega is a HBM2 card it may very well beat GTX 1080, as GTX 1080 isn't that fast compared to 980 Ti, and Vega will easily be faster than Fury X / 980 Ti. And if all that is true, the next big thing is Vega10, not 11 - 11 if you want to wait even longer, yes, again X months. I already told you waiting is no option for a lot of people and that it is a moot point, if you can have higher performance now. A cut down of Vega11 could very well be fast enough to warrant a purchase.


----------



## Frick (May 26, 2016)

I just noticed my GTX760 still is in the charts!


----------



## Caring1 (May 27, 2016)

Just skimmed through the review, and this card is good, but I feel the numbers for core and memory clocks were switched.


----------



## Steevo (May 27, 2016)

So just as I thought, very limited availability, and Nvidia cashes in on early adopters, and perhaps only partial availability after the 16th of next month according to many sites.


----------



## Aquinus (May 27, 2016)

Meanwhile, AMD's stock continues to grow, now at 4.60. That's a 5% gain today. It's funny because launching the 1080 didn't seem to do much good for nVidia's stock. Maybe the market knows something we don't?

Edit: In fact, AMD's stock gained when the date that the NDA for reviews on Polaris 10 was released; the magic date appears to be June 29th. Maybe there really is something we don't know but, that could be me being optimistic, or maybe it's me being optimistic because the market is optimistic. Lets see how the market looks in the next week or two. A few days of optimism isn't enough to say it'll be sliced bread good.
Source


----------



## FordGT90Concept (May 28, 2016)

It appears to be official that Pascal sucks at async shaders.  It doesn't suck as bad Maxwell but it still sucks:
http://wccftech.com/nvidia-gtx-1080-async-compute-detailed/




Pascal, like Maxwell, is still very much a DX11 card.  GCN is a DX12/Mantle/Vulkan card.


AMDs stocks are up because they'll soon have a product they can sell to the masses at prices and quantities NVIDIA can't compete with.  Additionally, news of the new Xbox likely bolstered that because it translates to millions of orders for AMD.


----------



## Prima.Vera (May 28, 2016)

Are there any cards out there than can push more than 25% O.C. just like nVidia advertised on their Doom demo??
So far all the cards released from various vendors are ubber expensive and O.C maximum to +20%...


----------



## Eagleye (May 29, 2016)

Bring back those days when Wizz blocks the air vents of the card (290X I think) to see what it does, and video comparisons of fan noise.

I agree we should stick with closed systems and warm up prior to benching


----------



## Prima.Vera (May 30, 2016)

Does anyone knows when can we see a SLI review for 1080 cards?


----------



## jabbadap (May 30, 2016)

FordGT90Concept said:


> It appears to be official that Pascal sucks at async shaders.  It doesn't suck as bad Maxwell but it still sucks:
> http://wccftech.com/nvidia-gtx-1080-async-compute-detailed/
> 
> 
> ...



Afaik async compute is disabled by the game engine when nvidia card is detected, aots need update to really support gtx1080/1070.


----------



## mcraygsx (May 30, 2016)

Great Review, I always enjoy hi-res images of PCB. That Radeon 295x2 still performs very well against the latest 1080.


----------



## FordGT90Concept (May 31, 2016)

jabbadap said:


> Afaik async compute is disabled by the game engine when nvidia card is detected, aots need update to really support gtx1080/1070.


Makes sense.  We'll have to wait until Oxide comments on it.


----------



## sweet (Jun 3, 2016)

jabbadap said:


> Afaik async compute is disabled by the game engine when nvidia card is detected, aots need update to really support gtx1080/1070.





FordGT90Concept said:


> Makes sense.  We'll have to wait until Oxide comments on it.



LOL, isn't that a parody of PhysX?


----------



## jabbadap (Jun 3, 2016)

sweet said:


> LOL, isn't that a parody of PhysX?



Perhaps not. With maxwell and kepler cards async path on the game engine makes game unplayable(Or affects negatively to performance). So its hard coded by oxide to disable that path if nvidia card id is rendering(don't know if it's in use while amd card rendering and nvidia card just used as slave). But seriously tough, we don't know if pascal's async compute is any better until oxide updated its engine.


----------



## FordGT90Concept (Jun 3, 2016)

sweet said:


> LOL, isn't that a parody of PhysX?


PhsyX = NVIDIA doesn't bother with nor share source code to make it work on competing hardware.
Async Compute = Oxide optimized the code for AMD (leave it on) and NVIDIA (disable it) to get the maximum frames per second.

TL;DR:
PhsyX = no fucks given by NVIDIA
Async = doing what game developers do


----------



## D1RTYD1Z619 (Jun 4, 2016)

GreiverBlade said:


> and here it is ... the "founder edition that cost 100$ more" am i the only one that find nvidia particularly despicable for that one ? custom model are far above that one in term or cooler (generally) and already more expensive than a stock one (everybody know the price of a custom one will not be 599$ ...) , so why make the stock one wear the name of "founder" just to add a 100$ overprice?
> 
> on the other hand, the perf are correct, tho +36% is not a warrant for a change over my 980   (not even for 4gb more, nor 41% in 4k case )
> 
> ...



A side from the power draw that 295x2 is still kicking ass.


----------



## 64K (Jun 5, 2016)

Well, for all the Tech Gomer Pyles that were barking about Pascal would only be 20% faster than Maxwell all over the interwebs......here it is. The GTX 1080 Ti, or whatever they call it, will run circles around my 980 Ti. What remains to be seen is whether Nvidia chooses to go with the original Titan (Kepler) route or the Titan X route. They probably could get away with charging between $1,000 and $1,300 for the Pascal Titan and scoop up a few gamers in a hurry to buy the best card out there but will they instead offer a gimped Maxwell gaming card (980 Ti) and leave the Titan version for those with deep pockets like they did with Maxwell? We'll see.

Radeon Tech will drop the hammer with their Flagship 14nm gaming GPU in the near future and we will see some strong competition when the boys each pull out their big guns. Good times for PC gamers. 

For anyone that isn't in a hurry to upgrade, don't worry. This will be a resting spot for a few years with GPU tech. Wait for prices to settle down, if you can, when supply channels are fully stocked. My guess is 3 years at least before the next process/ architecture.


----------



## Eroticus (Jun 6, 2016)

D1RTYD1Z619 said:


> A side from the power draw that 295x2 is still kicking ass.



Yep ! i want to see PRO DUO performance vs 295x +P.


----------



## Artas1984 (Jun 6, 2016)

For me techpowerup reviews are the best in all the internet.

1) They provide the most amount of games for benchmarking, with a vast amounts of resolutions used.
2) The performance summary is the main reason i choose TP over other sites, it helps me to compare other cards, not included in the tests, relative the the given ones.

Can't wait for GTX1070 review and i tell you why - very different results all over the internet right now concerning GTX1070 VS GTX980 Ti VS GTX Tinan X.

"Guru3D" claims GTX1070 to be absolutely superior to GTX980 Ti and as strong or even stronger than GTX Titan X, while "Hardware Unboxed" calims GTX1070 to be only equal to GTX980 Ti an nowhere as good as GTX Titan X.

I wonder what kind of conclusion will TP provide.


----------



## W1zzard (Jun 6, 2016)

Artas1984 said:


> TP


Appreciate the great feedback. GTX 1070 review this week if all goes well. Nearly everyone on the Internet abbreviates us as "TPU", so do we ourselves


----------



## FordGT90Concept (Jun 7, 2016)

And because TP = Toilet Paper.


----------



## 64K (Jun 7, 2016)

Looking forward to the GTX 1070 review. I expect Nvidia will hit a home run with this one just like the 970 was.


----------



## robal (Jun 9, 2016)

The big question now is:
Is 2 x GTX 1080 (SLI) enough for smooth 4k ?  Or do we wait for GTX 1080Ti ?


----------



## ThomasS31 (Jun 10, 2016)

W1zzard said:


> Appreciate the great feedback. GTX 1070 review this week if all goes well. Nearly everyone on the Internet abbreviates us as "TPU", so do we ourselves



Will you be able to test custom GTX 1080s soon?

I also think overall your reviews are one of the best and thorough on the net...

I am really interested in which 1080 custom has the quietest and best cooling.
Looking forward to your tests.

Hope it will happen soon.


----------



## Rexxcastle (Jul 12, 2016)

ON the Warcraft section where in game were you guys pulling these huge FPS numbers?  IM running I7 4790k at 4.4 with 16GB ram,  Samsung 950 pro PCIe SSD, and a EVGA Superclocked 1080 and not getting NEAR those numbers you listed at 1920x1080.  That 209 FPS you had listed is far from realistic


----------



## Mussels (Jul 12, 2016)

Rexxcastle said:


> ON the Warcraft section where in game were you guys pulling these huge FPS numbers?  IM running I7 4790k at 4.4 with 16GB ram,  Samsung 950 pro PCIe SSD, and a EVGA Superclocked 1080 and not getting NEAR those numbers you listed at 1920x1080.  That 209 FPS you had listed is far from realistic



i think they do FRAPS recordings for repeatable footage, which removes any CPU loads/limitations and goes pure GPU.


----------



## Rexxcastle (Jul 12, 2016)

Mussels said:


> i think they do FRAPS recordings for repeatable footage, which removes any CPU loads/limitations and goes pure GPU.



How is that even realistic at that point?  Thats like dropping a car out a plane to test MPH


----------



## Mussels (Jul 12, 2016)

Rexxcastle said:


> How is that even realistic at that point?  Thats like dropping a car out a plane to test MPH




i've asked about it, but never got much of a response. it might be good for comparing GPU's precisely, but the numbers have no relevance to actual gameplay that way.


----------

