Thursday, May 5th 2016

NVIDIA GeForce GTX 1080 Put Through 3DMark

Some of the first 3DMark performance numbers of NVIDIA's upcoming GeForce GTX 1080 graphics card made it to Futuremark's online database. The results page hint at samples of the GTX 1080 running on early drivers, on two separate machines (likely from two different sources). The first source, who ran the card on a machine with a Core i7-5820K processor, scored P19005 on 3DMark 11 (performance preset). The second source, who ran the card on a machine with a Core i7-3770K processor, scored 8959 points on 3DMark FireStrike Extreme. Both scores point at GTX 1080 being faster than a GTX 980 Ti.
Source: VideoCardz
Add your own comment

163 Comments on NVIDIA GeForce GTX 1080 Put Through 3DMark

#126
yogurt_21
truth telleras expected, i hope this doesnt come as a surprise to you
not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.
truth telleragain, as expected for gddr5x, not that its gonna do any better than the previous gddr5
does you car go faster just because you are in a racing circuit?
umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/
the real clock rates of the memory will be the same.
Read more: wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/#ixzz47tYvJsbZ

so how do you explain these results showing a 1290MHZ core clock difference between the 2 shots, and 10GHZ effective memory rate which is over double that of the 980 ti?

Fun fact THESE ARE FAKE. Some kid threw up whatever numbers they wanted for the specs onto a 980 ti result.
truth tellerdont, you will be sad once the cards are launched and definitive reviews are posted
no I really won't these are 980 ti's with the results edited to fool you. Obviously it worked you took the bait and are hooked right into their gag. Have fun with that.
Posted on Reply
#127
the54thvoid
Super Intoxicated Moderator
Prima.Vera^-- What he said. However, this is starting to feel more and more like a price fixation or worst...monopoly...
<start intercepted transcript>

.........

Lisa Su: "Hi Jen - you want to keep your price at the $599 mark for those new 104 chips?"
JHH: "That was the plan - how's Polaris?"
Lisa Su: "We figure we'll go toe to toe with 980ti so you can drop that 10% and we'll come in at $399"
JHH: "So we have maybe faster chip in GP104 and sell at top price for now, you guys get new Polaris chip snatching at our 980ti sales... sounds good to me."
Lisa Su: "Business as usual. I just got off the phone to Intel. They've gone nuts with Broadwell-E. Said the top part costs $1500 and one kidney..."
JHH: "What you pricing Zen at?"
Lisa Su: "Well, Intel's price is so basket case we think we'll do it at $599 - we should sell a million units in the first week. We're putting them in the new Nintendo."
JHH: "Ouch, well, we're working on Async on Vega so that'll suit us well, besides GP200 has an async module so at least our guys will be able to get fully functional DX12 in 2017."
Lisa Su: "Just lay off the Gameworks for a bit?"
JHH: "Yeah, sorry babes, that went a bit too far - I blame Ubisoft for that - they bought it hook line and sinker and coded for shit - my bad. Did you get my Ferrari apology present?"
Lisa Su: "I did, thank you, in my favourite colour too! Well gotta go, PR's talking silly stuff again - need to reel in Roy."
JHH: "KK, good luck!"

...click...
Posted on Reply
#128
okidna
the54thvoid<start intercepted transcript>

.........

Lisa Su: "Hi Jen - you want to keep your price at the $599 mark for those new 104 chips?"
JHH: "That was the plan - how's Polaris?"
Lisa Su: "We figure we'll go toe to toe with 980ti so you can drop that 10% and we'll come in at $399"
JHH: "So we have maybe faster chip in GP104 and sell at top price for now, you guys get new Polaris chip snatching at our 980ti sales... sounds good to me."
Lisa Su: "Business as usual. I just got off the phone to Intel. They've gone nuts with Broadwell-E. Said the top part costs $1500 and one kidney..."
JHH: "What you pricing Zen at?"
Lisa Su: "Well, Intel's price is so basket case we think we'll do it at $599 - we should sell a million units in the first week. We're putting them in the new Nintendo."
JHH: "Ouch, well, we're working on Async on Vega so that'll suit us well, besides GP200 has an async module so at least our guys will be able to get fully functional DX12 in 2017."
Lisa Su: "Just lay off the Gameworks for a bit?"
JHH: "Yeah, sorry babes, that went a bit too far - I blame Ubisoft for that - they bought it hook line and sinker and coded for shit - my bad. Did you get my Ferrari apology present?"
Lisa Su: "I did, thank you, in my favourite colour too! Well gotta go, PR's talking silly stuff again - need to reel in Roy."
JHH: "KK, good luck!"

...click...
Oi Void, that should be Volta, not Vega :D:D:D You better edit it before AMD sheep at WaterClosetClusterFucktech taking it as a news headline : "NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"
Posted on Reply
#129
[XC] Oj101
EarthDogI don't know you from a hole in the ground friend. I'll wait if you won't pony up any proof. :)
Entirely up to you. For what it's worth, I ran a news site called flyingsuicide.net (now defunct, life got in the way) and was the first person to release accurate GTX Titan performance (quoted here videocardz.com/39436/nvidia-geforce-gtx-titan-real-performance-revealed and www.nordichardware.se/nyheter/geforce-gtx-titan-50-60-kraftfullare-aen-gtx-680.html - I was the author of the non-watermarked barchart) on 8 February 2013. That was several days before anyone else could say for sure that the X7107 score was fake. At the same time I confirmed the price at $999.

On OCN (my username there is Oj010) on 24 June 2015 I told people that Fury X would average around 1140 MHz while everybody was hoping (banking on, believing, whatever) for 1300 MHz or more. I also said at the time that voltage would make very little difference - that was months before anyone else.

Also on OCN I told people 23 December 2015 already that Intel was going to be blocking BCLK overclocking on non-K series CPUs.

I could go on, but those are the easiest verifiable references I can give you off the top of my head.
Posted on Reply
#130
N3M3515
rtwjunkieNo, No and No. :slap:

Thank you. That will be all.
What a smart answer.
Nothing on topic? that's what i thought.
Posted on Reply
#131
HumanSmoke
N3M3515I compare the latest with the previous fastest.



So you take the only excptions to the rule, 770 was a refresh of the 680, merely oc.
And then you have the boldness to pick the 970, which is on the same node as the 780.

GTX 670 > GTX 580
GTX 470 > GTX 285
GTX 260 > 8800 ultra
8800GT 256MB > 7900GTX
7800GT > 6800 ultra
6800GT > 5950 ultra
5700 ultra > 4600 Ti

See the pattern? ALWAYS the x70 is faster than previous gen fastest.
That is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.

As an aside, your timeline is out of whack:
The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
The succeeding second tier card following the FX 5950U was the GF 6800 (non-GT) in May 2004. The 6800GT didn't arrive at the second tier pricing segment until November when the 6800 (non-GT) moved down to the $299 bracket
The succeeding second tier card following the 7900GTX512 was the 8800GTS 640M (G80). The 8800GT 256M didn't arrive until very late in 2007.

Regardless of the hierarchy, your examples work because in the past the succeeding chip has been more complex than than the one it replaced:

FX 5800 (NV30, 125m transistors) > Ti 4600 ( NV25 A3, 63m transistors)
GF 6800 (NV41, 222m transistors) > FX 5950U (NV38, 135m transistors)
GF 7800GT (G70/NV47, 302m transistors) > GF 6800U (NV45, 222m transistors)
8800GTS640 (G80, 681m transistors) > 7900GTX512 (G71, 278m transistors)
GTX 260 (GT200, 1400m transistors) > 8800U (G80, 681m transistors)
GTX 470 (GF100, 3100m transistors) > GTX 285 (GT200B, 1400m transistors)
GTX 670 (GK104, 3540m transistors) > GTX 580 (GF110, 3000m transistors)

We are now at a point where this is no longer true. GP104 carries less transistors than GM200. So, thanks to increased wafer costs, likely worse yield prediction, and a huge disparity in die area between GM200 and GP104, it is very probably that you can throw out past examples because the rules no longer apply - especially when factoring in salvage parts...and with foundry costs escalating, and GPUs evolving a degree of specialization depending upon market and workload, we probably wont be returning to "the good old days".
Posted on Reply
#132
the54thvoid
Super Intoxicated Moderator
okidnaOi Void, that should be Volta, not Vega :D:D:D You better edit it before AMD sheep at WaterClosetClusterFucktech taking it as a news headline : "NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"
Hell, I'll leave it at Vega - makes it even more of a close relationship. :laugh:
Posted on Reply
#133
rtwjunkie
PC Gaming Enthusiast
N3M3515What a smart answer.
Nothing on topic? that's what i thought.
Because it's like arguing with a wall. And I'm smart enough not to do that. Everyone except you and one or two others knows that the 70 series does not always, and will not be beating the previous flagship. Ever since Kepler, when the Flagship is of a higher end chip, the 70 series cannot beat it. Nor will it this time.

But I'll not argue with the wall anymore. I will let you be upset when your desires and predictions don't come true...as if any of this is worth getting upset over.
Posted on Reply
#134
Dethroy
What a sad day :twitch:
I've never been more disappointed in the tpu community than right now...
Posted on Reply
#135
Fluffmeister
DethroyWhat a sad day :twitch:
I've never been more disappointed in the tpu community than right now...
Don't be sad my friend.

Random leaks ... straight to sweeping conclusions is what the TPU "community" does.

All we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!
Posted on Reply
#136
Prima.Vera
For all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? :)))))
Posted on Reply
#137
truth teller
yogurt_21not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.

umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/



Read more: wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/#ixzz47tYvJsbZ

so how do you explain these results showing a 1290MHZ core clock difference between the 2 shots, and 10GHZ effective memory rate which is over double that of the 980 ti?

Fun fact THESE ARE FAKE. Some kid threw up whatever numbers they wanted for the specs onto a 980 ti result.

no I really won't these are 980 ti's with the results edited to fool you. Obviously it worked you took the bait and are hooked right into their gag. Have fun with that.
well, now that nvidias presentation is over i must ask, was there anyt factual information in that post of yours? did i really took the bait? and perhaps most importantly, did it make you sad?
Prima.VeraFor all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? :)))))
i only heard them mentioning 2x the performance of a titan x, which isnt such a feat, hell, even a 290x beats a titan x.
Posted on Reply
#138
okidna
[XC] Oj101Two of these on LN2 beat four 980 Tis on LN2
10 GHz RAM will be a possibility with some of these cards
Frequencies will hit CPU-like speeds in LN2
Wow, you're right, 2114 Mhz boost clock under stock cooling, so indeed CPU-like frequencies could be achieved under LN2 :
HumanSmoke2114MHz clock air cooled running the demo. Not too shabby at all.


Product page up now: www.geforce.com/hardware/10series/geforce-gtx-1080

Partial specs:
2560 cores
DP 1.4 and HDMI 2.0b output
1607MHz base, 1733MHz nominal boost

$599 MSRP
$379 MSRP for GTX 1070
Prima.VeraGuess how fast the 1070 will be then...? :)))))
They claimed it to be faster than Titan-X : www.techpowerup.com/forums/threads/nvidia-also-announces-faster-than-titan-x-gtx-1070.222280/
Posted on Reply
#139
HumanSmoke
FluffmeisterAll we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!
After the GTX 1080/1070 presentation, I think it all went up a notch. wccftech AMD fanboys have just gone to DEFCON1 ( Comparing their collective brown-stained pants for Rorschach test candidacy).

If the minimum guaranteed boost is 1733MHz ( www.geforce.com/hardware/10series/geforce-gtx-1080 ) then that 2114MHz core / 11000MHz effective memory augers well for overclocking and a raft of AIB custom cards.
Posted on Reply
#140
Prima.Vera
Yeah, just saw the 1070 article.

So 1070 card faster than a 980Ti for 380$...
Not bad at all. This card could even go lower than 350 if AMD pulls 2 rabbits from its hat.

Exciting times ahead!
Posted on Reply
#141
rtwjunkie
PC Gaming Enthusiast
Prima.VeraSo 1070 card faster than a 980Ti for 380$...
Check your reading comprehension. Faster than Titan-X, not 980Ti.

SMFH....:banghead:

The level of wishful thinking a few people on here are exhibiting is astounding. :rolleyes:
Posted on Reply
#142
Prima.Vera
rtwjunkieCheck your reading comprehension. Faster than Titan-X, not 980Ti.

SMFH....:banghead:

The level of wishful thinking a few people on here are exhibiting is astounding. :rolleyes:
Getting butt-hurt all of the sudden? :D :D
I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing ;)

Cheers!
Posted on Reply
#143
rtwjunkie
PC Gaming Enthusiast
Prima.VeraGetting butt-hurt all of the sudden? :D :D
I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing ;)

Cheers!
Wait, where did you admit you were wrong? Because you are on record multiple times saying the 1070 would beat 980Ti. Now that is not the case (as I pointed out to you beforehand that it wouldn't), you are unable to admit your wishful thinking was wrong? Wow.
Posted on Reply
#144
Prima.Vera
@rtwjunkie

Congrats! You just got awarded the Troll (I really hope you are trolling and not being stupid) of the Year award.

Now move along kid, you're bothering me...
Posted on Reply
#145
rtwjunkie
PC Gaming Enthusiast
Prima.Vera@rtwjunkie

Congrats! You just got awarded the Troll (I really hope you are trolling and not being stupid) of the Year award.

Now move along kid, you're bothering me...
You're oblivious. I'm way older than you. The only one who has been trolling is you, who has been practically slobbering like a rabid fanboy at the prospect that a 970 would beat a 980Ti.

I on the other hand have attempted to be a voice of reason and rational thought. Only in your Bizzaro-World is the voice of reason labeled a troll.
Posted on Reply
#146
N3M3515
HumanSmokeThat is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.

As an aside, your timeline is out of whack:
The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
The succeeding second tier card following the FX 5950U was the GF 6800 (non-GT) in May 2004. The 6800GT didn't arrive at the second tier pricing segment until November when the 6800 (non-GT) moved down to the $299 bracket
The succeeding second tier card following the 7900GTX512 was the 8800GTS 640M (G80). The 8800GT 256M didn't arrive until very late in 2007.

Regardless of the hierarchy, your examples work because in the past the succeeding chip has been more complex than than the one it replaced:

FX 5800 (NV30, 125m transistors) > Ti 4600 ( NV25 A3, 63m transistors)
GF 6800 (NV41, 222m transistors) > FX 5950U (NV38, 135m transistors)
GF 7800GT (G70/NV47, 302m transistors) > GF 6800U (NV45, 222m transistors)
8800GTS640 (G80, 681m transistors) > 7900GTX512 (G71, 278m transistors)
GTX 260 (GT200, 1400m transistors) > 8800U (G80, 681m transistors)
GTX 470 (GF100, 3100m transistors) > GTX 285 (GT200B, 1400m transistors)
GTX 670 (GK104, 3540m transistors) > GTX 580 (GF110, 3000m transistors)

We are now at a point where this is no longer true. GP104 carries less transistors than GM200. So, thanks to increased wafer costs, likely worse yield prediction, and a huge disparity in die area between GM200 and GP104, it is very probably that you can throw out past examples because the rules no longer apply - especially when factoring in salvage parts...and with foundry costs escalating, and GPUs evolving a degree of specialization depending upon market and workload, we probably wont be returning to "the good old days".
Now that's an excellent explanation.
Posted on Reply
#147
EarthDog
[XC] Oj101Two of these on LN2 beat four 980 Tis on LN2
10 GHz RAM will be a possibility with some of these cards
Frequencies will hit CPU-like speeds in LN2
:respect: :fear:
Posted on Reply
#149
Caring1
Gotta love TPU, we get intelligent responses, some butt hurt and a smattering of smart arse to lighten the mood :lovetpu::clap::clap::clap:
Posted on Reply
#150
Vayra86
the54thvoidExpecting a company to try harder because people elect to band together and boycott their product does a few things:
1) Share price crash
2) Your pious campaign leads to job lay offs and cut's in R&D.
3) The product actually gets worse.
4) The competition sees the opportunity to make profit for it's shareholders and raises it's own prices due to market conditions.
5) The slide of the other company continues for years, allowing the other to profit even more

Does that sound familiar - oh yeah - it's what happened in a sense to AMD. You are outrageously so far away from business reality it's almost comical. Big business does NOT listen to it's consumers - it is dictated to by it's shareholder to whom it owes everything. Shareholders demand a return on investment and that is acquired through profiteering at our expense. That is capitalism. I don't like it but I understand it.

The only and I mean THE ABSOLUTE ONLY incentive for Nvidia to lower it's prices is when AMD has the stand out best gfx chip and prices it in such a way Nvidia will only make sales if it lowers it's profit margin.

Stop blaming Nvidia for making profit for it's shareholders - blame the absolute lack of high end competition from AMD. And yes - I am actually very surprised that Fury X didn't claw that back a lot more than it did because Fury X is a great gfx card but AMD priced it (initially) at the same price and hey, guess what - Nvida didn't need to make their product cheaper. AMD got screwed as soon as they released the HD7970 at the inflated price they did. That opened Nvidia's floodgate of overpriced 104 chips, made worse by luxury price 100 chips.

I'm not having a go at you Zone, I'm really not - but as admirable as your stance is for better consumer prices the market reality of capitalist economics doesn't give a shit. Until AMD match Nvidia stride for stride and make their product MORE desirable, Nvidia prices wont budge.

I studied a module on economics at Uni, so I know enough to see the unfortunate picture. I do think unless Nvidia pulls a rabbit out the hat that AMD might just start to get a snowball effect if they push the DX12 and GCN message
enough. But they also need to push developer adoption of large queue Async because with Pascal's rumoured clock speeds, it looks like they might be trying to brute force Async until Vega.
Thank you, sir, for the dose of realism.

People seem to think they live in Utopia sometimes, especially when the hype train for a new GPU release is starting again. It never ceases to amaze me. Just a month ago everyone 'needed HBM2'. :rolleyes:
Posted on Reply
Add your own comment
Nov 24th, 2024 22:56 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts