• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce GTX 1080 Put Through 3DMark

So you take the only excptions to the rule, 770 was a refresh of the 680, merely oc.
And then you have the boldness to pick the 970, which is on the same node as the 780.

GTX 670 > GTX 580
GTX 470 > GTX 285
GTX 260 > 8800 ultra
8800GT 256MB > 7900GTX
7800GT > 6800 ultra
6800GT > 5950 ultra
5700 ultra > 4600 Ti

See the pattern? ALWAYS the x70 is faster than previous gen fastest.
No, No and No. :slap:

Thank you. That will be all.
 
as expected, i hope this doesnt come as a surprise to you
not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.
again, as expected for gddr5x, not that its gonna do any better than the previous gddr5
does you car go faster just because you are in a racing circuit?
umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/

the real clock rates of the memory will be the same.

Read more: http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/#ixzz47tYvJsbZ

so how do you explain these results showing a 1290MHZ core clock difference between the 2 shots, and 10GHZ effective memory rate which is over double that of the 980 ti?

Fun fact THESE ARE FAKE. Some kid threw up whatever numbers they wanted for the specs onto a 980 ti result.
dont, you will be sad once the cards are launched and definitive reviews are posted
no I really won't these are 980 ti's with the results edited to fool you. Obviously it worked you took the bait and are hooked right into their gag. Have fun with that.
 
^-- What he said. However, this is starting to feel more and more like a price fixation or worst...monopoly...

<start intercepted transcript>

.........

Lisa Su: "Hi Jen - you want to keep your price at the $599 mark for those new 104 chips?"
JHH: "That was the plan - how's Polaris?"
Lisa Su: "We figure we'll go toe to toe with 980ti so you can drop that 10% and we'll come in at $399"
JHH: "So we have maybe faster chip in GP104 and sell at top price for now, you guys get new Polaris chip snatching at our 980ti sales... sounds good to me."
Lisa Su: "Business as usual. I just got off the phone to Intel. They've gone nuts with Broadwell-E. Said the top part costs $1500 and one kidney..."
JHH: "What you pricing Zen at?"
Lisa Su: "Well, Intel's price is so basket case we think we'll do it at $599 - we should sell a million units in the first week. We're putting them in the new Nintendo."
JHH: "Ouch, well, we're working on Async on Vega so that'll suit us well, besides GP200 has an async module so at least our guys will be able to get fully functional DX12 in 2017."
Lisa Su: "Just lay off the Gameworks for a bit?"
JHH: "Yeah, sorry babes, that went a bit too far - I blame Ubisoft for that - they bought it hook line and sinker and coded for shit - my bad. Did you get my Ferrari apology present?"
Lisa Su: "I did, thank you, in my favourite colour too! Well gotta go, PR's talking silly stuff again - need to reel in Roy."
JHH: "KK, good luck!"

...click...
 
<start intercepted transcript>

.........

Lisa Su: "Hi Jen - you want to keep your price at the $599 mark for those new 104 chips?"
JHH: "That was the plan - how's Polaris?"
Lisa Su: "We figure we'll go toe to toe with 980ti so you can drop that 10% and we'll come in at $399"
JHH: "So we have maybe faster chip in GP104 and sell at top price for now, you guys get new Polaris chip snatching at our 980ti sales... sounds good to me."
Lisa Su: "Business as usual. I just got off the phone to Intel. They've gone nuts with Broadwell-E. Said the top part costs $1500 and one kidney..."
JHH: "What you pricing Zen at?"
Lisa Su: "Well, Intel's price is so basket case we think we'll do it at $599 - we should sell a million units in the first week. We're putting them in the new Nintendo."
JHH: "Ouch, well, we're working on Async on Vega so that'll suit us well, besides GP200 has an async module so at least our guys will be able to get fully functional DX12 in 2017."
Lisa Su: "Just lay off the Gameworks for a bit?"
JHH: "Yeah, sorry babes, that went a bit too far - I blame Ubisoft for that - they bought it hook line and sinker and coded for shit - my bad. Did you get my Ferrari apology present?"
Lisa Su: "I did, thank you, in my favourite colour too! Well gotta go, PR's talking silly stuff again - need to reel in Roy."
JHH: "KK, good luck!"

...click...

Oi Void, that should be Volta, not Vega :D:D:D You better edit it before AMD sheep at WaterClosetClusterFucktech taking it as a news headline : "NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"
 
I don't know you from a hole in the ground friend. I'll wait if you won't pony up any proof. :)

Entirely up to you. For what it's worth, I ran a news site called flyingsuicide.net (now defunct, life got in the way) and was the first person to release accurate GTX Titan performance (quoted here http://videocardz.com/39436/nvidia-geforce-gtx-titan-real-performance-revealed and http://www.nordichardware.se/nyheter/geforce-gtx-titan-50-60-kraftfullare-aen-gtx-680.html - I was the author of the non-watermarked barchart) on 8 February 2013. That was several days before anyone else could say for sure that the X7107 score was fake. At the same time I confirmed the price at $999.

On OCN (my username there is Oj010) on 24 June 2015 I told people that Fury X would average around 1140 MHz while everybody was hoping (banking on, believing, whatever) for 1300 MHz or more. I also said at the time that voltage would make very little difference - that was months before anyone else.

Also on OCN I told people 23 December 2015 already that Intel was going to be blocking BCLK overclocking on non-K series CPUs.

I could go on, but those are the easiest verifiable references I can give you off the top of my head.
 
I compare the latest with the previous fastest.



So you take the only excptions to the rule, 770 was a refresh of the 680, merely oc.
And then you have the boldness to pick the 970, which is on the same node as the 780.

GTX 670 > GTX 580
GTX 470 > GTX 285
GTX 260 > 8800 ultra
8800GT 256MB > 7900GTX
7800GT > 6800 ultra
6800GT > 5950 ultra
5700 ultra > 4600 Ti

See the pattern? ALWAYS the x70 is faster than previous gen fastest.

That is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.

As an aside, your timeline is out of whack:
The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
The succeeding second tier card following the FX 5950U was the GF 6800 (non-GT) in May 2004. The 6800GT didn't arrive at the second tier pricing segment until November when the 6800 (non-GT) moved down to the $299 bracket
The succeeding second tier card following the 7900GTX512 was the 8800GTS 640M (G80). The 8800GT 256M didn't arrive until very late in 2007.

Regardless of the hierarchy, your examples work because in the past the succeeding chip has been more complex than than the one it replaced:

FX 5800 (NV30, 125m transistors) > Ti 4600 ( NV25 A3, 63m transistors)
GF 6800 (NV41, 222m transistors) > FX 5950U (NV38, 135m transistors)
GF 7800GT (G70/NV47, 302m transistors) > GF 6800U (NV45, 222m transistors)
8800GTS640 (G80, 681m transistors) > 7900GTX512 (G71, 278m transistors)
GTX 260 (GT200, 1400m transistors) > 8800U (G80, 681m transistors)
GTX 470 (GF100, 3100m transistors) > GTX 285 (GT200B, 1400m transistors)
GTX 670 (GK104, 3540m transistors) > GTX 580 (GF110, 3000m transistors)

We are now at a point where this is no longer true. GP104 carries less transistors than GM200. So, thanks to increased wafer costs, likely worse yield prediction, and a huge disparity in die area between GM200 and GP104, it is very probably that you can throw out past examples because the rules no longer apply - especially when factoring in salvage parts...and with foundry costs escalating, and GPUs evolving a degree of specialization depending upon market and workload, we probably wont be returning to "the good old days".
 
Oi Void, that should be Volta, not Vega :D:D:D You better edit it before AMD sheep at WaterClosetClusterFucktech taking it as a news headline : "NVIDIA shamelessly steal AMD Vega GPU codename for their next flagship graphic card"

Hell, I'll leave it at Vega - makes it even more of a close relationship. :laugh:
 
What a smart answer.
Nothing on topic? that's what i thought.

Because it's like arguing with a wall. And I'm smart enough not to do that. Everyone except you and one or two others knows that the 70 series does not always, and will not be beating the previous flagship. Ever since Kepler, when the Flagship is of a higher end chip, the 70 series cannot beat it. Nor will it this time.

But I'll not argue with the wall anymore. I will let you be upset when your desires and predictions don't come true...as if any of this is worth getting upset over.
 
What a sad day :twitch:
I've never been more disappointed in the tpu community than right now...
 
What a sad day :twitch:
I've never been more disappointed in the tpu community than right now...

Don't be sad my friend.

Random leaks ... straight to sweeping conclusions is what the TPU "community" does.

All we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!
 
For all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? :)))))
 
not considering editing a 3dmark score to gain notoriety while using an existing card has happened every launch.

umm no read what I said, it's well over twice the MHZ! ie 10GHZ effective memory speed! I told you this is a bogus entry. Ie not real at all.
http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/



Read more: http://wccftech.com/micron-gddr5x-memory-analysis-nvidia-pascal-graphic-card/#ixzz47tYvJsbZ

so how do you explain these results showing a 1290MHZ core clock difference between the 2 shots, and 10GHZ effective memory rate which is over double that of the 980 ti?

Fun fact THESE ARE FAKE. Some kid threw up whatever numbers they wanted for the specs onto a 980 ti result.

no I really won't these are 980 ti's with the results edited to fool you. Obviously it worked you took the bait and are hooked right into their gag. Have fun with that.
well, now that nvidias presentation is over i must ask, was there anyt factual information in that post of yours? did i really took the bait? and perhaps most importantly, did it make you sad?

For all no-sayers out here, nVidia just claimed over their GTX 1080 presentation that the card is faster than 2x980 cards in SLI. Guess how fast the 1070 will be then...? :)))))
i only heard them mentioning 2x the performance of a titan x, which isnt such a feat, hell, even a 290x beats a titan x.
 
Two of these on LN2 beat four 980 Tis on LN2
10 GHz RAM will be a possibility with some of these cards
Frequencies will hit CPU-like speeds in LN2

Wow, you're right, 2114 Mhz boost clock under stock cooling, so indeed CPU-like frequencies could be achieved under LN2 :

2114MHz clock air cooled running the demo. Not too shabby at all.
8t8UEPg.png


Product page up now: http://www.geforce.com/hardware/10series/geforce-gtx-1080

Partial specs:
2560 cores
DP 1.4 and HDMI 2.0b output
1607MHz base, 1733MHz nominal boost

$599 MSRP
$379 MSRP for GTX 1070

Guess how fast the 1070 will be then...? :)))))

They claimed it to be faster than Titan-X : http://www.techpowerup.com/forums/threads/nvidia-also-announces-faster-than-titan-x-gtx-1070.222280/
 
All we know for sure is GPU's are serious business, and Nvidia are mean and winning and successful and that is most wrong!
After the GTX 1080/1070 presentation, I think it all went up a notch. wccftech AMD fanboys have just gone to DEFCON1 ( Comparing their collective brown-stained pants for Rorschach test candidacy).

If the minimum guaranteed boost is 1733MHz ( http://www.geforce.com/hardware/10series/geforce-gtx-1080 ) then that 2114MHz core / 11000MHz effective memory augers well for overclocking and a raft of AIB custom cards.
 
Yeah, just saw the 1070 article.

So 1070 card faster than a 980Ti for 380$...
Not bad at all. This card could even go lower than 350 if AMD pulls 2 rabbits from its hat.

Exciting times ahead!
 
So 1070 card faster than a 980Ti for 380$...
Check your reading comprehension. Faster than Titan-X, not 980Ti.

SMFH....:banghead:

The level of wishful thinking a few people on here are exhibiting is astounding. :rolleyes:
 
Check your reading comprehension. Faster than Titan-X, not 980Ti.

SMFH....:banghead:

The level of wishful thinking a few people on here are exhibiting is astounding. :rolleyes:
Getting butt-hurt all of the sudden? :D :D
I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing ;)

Cheers!
 
Getting butt-hurt all of the sudden? :D :D
I know, it takes a lot for one to admit that he was wrong in front of everybody. Is simpler just to go forward with the insults and smart-assing ;)

Cheers!

Wait, where did you admit you were wrong? Because you are on record multiple times saying the 1070 would beat 980Ti. Now that is not the case (as I pointed out to you beforehand that it wouldn't), you are unable to admit your wishful thinking was wrong? Wow.
 
@rtwjunkie

Congrats! You just got awarded the Troll (I really hope you are trolling and not being stupid) of the Year award.

Now move along kid, you're bothering me...
 
@rtwjunkie

Congrats! You just got awarded the Troll (I really hope you are trolling and not being stupid) of the Year award.

Now move along kid, you're bothering me...
You're oblivious. I'm way older than you. The only one who has been trolling is you, who has been practically slobbering like a rabid fanboy at the prospect that a 970 would beat a 980Ti.

I on the other hand have attempted to be a voice of reason and rational thought. Only in your Bizzaro-World is the voice of reason labeled a troll.
 
That is not exactly an apples-to-apples comparison. You will note that in the past, the second tier SKU has been carved out of the same silicon as the top dog, and for the most part you are dealing with a comparable sized GPU from generation to generation. For example, the G80 (of the 8800GTX/U) compares to the GT200 of the GTX 260, the GT200B of the GTX 285 compares to the GF100 of the GTX 470. This trend is unlikely to continue as architectures bifurcate between gaming-centric and professional usage.

As an aside, your timeline is out of whack:
The succeeding second tier card following the Ti 4600 was the FX 5800 (in January 2003). The FX 5700U didn't arrive until October.
The succeeding second tier card following the FX 5950U was the GF 6800 (non-GT) in May 2004. The 6800GT didn't arrive at the second tier pricing segment until November when the 6800 (non-GT) moved down to the $299 bracket
The succeeding second tier card following the 7900GTX512 was the 8800GTS 640M (G80). The 8800GT 256M didn't arrive until very late in 2007.

Regardless of the hierarchy, your examples work because in the past the succeeding chip has been more complex than than the one it replaced:

FX 5800 (NV30, 125m transistors) > Ti 4600 ( NV25 A3, 63m transistors)
GF 6800 (NV41, 222m transistors) > FX 5950U (NV38, 135m transistors)
GF 7800GT (G70/NV47, 302m transistors) > GF 6800U (NV45, 222m transistors)
8800GTS640 (G80, 681m transistors) > 7900GTX512 (G71, 278m transistors)
GTX 260 (GT200, 1400m transistors) > 8800U (G80, 681m transistors)
GTX 470 (GF100, 3100m transistors) > GTX 285 (GT200B, 1400m transistors)
GTX 670 (GK104, 3540m transistors) > GTX 580 (GF110, 3000m transistors)

We are now at a point where this is no longer true. GP104 carries less transistors than GM200. So, thanks to increased wafer costs, likely worse yield prediction, and a huge disparity in die area between GM200 and GP104, it is very probably that you can throw out past examples because the rules no longer apply - especially when factoring in salvage parts...and with foundry costs escalating, and GPUs evolving a degree of specialization depending upon market and workload, we probably wont be returning to "the good old days".

Now that's an excellent explanation.
 
Gotta love TPU, we get intelligent responses, some butt hurt and a smattering of smart arse to lighten the mood :lovetpu::clap::clap::clap:
 
Back
Top