Tuesday, February 28th 2017

NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

NVIDIA today unveiled the GeForce GTX 1080 Ti graphics card, its fastest consumer graphics card based on the "Pascal" GPU architecture, and which is positioned to be more affordable than the flagship TITAN X Pascal, at USD $699, with market availability from the first week of March, 2017. Based on the same "GP102" silicon as the TITAN X Pascal, the GTX 1080 Ti is slightly cut-down. While it features the same 3,584 CUDA cores as the TITAN X Pascal, the memory amount is now lower, at 11 GB, over a slightly narrower 352-bit wide GDDR5X memory interface. This translates to 11 memory chips on the card. On the bright side, NVIDIA is using newer memory chips than the one it deployed on the TITAN X Pascal, which run at 11 GHz (GDDR5X-effective), so the memory bandwidth is 484 GB/s.

Besides the narrower 352-bit memory bus, the ROP count is lowered to 88 (from 96 on the TITAN X Pascal), while the TMU count is unchanged from 224. The GPU core is clocked at a boost frequency of up to 1.60 GHz, with the ability to overclock beyond the 2.00 GHz mark. It gets better: the GTX 1080 Ti features certain memory advancements not found on other "Pascal" based graphics cards: a newer memory chip and optimized memory interface, that's running at 11 Gbps. NVIDIA's Tiled Rendering Technology has also been finally announced publicly; a feature NVIDIA has been hiding from its consumers since the GeForce "Maxwell" architecture, it is one of the secret sauces that enable NVIDIA's lead.
The Tiled Rendering technology brings about huge improvements in memory bandwidth utilization by optimizing the render process to work in square sized chunks, instead of drawing the whole polygon. Thus, geometry and textures of a processed object stays on-chip (in the L2 cache), which reduces cache misses and memory bandwidth requirements.
Together with its lossless memory compression tech, NVIDIA expects Tiled Rendering, and its storage tech, Tiled Caching, to more than double, or even close to triple, the effective memory bandwidth of the GTX 1080 Ti, over its physical bandwidth of 484 GB/s.
NVIDIA is making sure it doesn't run into the thermal and electrical issues of previous-generation reference design high-end graphics cards, by deploying a new 7-phase dual-FET VRM that reduces loads (and thereby temperatures) per MOSFET. The underlying cooling solution is also improved, with a new vapor-chamber plate, and a denser aluminium channel matrix.
Watt-to-Watt, the GTX 1080 Ti will hence be up to 2.5 dBA quieter than the GTX 1080, or up to 5°C cooler. The card draws power from a combination of 8-pin and 6-pin PCIe power connectors, with the GPU's TDP rated at 220W. The GeForce GTX 1080 Ti is designed to be anywhere between 20-45% faster than the GTX 1080 (35% on average).
The GeForce GTX 1080 Ti is widely expected to be faster than the TITAN X Pascal out of the box, despite is narrower memory bus and fewer ROPs. The higher boost clocks and 11 Gbps memory, make up for the performance deficit. What's more, the GTX 1080 Ti will be available in custom-design boards, and factory-overclocked speeds, so the GTX 1080 Ti will end up being the fastest consumer graphics option until there's competition.
Add your own comment

160 Comments on NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

#76
newtekie1
Semi-Retired Folder
chaosmassivethe reason we saw weird memory configuration here is because
its has 6 ROP disabled, we still know that ROP tied with memory controller like GTX 970 was
which 4 ROPs disabled in GTX 970 causing last 32 bit portion need to be 'hike' along with
another ROP which connected to Memory controller (3.5+0.5 GB) IIRC

Now Nvidia learn their lesson, to avoid GTX 970 fiasco, they simply completely disabled last 32 bit MC along with 6 ROP
hence thats what you see now 11 GB via 352 bit bus, if they attempt to 'normalize' 12 GB (11+1) advertising, it will revive old stigma to them .
Not exactly. You are correct that the ROPs are tied with the memory controller. However, nVidia didn't actually disable any ROPs on the GTX970. The GTX970 still had all 64 ROPs enabled, because it had all 8 memory controllers enabled.
It was disabling the L2 that caused the issue. Each memory controller, and it's ROPs, are linked to a block of L2. When they disabled a block of L2 in the GTX970, that block's memory controller and ROPs had to be jumpered over to another block of L2.
The ROPs in the jumpered section were technically still active. However, nVidia designed their driver to not use them, because using them would have actually resulted in slower performance.

In the case of the GTX1080Ti, they likely also lowered the amount of L2. We won't know for sure, because L2 is not an advertised spec. And you are probably right, in this case, they also just went ahead and disabled the memory controller and it's associated ROPs to avoid any kind of fiasco.
EarthDogwuh...what? What's this 100% 80% nonsense you are talking?

DP is a digital signal. It either getside there or doesnt. I have some of the cheapest DP cables I could find driving a 4k and 2460x1440 monitor...DP is equal or better in every way last I understood.
The only way I see his statement making sense is if he use using a DP -> DVI adapter. I've had some of those really suck.

But he's really complaining about nothing for two reasons:

1.) This is just the reference output design. AIBs can change it however they want, and I'm sure some will add a DVI port.
2.) It has an HDMI port. Since DVI and HDMI use the exact same signal, he can just pick up a cheap HDMI -> DVI adapter or cable.
qubitA slightly crippled GPU and a weird 11GB RAM on their top GTX? Now that's just fugly.
Just like so many great GPUs before it.
R0H1TSeriously, I'd expect anyone on this site to be able read the graph.

How about adding something like *at constant GPU load. If Nividia were really smart they would've simply put the noise delta on the X axis otherwise this is what people can interpret & they're not wrong either ~
It doesn't matter which axis is which. The graph would read the same. However, the point nVidia was making was that the 1080Ti cooler gives lower temperatures than the 1080 cooler. So, most people expect the 1080Ti line to then be lower on the graph than the 1080. Not shifted a little to the left. Visually, if you're point is that something is lower than something else, you orient your graph axes so that it is visually lower on the graph.

And they did put, in clear as day letters, that they were testing both at 220w.
Posted on Reply
#77
Air
After the "do you get lower room temperatures with a bigger cpu cooler?", the new TPU complex science challenge is understanding the nvidia cooler graph. Seriously i can even understand what people isn't understanding.

On the cooler subject, id like to point that there are 2 things that favors the 1080 ti cooling over the 1080:

1. No dvi port - increased flow, lower turbulence.
2. Bigger die area - higher heat transfer at same power.

I bet those two make up for the most of this 5 C diference at the same power. And nividia changed nothing or almost nothing.
Posted on Reply
#79
EarthDog
Says 699 in what you linked.. and is correct?

Edit: you meant the 1080...in a 1080ti thread.... and didn't say it. LOL!
Posted on Reply
#80
Captain_Tom
So 35% stronger than the 1080? It's just a Titan with a different name.


Vega should have no trouble defeating this if they want it to.
Posted on Reply
#81
Fluffmeister
Yeah Vega is a beast, at least 50% faster I've heard, should cost only $400 too and come with a free HTC Vive.

Nvidia are doomed.
Posted on Reply
#82
newtekie1
Semi-Retired Folder
AirAfter the "do you get lower room temperatures with a bigger cpu cooler?", the new TPU complex science challenge is understanding the nvidia cooler graph. Seriously i can even understand what people isn't understanding.

On the cooler subject, id like to point that there are 2 things that favors the 1080 ti cooling over the 1080:

1. No dvi port - increased flow, lower turbulence.
2. Bigger die - higher heat transfer at same power.

I bet those two make up for the most of this 5 C diference at the same power. And nividia changed nothing or almost nothing.
I think the interesting thing is that everyone is arguing about the reference cooler. Something almost none of us will sue because it's still crap compared to the 3rd party coolers that will be used by the AIBs.
Posted on Reply
#83
londiste
i would not call it exactly crap. there are tradeoffs to be made with cooler having to be a blower.
there is not much you can do to a blower type cooler beyond what nvidia currently has on 1080/1080ti/titanxp. at least not in reasonable price range.
Posted on Reply
#84
ZeroFM
1080ti/titan xp pcb should be same ? I want order waterblock
Posted on Reply
#85
Steevo
If anyone honestly thinks the price is due to anything than wanting to capture marketshare and competition even from older cards and AMD.

Stop being delusional.

Also, it looks amazing, when are the actual reviews out?
Posted on Reply
#86
Captain_Tom
FluffmeisterYeah Vega is a beast, at least 50% faster I've heard, should cost only $400 too and come with a free HTC Vive.

Nvidia are doomed.
Do you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.


They have done that every generation lol.
Posted on Reply
#87
londiste
Captain_TomDo you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.
They have done that every generation lol.
you are right that amd should be able to do 50% over previous flagship.
however, amd's previous flagship was fury x. 50% on top of fury x would put performance at only slightly faster than gtx1080.

depending on where exactly they want vega to be it might not be enough.
Posted on Reply
#88
dalekdukesboy
OneCool11gb of VRAM screams something isn't right...W1z...Come on back my old ass up here.... Core math doesn't hold up on this lol...2+2 isn't almost 4
With modern math AKA common core math it certainly is! Probably more like -14.5 with where our math education is going. And to the 2+2 that probably = wtf you want it to!
Posted on Reply
#89
dalekdukesboy
MrGeniusOh for crying out loud. How hard can this be to understand people? They made a mistake. Plain and simple.

Here. I fix.



How difficult was that?
Ask Nvidia, they fucked up a simple graph:).
Posted on Reply
#90
theGryphon
Captain_TomDo you realize how silly you sound? You are acting like it's insane that after 2 years AMD could make a card 50% stronger than their previous flagship.


They have done that every generation lol.
In his sarcasm I bet he meant 50% faster than 1080Ti.

In all seriousness, AMD should be able to do way more than +50% over Fury X, which came with a gimped HBM infrastructure. My bet though, top Vega won't beat 1080Ti, but come to 10% neighborhood.
Posted on Reply
#91
theGryphon
dalekdukesboyAsk Nvidia, they fucked up a simple graph:).
oh I so hope this was sarcasm...
Posted on Reply
#92
dalekdukesboy
Yes...yet no. As you pointed out they didn't technically screw up the graph or get it "wrong" they show proper relationship of noise of cooler and temperature. So yes I was just being funny and sarcastic, but as others' have stated it isn't an incorrect graph necessarily but it is poorly executed with parameters not set or stated like fan speed etc which would have stopped everyone from questioning it. So yes the graph seems accurate I get what it is trying to say, but it did take me a moment to look at it and see what they were doing. It works, just could have been better executed.
Posted on Reply
#93
londiste
noise level and fan speed would be on the same axis and graph would largely be the same. maybe they chose noise as it drew more straight lines?

now that i looked closer at that temp/noise graph though, 1080@220w and 1080ti@220w is misleading as hell. 1080 reference tdp is 180w, 1080's has 250w. even with a better cooler the actual end result will be worse
Posted on Reply
#94
R0H1T
londistenoise level and fan speed would be on the same axis and graph would largely be the same. maybe they chose noise as it drew more straight lines?

now that i looked closer at that temp/noise graph though, 1080@220w and 1080ti@220w is misleading as hell. 1080 reference tdp is 180w, 1080's has 250w. even with a better cooler the actual end result will be worse
Of course, news at 11 the 180W TDP cooler is less efficient/more noisy than 220W TDP cooler.
This is what many are ignoring, also the reason why the graph didn't make sense to me at first, not to mention I couldn't recall 1080's TDP immediately.
Posted on Reply
#95
Air
dalekdukesboyYes...yet no. As you pointed out they didn't technically screw up the graph or get it "wrong" they show proper relationship of noise of cooler and temperature. So yes I was just being funny and sarcastic, but as others' have stated it isn't an incorrect graph necessarily but it is poorly executed with parameters not set or stated like fan speed etc which would have stopped everyone from questioning it. So yes the graph seems accurate I get what it is trying to say, but it did take me a moment to look at it and see what they were doing. It works, just could have been better executed.
The graph is perfect. RPM value is meaningless, what matters is noise levels and cooling performance, both shown on the graph. But, as I said before, is not an apples to apples comparison because of the difference in die area and outlet design. So I'm not buying the "better cooler" claim.
Posted on Reply
#96
nickbaldwin86
throw a water block on it and you get a single slot card!!! FINALLY!! the day of DVI is over!
Posted on Reply
#97
rtwjunkie
PC Gaming Enthusiast
nickbaldwin86throw a water block on it and you get a single slot card!!! FINALLY!! the day of DVI is over!
It's not. AIB's are 99% likely to put one on there, because the vast majority of buyers have DVI monitors. Yes you can use an adapter, but they won't want to alienate buyers.
Posted on Reply
#98
Steevo
Probably 11GB due to the die cuts directly effecting memory interface and Nvidia trying to avoid a much slower 12th GB.
Posted on Reply
#99
Prince Valiant
I love official graphs for this stuff, always making minor differences look huge :laugh:. The percentage comparison at the end is solid gold :roll:. It'll be interesting to see what the non-reference boards manage.
Posted on Reply
#100
GhostRyder
nickbaldwin86throw a water block on it and you get a single slot card!!! FINALLY!! the day of DVI is over!
Yea, I am thankful at least on reference no DVI. I prefer this 1 HDMI and 3 DP design (Same with AMD) well over having anything on the top.

Interested in how this thing will perform and how this cut up card handles things in the memory department. For the price, I may trade up my Titan XP for a pair of these instead of grabbing a second Titan XP (Also depends on how it overclocks).
Posted on Reply
Add your own comment
Nov 23rd, 2024 11:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts