Tuesday, February 28th 2017

NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

NVIDIA today unveiled the GeForce GTX 1080 Ti graphics card, its fastest consumer graphics card based on the "Pascal" GPU architecture, and which is positioned to be more affordable than the flagship TITAN X Pascal, at USD $699, with market availability from the first week of March, 2017. Based on the same "GP102" silicon as the TITAN X Pascal, the GTX 1080 Ti is slightly cut-down. While it features the same 3,584 CUDA cores as the TITAN X Pascal, the memory amount is now lower, at 11 GB, over a slightly narrower 352-bit wide GDDR5X memory interface. This translates to 11 memory chips on the card. On the bright side, NVIDIA is using newer memory chips than the one it deployed on the TITAN X Pascal, which run at 11 GHz (GDDR5X-effective), so the memory bandwidth is 484 GB/s.

Besides the narrower 352-bit memory bus, the ROP count is lowered to 88 (from 96 on the TITAN X Pascal), while the TMU count is unchanged from 224. The GPU core is clocked at a boost frequency of up to 1.60 GHz, with the ability to overclock beyond the 2.00 GHz mark. It gets better: the GTX 1080 Ti features certain memory advancements not found on other "Pascal" based graphics cards: a newer memory chip and optimized memory interface, that's running at 11 Gbps. NVIDIA's Tiled Rendering Technology has also been finally announced publicly; a feature NVIDIA has been hiding from its consumers since the GeForce "Maxwell" architecture, it is one of the secret sauces that enable NVIDIA's lead.
The Tiled Rendering technology brings about huge improvements in memory bandwidth utilization by optimizing the render process to work in square sized chunks, instead of drawing the whole polygon. Thus, geometry and textures of a processed object stays on-chip (in the L2 cache), which reduces cache misses and memory bandwidth requirements.
Together with its lossless memory compression tech, NVIDIA expects Tiled Rendering, and its storage tech, Tiled Caching, to more than double, or even close to triple, the effective memory bandwidth of the GTX 1080 Ti, over its physical bandwidth of 484 GB/s.
NVIDIA is making sure it doesn't run into the thermal and electrical issues of previous-generation reference design high-end graphics cards, by deploying a new 7-phase dual-FET VRM that reduces loads (and thereby temperatures) per MOSFET. The underlying cooling solution is also improved, with a new vapor-chamber plate, and a denser aluminium channel matrix.
Watt-to-Watt, the GTX 1080 Ti will hence be up to 2.5 dBA quieter than the GTX 1080, or up to 5°C cooler. The card draws power from a combination of 8-pin and 6-pin PCIe power connectors, with the GPU's TDP rated at 220W. The GeForce GTX 1080 Ti is designed to be anywhere between 20-45% faster than the GTX 1080 (35% on average).
The GeForce GTX 1080 Ti is widely expected to be faster than the TITAN X Pascal out of the box, despite is narrower memory bus and fewer ROPs. The higher boost clocks and 11 Gbps memory, make up for the performance deficit. What's more, the GTX 1080 Ti will be available in custom-design boards, and factory-overclocked speeds, so the GTX 1080 Ti will end up being the fastest consumer graphics option until there's competition.
Add your own comment

160 Comments on NVIDIA Announces the GeForce GTX 1080 Ti Graphics Card at $699

#101
Hotobu
Just posting for posterity in this quasi tragedy called a thread. LOL @ people who can't read the most basic of graphs and thought the original was reversed. Like... how hard is it to correlate a higher noise level with higher fan speed and lower temps?

What really sucks is that this embarrassment has to be on the main page. Hard to be a reputable site when a guy with genius (lol?) in his name goes through the trouble of fixing something that isn't broken. What's even worse is that some people probably STILL haven't figured it out.
Posted on Reply
#102
EarthDog
If 256bit handled it just fine, 352 will... :)
Posted on Reply
#103
dalekdukesboy
HotobuJust posting for posterity in this quasi tragedy called a thread. LOL @ people who can't read the most basic of graphs and thought the original was reversed. Like... how hard is it to correlate a higher noise level with higher fan speed and lower temps?

What really sucks is that this embarrassment has to be on the main page. Hard to be a reputable site when a guy with genius (lol?) in his name goes through the trouble of fixing something that isn't broken. What's even worse is that some people probably STILL haven't figured it out.
Also he originally misread the graph and was first to be confused by it ironically..."genius" guy that is. Site is very reputable, I don't think TPU is any less or more due to certain posters and hey, maybe they just didn't have their coffee before posting:).
Posted on Reply
#104
NTM2003
any Idea when pre order starts I keep refreshing the amazon page but nothing yet not even price drops
Posted on Reply
#105
Captain_Tom
londisteyou are right that amd should be able to do 50% over previous flagship.
however, amd's previous flagship was fury x. 50% on top of fury x would put performance at only slightly faster than gtx1080.

depending on where exactly they want vega to be it might not be enough.
Not at all true.

tpucdn.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/images/perfrel_3840_2160.png


That's 15%+ higher than a stock 1080, and that's using a list that includes many older games. All of this is just the bare minimum too. If you actually look at the architectural enhancements and leaked specs it could be as much as high as twice as strong - we just don't know yet.


The point is that the 7970 was practically twice as strong as the 6970, and the 290X was 50 - 65% stronger than the 7970. We should expect at least that much from AMD considering how long it has been.
Posted on Reply
#106
Hotobu
I was considering waiting for Vega, but now I don't see a reason to. Even if it does beat out this card I don't see that happening with a large enough performance increase or price difference to justify the wait.
Posted on Reply
#107
GhostRyder
NTM2003any Idea when pre order starts I keep refreshing the amazon page but nothing yet not even price drops
You need to buy from Nvidia's store. They are doing the FE versions first only on the NVidia store.

Still debating if I want to trade up.
Posted on Reply
#108
TheoneandonlyMrK
Wow ,i've a comment but I'm keeping it to myself, it errr lolz lots enough said.
Posted on Reply
#110
NTM2003
I want the ti but the normal 1080 is a major upgrade from my 960. but then again the price on the ti made me want that more now. cant wait I just hope my cpu will handle it I'm sure it will.
Posted on Reply
#111
Grings
I was expecting 799 (didnt last gen titan cost that?)

Annoyingly, Nvidia have finally gone for 1:1 $>£ currency conversion, so its £699, i was hoping it might be £649

Seems they took that opportunity to jack the price of the Titan from £1099 to £1179 too:(
Posted on Reply
#112
NTM2003
I thought they went around $1099 or $1199 when the 980ti was released
Posted on Reply
#113
Kyuuba
Perfect card to upgrade from 780ti.
Wallet ready to put one inside my case!
Posted on Reply
#114
efikkan
This is by far the greatest top consumer model Nvidia has released in ages, with a great price, 35% extra performance, outstanding efficiency and thermals, and still some decent overclocking headroom. Many of you dismiss this product, yet you create hype about Vega, which will not even compete with this one.
krukNo founders edition tax? Only $699? 11 GB of RAM? Does anybody else feed AMD tricked them into releasing this card so early with their Vega "reveal"? They could keep charging that much for the 1080 and get more profits.
Nvidia was not tricked, in fact GTX 1080 Ti was postponed, it was supposed to arrive end of 2016 but was delayed due to supply issues. Why are so many people complaining about the memory size/bus width? The memory controllers in a GPU are separate and work independently, you can have as many 32-bit controllers you want, it's not a technical problem.
qubitA slightly crippled GPU and a weird 11GB RAM on their top GTX? Now that's just fugly. :shadedshu: I'll wait for the reviews and Vega before buying, but this puts me off the card and might just stick to a 1080. The thing was plenty fast anyway.
And exactly how did the 11 GB memory put you off?
It's not like Vega is going to beat this anyway.
chr0nosmaybe another gtx970 memory fiasco :wtf:
How will this be a fiasco?
None of the memory controllers or chips are crippled in any way.
Aenra(off topic, but..)
Why the joy at no DVI?
Because the DVI-port is blocking ~30 of the exhaust of GTX 1060/1070/1080, even though it's not that useful any more.
Posted on Reply
#115
londiste
GringsI was expecting 799 (didnt last gen titan cost that?)
Annoyingly, Nvidia have finally gone for 1:1 $>£ currency conversion, so its £699, i was hoping it might be £649
Seems they took that opportunity to jack the price of the Titan from £1099 to £1179 too:(
nvidia has very little to do with that.
for £ prices, thank brexit. especially titan x one. nvidia actually has a press release or something about that, saying they needed to adjust prices to due £/$ rate changes.
for the rest - usd has been gaining a lot against other currencies, € is also almost 1:1 now.
efikkanThe memory controllers in a GPU are separate and work independently, you can have as many 32-bit controllers you want, it's not a technical problem.
not really separate controllers.
idea is correct though, memory controller can work with somewhat arbitrary width of memory bus. memory bus width increments are defined by the data bus width of a single memory chip, in case of gddr5(x) that is 32 bits.
nitpicky, i know. sorry. :)
Posted on Reply
#116
kruk
efikkanThis is by far the greatest top consumer model Nvidia has released in ages, with a great price, 35% extra performance, outstanding efficiency and thermals, and still some decent overclocking headroom. Many of you dismiss this product, yet you create hype about Vega, which will not even compete with this one.
Man, you sure know a lot of things about the card which wasn't released and benchmarked yet. And Vega. Do you work in/for GPU industry? Just curious, not accusing you of anything ...
Posted on Reply
#117
Kyuuba
I need an advice here, my monitor is 144 Hertz capable but in order to work at 144 Hz it needs the DVI port, will the DVI adapter found in the 1080ti package content allow my VG248QE to run at 144Hz ?
Posted on Reply
#118
efikkan
londistenot really separate controllers.
idea is correct though, memory controller can work with somewhat arbitrary width of memory bus. memory bus width increments are defined by the data bus width of a single memory chip, in case of gddr5(x) that is 32 bits.
nitpicky, i know. sorry. :)
I'm sorry, but that's incorrect.
Modern GPUs work by having multiple separate 32-bit memory controllers, each complete with their own ROPs. GTX 1080 Ti has one of these disabled, which is why it also has fewer ROPs. This is one of the nice modular features of modern GPUs.

When a cluster of cores wants to access a block of memory it addresses the respective memory controller.
Posted on Reply
#119
iO
Yeah all nice and fast but 819€ for a cut down GPU in a ref design is still not cool...
Posted on Reply
#120
efikkan
iOYeah all nice and fast but 819€ for a cut down GPU in a ref design is still not cool...
Why does it matter if it's "cut down"?
Isn't all that matters what it actually gives you?
If Nvidia needs to add more cores/controllers/etc. to get decent yields and keep the prices low, precisely how is this bad for the end user?
Posted on Reply
#121
the54thvoid
Super Intoxicated Moderator
Captain_TomNot at all true.

tpucdn.com/reviews/Gigabyte/GTX_1080_Aorus_Xtreme_Edition/images/perfrel_3840_2160.png


That's 15%+ higher than a stock 1080, and that's using a list that includes many older games. All of this is just the bare minimum too. If you actually look at the architectural enhancements and leaked specs it could be as much as high as twice as strong - we just don't know yet.


The point is that the 7970 was practically twice as strong as the 6970, and the 290X was 50 - 65% stronger than the 7970. We should expect at least that much from AMD considering how long it has been.
Using your own rational, the increase in performance from a 390X to Fury X was only 30%. Vega has the same core count as Fiji. So the arch tweaks and clockspeeds will be the difference. I can't see Vega being 100% faster than Fury X. Not even 75% faster. I'd love to be wrong but the history doesn't back it up.
Posted on Reply
#122
TheoneandonlyMrK
the54thvoidUsing your own rational, the increase in performance from a 390X to Fury X was only 30%. Vega has the same core count as Fiji. So the arch tweaks and clockspeeds will be the difference. I can't see Vega being 100% faster than Fury X. Not even 75% faster. I'd love to be wrong but the history doesn't back it up.
in some cases it will be 100% but how many being realistic ,all yawn and no action for me this Ti
Posted on Reply
#123
efikkan
If 1080 Ti is boring (even though it's the most exciting high end model in recent history), then Vega is going to bore you to death.
Posted on Reply
#124
jabbadap
BTW. there's now giveaway on nvidia site:
SIGN UP FOR GEFORCE EXPERIENCE AND GET REWARDED
Being a member of the GeForce Experience community means you can receive a ton of great giveaways—from game codes to graphics cards and more!

CHECK OUT OUR NEWEST GIVEAWAY!
Want to be one of the first to experience the power of NVIDIA's new flagship GPU: the GeForce® GTX 1080 Ti? We're giving away 108 GTX 1080 Tis to lucky members of GeForce Experience. All you need to do is download, log in and opt-in with GeForce Experience 3.0. This card, based on NVIDIA Pascal™ architecture, is packed with extreme gaming horsepower, next-gen 11 Gbps GDDR5X memory, and a massive 11 GB frame buffer.

WHAT DO I NEED TO DO TO BE ELIGIBLE?
Just download and log in to GeForce Experience 3.0 and then opt-in to communications from NVIDIA. We'll notify lucky recipients by email on March 7th.
Posted on Reply
#125
Fluffmeister
efikkanIf 1080 Ti is boring (even though it's the most exciting high end model in recent history), then Vega is going to bore you to death.
Indeed, all credit to the AMD fanboys for their patience waiting for that mythical beast.
Posted on Reply
Add your own comment
Nov 23rd, 2024 12:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts