Saturday, May 7th 2016

NVIDIA GeForce GTX 1080 Specifications Released

After launching its shockingly fast (claimed) GeForce GTX 1080 and GTX 1070 graphics cards, NVIDIA posted specifications of the former. The two are based on NVIDIA's swanky new 16 nm "GP104" silicon, derived from its "Pascal" GPU architecture. The architecture is detailed in our older article, here. The GeForce GTX 1080 leads the pack, featuring four graphics processing clusters, holding 2,560 CUDA cores. The core runs at a scorching 1607 MHz, with a GPU Boost frequency of 1733 MHz. In one of its demos, NVIDIA overclocked this chip to over 2100 MHz, on its reference air cooling, and the GPU barely scraped 67 °C under stress. The GTX 1080 features a 256-bit wide GDDR5X memory interface, holding 8 GB of memory. The memory is clocked at 2500 MHz (10 GHz effective), working out to a memory bandwidth of 320 GB/s.

API support includes DirectX 12 (feature-level 12_1), OpenGL 4.5, and Vulkan. Display outputs include three DisplayPort 1.4 connectors, one HDMI 2.0b, and one dual-link DVI. The reference-design card is 10.5-inch long, and double-slot. It draws power from a single 8-pin PCIe power connector, and its typical board power is rated at 180W. With the GeForce "Pascal" family, instead of caving in to DirectX 12 native multi-GPU, NVIDIA developed its SLI technology further, with the new SLI HB (high-bandwidth) bridge standard. It's essentially a 2-way bridge in which both SLI fingers of the card are used. This doubles bandwidth between the two cards, allowing higher display resolutions, and multi-display setups between high-resolution monitors. The GeForce GTX 1080 will be available from May 27, 2016, starting at US $599. The $379 GTX 1070 specifications will be revealed closer to its June 10, 2016 market availability.
Add your own comment

103 Comments on NVIDIA GeForce GTX 1080 Specifications Released

#1
Xzibit
btarunrAPI support includes DirectX 12 (feature-level 12_1), OpenGL 4.5, and Vulkan. Display outputs include three DisplayPort 1.4 connectors, one HDMI 2.0b, and one dual-link DVI. The reference-design card is 10.5-inch long, and double-slot. It draws power from a single 8-pin PCIe power connector, and its typical board power is rated at 180W. With the GeForce "Pascal" family, instead of caving in to DirectX 12 native multi-GPU, NVIDIA developed its SLI technology further, with the new SLI HB (high-bandwidth) bridge standard. It's essentially a 2-way bridge in which both SLI fingers of the card are used. This doubles bandwidth between the two cards, allowing higher display resolutions, and multi-display setups between high-resolution monitors. The GeForce GTX 1080 will be available from May 27, 2016, starting at US $599. The $379 GTX 1070 specifications will be revealed closer to its June 10, 2016 market availability.
Sorry for pointing this out but I think its important if it doesn't support its features.

If you read the fine print. Its not DP 1.4 connectors they are DP 1.2 with DP 1.3 & 1.4 ready. They don't support some features I'm guessing
Nvidia 1080 Specs page1 - 7680x4320 at 60 Hz RGB 8-bit with dual DisplayPort connectors or 7680x4320 at 60 Hz YUV420 8-bit with on DisplayPort 1.3 connector.
2 - DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready.
3 - Recommendation is made based on PC configured with an Intel Core i7 3.2 GHz processor. Pre-built system may require less power depending on system configuration.
On the bright side its has HDMI 2.0b. It should be able to take advantage of HDMI version of DisplayPorts adaptive-sync (dynamic-sync) if Nvidia chooses to do so.
Posted on Reply
#2
the54thvoid
Super Intoxicated Moderator
XzibitOn the bright side its has HDMI 2.0b. It should be able to take advantage of HDMI version of DisplayPorts adaptive-sync (dynamic-sync) if Nvidia chooses to do so.
You have to wonder if the there can't be a driver hack in that instance to force adaptive sync on Nvidia cards. There has to people with the know how to do it.
Posted on Reply
#4
walker15130
XzibitIf you read the fine print. Its not DP 1.4 connectors they are DP 1.2 with DP 1.3 & 1.4 ready. They don't support some features I'm guessing
AnandtechOfficially the cards are being called “DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready,” the distinction being that the latter is not currently certified, though I suspect the DP 1.3/1.4 certification process may itself not be ready yet.
Posted on Reply
#5
badtaylorx
Hmmm, I wonder if it'll do * portrait with one card???
Posted on Reply
#6
MxPhenom 216
ASIC Engineer
Fuck me I need a job. I want that 1080.
Posted on Reply
#7
Tsukiyomi91
this card is one blazing beast. 2.1GHz on air cooling, barely touched 70C on load? I'm impressed.
Posted on Reply
#8
Darksword
We should all learn from history and wait for the Ti variant to come out instead.
Posted on Reply
#9
Fluffmeister
MxPhenom 216Fuck me I need a job. I want that 1080.
20 days and counting. :eek:
Posted on Reply
#11
Xzibit
walker15130Officially the cards are being called “DisplayPort 1.2 Certified, DisplayPort 1.3/1.4 Ready,” the distinction being that the latter is not currently certified, though I suspect the DP 1.3/1.4 certification process may itself not be ready yet.
That's believable for DP 1.4 that was released this March but for DP 1.3 that was released in 2014.
Posted on Reply
#12
TheGuruStud
XzibitThat's believable for DP 1.4 that was released this March but for DP 1.3 that was released in 2014.
Holding on to gsync are they?
Posted on Reply
#13
arbiter
TheGuruStudHolding on to gsync are they?
Not-so-Freesync is Optional part of the spec not required.
Posted on Reply
#14
TheGuruStud
arbiterNot-so-Freesync is Optional part of the spec not required.
Except that nvidia uses adaptive sync on mobile, so they're full of shit.
Posted on Reply
#15
Xzibit
arbiterNot-so-Freesync is Optional part of the spec not required.
If your going to be certified for DP 1.4 you have to be backwards compatible.
Posted on Reply
#16
arbiter
TheGuruStudExcept that nvidia uses adaptive sync on mobile, so they're full of shit.
First off you don't know its that and not g-sync still, module and all circuits could be with the gpu.
XzibitIf your going to be certified for DP 1.4 you have to be backwards compatible.
I guess you don't know what OPTIONAL means. Yes OPTIONAL means you can be certified for DP1.4 and no have adaptive sync.
Posted on Reply
#17
bug
XzibitThat's believable for DP 1.4 that was released this March but for DP 1.3 that was released in 2014.
Read carefully: it says the certification process may not be finalized yet. The specifications for both 1.3 and 1.4 are readily available.
Tbh, I didn't catch that until the third time I read the news myself.
Posted on Reply
#18
TheGuruStud
arbiterFirst off you don't know its that and not g-sync still, module and all circuits could be with the gpu.
You're a very anxious apologist. Sorry, it has been known for several months that mobile gsync officially uses adaptive sync.

The laptop panels are not special and have nothing added to them. They're just eDP.

Deal with it. Nvidia is not your pal.
Posted on Reply
#19
medi01
Even 970 at its $379 is in "way too expensive" category for me.
I can afford it, but can't justify buying it. (mid range card, FFS)

Yet, this might be a very good strategy for nVidia. I see hype everywhere.
1080 will likely create great halo effect. (it already does, despite lack of any real world benchmarks whatsoever)
AMD, judging from its Polaris 10 GPU size, can not compete vs these perf wise.

1070 specs are yet to be defined, apparently, waiting for AMD release, poised to spoil it.

Now, AMD's "we wanna go multi" approach might make perfect sense (better yelds, console like etc) but I don't see how it could help here and now.


So, so far no signs of AMD recovering even a bit (count me into pessimists) too bad for consumers. I wish I was wrong.
Posted on Reply
#20
arbiter
medi01Even 970 at its $379 is in "way too expensive" category for me.
I can afford it, but can't justify buying it. (mid range card, FFS)
Its pretty close to launch price of a 970, but little higher due to new proc node which add's cost on to gpu. Price is where it should be compared to card it replaced.
Posted on Reply
#21
TheLostSwede
News Editor
I think a lot of people that are complaining about the price of these cards forget the fact that you get not only a faster GPU, but also twice as much RAM on the 1070 and 25% more and faster RAM on the 1080, which doesn't come free for Nvidia. Memory tends to be a relatively consistent cost, although Nvidia is clearly making sure they get a bit of extra profit here as well, but even so, the prices aren't utterly crazy for what you get.
Posted on Reply
#22
medi01
arbiterIts pretty close to launch price of a 970
380$ is not pretty close to 330$.
TheLostSwedeprices aren't utterly crazy
Cause some yadayada "memory costs" you just made up. I see.
Cause 8Gb is such a big increase, over what, say, 390 has. Oh wait...
Posted on Reply
#23
TheLostSwede
News Editor
medi01Cause some yadayada "memory costs" you just made up. I see.
I didn't make shit up, 8GB is much more expensive than 4GB, that's a fact. It's also a fact that GDDR5X is going be far more costly, at least for now, due to limited supply compared to GDDR5. I guess you simply don't understand the cost of things, so if you want to be mad at Nvidia for charging more for these cards, then that's your problem.
Posted on Reply
#24
HumanSmoke
medi01Even 970 at its $379 is in "way too expensive" category for me.
Well that's the price of doing business with a new process node.
The cost per wafer (once yields are on par with 28nm) isn't prohibitively larger ($4800 per 16nmFF+ wafer vs $3500 for 28nm) which would only add around $4 per GPU - assuming the 333mm² (1:1.25 die ratio, 16.33mm * 20.4mm size) is accurate....but that is also assuming yields would be the same between a very mature 28nm process and a very immature 16nmFF+ process.....Which I highly doubt is the case at present.
The other major contributer would be overall chip design cost for a chip with a much higher transistor density and a couple of extra metal layers (at least):
But perhaps the biggest issue is cost. The average IC design cost for a 28nm device is about $30 million, according to Gartner. In comparison, the IC design cost for a mid-range 14nm SoC is about $80 million. “Add an extra 60% (to that cost) if embedded software development and mask costs are included,” Gartner’s Wang said. “A high-end SoC can be double this amount, and a low-end SoC with re-used IP can be half of the amount. If that’s not enough, there is also a sizable jump in manufacturing costs. In a typical 11-metal level process, there are 52 mask steps at 28nm. With an 80% fab utilization rate at 28nm, the loaded manufacturing cost is about $3,500 per 300mm wafer, according to Gartner.
At 1.3 days per lithography layer, the cycle time for a 28nm chip is about 68 days. “Add one week minimum for package testing,” Wang said. “So, the total is two-and-half months from wafer start to chip delivery.”
At 16nm/14nm, there are 66 mask steps. With an 80% fab utilization rate at 16nm/14nm, the loaded cost is about $4,800 per 300mm wafer, according to Gartner. “It takes three months from wafer start to chip delivery,” he added.
On top of that, it takes 100 engineer-years to bring out a 28nm chip design. “Therefore, a team of 50 engineers will need two years to complete the chip design to tape-out. Then, add 9 to 12 months more for prototype manufacturing, testing and qualification before production starts. That is if the first silicon works,” he said. “For a 14nm mid-range SoC, it takes 200 man-years. A team of 50 engineers will need four years of chip design time, plus add nine to 12 months for production.”
Comparing two chips made on two different process nodes - one very mature, one very immature - of vastly different complexity, and trying to justify the same end user cost is an exercise in futility. I doubt the BoM of the assembled cards differs too greatly, although 8Gbps GDDR5 IC's would carry a reasonable price premium over 7Gbps chips of half the density (4Gb rather than 8Gb for the GTX 1070).
Posted on Reply
#25
R-T-B
The 980 debuted in the same price segment. I'm not finding this any worse.
Posted on Reply
Add your own comment
Dec 26th, 2024 03:20 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts