Monday, March 20th 2017

Gigabyte GeForce GTX 1080 Ti AORUS Xtreme Edition Graphics Card Detailed

It was only a matter of time before Gigabyte applied its custom works to the GeForce GTX 1080 Ti. The company has released some pictures of its upcoming AORUS Xtreme Edition - the company's take on what is the world's most powerful gaming graphics card ever released. As an AORUS branded card, the AORUS Xtreme will feature Gigabyte's Windforce (triple-slot, 3x 100mm fans) cooler with RGB lighting (16.8 million colors). Aiding its triple-fan cooling prowess is a direct copper contact with a 6-heatpipe design, as well as a built-in backplate.

The 1080 Ti AORUS Xtreme Edition only has a single VR-link HDMI port on its front corner (while the GTX 1080 had two). On the rear IO however, you'll find 2x HDMI ports (ideal for VR-link), 3x DisplayPort, and 1x DVI. No information on pricing or clock speed is available at the moment, though the card is expected to hit shelves mid-April.

Update: Clock speeds have been revealed by Gygabyte itself, and the card's OC Mode shows boost speeds of 1746 MHz and 1632 MHz base; while its Gaming Mode lowers those to 1721 MHz boost, and 1607 MHz base clocks.
Add your own comment

18 Comments on Gigabyte GeForce GTX 1080 Ti AORUS Xtreme Edition Graphics Card Detailed

#1
GhostRyder
I am really digging the new Gigabyte designs, however I say (In my opinion) its ruined by adding back the DVI as I think its much cleaner on the outputs without it.
Posted on Reply
#2
Joss
The plastic shroud covers the heatsinks too much, it should impede airflow somewhat.
Posted on Reply
#3
peche
Thermaltake fanboy
i just realized that i love the led sayin' GeForce GTX than any other name or brand...no brand, no fan stop led, nothing, Geforce GTX, adding nvidia's logo will be nice too,

Also Gigabyte's cooler, shroud should be metallic slim frame....also this is a big card, cant image weight,
Posted on Reply
#4
chodaboy19
Does the overlapping fan setup work without issues? Looks pretty interesting!
Posted on Reply
#5
dj-electric
Trusting GB's fans is like trusting farts.

Anybody who worked in a lab repairing parts and PCs knows what im talking about
Posted on Reply
#6
peche
Thermaltake fanboy
Dj-ElectriCTrusting GB's fans is like trusting farts.

Anybody who worked in a lab repairing parts and PCs knows what im talking about
well... i could say that about zotac or some asus cards, i have never faced problems with their cards, i do know about some issues on GTX 1000 series...
Posted on Reply
#7
alucasa
Overengineered for epeen?
Posted on Reply
#8
Prima.Vera
What's with the triple slots recently???
Posted on Reply
#9
Joss
Prima.VeraWhat's with the triple slots recently???
What's with the triple fans recently??? and the RGBs, and the over the board designs, and monster coolers for a 1060 that could do with passive... etc
the answer is: they don't design for what we want but for what they think we should want. The explanation is out of place here because it's political/Historical.
Posted on Reply
#10
DecanFrost
I just got the 1080 version of this and I have to see the cooling on this card is simply awesome. I had a MSI 980TI and that card was at max temp every game (84C locked) with the new card it sits at about 70C gaming.
Posted on Reply
#11
Ja.KooLit
DecanFrostI just got the 1080 version of this and I have to see the cooling on this card is simply awesome. I had a MSI 980TI and that card was at max temp every game (84C locked) with the new card it sits at about 70C gaming.
One reason that i know of is that pascal is much cooler chip than maxwell.
Posted on Reply
#12
ddferrari
GhostRyderI am really digging the new Gigabyte designs, however I say (In my opinion) its ruined by adding back the DVI as I think its much cleaner on the outputs without it.
Who cares how "clean" the end of the GPU looks? No one ever complained about DVI ports until the reference 1080 Ti omitted it. Now every monkey-see-monkey-doer out there is hopping on the "Get rid of the DVI port, man!" train, although they have no idea why. It makes no difference in a card that doesn't use the reference rear-exhaust cooler.

A lot of overclockable 1440p Korean monitors have only one input: DVI - which is why they're overclockable. So no, nothing is ruined by including it, and no, I'm not going to try to use an adapter and hope it can handle my 120Hz refresh rate. Spend a lot of time admiring the back your pc, do you?
Posted on Reply
#13
GhostRyder
ddferrariWho cares how "clean" the end of the GPU looks? No one ever complained about DVI ports until the reference 1080 Ti omitted it. Now every monkey-see-monkey-doer out there is hopping on the "Get rid of the DVI port, man!" train, although they have no idea why. It makes no difference in a card that doesn't use the reference rear-exhaust cooler.

A lot of overclockable 1440p Korean monitors have only one input: DVI - which is why they're overclockable. So no, nothing is ruined by including it, and no, I'm not going to try to use an adapter and hope it can handle my 120Hz refresh rate. Spend a lot of time admiring the back your pc, do you?
First of all, I have mentioned it plenty including how much better it looked when the newer R9 Fury line came out and how its really unnecessary. Second most people who don't want it is not only because it looks bad on the back of the card, but it also block airflow for a tech that has become outdated and unnecessary/unused on modern setups. Third, it makes it impossible to make a card single slot without physically modding the card. I like water cooling cards like others and some upgraded cards have better VRM sections/better yields for overclocking so are likely to be purchased even if the person is not using the cooler that comes with the card.

Most monitors in this day and age use HDMI or Displayport. DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions being supported. Just because there are some monitors out there still around or being purchased does not mean we should still support the old tech. Adapters exist for pretty cheap and are easily available if you want to keep using the outdated tech similar to how for awhile VGA was. I know it does not work on those "Overclockable" monitors but that is a tiny niche compared to the rest of the market who is using now HDMI or DP and how the industry is pretty much only supporting those two on modern monitors (some keep the connector available, however in many cases it can't even support the monitors full performance).

I stated it was my opinion that I did not like the look because of it and wished it was not there (Its called my opinion for a reason). Manufacturers do not need to keep an antiquated tech going forever and DVI's time is up.
Posted on Reply
#14
ddferrari
GhostRyderFirst of all, I have mentioned it plenty including how much better it looked when the newer R9 Fury line came out and how its really unnecessary. Second most people who don't want it is not only because it looks bad on the back of the card, but it also block airflow for a tech that has become outdated and unnecessary/unused on modern setups. Third, it makes it impossible to make a card single slot without physically modding the card. I like water cooling cards like others and some upgraded cards have better VRM sections/better yields for overclocking so are likely to be purchased even if the person is not using the cooler that comes with the card.

Most monitors in this day and age use HDMI or Displayport. DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions being supported. Just because there are some monitors out there still around or being purchased does not mean we should still support the old tech. Adapters exist for pretty cheap and are easily available if you want to keep using the outdated tech similar to how for awhile VGA was. I know it does not work on those "Overclockable" monitors but that is a tiny niche compared to the rest of the market who is using now HDMI or DP and how the industry is pretty much only supporting those two on modern monitors (some keep the connector available, however in many cases it can't even support the monitors full performance).

I stated it was my opinion that I did not like the look because of it and wished it was not there (Its called my opinion for a reason). Manufacturers do not need to keep an antiquated tech going forever and DVI's time is up.
"...most people who don't want it is not only because it looks bad on the back of the card"
Please. It's one more jack on the back of your computer. It doesn't look bad, or good... it's just there. I doubt anyone's date walked out when they spot that ghastly DVI port. Nor did those LAN party invites just dry up.

"...it also block(s) airflow..."
Not on the non-reference design. AIB cards don't exhaust out of the case. In fact, the reference designs, sans DVI port, are hitting 84° C and throttling in reviews, while the AIB cards with DVI are staying in the 60's. No reviewer has openly questioned the card's cooling ability because it has DVI.

"...for a tech that has become outdated"
Incorrect. Korean 1440p overclockable monitors are still very much current and available now on Newegg. They all use DVI exclusively. See: Crossover 2795QHD; Pixio PX277; QNIX QX2710; Overlord Tempest X2700C; etc. They are likely no smaller a niche than 4K owners at this time.

"...and unnecessary/unused on modern setups.
1440p @120Hz is still very much a "modern" setup, and many of those monitors use DVI-D. It is quite necessary.

"DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions."
It handles my 1440p 120Hz signal perfectly. As shown above, this is hardly old tech.

Thankfully AIB partners also disagree with your opinion of DVI, so I have nothing to prove here. But it rubs me the wrong way when someone basically states "I don't need it myself, so they should get rid of it all together". It's just so... Millennial.
Posted on Reply
#15
Caring1
ddferrari.... it rubs me the wrong way when someone basically states "I don't need it myself, so they should get rid of it all together". It's just so... Millennial.
So are generalizations :slap:
Everyone is entitled to their own opinion here, just don't insist yours is the only correct one.
Posted on Reply
#16
ddferrari
Caring1So are generalizations :slap:
Everyone is entitled to their own opinion here, just don't insist yours is the only correct one.
A factual debate on a forum?
Posted on Reply
#17
GhostRyder
ddferrari"...most people who don't want it is not only because it looks bad on the back of the card"
Please. It's one more jack on the back of your computer. It doesn't look bad, or good... it's just there. I doubt anyone's date walked out when they spot that ghastly DVI port. Nor did those LAN party invites just dry up.

"...it also block(s) airflow..."
Not on the non-reference design. AIB cards don't exhaust out of the case. In fact, the reference designs, sans DVI port, are hitting 84° C and throttling in reviews, while the AIB cards with DVI are staying in the 60's. No reviewer has openly questioned the card's cooling ability because it has DVI.

"...for a tech that has become outdated"
Incorrect. Korean 1440p overclockable monitors are still very much current and available now on Newegg. They all use DVI exclusively. See: Crossover 2795QHD; Pixio PX277; QNIX QX2710; Overlord Tempest X2700C; etc. They are likely no smaller a niche than 4K owners at this time.

"...and unnecessary/unused on modern setups.
1440p @120Hz is still very much a "modern" setup, and many of those monitors use DVI-D. It is quite necessary.

"DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions."
It handles my 1440p 120Hz signal perfectly. As shown above, this is hardly old tech.

Thankfully AIB partners also disagree with your opinion of DVI, so I have nothing to prove here. But it rubs me the wrong way when someone basically states "I don't need it myself, so they should get rid of it all together". It's just so... Millennial.
In order:
1: Uhh, well unfortunately it looks cleaner without it and allows for single slot modifications. My opinion is it does not look good on the card and it does not fit with what I would intend the card for and thats important to me!!!
2: Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands.
3: You can still purchase a monitor that only supports VGA, that does not make it modern. Just because a few monitors decide to only have that as a connection does not make it a necessity for everyone else. If you go online and picked a monitor out at random, its most likely going to have HDMI. Not to mention again adapters for those still with the older tech. But I would saying overclocking a monitor is more niche than 4k is... (Yes I am aware overclocking a monitor does not work with the adapters)
4: It is a modern resolution and refresh yes, but using an older tech and at that pretty much the max it will ever handle unless they decide to bring it back. Its the same concept VGA (D-SUB) went through.
5: Congrats, its basically at its limit so yea its modern in that term but everything else is dated.

AIB partners also build parts in quantity and all the upper Nvidia cards had the same outputs that included DVI so do the math. Also if your going to try and insult me and say I am being a "Millenial" by stating my opinion about a graphics card design, then I think you need to look in a mirror because:
A: Your stating your opinion is more valid than mine because you said it.
B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.

So if anything by your definition of a "Millennial" (Which is actually a term used mostly to cover people born in the early 1980's to early 2000's which by that definition I am) your are acting more that way than I am. Either way my point is proven, if you wish to continue arguing over an opinion I merely stated as my own not directed at anyone nor intending to insult anyone then by all means go ahead. You will almost guarantee receive 1 like for every comment from fluffmeister for your trouble.
Posted on Reply
#18
ddferrari
GhostRyderIn order:
1: Uhh, well unfortunately it looks cleaner without it and allows for single slot modifications. My opinion is it does not look good on the card and it does not fit with what I would intend the card for and thats important to me!!!
2: Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands.
3: You can still purchase a monitor that only supports VGA, that does not make it modern. Just because a few monitors decide to only have that as a connection does not make it a necessity for everyone else. If you go online and picked a monitor out at random, its most likely going to have HDMI. Not to mention again adapters for those still with the older tech. But I would saying overclocking a monitor is more niche than 4k is... (Yes I am aware overclocking a monitor does not work with the adapters)
4: It is a modern resolution and refresh yes, but using an older tech and at that pretty much the max it will ever handle unless they decide to bring it back. Its the same concept VGA (D-SUB) went through.
5: Congrats, its basically at its limit so yea its modern in that term but everything else is dated.

AIB partners also build parts in quantity and all the upper Nvidia cards had the same outputs that included DVI so do the math. Also if your going to try and insult me and say I am being a "Millenial" by stating my opinion about a graphics card design, then I think you need to look in a mirror because:
A: Your stating your opinion is more valid than mine because you said it.
B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.

So if anything by your definition of a "Millennial" (Which is actually a term used mostly to cover people born in the early 1980's to early 2000's which by that definition I am) your are acting more that way than I am. Either way my point is proven, if you wish to continue arguing over an opinion I merely stated as my own not directed at anyone nor intending to insult anyone then by all means go ahead. You will almost guarantee receive 1 like for every comment from fluffmeister for your trouble.
"Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands."

Take a quick peek at the chart below from TechPowerUp. No AIB review from any site I've read has stated that. I'll let you take a mulligan on that one if you'd like. ;)

GPU Temperature Comparison:

Idle/ Load/ Gaming Noise
46°C 71°C 33 dBA Gigabyte GTX 1080 Ti Xtreme Gaming (DVI)
53°C 72°C 35 dBA MSI GTX 1080 Ti Gaming X (DVI)
46°C 69°C 33 dBA ASUS GTX 1080 Ti STRIX OC (DVI)
34°C 84°C 39 dBA GTX 1080 Ti FE (No DVI)
42°C 84°C 39 dBA Titan X Pascal (No DVI)

www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/34.html

A: Your stating your opinion is more valid than mine because you said it.

No, I'm saying my opinion is more valid than yours because the entire AIB industry's actions support my view, and don't align with your assertions at all.
The only reason they omitted DVI on reference cards is because there are major heat/throttling issues (see above chart) and they need all the help they can get, not because DVI is dead. I doubt they'd bother including the adapters if it was as antiquated as you say.

Over half the world population now uses the internet- over 3.5 billion people. If a modest 1 in every 1,000 users has one of these monitors, that equates to 3.5 million out there worldwide. It's not nearly as "niche" as you claim.

B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.

And you're stating that they should stop supporting DVI-D because you think the port looks icky on the back of your case. Okay... (o_O)
The difference here is that your opinion is all about you, while mine represents me and the other +/- 3.5 million folks- still using modern and desirable monitors- who would be shafted.
So, yes: Millennial
Posted on Reply
Add your own comment
Jun 11th, 2024 11:21 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts