Monday, March 20th 2017
Gigabyte GeForce GTX 1080 Ti AORUS Xtreme Edition Graphics Card Detailed
It was only a matter of time before Gigabyte applied its custom works to the GeForce GTX 1080 Ti. The company has released some pictures of its upcoming AORUS Xtreme Edition - the company's take on what is the world's most powerful gaming graphics card ever released. As an AORUS branded card, the AORUS Xtreme will feature Gigabyte's Windforce (triple-slot, 3x 100mm fans) cooler with RGB lighting (16.8 million colors). Aiding its triple-fan cooling prowess is a direct copper contact with a 6-heatpipe design, as well as a built-in backplate.
The 1080 Ti AORUS Xtreme Edition only has a single VR-link HDMI port on its front corner (while the GTX 1080 had two). On the rear IO however, you'll find 2x HDMI ports (ideal for VR-link), 3x DisplayPort, and 1x DVI. No information on pricing or clock speed is available at the moment, though the card is expected to hit shelves mid-April.
Update: Clock speeds have been revealed by Gygabyte itself, and the card's OC Mode shows boost speeds of 1746 MHz and 1632 MHz base; while its Gaming Mode lowers those to 1721 MHz boost, and 1607 MHz base clocks.
The 1080 Ti AORUS Xtreme Edition only has a single VR-link HDMI port on its front corner (while the GTX 1080 had two). On the rear IO however, you'll find 2x HDMI ports (ideal for VR-link), 3x DisplayPort, and 1x DVI. No information on pricing or clock speed is available at the moment, though the card is expected to hit shelves mid-April.
Update: Clock speeds have been revealed by Gygabyte itself, and the card's OC Mode shows boost speeds of 1746 MHz and 1632 MHz base; while its Gaming Mode lowers those to 1721 MHz boost, and 1607 MHz base clocks.
18 Comments on Gigabyte GeForce GTX 1080 Ti AORUS Xtreme Edition Graphics Card Detailed
Also Gigabyte's cooler, shroud should be metallic slim frame....also this is a big card, cant image weight,
Anybody who worked in a lab repairing parts and PCs knows what im talking about
the answer is: they don't design for what we want but for what they think we should want. The explanation is out of place here because it's political/Historical.
A lot of overclockable 1440p Korean monitors have only one input: DVI - which is why they're overclockable. So no, nothing is ruined by including it, and no, I'm not going to try to use an adapter and hope it can handle my 120Hz refresh rate. Spend a lot of time admiring the back your pc, do you?
Most monitors in this day and age use HDMI or Displayport. DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions being supported. Just because there are some monitors out there still around or being purchased does not mean we should still support the old tech. Adapters exist for pretty cheap and are easily available if you want to keep using the outdated tech similar to how for awhile VGA was. I know it does not work on those "Overclockable" monitors but that is a tiny niche compared to the rest of the market who is using now HDMI or DP and how the industry is pretty much only supporting those two on modern monitors (some keep the connector available, however in many cases it can't even support the monitors full performance).
I stated it was my opinion that I did not like the look because of it and wished it was not there (Its called my opinion for a reason). Manufacturers do not need to keep an antiquated tech going forever and DVI's time is up.
Please. It's one more jack on the back of your computer. It doesn't look bad, or good... it's just there. I doubt anyone's date walked out when they spot that ghastly DVI port. Nor did those LAN party invites just dry up.
"...it also block(s) airflow..."
Not on the non-reference design. AIB cards don't exhaust out of the case. In fact, the reference designs, sans DVI port, are hitting 84° C and throttling in reviews, while the AIB cards with DVI are staying in the 60's. No reviewer has openly questioned the card's cooling ability because it has DVI.
"...for a tech that has become outdated"
Incorrect. Korean 1440p overclockable monitors are still very much current and available now on Newegg. They all use DVI exclusively. See: Crossover 2795QHD; Pixio PX277; QNIX QX2710; Overlord Tempest X2700C; etc. They are likely no smaller a niche than 4K owners at this time.
"...and unnecessary/unused on modern setups.
1440p @120Hz is still very much a "modern" setup, and many of those monitors use DVI-D. It is quite necessary.
"DVI is no longer an updated tech and cannot handle all the new techs and higher resolutions."
It handles my 1440p 120Hz signal perfectly. As shown above, this is hardly old tech.
Thankfully AIB partners also disagree with your opinion of DVI, so I have nothing to prove here. But it rubs me the wrong way when someone basically states "I don't need it myself, so they should get rid of it all together". It's just so... Millennial.
Everyone is entitled to their own opinion here, just don't insist yours is the only correct one.
1: Uhh, well unfortunately it looks cleaner without it and allows for single slot modifications. My opinion is it does not look good on the card and it does not fit with what I would intend the card for and thats important to me!!!
2: Aftermarket cards still blow out the back, not all the air like blowers but they still push some air that way and that is intrusive so the point still stands.
3: You can still purchase a monitor that only supports VGA, that does not make it modern. Just because a few monitors decide to only have that as a connection does not make it a necessity for everyone else. If you go online and picked a monitor out at random, its most likely going to have HDMI. Not to mention again adapters for those still with the older tech. But I would saying overclocking a monitor is more niche than 4k is... (Yes I am aware overclocking a monitor does not work with the adapters)
4: It is a modern resolution and refresh yes, but using an older tech and at that pretty much the max it will ever handle unless they decide to bring it back. Its the same concept VGA (D-SUB) went through.
5: Congrats, its basically at its limit so yea its modern in that term but everything else is dated.
AIB partners also build parts in quantity and all the upper Nvidia cards had the same outputs that included DVI so do the math. Also if your going to try and insult me and say I am being a "Millenial" by stating my opinion about a graphics card design, then I think you need to look in a mirror because:
A: Your stating your opinion is more valid than mine because you said it.
B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.
So if anything by your definition of a "Millennial" (Which is actually a term used mostly to cover people born in the early 1980's to early 2000's which by that definition I am) your are acting more that way than I am. Either way my point is proven, if you wish to continue arguing over an opinion I merely stated as my own not directed at anyone nor intending to insult anyone then by all means go ahead. You will almost guarantee receive 1 like for every comment from fluffmeister for your trouble.
Take a quick peek at the chart below from TechPowerUp. No AIB review from any site I've read has stated that. I'll let you take a mulligan on that one if you'd like. ;)
GPU Temperature Comparison:
Idle/ Load/ Gaming Noise
46°C 71°C 33 dBA Gigabyte GTX 1080 Ti Xtreme Gaming (DVI)
53°C 72°C 35 dBA MSI GTX 1080 Ti Gaming X (DVI)
46°C 69°C 33 dBA ASUS GTX 1080 Ti STRIX OC (DVI)
34°C 84°C 39 dBA GTX 1080 Ti FE (No DVI)
42°C 84°C 39 dBA Titan X Pascal (No DVI)
www.techpowerup.com/reviews/Gigabyte/GTX_1080_Ti_Xtreme_Gaming/34.html
A: Your stating your opinion is more valid than mine because you said it.
No, I'm saying my opinion is more valid than yours because the entire AIB industry's actions support my view, and don't align with your assertions at all.
The only reason they omitted DVI on reference cards is because there are major heat/throttling issues (see above chart) and they need all the help they can get, not because DVI is dead. I doubt they'd bother including the adapters if it was as antiquated as you say.
Over half the world population now uses the internet- over 3.5 billion people. If a modest 1 in every 1,000 users has one of these monitors, that equates to 3.5 million out there worldwide. It's not nearly as "niche" as you claim.
B: Your stating that because you own one of these niche monitors they need to continue to support it because it fancies you.
And you're stating that they should stop supporting DVI-D because you think the port looks icky on the back of your case. Okay... (o_O)
The difference here is that your opinion is all about you, while mine represents me and the other +/- 3.5 million folks- still using modern and desirable monitors- who would be shafted.
So, yes: Millennial