Monday, March 6th 2017
GeForce GTX 1080 Ti Overclocked Beyond 2 GHz Put Through 3DMark
An NVIDIA GeForce GTX 1080 Ti reference-design graphics card was overclocked to 2062 MHz core, and 11404 MHz (GDDR5X-effective) memory, and put through the 3DMark suite. The card was able to sustain its overclock without breaking a sweat, with its core temperature hovering around 63°C. Apparently, the card's power-limit was manually set to 122%, to sustain the overclock. In the standard FireStrike benchmark (1080p), the card churned up graphics scores of 31,135 points, followed by 15,093 points in FireStrike Extreme (1440p), and 7,362 points in the 4K Ultra HD version of the benchmark, FireStrike Ultra. The card also scored 10,825 points in the TimeSpy DirectX 12 benchmark. Overall, the card falls within 30-40% performance of an overclocked GTX 1080.
Sources:
ChipHell, VideoCardz
72 Comments on GeForce GTX 1080 Ti Overclocked Beyond 2 GHz Put Through 3DMark
User A: "my card OC @ 2025mhz UUU!1!1!!!"
User B: " wow my card does not even goes up past 1706mhz"
User C: ".... i think User A is talking about boost ..."
User B: "oh .. well if so ... mine in boost goes up to 2100mhz..."
:rolleyes:
edit: yep after a second look at the pics : max boost frequency 2062mhz .... meaning a OC clock of 1531mhz (well if it does 2062 for 1531 that's a better boosting ratio than my 2100/2088 for 1706, although i keep my card at 1557 since it's not really underperforming for now)
BOOOORING errrr @Caring1 as much as i like you and your post ... i have to write : if the card perform better in gaming but slightly worse in benchmarks ... then it's the "winner" ... benchmark serve nothing but self contentment (well gaming too ... but you don't interact with a benchmark ...)
for me (non game) Benchmark is worth horse sh!t when it come to decide which card is better.
The game engines however...hmm. Most of them are unoptimised to fully use all resources of the CPU/GPU, but specially they are optimized by the way the wind blows. Or the cash flow from you know who... ;)
ah TPU GPU reviews use benchies for reviews? i always saw games :D (and heaven 4.0 sometime ... since heaven is a GPU hog ... well that one is a tad better.... for GPU reviews)
thought if the card A perform better in gaming than card B and worse in bench: Card A is the best one of the 2 ... (not literally speaking, if you focus on breaking benchmark score ... well yep you take Card B ... or a Titan XP/1080Ti ... )
let me re edit my post ... i should generalize ;)
IMO it depends on the game played which is better between the 1060 and 480.
Despite all said, there is another thing to thank, this war helped us to get a better price tag on several things....
I've already provided you with the Performance/FLOPs for the GPUs: You can't seriously claim that Polaris "closed most of the […] gap", that's insane.
And in terms of performance/watt it's even worse. GP104 is ~80% and GP102 ~85% more efficient than Polaris, which is a larger gap than with Maxwell. I'm pointing this out because AMD needs to solve all these problems before they can become competitive. It would be an achievement of the century to be able to do it in a single year! So please stop terrorizing everyone that's bursting your AMD bubble by pointing out the facts and realistic estimates for AMD products!
:toast:
What I don't get is why so many people are impress by the performance of the Titan/1080 Ti. I mean it's bloody 2017 and we are supposed to be impressed with BARELY acceptable 4K performance for $700?! I got a Fury @ 1135/525 a year ago for $310, and the FAR newer 1080 Ti is only ~70% stronger for 225% the money. Wake me up when we have ANY card that can do some 4K at 100Hz+ gaming...
At the same time, yes, I also await the day when a mid range card will push 4k, that will be quite an achievement. But that day will not be in 2017 or in 2018. Hopefully in 2019.
Efficiency, like price - is a design choice. Anyone can do it.
The fact is, you are sugar coating what hasn't come to market yet (Vega), because '1080ti' cant even do 4k/ultra/60 fps. The logic is lost here. Vega will NOT be able to push that setting either, I'll provide you that bit of realism right now.
Also, you are STILL double posting.