• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Is It The 1080 TI The Best GPU Ever?

Is It The 1080 TI The Best GPU Ever?


  • Total voters
    147
  • Poll closed .
A few memorable cards of their time for me personally, either for outright crazy performance at the time, or value offered, special mention for efficiency.
  • GeForce 4 Ti 4200 - value and performance
  • Radeon 9700/9800 Pro - value and performance
  • GeForce 6800 Ultra - performance - this is my personal favorite GPU of all time, such a rich story to be told in that era
  • GeForce 6600GT - value
  • GeForce 8800 GTX/GT - performance / value
  • Radeon 4870 - value and performance
  • Radeon 5870 - value and performance
  • GeForce GTX 460 - value
  • GeForce GTX 750Ti - efficiency
  • GeForce GTX 970 - value
  • GeForce GTX 1080/Ti - performance and efficiency
  • GeForce GTX 1060 - value
  • GeForce RTX 3080 - value and performance (for those few who got it at launch at MSRP like me)
  • GeForce RTX 4090 - performance

It's totally subjective, but I mostly like this list and it highlights especially recent sad tendency with every next generation being harder to find value in anything other than top products (btw tending to last well for longer and longer) and value in general shifting toward scaling with raising prices, so total upselling. Or more like let's make the best flagship as we can and then add more and more dissapointments with every other model going down - to the point it's hard to imagine arrival of next rock solid midrange card with potential to became legendary like e.g. 1060. The worse is tendency already lasts for years (imo last solid lower-end offerings were 970, 1060 or 16XX Super cards), so it doesn't look like Nvidia and rest experimenting with offer obviously unattractive for us, but such selling well...
 
It's totally subjective, but I mostly like this list and it highlights especially recent sad tendency with every next generation being harder to find value in anything other than top products (btw tending to last well for longer and longer) and value in general shifting toward scaling with raising prices, so total upselling. Or more like let's make the best flagship as we can and then add more and more dissapointments with every other model going down - to the point it's hard to imagine arrival of next rock solid midrange card with potential to became legendary like e.g. 1060. The worse is tendency already lasts for years (imo last solid lower-end offerings were 970, 1060 or 16XX Super cards), so it doesn't look like Nvidia and rest experimenting with offer obviously unattractive for us, but such selling well...
2060 would've been an okay card with a lower price tag as well.
 
2060 would've been an okay card with a lower price tag as well.

I agree, but price bump and no VRAM uplift imo makes it not worthy successor of 1060. It's exactly the thing - we just don't get anymore in lower end of the stack cards being value performance-wise and overally solid, so without mentioned dissapointments like feeling low on VRAM, having insultingly narrow bus or cut-down PCIe. Cheap or heavily discounted Radeons used to be something recently, but sadly mostly as an answer to Nvidia's offer. Not AMD being nice guys giving you nice cards, but only ones seeming a little better than what Nvidia has proposed first. And I personally find a turn off lack of DLSS and RT capabilities no matter what range of cards we talk about.
 
4) Forced generations. We're well into the 4xxx series...but the 1660 and 3060 are still huge hitters...seems like they either have staying power...or the "new features" aren't ready to drive sales yet


As such the survey absolutely shows popularity, can give a soft representation of value, and shows what is "best that people can afford." That last one assumes a bit...but it's how companies like Nvidia and AMD can put price points to relative performance based upon real data. While it is by no means a 1:1 representation, there's good reason to conclude that the 3060 is a great performer at whatever its price point is...just based on how many are active. It pains me to say this...but sometimes the data is clear. I think the 3060 is overpriced for its relative performance...but the market states its the best current option. This is from someone whose most popular GPU is about half as represented as the 3060...but I'd still say that the 3060 is a better value to performance ratio than most other options on the market.
IMO I really don't think Steam survey is a valid image in your context. To put it simply because of prebuild systems that are present among Steam users and those prebuild systems I believe are in high % .

Let me explain when: grandparents or parents buy to their child a gaming PC they prolly get a prebuilt. Also if you take average person that have very little knowledge about a gaming PC will just pick a a PC gaming PC from the shelf not knowing that the PC gaming he just picked has a high end CPU(brain) with disability issues on eye sight(GPU and Screen), hands( not enough RAM and in single channel) and legs(slow HDD 5200RPM instead of 7200RPM) not enough cooling etc.

The seller will equip the prebuilt system with high end CPU but weak GPU, not enough RAM and maybe a bad PSU and case.
Why? Because the said GPU is highly available (1660, 3050,3060) but also to favorize future upgrades at high prices for that PC gaming they just sold, so is more profit to be made.
For example: in 2004-2009 those marketing practices around CPU only was highly present on the market: Intel core DUO that, latest Intel technology that and the rest, the support for that CPU was very weak.
I use to call the "Wheelchair PCs".

The marketing point is the around the high end CPU and some flashy leds. To overcome the GPU disability they will throw a cheap screen with HD resolution instead of FHD, that given GPU will not struggle in HD, users will see enough FPS in Fortnite and are happy, some of the users will see the weakness in their GPU and they will proceed, maybe, to upgrade if they have the funds and the courage but till than they will be surveyed by Steam showing cards like 1660 and 3060 - the fore mentioned "hitters".
Now those users might be able to afford 3070 instead of 3060 if, the PC Prebuilder chose to lower the unnecessary expenditure on the latest CPU, equip the Gaming PC with a more balanced CPU/GPU ratio.

In conclusion what you see on the Steam Survey is not what people can afford or GPU wants in their Gaming PC, but what is given to them.

Also, I really don't think, you can say 1660 and 3060 are the most popular but, what Companies like Dell or Cyber Power PC, etc, chose to equip the average Gaming Pcs with, in order to sell more and fast, make more profit and suck some more profit in the future trough the upgrades.
Those hitters makes more profit to them and are highly available is not necessarily the best deal for $.


Regarding 1080Ti > for me is Longevity

I'm happy with my 1080 Ti FTW3 even that deshrouding cost and thermal putty cost made it more expensive.
I bought it used for 425£ in Scalping times 2021 and sold my Zotac 1060 6GB with 215-230£. I have to add the cost of 3x Noctua 92mm + putty adds another 70£ minus 12 £ for selling the original shroud and fans.
However with the deshrouding and thermal putty instead of the leaky thermal pads I can safely HASH this card with VRAM in benchmarks and gamin being 5-7C lower than GPU temps.

Gaming at 1440 P 60-80 FPS any game I want including late titles, like AC Mirage with low temps, 50-60C

Longevity of this card is great, the only concern is is the high power draw compared with a 4070 for example.

Question: Anybody knows Watt /frame on the 1080 Ti ?
I believe is lower than of 2080 Ti which is 6.4W but, I would like to know the exact figures. Thanks.
 
Last edited:
I don't know if I responded to this thread earlier but I've scored 4 1080 Ti's for $150 or less in the past two years including a liquid cooled EVGA in mint condition for $75 from a 15 year old kid on Facebook Marketplace. The PC I'm on at the moment has a 1080 Ti FTW3 in it (I have two more of those as well) and I'll run it until it stops working. I'm not a gamer and using 50" Vizio M-series 4K TV's for monitors the 1080 Ti is great for web browsing and watching movies. While my Asus 4090 Tuff OC is 4x faster at video upscaling I don't have any reason to use that PC on a 24/7 basis like I do this one. I also have rigs with XFX 6800 XT & 6900 XT GPU's I use for A/V production. I will say the 1080 Ti has better image quality than the 16GB ATI Vega Frontier Edition I have stashed away as well as a XFX 6700 I used for a short while.

For the prices I paid the 1080 Ti's are stellar deals for use in systems that don't have PCIe 4.0. I have several former top-end GPU's going back to an ATI Radeon 9800 in my collection and nearly all of them are retired for good.
 
Question: Anybody knows Watt /frame on the 1080 Ti ?
I believe is lower than of 2080 Ti which is 6.4W but, I would like to know the exact figures. Thanks.
Unless you lock all variables like power limits, voltage limits, frequency limits, cooler/temp limits, etc. - it's impossible to do this reliably.
Also, due all cards boosting till end of V/F curve this WILL get biased to all cards being the same (assuming CPU is fast enough).
Have an attempt at it though (higher = better) :

metro-exodus-1080p-png.359024

Since this is FPS/W, you have divide 1 by number from graph ;)
 
1. 1080Ti and 9700 Pro
2. Rest
 
Do you guys remember when Nvidia dropped the $700 GTX780Ti?? Fully enabled GK100b silicon, faster than the OG Titan at $1000 but had half the Vram. I would say that is more impressive than even the 1080Ti was.
the world would loose its mind if nvidia dropped a fully enabled GB202 with all 24,320 cuda cores, 512bit bus, and 16GB VRAM at $1500.

even just a cut down version with half the vram GB202, 384bit, 24GB with 144SM's (18432 cuda cores) 144RT cores and call it the 5080Ti. price it at $1200. they will never give us a full fat GB202 die for a gaming card.
 
Question: Anybody knows Watt /frame on the 1080 Ti
Significantly worse than 2080 Ti. Cyberpunk, 4K "Ultra" preset (involves FSR: Quality).
1737997565477.png

48 FPS at ~260 W versus 35 FPS at ~240 W. Or 5.42 W per FPS VS 6.85 W per FPS.

In games like Alan Wake 2, the difference is even greater because GTX 1080 Ti lacks DX12_2 features and runs the game way worse than expected with its calculating power (mitigated by DXVK/DX11 modes but not ideal).

In all games where DLSS is available, one might consider 2080 Ti getting about 20 to 40 additional percent of edge over its predecessor because FSR is that much worse. DLSS Quality might prove acceptable at 1080p in some games, whereas you usually avoid FSR even at 1440p.

//XeSS is out of question because it runs like total garbage on Pascal GPUs
 
Anyone who didn't reply 8800 gtx is wrong.

First and only time we got aprox 100% performance uplift with a single gen. They made sure to not repeat that mistake.
 
Anyone who didn't reply 8800 gtx is wrong.

First and only time we got aprox 100% performance uplift with a single gen. They made sure to not repeat that mistake.
GeForce 6800 Ultra feels forgotten...
Also, you usually only get 2x perf. uplift if something in previous generation was seriously broken.

NV40 is "fixed" NV35, which was absolutely terrible in DX9 code.
G80 is first card that doesn't waste resources of Shaders, since they are now unified.

You also "skip" the problem of some series not launching big chip despite next having it (GTX 6xx vs. GTX 7xx), pretty sure full GF110 vs. full GK110 is close to 100% jump.

Lastly, testing on period correct hardware may not allow some card generations show their full potential.
 
GeForce 6800 Ultra feels forgotten...
Also, you usually only get 2x perf. uplift if something in previous generation was seriously broken.

NV40 is "fixed" NV35, which was absolutely terrible in DX9 code.
G80 is first card that doesn't waste resources of Shaders, since they are now unified.

You also "skip" the problem of some series not launching big chip despite next having it (GTX 6xx vs. GTX 7xx), pretty sure full GF110 vs. full GK110 is close to 100% jump.

Lastly, testing on period correct hardware may not allow some card generations show their full potential.

Agree. Also, full GT200 to full GF110 was double. but i guess just like with 600/700 series, it was more like 1.5 gen since we didnt get full Fermi chip with 400 series.

gtx280 crysis runs 4790k 4700.png
gtx580 stock crysis runs 4790k 4700.png
 
Good value back then. Great value used now. Enough performance for modern titles in 1080p or maybe higher.
I bought one for my main pc for €110 I play most things at medium to very high settings on a 1080p 180hz monitor. and I‘ll continue to use if for probably quite a while. The only thing I‘m missing compared to a rtx 4060 is dx12_2 and hardware raytracing and you definitely can’t get that one for €110 currently.

Is it the best card for everyone and should you sell your rtx card to get a 1080ti? Probably not, but is it an awesome option for the average gamer, especially on a budget, that has stood the test of time? I think so.
 
Saw this on youtube and I didn't even remember that 980 Ti is also still somewhat a capable card, being almost 10 years old.


Upgraded from 980 Ti to 1080 Ti like 4 years ago and both cards were dope.
 
Best card ever ? I would say no based on just performance Now if you ask me how does it do as in performance per dollar I'd say its right up there with the best .
 
Imo it's between the 8800 GTX and the GTX 1080 Ti

The 8800 GTX was 2x faster but 20% more expensive than the 7900 GTX ($600 vs $500) and with a much higher TDP (155W vs 84W).

The 1080 Ti was ~68% faster but 7,7% more expensive than the 980 Ti ($699 vs $649) and both had a 250W TDP.

Ps: the 6800 Ultra was also a lot faster (sometimes 2x in some games) than the FX 5950 Ultra but had almost the same TDP (81W vs 76W) and the same $500 MSRP


That era is long gone for sure, if only the RTX 5090 could be as amazing as any of them back then! :cry:
 
Last edited:
Imo it's between the 8800 GTX and the GTX 1080 Ti

The 8800 GTX was 2x faster but 20% more expensive than the 7900 GTX ($600 vs $500) and with a much higher TDP (155W vs 84W).

The 1080 Ti was ~68% faster but 7,7% more expensive than the 980 Ti ($699 vs $649) and both had a 250W TDP.

Ps: the 6800 Ultra was also a lot faster (sometimes 2x in some games) than the FX 5950 Ultra but had almost the same TDP (81W vs 76W) and the same $500 MSRP


That era is long gone for sure, if only the RTX 5090 could be as amazing as any of them back then! :cry:
6800 GT was a lot better choice since it had a full NV40 but slightly lower clocks. I've never heard of any GT which didn't OC to Ultra clocks.

Cut the 5090's price to 1/3 and then it may get a legendary status :rolleyes:
 
For me it's still my old MSI Geforce 4 ti 4600 128mb, still working on a Athlon XP 2500+ system with win98/winxp
 
Nobody who had that jaw dropping moment the first time they saw 3D accelerated graphics in the late 90s on a 3DFX Voodoo card (aware there were other accelerators, but they were all pants) could ever entertain the idea of any other graphics chip being considered "the best ever". The phrase "jaw dropping" is overused, but honestly in a world where software rendered Quake was the best graphics anyone had seen on a home PC when you saw GL Quake on a 3DFX card at 30fps+ all you could do was gawp in slack jawed amazement.

Yeah others had a longer service life, obviously were much more powerful, but have any made you genuinely shocked that a rendered image of that quality actually moves? Has anything changed the entire face of gaming like the 3DFX Voodoo?

It's hard to express to younger gamers how enormous that first leap was.
 
I do remember the insanity of putting a 3dfx Voodoo 3 2000 PCI into my unassuming compaq presario back in 1999. seeing games appear totally different in glide mode vs d3d. need for speed. tomb raider. it was the difference between a painting and a photograph. still have the card and original box too. and to think, it only cost $130!
 
I do remember the insanity of putting a 3dfx Voodoo 3 2000 PCI into my unassuming compaq presario back in 1999. seeing games appear totally different in glide mode vs d3d. need for speed. tomb raider. it was the difference between a painting and a photograph. still have the card and original box too. and to think, it only cost $130!
I really loved my Voodoo Banshee 16Mb before that I owned RIVA TNT 8mb and I swapped it for Banshee and yeah back then glide was way ahead of d3d....
 
First 3d graphics I had was a Diamond Stealth 3d 3000 4mb. Virge chipset coudnt do anything well really. First really awesome and jaw dropping graphics was with a Diamond V550 16mb agp Riva TNT card. Could run UT99, Quake2/3, and many others really well!
 
First 3d graphics I had was a Diamond Stealth 3d 3000 4mb. Virge chipset coudnt do anything well really. First really awesome and jaw dropping graphics was with a Diamond V550 16mb agp Riva TNT card. Could run UT99, Quake2/3, and many others really well!
Ahh my first 3d Graphics card was also Diamond but it was Diamond Viper V330 4mb and yeah well it was not that good but still it was a big difference compared to the software rendering
 
the diamond stealth.... i had one of those...

I ended up getting a voodoo banshee to play quake with glide and my mind was blown.
 
Cut the 5090's price to 1/3 and then it may get a legendary status :rolleyes:
I watched the video of High Yield and he says a GB202 chips cost $300-400 to make, then the 32GB GDDR7 about $350 too, and then there's the PCB, VRM, Cooler, Packaging, Shipping, etc. (without adding R&D into account), so a full 5090 probably cost Nvidia around $900 to manufacture.
So the chances of seeing a price drop is close to none unfortunately.

Start video around 13:00
 
Back
Top