• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

A mined 2080TI for 2 months,or an old card 1080ti used for constant gaming?

Status
Not open for further replies.
i have mined 20 or so cards for over a year 24/7.. every single one of them still worked like the day it was new..

most people in this thread do not have a clue as to what they are talking about at least i do..

trog
I have bought a number of pre-mined cards that range from doing okay to barely surpassing the base frequency. It's not like a temperature issue either, its just a silicon degraded so hard that the maximum voltage allowed gets it up to the base clocks.

I also mined a few cards for a year straight and they did show signs of slight degration to the maximum clock speed from before mining.

I ran mine at 75% though and 2 were completely water-cooled....

I'll take a chance on a mine card if it's like a fraction of the cost of another used card. So for 2080 Ti. $150 is the most I would pay.
 
I have bought a number of pre-mined cards that range from doing okay to barely surpassing the base frequency. It's not like a temperature issue either, its just a silicon degraded so hard that the maximum voltage allowed gets it up to the base clocks.

I also mined a few cards for a year straight and they did show signs of slight degration to the maximum clock speed from before mining.

I ran mine at 75% though and 2 were completely water-cooled....

I'll take a chance on a mine card if it's like a fraction of the cost of another used card. So for 2080 Ti. $150 is the most I would pay.

That is the same symptom of a GTX 780 rev. 2 (GK110B type) that I have here. My brother purchased it for relatively cheap during the mining craze. In this card's case, it's not that it was mined, is that the previous owner ran it since 2014 without opening it for maintenance, for years on end at 90C load until it started crashing. It's purely a silicon degradation problem, it's just not designed for very-long-term high-stress situations like this.

The only solution I've found for it is to reduce its clock speeds to around 550 MHz. The card is fully functional then, but performs probably slower than your average GTX 580. Better than nothing, but disgusting regardless.

Still a bit early to call that one. There seems to be just as much concern over everything before 4xxx being made obsolete as there is over eliminating the large share of gaming market who decided to wait things out. The former coming to fruition among the very top AAA titles is considerably more likely than the latter. 9xx would be a more fitting place to hang the legacy hardware tag.

Ultimately it comes right down to what games OP will be playing. What requirements need to be met long term.

See it from my perspective: this is a 6-year-old architecture which was designed at a time where none of these new technologies existed and most weren't even planned. Sure, it was a great performer, and one may argue that it still holds... holds on what? DirectX 11 games? Then yeah. Get out of that comfort zone, and it'll go belly up.

The 900 series GPUs share practically all of the same limitations and are already "dead men walking". Their 8 years (NV's lifecycle policy) have already elapsed, and it's a matter of time until Nvidia disables Maxwell support on the drivers.
 
The 900 series GPUs share practically all of the same limitations and are already "dead men walking". Their 8 years (NV's lifecycle policy) have already elapsed, and it's a matter of time until Nvidia disables Maxwell support on the drivers.
980ti still have support for latest driver.Latest driver is may 2, 2023....R9 Fury X latest driver is 2021...is in LEGACY league already....
Cheers and respect NVIDIA !!!! :toast:
 

Attachments

  • 980ti driver.jpg
    980ti driver.jpg
    72.1 KB · Views: 68
980ti still have suport latest driver.Latest driver is may 2, 2023....R9 Fury X latest driver is 2021...is in LEGACY league already....
Cheers and respect NVIDIA !!!! :toast:

The R9 Fury X had a bit of a premature end of support date. They waited until the card was 5 years old on the clock to pull the plug, but something tells me they wanted to get rid of it much earlier than that. AMD supported GCN 1 until the same end of support date, so Radeon HD 7970 owners got basically 9 years out of their hardware. That was a good run to some folks, but an equally awful run to others such as yourself. Radeon RX Vega is already older than R9 Fury X was when its driver support was discontinued, and AMD still updates it, likely due to Radeon VII being barely 4 years old at this point and the Vega architecture still being used in their APUs and semicustom business, so I guess it won't be going anywhere for now.

NVIDIA's official support is 8 years of game ready drivers, and then they pass the hardware into the legacy class. Beginning with Kepler they've also promised a few additional years of security fixes past that end of support date, the last security update driver for the GTX 600 and 700 series released March 30. The Maxwell cards (750 Ti/900 series) are around 9 years old at this point, so their time is essentially up.
 
Someone know how much many years of support drivers have Intel ARC GPU?I think i will go to blue camp. :)
 
Someone know how much many years of support drivers have Intel ARC GPU?I think i will go to blue camp. :)

It is a brand new product with very recent availability, I expect Intel to support it for quite some time. It's an option for you, by the way. RTX 3060-like performance, 16 GB... I think you may be pleased. Do keep in mind that it requires a newish PC which supports Resizable BAR - your X299 motherboard may or may not have received an update to support this. In any case, Intel has a compatibility check tool on their website which you can download and run.

By the way: Arc may still have some bugs, but development pace is fast and they are very approachable on their Discord server. Worth a try for you, I'd say.
 
I'd say neither to the 2080ti and 1080ti, not worth the hassle.
I think gaming is your passion and think of buying a gpu as an investment for years to come..
You get a gpu that is not touched and you have warranty if anything goes wrong with it..

I think some1 mentioned a 6800xt in this thread and personally it is a great card, if you want team green good gpu is the 4070, its solid but the price sucks big time.
Just save up a bit more :) not like your r9 fury is on the verge of death is it?
 
See it from my perspective: this is a 6-year-old architecture which was designed at a time where none of these new technologies existed and most weren't even planned. Sure, it was a great performer, and one may argue that it still holds... holds on what? DirectX 11 games? Then yeah. Get out of that comfort zone, and it'll go belly up.

Believe me I do see your perspective the world moved on for better or worse.

ARC may end up being recognized as a good long term choice.
 
@El-Presidente
Mining card can be in a better condition than gaming like @dgianstefani @demirael said.
Just make sure the vbios is not flashed.

And was 2080ti not used for anything before 2 months of mining? Was it unboxed, mined and then sold to you? Then it's ok.
 
@El-Presidente
Mining card can be in a better condition than gaming like @dgianstefani @demirael said.
Just make sure the vbios is not flashed.

And was 2080ti not used for anything before 2 months of mining? Was it unboxed, mined and then sold to you? Then it's ok.

Considered 2080 Tis are now 4 years old it's highly unlikely it was used for 2 months only. But while he was making up his mind the 2080 Ti sold. Good thing since I just can't trust it.

IMO if they're willing to try out ARC that might be the best option for them right now.
 
The Mining gpu has roughly 1440 hours on it, if the gamer played 8 hours a day that's only ~2900 hours.

The only real dif is the heating cooling cycles which the gaming card had plenty of and the mining card didn't.

So is it better to buy the card which has had many heating/cooling cycles or very few? (yes i saw the 2080 ti isn't an option anymore)
 
Just save up a bit more :) not like your r9 fury is on the verge of death is it?
Is dead already since 2 years ago when AMD decide to put this card in Legacy mode.But i can still play Diablo III on this card. :)
I played Diablo IV open beta on this card ,and is still a decent :)

 
A mining card that's been running at 60% TDP under steady state thermal load will be in better condition than a gaming card that's run overclocked at twice the power draw for the same amount of time.
When I mined, I had two GTX 1070s, I under clocked the core clock, cranked down the power limit, but did slightly OC the ram, cards were ran out in the open, temps way cooler than gaming temps.

Ironically I worked out the money I spent on mining hardware if I just flat invested it straight into currency I would have made far more, I ended up just about getting my money back.
 
Ironically I worked out the money I spent on mining hardware if I just flat invested it straight into currency I would have made far more, I ended up just about getting my money back.

Did you have to pay for power?
 
Did you have to pay for power?
Yes that was calculated into me saying just about got my money back.

If I invested direct instead, I would have had the coins right away pre value boost vs having to wait for the coins to be mined before hand. Plus on top of that no power usage costs.
 
The Mining gpu has roughly 1440 hours on it, if the gamer played 8 hours a day that's only ~2900 hours.

The only real dif is the heating cooling cycles which the gaming card had plenty of and the mining card didn't.

So is it better to buy the card which has had many heating/cooling cycles or very few? (yes i saw the 2080 ti isn't an option anymore)

heating and cooling cycles aren't anywhere near as much of a problem as the sheer amount of electrical current being pumped into the ASIC day and night. You're looking at practically a decade of running the hardware at high voltages and intense heat to degrade a chip like the mining workload will in a matter of weeks.
 
This is why eBay has buyer protection, and it is very much favoring buyers by the way vs the sellers - same with PayPal.

You got burned online once, so has just about everyone at one point or another.
The few times it happened to me was of course frustrating (I don't buy from non-established sellers either), but eBay/PayPal made it right.

What was far more frustrating was having things go south as a seller...

I had someone claim a package was never delivered (it was); the dispute was open with eBay for nearly a month and I ultimately had to get a claim from UPS.
Another person claimed a gift card I sold didn't work, immediate refund and loss to me no help from eBay or PayPal.

Buying anything used involves a certain level of risk.
That risk can be mitigated to an extent though if you buy something in warranty, from a reputable seller and on a platform with buyer protection.

There's nothing wrong with preferring to buy new, I do myself with most things.
Just because it's a preference though, doesn't mean buying used is a near guaranteed burn for someone on a budget.
I bought a GTX 780 once off ebay. The buyer sent it via USPS, but I don't think he paid for insurance. I never received the card and filed an investigative request w/the USPS. After MONTHS, they sent me the top of the box that had my name, address and the ebay auction number on the label. That was all the USPS had, the top of the box. I felt bad for the buyer, but was fully refunded by ebay. To this day, I think someone working at the USPS stole that videocard.
 
A mining card that's been running at 60% TDP under steady state thermal load will be in better condition than a gaming card that's run overclocked at twice the power draw for the same amount of time.
Preach. People don't understand thermal cycling on gaming cards and how terrible it is.

I have 3 1080tis that mined for I don't know how long, 4-5 years. All of them in pristine condition. Even the fans somehow, which I find surprising.

Also have a 3090 but that only mined for 16 months.
 
@fevgatos
But what about the memory degradation? Or is that just a myth about cryptomining?
If you keep memory temps under control, it's fine. If you don't, then yeah, stuff happens. My 3090 has degraded memory, but not anything noticeable. Nothing changed for daily usage, but for benchmarking it used to do +2100mhz on memory before mining, dropped to around 1800 after.

With that said I wouldnt buy a card that mined for like 2-3 years straight, even if temps were under control. Id be more worried about vrms than anything else.
 
This site telling the truth or not ? Rtx 2080Ti is too weak for i7 7820x on 3840x2160 resolution?

 

Attachments

  • 0.jpg
    0.jpg
    182.8 KB · Views: 81
  • 1.jpg
    1.jpg
    110.1 KB · Views: 57
  • 2.jpg
    2.jpg
    132.9 KB · Views: 64
This site telling the truth or not ? Rtx 2080Ti is too weak for i7 7820x on 3840x2160 resolution?


Bottleneck calculators are probably one of the oldest memes on the internet. Don't listen to any of that, it's rubbish.

That said, the i7-7820X is what might hold the 2080 Ti back in a few games. Not because it's weak, but because Skylake's getting long in the tooth and the high-end desktop variants (for LGA 2066) tend to be slower than their mainstream (LGA 1151 and 1200) counterparts at gaming. That 7820X is no state of the art CPU anymore, but it's no slouch either.

Honestly, just buy the Arc A770 like you suggested earlier instead of an old, beat-up 4 year old GPU that has likely been mined to death on, or the Radeon 6700 XT. Nvidia will not fit in your pocket this time around.
 
Dr. Dro does LGA 2066 even support resizable memory bar? Because if it doesn't it'll seriously nerf the performance of Intel Arc videocards.
 
Dr. Dro does LGA 2066 even support resizable memory bar? Because if it doesn't it'll seriously nerf the performance of Intel Arc videocards.

Yep, looks like they added support for OPs board back in 2021 w a BIOS update
 

Attachments

  • capture.png
    capture.png
    35.3 KB · Views: 64
I'm late but yea, @Double-Click beat me to it

Just gotta upgrade BIOS if it's not up to date
 
Status
Not open for further replies.
Back
Top