• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Seemingly Begins Resupplying GeForce GTX 1050 Ti GPUs

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.16/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
In a move that speaks loads towards the current state of the semiconductor market, NVIDIA has apparently begun reseeding retailers with 5-year-old Pascal-based GTX 1050 Ti graphics cards. In some retailers (namely, Newegg), the card can still be found at $499, a vestige of tight supply since its discontinuation, and a result of the constrained GPU market. However, retailers that have received fresh supply of the 14 nm, 4 GB GDDR5-totting graphics card have it at $179 - still above the 5-year-old asking price at release, which was set at $140. The GTX 1050 Ti features a 192-bit memory bus and a whopping 768 shading units.

Resupplying this card means that customers looking at the lower-end of the spectrum now have a feasible alternative to non-existent solutions on the RTX 3000 series. Equivalent models in the 2000-series are also hard to come by, and marred by much higher pricing. The choice for the GTX 1050 Ti with its 4 GB GDDR5 bus isn't an innocent one; it actually skirts two problems with current-generation hardware. First of all, constraints with GDDR6 memory allocation, which is becoming a bottleneck as well for new graphics card manufacture on account on the increasing amount of chips employed in each individual card, as well as its deployment in latest-gen consoles. And secondly, the 4 GB VRAM is no longer enough for these graphics cards to fit in the current Ethereum mining workload fully into memory, which means they also skirt mining demand. It is, however, a heavy moment for the industry and for any enthusiast who wants to see the progress we have been so readily promised.



View at TechPowerUp Main Site
 
 
And secondly, the 4 GB VRAM is no longer enough for these graphics cards to fit in the current Ethereum mining workload fully into memory, which means they also skirt mining demand.
If that's the reason, then its not a bad move to reduce the shortage for low-end gaming, though I'm surprised nVidia hasn't released a GTX 3050 4GB / increased production of 1650 Super's whose 4GB "limitation" is a plus right now, then add a "Ti" version with more memory post mining bubble. As for the 1050Ti, I used to own one. It's not bad for older games (runs games like Skyrim, Bioshock Infinite, Dishonored, Alien Isolation, etc, just fine at 1080p 60-110fps) and most Indie's though it shouldn't be more than $150.
 
Last edited:
Fun fact: In 2016 their realise MSRP was 139 USD :roll:
 
Meanwhile at ZOTAC (archived it since the OG post was deleted after shitstorm from the community):
1613562797532.png
 
Last edited:
"Equivalent models in the 2000-series are also hard to come by" -Naturally, since those models have never existed in the first place. The lowest 2000-series card is the 2060 which is incomparable to the 1050 Ti.

Anyway, quite interesting move - but also quite sensible in these times when the race for performance has been halted by supply issues.

I still have a passively cooled variant on the shelf in case my 5700 XT fails. It's still fine for some low spec gaming.
 
I can't believe NVIDIA went with such old technology. If they have to pull a trick like this, then how about at least going back just one gen to Turing? I've got a 2080 SUPER and it works fine. A low end version of this would have done the trick. Create a new low-ish end model called "RT 2040" or something with a suitably cut down GPU and people will buy it. Or heck, just make an "RT 3040" equivalent or something. Personally, I'd buy a card called "RT 3030" just because of the name, lol.
 
$500 for an 1050 card ????:kookoo:
I have the GIGABYTE GeForce GTX 1050 Ti and this card is no longer good for anything... runs new (AAA) games at ~20 fps in 1080p.
 
man here i was saving for a 3070 but all i can afford is a damn 1050 ti


thanks i guess
 
man here i was saving for a 3070 but all i can afford is a damn 1050 ti


thanks i guess

lol no way I paying that much for 5 year old tech that was slow as shit to start with.

Meanwhile at ZOTAC (archived it since the OG post was deleted after shitstorm from the community):
View attachment 188691
Isn't it ironic; on the card there stands "Zotac Gaming" and not "Zotac Mining"
 
It wasn't that long ago when we were selling GTX 1070s for about $150-170 a pop.

Never been that low in my country, not even used and abused (mining)....
 
lol no way I paying that much for 5 year old tech that was slow as shit to start with.


Isn't it ironic; on the card there stands "Zotac Gaming" and not "Zotac Mining"
Tags are even better.
 
It wasn't that long ago when we were selling GTX 1070s for about $150-170 a pop.
Indeed. Paid 150 CAN$ each for eleven 1070 cards exactly one year ago. I could easily double my money now, if i was that kind of human. But i bought them to mine, so until they die mine they will!
 
Is it possible that Samsung restarted the 14nm line or are those just refurbished miner cards that were doing work until now. Isn't it suspicious they showed up just at the right time when 4GB can't fit the DAG anymore. Yeah unless it costs 99$ it's a great way to waste money, just make a 199$ GTX1670 4GB and i would buy it. Just Refresh 2070 as 1670 4GB cutting all the mining crap out of it and i would take it so i can press my little comfort buttons in games like a 40y.o baby for a little wile longer. They are doing it right now 1660 super based on TU 106, but it's too stupid to disable so much if the chip.
 
I can't believe NVIDIA went with such old technology. If they have to pull a trick like this, then how about at least going back just one gen to Turing? I've got a 2080 SUPER and it works fine. A low end version of this would have done the trick. Create a new low-ish end model called "RT 2040" or something with a suitably cut down GPU and people will buy it. Or heck, just make an "RT 3040" equivalent or something. Personally, I'd buy a card called "RT 3030" just because of the name, lol.

Binning i guess..

They probably also looked at what cards were the easiest and quickest to produce that they had the resources for and they found out they had a tonne of old 1050Ti chips.
 
Binning i guess..

They probably also looked at what cards were the easiest and quickest to produce that they had the resources for and they found out they had a tonne of old 1050Ti chips.
That. Besides, creating a new sku, or a new version of an old sku requires R&D on both nvidia's and the partners' sides, while using a completely old design does not.
 
Wow.... that's.... pretty depressing. knock the $400 off the Newegg price and that's about what a 1050ti is worth. :kookoo:
 
So 1050ti which performs the same 960 4GB which i got for 200 eur 5-6 years ago is being rereleased for 500 EUR. What the actual fuck ?

P.S. Just opened up my case, dusted my 960, called it a king, apologized that i wanted to replace it, encouraged it for 5 more minutes and prayed to lord and saviour Gaben that it keeps working.
 
Absolute waste of time and money, these cards aren't viable in 2021, they play modern games between 15-25fps on LOW! I believe these are just left over chips they had in storage, gamers should H A R D P A S S on this obviously, especially paying over price for a 5 year old card that can't play shit. This is not the right direction to go and does not solve the real issues at all.
 
The problem is the companies need to sell anything for the maintenance of jobs, profits etc., and not users to buy anything only because it´s the only hardware available on markets.

The absenteeism is not a choice, not now and not ever, until comes new chips from Nvidia and AMD, this is the best choice, very poorly, to keep things going.
 
Back
Top