Wednesday, February 17th 2021

NVIDIA Seemingly Begins Resupplying GeForce GTX 1050 Ti GPUs

In a move that speaks loads towards the current state of the semiconductor market, NVIDIA has apparently begun reseeding retailers with 5-year-old Pascal-based GTX 1050 Ti graphics cards. In some retailers (namely, Newegg), the card can still be found at $499, a vestige of tight supply since its discontinuation, and a result of the constrained GPU market. However, retailers that have received fresh supply of the 14 nm, 4 GB GDDR5-totting graphics card have it at $179 - still above the 5-year-old asking price at release, which was set at $140. The GTX 1050 Ti features a 192-bit memory bus and a whopping 768 shading units.

Resupplying this card means that customers looking at the lower-end of the spectrum now have a feasible alternative to non-existent solutions on the RTX 3000 series. Equivalent models in the 2000-series are also hard to come by, and marred by much higher pricing. The choice for the GTX 1050 Ti with its 4 GB GDDR5 bus isn't an innocent one; it actually skirts two problems with current-generation hardware. First of all, constraints with GDDR6 memory allocation, which is becoming a bottleneck as well for new graphics card manufacture on account on the increasing amount of chips employed in each individual card, as well as its deployment in latest-gen consoles. And secondly, the 4 GB VRAM is no longer enough for these graphics cards to fit in the current Ethereum mining workload fully into memory, which means they also skirt mining demand. It is, however, a heavy moment for the industry and for any enthusiast who wants to see the progress we have been so readily promised.
Sources: Tech YES City @ YouTube, via Videocardz
Add your own comment

50 Comments on NVIDIA Seemingly Begins Resupplying GeForce GTX 1050 Ti GPUs

#26
xorbe
Probably card manufs in China are just setting up mining farms and not even sending cards at this point.
Posted on Reply
#27
efikkan
qubitI can't believe NVIDIA went with such old technology. If they have to pull a trick like this, then how about at least going back just one gen to Turing? I've got a 2080 SUPER and it works fine.
I'm pretty sure they used a surplus of chips they already had, it's not uncommon to do this at the end of a product's lifecycle, but normally this doesn't get much attention.

Starting production of a new batch of old chips would require ~4 months for the chips + ~1 month for assembly, providing they had free production capacity, which they probably don't.
ppnIs it possible that Samsung restarted the 14nm line
Foundry production lines usually don't stop until they are worn out or are upgraded to a newer node. The demand for all the "16nm class" nodes is tremendous, and they are probably fully booked. I doubt Nvidia could get many extra wafers without paying someone else to give up their reserved capacity.
ppnor are those just refurbished miner cards that were doing work until now.
Highly unlikely.
It's also illegal to market refurbished cards as new.
Posted on Reply
#28
1d10t
For the same amount of money I got AIB 5700XT last year, dang we really going away from home gaming.
Posted on Reply
#29
RJARRRPCGP
IIRC, in 2018, I saw a 1050 series for $220, the same as my 1060 3 GB in November, 2016!

That reminds me, why not bring the 1060 3 GBs back?
Posted on Reply
#30
Max(IT)
A few weeks ago there were rumors about Nvidia reintroducing 2060, but we found none in the stores

the 2060 were a much interesting option...
Posted on Reply
#31
Legacy-ZA
ST.o.CHThe problem is the companies need to sell anything for the maintenance of jobs, profits etc., and not users to buy anything only because it´s the only hardware available on markets.

The absenteeism is not a choice, not now and not ever, until comes new chips from Nvidia and AMD, this is the best choice, very poorly, to keep things going.
You know, they bitch about memory chip shortage and other components like VRM, capacitors, etc etc.. but really, if that is the case and they are so so scarce, they are worsening the problem then, yes? Why not use them to produce more of the latest series GPU's... Are people really this naive? it's obvious, THEY ARE LYING! They are just re-selling old stock for major profits.
Posted on Reply
#33
Casecutter
While we hate to see this, I see there is a proposition for gamers to bring back into the market such cards; although, yes... at reasonable prices. Stuff miner can't necessarily work with anymore from a perf/power, but still gives something good for 1080p/1440p, all while using unrealized wafer/fab production.

I would think AMD could shuffle back to Gloflo and work Polaris 30 (12nm LP) back in, and then release stuff like a RX 675 ($160) & RX 695 ($200) with nothing more than slight tweak to clocks.
Posted on Reply
#34
evernessince
On a side note, has anyone else noticed Newegg is filled with chinese sellers lately? For me the search results are garbage now, I look for gaming mice and a bunch of ripoff chinese brands come up. Somehow Amazon is better now and that's a low bar.
Posted on Reply
#35
lexluthermiester
DeathtoGnomesand @lexluthermiester is driving!!
And I'll get us there faster than a screaming banshee!!
evernessinceOn a side note, has anyone else noticed Newegg is filled with chinese sellers lately?
I generally select the "Shipped by Newegg" and "Sold by Newegg" options. Filters out all the crap sellers.
Posted on Reply
#36
TheoneandonlyMrK
They will be re releasing the 920 next, if they could get a graphics output out of toast they would be selling it again.

I'm thinking the price could possibly be comedy.
Posted on Reply
#37
ironwolf
I get 1-3 calls a day with people asking if I have any RTX 3000 series cards at all. It pains me, as a gamer myself, to tell them nope nope and nope and no ETA. :(
Posted on Reply
#38
Nihilus
ironwolfI get 1-3 calls a day with people asking if I have any RTX 3000 series cards at all. It pains me, as a gamer myself, to tell them nope nope and nope and no ETA. :(
I think it is hilarious.

"*Yawn* wake me up when consoles can play 4k ultra like the RTX 3090ti" they would say.

The flagship gpus, although profitable, have simply taken too much resources to sustain.

Now they look on like desperate peasants with their 1050ti while console gamers are playing with much more serious hardware.
Posted on Reply
#39
Thefumigator
Everyone will end up with intel discrete gpus

C'mon Raja, you can do this....
Posted on Reply
#40
Vayra86
qubitI can't believe NVIDIA went with such old technology. If they have to pull a trick like this, then how about at least going back just one gen to Turing? I've got a 2080 SUPER and it works fine. A low end version of this would have done the trick. Create a new low-ish end model called "RT 2040" or something with a suitably cut down GPU and people will buy it. Or heck, just make an "RT 3040" equivalent or something. Personally, I'd buy a card called "RT 3030" just because of the name, lol.
Because Turing was done on a shit node with pretty big dies for what they offered.

The gen was crap. And it still is.
NihilusI think it is hilarious.

"*Yawn* wake me up when consoles can play 4k ultra like the RTX 3090ti" they would say.

The flagship gpus, although profitable, have simply taken too much resources to sustain.

Now they look on like desperate peasants with their 1050ti while console gamers are playing with much more serious hardware.
Said it many times... RT is too costly.
Posted on Reply
#41
qubit
Overclocked quantum bit
Vayra86Because Turing was done on a shit node with pretty big dies for what they offered.

The gen was crap. And it still is.
I've got a 2080 SUPER and can tell you that there's nothing shit about it. It's first gen RTX, that's all, so it pushed the envelope and my card works very nicely, with no glitches at all and great temps.

I don't know if NVIDIA used the latest node to make it or not, but you can't just blanket call it "shit", like you're some kind of expert, which you're not. Besides being offensive with language like that, what do you know about the compromises that NVIDIA had to make due to the resources available to them? In particular, what the fab is able to offer them seems to be the biggest one. New, cutting edge nodes are usually fully subscribed and aren't able to push out as much volume as a more mature node, so they've gone with what's best at the time, given all the variables.

Hence, my comment stands that it doesn't justify releasing 5 year old tech, which by definition has to perform worse and have less features.
Posted on Reply
#42
Vayra86
Okay buddy, sorry if I hurt your pretty GPU ;)

Offensive? When did you become a snowflake? Wow, man.

If you can get past your emotions... you may note my comment is aimed at the margins of Turing and therefore its potential in the market. Those dies are big and there isn't a lot of fab capacity, and on top of that, the node was a one-off for TSMC and for Nvidia. Pascal was made in much larger volumes on a cheaper node and it also doesn't contain RT cores.

Yes, I'm blanket calling it shit, like it was since release and like I'll always do. Its clear as day. The dies are too big and the gap with Pascal is too small. Why did Ampere leap ahead in absolute perf? The answer is because Turing was such a weak gen compared to Pascal. Ampere leaps not only on shader perf but also on RT perf. Ergo, the Turing performance delta per square mm is just too low to repeat.

As for expertise... you assume too much. Hence, this is why I correct your comment because it does justify their re-release of Pascal. I don't like it either, but doing more of Turing is a clear no.
efikkanI'm pretty sure they used a surplus of chips they already had, it's not uncommon to do this at the end of a product's lifecycle, but normally this doesn't get much attention.
Plausible, but at the current rate I wouldn't abandon the idea. The demand is for graphics cards. Not RT enabled ones, particularly not in fact, because all cards are overpriced now. Adding cost is not an option. It is fast becoming viable to do something along the lines of restarting production on older nodes. You also have to consider the demand problem isn't new, it's been present for a half year now at least, and was already gearing up before that.
Posted on Reply
#43
nfineon
There is no way in hell they are wasting new wafers on this old ass chip/architecture there are only two viable options:

1. These were left over surplus chips nVidia is trying to offload onto customers who don't know any better

2. These cards are refrub/rebranded models being offloaded en masse by mining farms as they can no longer mine ethereum with 4gb memory

I suspect more and more it's the latter, as the timing is pretty much dead on for a mass dump of these cards onto the market from miners. They could literally just slap a new hs/fan/plastic case on the same pcb (or just the plastic case really) and re-sell it as a 1050ti. Avoid these trash cards at all costs, even MSRP they just aren't worth it for anything other than an EMULATOR machine.
Posted on Reply
#44
lexluthermiester
ironwolfI get 1-3 calls a day with people asking if I have any RTX 3000 series cards at all. It pains me, as a gamer myself, to tell them nope nope and nope and no ETA. :(
Pretty much the same at my store. NVidia and AMD both, but at least we're getting a few Radeons here and there.
Posted on Reply
#45
MrGRiMv25
I can get hold of GTX 1650/1660's in the UK, they're in stock in multiple places but I still wouldn't buy one when they're around £170-280 and much better than the 1050Ti's that are on the same sites going for the same amount of money.......

I still need to replace my broken RX570 but I can wait until the end of the year when hopefully things will have settled down a bit. I refuse to pay anything over £600 for a RTX 3070, so until they come back down to something closer to the MSRP I'm gonna keep the money in my pocket.
Posted on Reply
#46
efikkan
Vayra86Plausible, but at the current rate I wouldn't abandon the idea. The demand is for graphics cards. Not RT enabled ones, particularly not in fact, because all cards are overpriced now. Adding cost is not an option. It is fast becoming viable to do something along the lines of restarting production on older nodes. You also have to consider the demand problem isn't new, it's been present for a half year now at least, and was already gearing up before that.
If they were running new batches, then they would very likely be running some of the bigger dies which gives more profit per wafer.
Posted on Reply
#47
kapone32
CasecutterWhile we hate to see this, I see there is a proposition for gamers to bring back into the market such cards; although, yes... at reasonable prices. Stuff miner can't necessarily work with anymore from a perf/power, but still gives something good for 1080p/1440p, all while using unrealized wafer/fab production.

I would think AMD could shuffle back to Gloflo and work Polaris 30 (12nm LP) back in, and then release stuff like a RX 675 ($160) & RX 695 ($200) with nothing more than slight tweak to clocks.
If they want serve Gamers the best they could do is release cards with 4 GB of VRAM for 1080P Gaming. Having said that I wager that a large percentage of these are cards that are coming from the distribution market for $179 and the ones that are old inventory in retail channels are the uber expensive ones as the algorithms may not be smart enough to see that they are the same card.
Posted on Reply
#48
henok.gk
1050 Ti operates on 128 bit bus, Not 192 bit. Also whoever buys it for 500 bucks is lost beyond any help.
Posted on Reply
#49
qubit
Overclocked quantum bit
Vayra86Okay buddy, sorry if I hurt your pretty GPU ;)

Offensive? When did you become a snowflake? Wow, man.

If you can get past your emotions... you may note my comment is aimed at the margins of Turing and therefore its potential in the market. Those dies are big and there isn't a lot of fab capacity, and on top of that, the node was a one-off for TSMC and for Nvidia. Pascal was made in much larger volumes on a cheaper node and it also doesn't contain RT cores.

Yes, I'm blanket calling it shit, like it was since release and like I'll always do. Its clear as day. The dies are too big and the gap with Pascal is too small. Why did Ampere leap ahead in absolute perf? The answer is because Turing was such a weak gen compared to Pascal. Ampere leaps not only on shader perf but also on RT perf. Ergo, the Turing performance delta per square mm is just too low to repeat.

As for expertise... you assume too much. Hence, this is why I correct your comment because it does justify their re-release of Pascal. I don't like it either, but doing more of Turing is a clear no.
None of what you say actually negates my post, really.

I've already said that NVIDIA had to make compromises, so yeah, Turing isn't quite as good as it could have been and it's first gen RTX too, the reviews said so. It's not the end of the world though and they can still make decent cards out of it, so I think they should use those instead of Pascal. In the end, neither of us have all the facts in front of us to fully understand why NVIDIA made this decision, so we shouldn't be too judgemental about Turing.

The big performance leap also reflects further development of RTX as well as a better process node - and let's not forget the competition from AMD that simply wasn't there when Turning came out. NVIDIA wants to be top dog at almost any cost, so it's not that surprising that Ampere has a big performance uplift to make it as hard as possible for AMD to catch up, and so far they're winning. We'll see in a year or two if this still holds true, but I think it will.

Anyway, you worry too much about my emotions, which have nothing to do with it. Blanketing the whole Turing line as "shit" without knowing those fact is just being judgemental and a bit ignorant, really.
Posted on Reply
#50
Vayra86
efikkanIf they were running new batches, then they would very likely be running some of the bigger dies which gives more profit per wafer.
qubitNone of what you say actually negates my post, really.

I've already said that NVIDIA had to make compromises, so yeah, Turing isn't quite as good as it could have been and it's first gen RTX too, the reviews said so. It's not the end of the world though and they can still make decent cards out of it, so I think they should use those instead of Pascal. In the end, neither of us have all the facts in front of us to fully understand why NVIDIA made this decision, so we shouldn't be too judgemental about Turing.

The big performance leap also reflects further development of RTX as well as a better process node - and let's not forget the competition from AMD that simply wasn't there when Turning came out. NVIDIA wants to be top dog at almost any cost, so it's not that surprising that Ampere has a big performance uplift to make it as hard as possible for AMD to catch up, and so far they're winning. We'll see in a year or two if this still holds true, but I think it will.

Anyway, you worry too much about my emotions, which have nothing to do with it. Blanketing the whole Turing line as "shit" without knowing those fact is just being judgemental and a bit ignorant, really.
Fair enough. Good points
Posted on Reply
Add your own comment
May 15th, 2024 21:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts