Wednesday, February 17th 2021
NVIDIA Seemingly Begins Resupplying GeForce GTX 1050 Ti GPUs
In a move that speaks loads towards the current state of the semiconductor market, NVIDIA has apparently begun reseeding retailers with 5-year-old Pascal-based GTX 1050 Ti graphics cards. In some retailers (namely, Newegg), the card can still be found at $499, a vestige of tight supply since its discontinuation, and a result of the constrained GPU market. However, retailers that have received fresh supply of the 14 nm, 4 GB GDDR5-totting graphics card have it at $179 - still above the 5-year-old asking price at release, which was set at $140. The GTX 1050 Ti features a 192-bit memory bus and a whopping 768 shading units.
Resupplying this card means that customers looking at the lower-end of the spectrum now have a feasible alternative to non-existent solutions on the RTX 3000 series. Equivalent models in the 2000-series are also hard to come by, and marred by much higher pricing. The choice for the GTX 1050 Ti with its 4 GB GDDR5 bus isn't an innocent one; it actually skirts two problems with current-generation hardware. First of all, constraints with GDDR6 memory allocation, which is becoming a bottleneck as well for new graphics card manufacture on account on the increasing amount of chips employed in each individual card, as well as its deployment in latest-gen consoles. And secondly, the 4 GB VRAM is no longer enough for these graphics cards to fit in the current Ethereum mining workload fully into memory, which means they also skirt mining demand. It is, however, a heavy moment for the industry and for any enthusiast who wants to see the progress we have been so readily promised.
Sources:
Tech YES City @ YouTube, via Videocardz
Resupplying this card means that customers looking at the lower-end of the spectrum now have a feasible alternative to non-existent solutions on the RTX 3000 series. Equivalent models in the 2000-series are also hard to come by, and marred by much higher pricing. The choice for the GTX 1050 Ti with its 4 GB GDDR5 bus isn't an innocent one; it actually skirts two problems with current-generation hardware. First of all, constraints with GDDR6 memory allocation, which is becoming a bottleneck as well for new graphics card manufacture on account on the increasing amount of chips employed in each individual card, as well as its deployment in latest-gen consoles. And secondly, the 4 GB VRAM is no longer enough for these graphics cards to fit in the current Ethereum mining workload fully into memory, which means they also skirt mining demand. It is, however, a heavy moment for the industry and for any enthusiast who wants to see the progress we have been so readily promised.
50 Comments on NVIDIA Seemingly Begins Resupplying GeForce GTX 1050 Ti GPUs
Starting production of a new batch of old chips would require ~4 months for the chips + ~1 month for assembly, providing they had free production capacity, which they probably don't. Foundry production lines usually don't stop until they are worn out or are upgraded to a newer node. The demand for all the "16nm class" nodes is tremendous, and they are probably fully booked. I doubt Nvidia could get many extra wafers without paying someone else to give up their reserved capacity. Highly unlikely.
It's also illegal to market refurbished cards as new.
homegaming.That reminds me, why not bring the 1060 3 GBs back?
the 2060 were a much interesting option...
I would think AMD could shuffle back to Gloflo and work Polaris 30 (12nm LP) back in, and then release stuff like a RX 675 ($160) & RX 695 ($200) with nothing more than slight tweak to clocks.
I'm thinking the price could possibly be comedy.
"*Yawn* wake me up when consoles can play 4k ultra like the RTX 3090ti" they would say.
The flagship gpus, although profitable, have simply taken too much resources to sustain.
Now they look on like desperate peasants with their 1050ti while console gamers are playing with much more serious hardware.
C'mon Raja, you can do this....
The gen was crap. And it still is. Said it many times... RT is too costly.
I don't know if NVIDIA used the latest node to make it or not, but you can't just blanket call it "shit", like you're some kind of expert, which you're not. Besides being offensive with language like that, what do you know about the compromises that NVIDIA had to make due to the resources available to them? In particular, what the fab is able to offer them seems to be the biggest one. New, cutting edge nodes are usually fully subscribed and aren't able to push out as much volume as a more mature node, so they've gone with what's best at the time, given all the variables.
Hence, my comment stands that it doesn't justify releasing 5 year old tech, which by definition has to perform worse and have less features.
Offensive? When did you become a snowflake? Wow, man.
If you can get past your emotions... you may note my comment is aimed at the margins of Turing and therefore its potential in the market. Those dies are big and there isn't a lot of fab capacity, and on top of that, the node was a one-off for TSMC and for Nvidia. Pascal was made in much larger volumes on a cheaper node and it also doesn't contain RT cores.
Yes, I'm blanket calling it shit, like it was since release and like I'll always do. Its clear as day. The dies are too big and the gap with Pascal is too small. Why did Ampere leap ahead in absolute perf? The answer is because Turing was such a weak gen compared to Pascal. Ampere leaps not only on shader perf but also on RT perf. Ergo, the Turing performance delta per square mm is just too low to repeat.
As for expertise... you assume too much. Hence, this is why I correct your comment because it does justify their re-release of Pascal. I don't like it either, but doing more of Turing is a clear no. Plausible, but at the current rate I wouldn't abandon the idea. The demand is for graphics cards. Not RT enabled ones, particularly not in fact, because all cards are overpriced now. Adding cost is not an option. It is fast becoming viable to do something along the lines of restarting production on older nodes. You also have to consider the demand problem isn't new, it's been present for a half year now at least, and was already gearing up before that.
1. These were left over surplus chips nVidia is trying to offload onto customers who don't know any better
2. These cards are refrub/rebranded models being offloaded en masse by mining farms as they can no longer mine ethereum with 4gb memory
I suspect more and more it's the latter, as the timing is pretty much dead on for a mass dump of these cards onto the market from miners. They could literally just slap a new hs/fan/plastic case on the same pcb (or just the plastic case really) and re-sell it as a 1050ti. Avoid these trash cards at all costs, even MSRP they just aren't worth it for anything other than an EMULATOR machine.
I still need to replace my broken RX570 but I can wait until the end of the year when hopefully things will have settled down a bit. I refuse to pay anything over £600 for a RTX 3070, so until they come back down to something closer to the MSRP I'm gonna keep the money in my pocket.
I've already said that NVIDIA had to make compromises, so yeah, Turing isn't quite as good as it could have been and it's first gen RTX too, the reviews said so. It's not the end of the world though and they can still make decent cards out of it, so I think they should use those instead of Pascal. In the end, neither of us have all the facts in front of us to fully understand why NVIDIA made this decision, so we shouldn't be too judgemental about Turing.
The big performance leap also reflects further development of RTX as well as a better process node - and let's not forget the competition from AMD that simply wasn't there when Turning came out. NVIDIA wants to be top dog at almost any cost, so it's not that surprising that Ampere has a big performance uplift to make it as hard as possible for AMD to catch up, and so far they're winning. We'll see in a year or two if this still holds true, but I think it will.
Anyway, you worry too much about my emotions, which have nothing to do with it. Blanketing the whole Turing line as "shit" without knowing those fact is just being judgemental and a bit ignorant, really.