Monday, May 21st 2018

NVIDIA GeForce GTX 1050 3GB Memory Bus-width Confirmed: A Major Trade-off

NVIDIA's entry-level GeForce GTX 1050 launched in a new 3 GB variant, earlier this month, with 50 percent more memory than the 2 GB which the original GTX 1050 launched with. But there's a major catch that's relegated to the fine-print of the card's specifications on NVIDIA website, and something most NVIDIA AIC (add-in-card) partners won't particularly blare on their product packaging anywhere near as loudly as the memory amount, and that's memory bus width. The 3 GB GTX 1050 has 50 percent more memory than the original GTX 1050, but a 25% narrower memory bus, at just 96-bit.

When you look at a GTX 1050 3 GB graphics card PCB, you'll likely only find three 8 Gb (1 GB) memory chips, with one set of memory chip traces blanked out. It's not even like NVIDIA compensated for the narrower memory bus with higher memory clocks. The chips run at the same 7 Gbps as the original's, yielding just 84 GB/s memory bandwidth, compared to the original's 112 GB/s. The CUDA core count of the GTX 1050 3 GB is the same as the GTX 1050 Ti, with 768 CUDA cores, which twitch their thumbs as data is moved between the GPU and memory over Pony Express. Besides more CUDA cores, the GPU clocks are marginally higher, with 1392 MHz base and 1518 MHz GPU Boost, compared to 1354/1455 MHz of the original. NVIDIA, which recently sermonized the industry on "making products easier for consumers to identify" with its stillborn GPP, is once again caught concealing a major specification. To find it, you'll need to visit the product page of the GTX 1050, scroll all the way down to the specs sheet, and click on "view full specs" to reveal the memory bus width.
Add your own comment

43 Comments on NVIDIA GeForce GTX 1050 3GB Memory Bus-width Confirmed: A Major Trade-off

#26
newtekie1
Semi-Retired Folder
I'd be interested to see the performance of this card, I bet it will be better than a 2GB 1050 in most games(and not because of the extra memory). I know when I'm playing games on my GTX1050, the memory controller load is pretty much never over 65%, even on pretty demanding recent games like FC5. I mean, honestly, a 128-bit memory bus was probably a little overkill for a 1050 anyway.
Posted on Reply
#27
owen10578
Why is it nvidia keeps pushing out these bs misleading products? And then blare about trying to be transparent with GPP?? I mean wtf.
Posted on Reply
#28
Ruru
S.T.A.R.S.
cucker tarlson780Ti will destroy 1050. Even in 2018 games.
Absolutely.

Posted on Reply
#29
Adamkalnoki
I was looking at the picture whit the specs and thinking whats wrong whit it then i saw 96 bit memory interface :D
Posted on Reply
#30
geon2k2
So there are some defective chips which must be sold.
Nice, I think unless there is clearly written proof, all 1050 should be considered as 96 bit bus when making the buying decision and cost/performance should be adjusted accordingly.
If you are lucky to get a better one good for you, otherwise you get what you paid for.
Posted on Reply
#31
megamanxtreme
I always believed that the GTX 1050 Ti was the real 1050 and the 1060 3 Gigs is the real 1050Ti.
Either way, let's see how these cards perform, and hoping for an 1150 performing at the level of the 1060 3 Gigs.
Posted on Reply
#33
Caring1
ssdproNVIDIA concealed that memory bus width so bad - it is amazing how well they "concealed" it on the public specifications page. Only the most hardened investigative consumer would know to look at the product specifications buried deep inside something as complicated and little known as "the internet".
Unless it is marked clearly on the box, the average mom and pop buying junior an upgrade wouldn't know the difference.
Posted on Reply
#34
T4C Fantasy
CPU & GPU DB Maintainer
Caring1Unless it is marked clearly on the box, the average mom and pop buying junior an upgrade wouldn't know the difference.
very laughable when the kid hits the parent with the box calling them a loser
Posted on Reply
#35
newtekie1
Semi-Retired Folder
Caring1Unless it is marked clearly on the box, the average mom and pop buying junior an upgrade wouldn't know the difference.
Or likely care.
Posted on Reply
#36
Ruru
S.T.A.R.S.
But just why Nvidia, why? These aren't for gaming, and if someone needs a cheap Nvidia card, there's already GT 1030..
Posted on Reply
#37
Unregistered
dj-electricPrice and performance will decide if there's a market for the GTX 1050 3GB, and nothing else. It can have a 64bit memory controller for all i care.
People should stop obsessing over paper specs, this isn't 2008 anymore.
Besides, memory is the easiest to OC.
#38
Assimilator
Oh COME ON NOW. This blatant clickbait has become bloody fucking unconscionable, btarunr.

Let's look at what you wrote when AMD did the same thing for the RX 560:
brarunrThe phenomenon of Radeon RX 560 graphics cards with 896 stream processors is more widespread than earlier thought. It looks like RX 560 cards with 896 stream processors will be more widely available than the previously thought Greater China region; with AMD silently editing the specifications of the SKU to have either 896 or 1,024 stream processors, as opposed to the 1,024 it originally launched with. There are no clear labeling guidelines or SKU names to distinguish cards with 896 stream processors from those with 1,024.

The Radeon RX 560 and the previous-generation RX 460 are based on the 14 nm "Polaris 11" silicon, which physically features 16 GCN compute units (CUs), each packed with 64 stream processors. The RX 560 originally maxed this silicon out, with all 16 CUs being enabled, while the RX 460 has two CUs locked. The decision to change specs of the RX 560 effectively makes it a re-brand of the RX 460, which is slower, and provides fertile grounds for bait-and-switch lawsuits.
Do you see the difference between that article and this one? The fact that the RX 560 news post is almost entirely factual and devoid of FUD, while this GTX 1050 "news" is full of opinionated anti-NVIDIA half-truths and snide remarks? The fact that you obviously have an agenda? The fact that said agenda makes it clear that TPU cannot be relied upon to report impartially, and thus I might as well go to Wccftech or SemiAccurate to get news?

It takes a decade to build a reputation, and 5 minutes to destroy it. You're doing pretty well on the latter front.
Posted on Reply
#41
DeathtoGnomes
AssimilatorOh COME ON NOW. This blatant clickbait has become bloody fucking unconscionable, btarunr.

Let's look at what you wrote when AMD did the same thing for the RX 560:



Do you see the difference between that article and this one? The fact that the RX 560 news post is almost entirely factual and devoid of FUD, while this GTX 1050 "news" is full of opinionated anti-NVIDIA half-truths and snide remarks? The fact that you obviously have an agenda? The fact that said agenda makes it clear that TPU cannot be relied upon to report impartially, and thus I might as well go to Wccftech or SemiAccurate to get news?

It takes a decade to build a reputation, and 5 minutes to destroy it. You're doing pretty well on the latter front.
Instead of attacking the OP, why not gather your own facts and press releases and post your own news. Its pretty easy to point fingers and say someone else is doing this or that wrong without walking in their shoes first, might as well drive from the backseat too.

So feel free to crawl back in your hole, no one is stopping you. :toast:
Posted on Reply
#42
dj-electric
DeathtoGnomesInstead of attacking the OP, why not gather your own facts and press releases and post your own news. Its pretty easy to point fingers and say someone else is doing this or that wrong without walking in their shoes first, might as well drive from the backseat too.

So feel free to crawl back in your hole, no one is stopping you. :toast:
@btarunr here to post news, and he should stick to the most neutral point of view possible IMO. Just keeping it clean.
Now, it seems like @Assimilator is very passionate with his opinion on the matter but he isn't completely wrong.
It is no secret that bta doesn't really like nvidia atm..
btarunrFor this GPP assholery alone, my next VGA purchase will be an AMD Radeon, no matter how bad its performance/Watt is.
And i do agree that this anger might... will seems like... completely spill into news pieces. I get bta's anger, 100%, but i think that it is best to keep reporting neutral.
Posted on Reply
#43
eidairaman1
The Exiled Airman
qubitOne of the reasons I upgraded from the 780 Ti was because 3GB RAM just wasn't enough with some games, even at 1080p. It certainly had enough GPU power left over. The other reason was DX12 capability.

It should have been 4GB for this new 1050 or discontinue the card, in my opinion.
These are like OEM cards, (waiting for more to come here complaining they got a fake card that reads 4 GB or 6 GB when they only have 3GB)
Posted on Reply
Add your own comment
Dec 20th, 2024 12:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts