Monday, May 28th 2018

Gigabyte Introduces Their GeForce GTX 1050 3GB OC Video Card

We've come from rumors through somewhat disappointing listed specs on NVIDIA's latest GPU, and now, there's an actual AIB product hitting store shelves. Gigabyte seems to be the first NVIDIA partner out of the gates with their own version of NVIDIA's GeForce GTX 1050 3 GB video card - a card that's a murder of its original specs and has nonetheless generated more than its fair share of buzz among the tech crowds.

Whether or not performance is severely hampered by the 96-bit bus width of the new NVIDIA graphics card remains to be seen (it can't be good for performance though, now can it?) And even as NVIDIA increased core count and clockspeeds to compensate for the severe lack of memory bandwidth already, Gigabyte, naturally, introduced a small OC to their own version of the graphics card, allowing it to boost up to 1582 MHz (1417 MHz base, and 1556 MHz boost in gaming mode, 1442 MHz base, up to 1582 MHz boost in OC Mode. Gigabyte employs its Windforce 2X cooler with 2x 80 mm fans to keep the card cool and allow maximum boost capability. Connectivity-wise, there's 1x DVI-D, 1x HDMI 2.0b and 1x DisplayPort 1.4 port (up to three simultaneous displays are supported).
Source: Gigabyte
Add your own comment

22 Comments on Gigabyte Introduces Their GeForce GTX 1050 3GB OC Video Card

#1
Caring1
3% Boost much wow.
Although I'm sure they couldn't do much more with having to add the need for additional power to the card as it currently draws it's power through the PCI-e slot.
Posted on Reply
#2
dj-electric
This card was not murdered by anyone but the media itself, proclaiming its death before reviewers even had a chance to show its performance.
How about cutting some slack with those ridicules superlatives and being less dramatic and tilting about something like this?

Nvidia have proven time and time before that they can do magic with tight memory bandwidth cards. Prejudging feeds misconception and misinformation
Posted on Reply
#3
Space Lynx
Astronaut
Nvidia has 80% market share in gpu discrete gaming arena, they don't care how much we hate their illogic or bullying. They are swimming in all the money and driver optimization deals.
Posted on Reply
#4
Vya Domus
dj-electricThis card was not murdered by anyone but the media itself, proclaiming its death before reviewers even had a chance to show its performance.
Except it has nothing to do with that , this card is named "GTX 1050" yet it shares almost nothing with the original product. Nvidia is slowing but surely destroying the relevance of having a name scheme in order to confuse buyers. They are selling 1060s which aren't really 1060s and 1030s which aren't really 1030s and this is no exception. Introducing a new product under the name of an already existing one is more problematic than it's performance , that sparks true misinformation not prejudice.
dj-electricNvidia have proven time and time before that they can do magic with tight memory bandwidth cards.
No such thing as magic , you either use higher speed memory and fewer memory chips or vice-versa. This , they just decreased the memory bandwidth by removing one chip from the memory subsystem , that's no magic.
Posted on Reply
#5
dj-electric
Vya Domusthis card is named "GTX 1050" yet it shares almost nothing with the original product. Nvidia is slowing but surely destroying the relevance of having a name scheme in order to confuse buyers. They are selling 1060s which aren't really 1060s and 1030s which aren't really 1030s
Nvidia is the only one defining what is a GTX 1060, and what is a GT 1030. This card is named GTX 1050 3GB, and while it certainly may confuse people, that is its name. Weather it is meant solely for the purpose of confusion among customers - its up to debate and within speculation
Vya DomusNo such thing as magic , you either use higher speed memory and fewer memory chips or vice-versa.
Memory speed may correlate directly to performance, but it is never in a 1:1 ratio with actual performance. This is exactly why a card, or any product should be judged on its performance, and not on its paper specs. I've seen many people saying they will avoid the GTX 1060 6GB only because it has a 192bit bus, and not by any other reasonable reason.
Posted on Reply
#6
Readlight
There is no better choice if you canot aford gtx 1070. but they will haw problems to run best new games on ful monitor refresh rate or 4k tv like me. It can make better graphics than ps4.
Posted on Reply
#7
jabbadap
Caring13% Boost much wow.
Although I'm sure they couldn't do much more with having to add the need for additional power to the card as it currently draws it's power through the PCI-e slot.
Quite pointless putting 6-pin pcie connector on it, it would ruin purpose of this card and throw perf/w out of the window. By tpu full fat gp107 takes 60W on peak and 57W on average gaming. This one have one gddr5 ram less(~ -2W), which makes it possible to ramp up clock speeds a bit. But still, I don't really think that power consumption is no where near to that marketed 75W.
Posted on Reply
#8
Vayra86
dj-electricNvidia is the only one defining what is a GTX 1060, and what is a GT 1030. This card is named GTX 1050 3GB, and while it certainly may confuse people, that is its name. Weather it is meant solely for the purpose of confusion among customers - its up to debate and within speculation



Memory speed may correlate directly to performance, but it is never in a 1:1 ratio with actual performance. This is exactly why a card, or any product should be judged on its performance, and not on its paper specs. I've seen many people saying they will avoid the GTX 1060 6GB only because it has a 192bit bus, and not by any other reasonable reason.
Come on. First Nvidia releases a 1060 6GB and a 3GB variant. It becomes very clear to many that these are not just VRAM limited cards but also a 10% shader count gap. But even today, lots of people STILL miss it. They read 6GB and 3GB, conclude they don't need more than 3GB, and they're done.

Then, in the aftermath of Pascal we see this. A smaller VRAM variant of the 1050. Logic would suggest there is also a 10% shader count gap here, but no, this is a completely different tier of product we're looking at. This is not a 1050. Its a 1040. And the flak it receives is exactly because of that: the name is a bad and obviously misleading choice. You say up for debate, I say blatantly misleading. And it seems a lot of press agrees.

Nvidia has always been horrifying to get clarity on in the entire segment below x60 (and even x60 isn't safe, look at the GTX 660 OEMs). They keep screwing with everything and you never know what's what. Kepler refresh: we get a 750ti... that is in fact a Maxwell card. :kookoo: Mobile: there are warehouses full of misleading nomenclature, including cards that had new names but still carried architecture from years back. Surprisingly enough, they never do this in the midrange to high end segments, where buyers are actually looking at specs. In those cases they now differentiate with 'Max Q'... which effectively means buying full price for a card that performs a full tier below what you're seeing on the box - now its not shader count or VRAM but clocks and 'efficiency'.

You can say its just a name, but its not. This is a carefully crafted and constantly expanded encyclopedia of misdirection and everything is aimed at pushing consumers either to a higher end product or by handicapping them with an underperforming one 'because they chose the cheaper route' - the latter is the real world effect of these naming schemes. Its not Nvidia's fault, the consumer chose for himself... While in fact he just couldn't make a proper informed choice.
Posted on Reply
#9
dj-electric
You're getting a little confused here. The GTX 1050 has 2GB as default. This one increased VRAM by 50%.
Posted on Reply
#10
Vayra86
dj-electricYou're getting a little confused here. The GTX 1050 has 2GB as default. This one increased VRAM by 50%.
You're right. I had the 1050ti in my head. See? Its working

It also makes the naming EVEN worse with that bus width. Its a seemingly 'higher' model that will perform worse.
Posted on Reply
#11
dj-electric
No, that's just pure confusion on your end. The product does not have "Ti" anywhere. This product is somewhat of an upgrade over a regular GTX 1050 if anything.

Media generated anger > people light torches. This is exactly what i was referring to on my first post here.
There are hundreds if not thousands of models that are named very close to others in the world.
Working tools, cameras, cars and many others. Its people's job to do a proper market research and know what they are buying. A D3300 camera and a 3300D camera are very different
Posted on Reply
#12
jabbadap
Vayra86You're right. I had the 1050ti in my head. See? Its working

It also makes the naming EVEN worse with that bus width. Its a seemingly 'higher' model that will perform worse.
Have you seen benchmarks somewhere, or how do you know it will perform worse? Only word about it's performance comes from nvidia and it says it will be 10% faster than 2GB version. Yeah I know IHVs numbers should always be taken with grain of salt. But in pure fp32 grunt it will have 25% more. And yes it will be bandwidth limited more than 2GB version. But you have to remember one have to dial down settings very much with 2GB version too so it's obviously limited by pure fp32 performance too.
Posted on Reply
#13
B-Real
ReadlightThere is no better choice if you canot aford gtx 1070. but they will haw problems to run best new games on ful monitor refresh rate or 4k tv like me. It can make better graphics than ps4.
Can't really get your point. There is the GTX 1060 or RX 570-580 between the GTX 1050 Ti and the 1070/Vega56. And what can make better graphics than a PS4? A 1050 or 1050 Ti? Yes it can, and the PS4 costs around the GTX 1060 3GB.
dj-electricMedia generated anger > people light torches.
It doesn't change the fact that this card doesn't make any sense. Why make a card between the 1050 and 1050 Ti? There is ~20% difference between the 2. What is the main reason to release a card at the end of the generation which may be 10% faster than a GTX 1050?
Posted on Reply
#14
Vayra86
jabbadapHave you seen benchmarks somewhere, or how do you know it will perform worse? Only word about it's performance comes from nvidia and it says it will be 10% faster than 2GB version. Yeah I know IHVs numbers should always be taken with grain of salt. But in pure fp32 grunt it will have 25% more. And yes it will be bandwidth limited more than 2GB version. But you have to remember one have to dial down settings very much with 2GB version too so it's obviously limited by pure fp32 performance too.
I seriously doubt this cards reliability when it comes to min. FPS and stutter. A 96 bit bus is literally nothing and 128 bit was already pushing it. The core can still be pretty quick so if it IS faster than the 2GB version it also means it will be faster at the cost of consistency: it will do well in game A but suck donkey balls in another. The vanilla 1050 is already such a card, imagine this one.
dj-electricNo, that's just pure confusion on your end. The product does not have "Ti" anywhere. This product is somewhat of an upgrade over a regular GTX 1050 if anything.

Media generated anger > people light torches. This is exactly what i was referring to on my first post here.
There are hundreds if not thousands of models that are named very close to others in the world.
Working tools, cameras, cars and many others. Its people's job to do a proper market research and know what they are buying. A D3300 camera and a 3300D camera are very different
Let's agree to disagree. You're stating a utopian world of smart people that will never exist and I am telling you what I see and what many others experience, and how this marketing works in reality.

It doesn't mean you're wrong, but it also means that Nvidia is playing the game I'm saying they play. Its a similar discussion as the one many have seen with regards to GPP. One half keeps repeating they never care about branding, and the other half can support their argument with real world results that show it to be true ;)

What applies to you and to an ideal world does not apply to the rest of it, keep that in mind...
Posted on Reply
#15
dj-electric
B-RealIt doesn't change the fact that this card doesn't make any sense. Why make a card between the 1050 and 1050 Ti? There is ~20% difference between the 2. What is the main reason to release a card at the end of the generation which may be 10% faster than a GTX 1050?
It might make the GTX 1050 a bit less obsolete when its 2GBs get full.
Does 1070 Ti make sense? its 5% away from a GTX 1080
How about Vega 56? its within 10%
RX 560D is a story about trimming hardware as well
Take a peek 11 years back to X1950XT and X1950XTX

That's the nature of the beast
Vayra86It doesn't mean you're wrong, but it also means that Nvidia is playing the game I'm saying they play. Its a similar discussion as the one many have seen with regards to GPP. One half keeps repeating they never care about branding, and the other half can support their argument with real world results that show it to be true ;)

What applies to you and to an ideal world does not apply to the rest of it, keep that in mind...
I am not disagreeing whatsoever that this product could and probably should be named differently. Just saying that this isn't the worst thing in history, and there's probably not much we can do right now about it
Posted on Reply
#16
Vya Domus
dj-electricDoes 1070 Ti make sense? its 5% away from a GTX 1080
And it has it's own distinct name and product entry. I am sure you can tell the difference.
Posted on Reply
#17
newtekie1
Semi-Retired Folder
Vayra86I seriously doubt this cards reliability when it comes to min. FPS and stutter. A 96 bit bus is literally nothing and 128 bit was already pushing it. The core can still be pretty quick so if it IS faster than the 2GB version it also means it will be faster at the cost of consistency: it will do well in game A but suck donkey balls in another. The vanilla 1050 is already such a card, imagine this one.
Actually having a GTX 1050, I can say when playing games the memory bus is never maxed out. In fact, when I'm maxing out the GPU in something like GTX:V, the memory bus is usually at the 60-65% area.

The reality is a 128-bit memory bus is actually probably a little overkill for a 1050 and probably about right for a 1050Ti. What we have with the GTX 1050 3GB is a slightly slower GTX1050 Ti, so calling it a GTX 1050 makes sense. And at the same time, it probably isn't going to perform nearly as poorly as people believe.

And people need to stop obsessing about the memory bus size on nVidia cards. The reality is nVidia cards just do not need large memory buses to perform well. Hell, the GTX1070 has like 3 times the number of sharders as a 1050, but only twice the memory bus. That should tell you something.

Posted on Reply
#18
Vayra86
newtekie1Actually having a GTX 1050, I can say when playing games the memory bus is never maxed out. In fact, when I'm maxing out the GPU in something like GTX:V, the memory bus is usually at the 60-65% area.

The reality is a 128-bit memory bus is actually probably a little overkill for a 1050 and probably about right for a 1050Ti. What we have with the GTX 1050 3GB is a slightly slower GTX1050 Ti, so calling it a GTX 1050 makes sense. And at the same time, it probably isn't going to perform nearly as poorly as people believe.

And people need to stop obsessing about the memory bus size on nVidia cards. The reality is nVidia cards just do not need large memory buses to perform well. Hell, the GTX1070 has like 3 times the number of sharders as a 1050, but only twice the memory bus. That should tell you something.

A refreshing view on things, to me. And it also counters my experience with lower end Nvidia, coming from a GTX 660 mostly. Then again - 660 and more notably 660ti DID have a major bandwidth issue.

I guess I need to refer to my own sig.
Posted on Reply
#19
dj-electric
"give a man a GPU and he will use it to judge others, teach a man to review hardware and he will have graphics cards for a lifetime"

(yea i know its the other one, but i swear from personal experience, it is true)
Posted on Reply
#20
jabbadap
B-RealCan't really get your point. There is the GTX 1060 or RX 570-580 between the GTX 1050 Ti and the 1070/Vega56. And what can make better graphics than a PS4? A 1050 or 1050 Ti? Yes it can, and the PS4 costs around the GTX 1060 3GB.

It doesn't change the fact that this card doesn't make any sense. Why make a card between the 1050 and 1050 Ti? There is ~20% difference between the 2. What is the main reason to release a card at the end of the generation which may be 10% faster than a GTX 1050?
Nvidia's silly restriction for netflix 4k is 3GB of vRAM, so this one can show 4k netflix while 2GB version can't. But yeah couple of reasons why would they do this: a) there might be punch of perfectly working dies with one broken memory channel or b) memory channel might be working fine(it's very rare to have chip with defunct memory channels) but current gddr5 shortages makes memory prices high: with 12 vram chips they can make four cards instead of three and albeit selling it little less than gtx 1050tis saving made by less memory make it more profitable.
Posted on Reply
#21
B-Real
dj-electricIt might make the GTX 1050 a bit less obsolete when its 2GBs get full.
Does 1070 Ti make sense? its 5% away from a GTX 1080
How about Vega 56? its within 10%
RX 560D is a story about trimming hardware as well
Take a peek 11 years back to X1950XT and X1950XTX

That's the nature of the beast



I am not disagreeing whatsoever that this product could and probably should be named differently. Just saying that this isn't the worst thing in history, and there's probably not much we can do right now about it
The 1070 Ti didn't make sense either, and I also expressed my opinion about that when (and before) it was released. Vega 56 is a product of the other GPU supplier, not from the same company. Your example of the X1950XT-XTX is failing because they were released together. My main problem is that they release a GPU between the two entry level gaming GPUs at the END of the GPU generation. If there was variety of choice at the beginning (like the first 6 months or so), it would be nice. This way it sucks.
Posted on Reply
#22
AnarchoPrimitiv
newtekie1Actually having a GTX 1050, I can say when playing games the memory bus is never maxed out. In fact, when I'm maxing out the GPU in something like GTX:V, the memory bus is usually at the 60-65% area.

The reality is a 128-bit memory bus is actually probably a little overkill for a 1050 and probably about right for a 1050Ti. What we have with the GTX 1050 3GB is a slightly slower GTX1050 Ti, so calling it a GTX 1050 makes sense. And at the same time, it probably isn't going to perform nearly as poorly as people believe.

And people need to stop obsessing about the memory bus size on nVidia cards. The reality is nVidia cards just do not need large memory buses to perform well. Hell, the GTX1070 has like 3 times the number of sharders as a 1050, but only twice the memory bus. That should tell you something.

Calling the 3GB 1050 a "1050" only really makes sense if the 2GB 1050 was called a 1040....

It is confusing, don't forget, I'm sure some of these are bought by very uninformed parents and family members buying them as gifts, people just getting into building a PC and are too trusting....either way, I really don't understand any of the individuals here acting as apologists for Nvidia (unless you own stock)....what provokes someone to act as the self-appointed defender of an abstract corporate entity? I'm seriously asking...
Posted on Reply
Add your own comment
Nov 21st, 2024 10:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts