# MSI GeForce GTX 1650 Gaming X 4 GB



## W1zzard (Apr 23, 2019)

The GeForce GTX 1650 is NVIDIA's latest Turing release, going head-to-head with AMD's Radeon RX 570. MSI's Gaming X is a premium rendition of this chip with a much improved cooler, idle-fan stop, and large overclock out of the box. Noise levels are amazing and make the card whisper quiet.

*Show full review*


----------



## Kaotik (Apr 23, 2019)

Why is VESA Adaptive-sync listed as positive? The card doesn't support it, it supports limited number of Adaptive-sync-displays


----------



## W1zzard (Apr 23, 2019)

Kaotik said:


> The card doesn't support it, it supports limited number of Adaptive-sync-displays


The "limited number of monitors" are tested by NVIDIA and marked as "compatible". You can still enable Adaptive Sync in the NVIDIA control panel manually, on all FreeSync monitors.


----------



## dirtyferret (Apr 23, 2019)

I know the reason for the six pin connector is if you want to OC the card but I thought the whole point of these GTX 1650s was for people who just want to stick one into their DELL, ACER, HP, etc., and not worry about swapping out a higher power PSU.


----------



## A.Stables (Apr 23, 2019)

Thanks for time reviewing, but you should compare the 570 overclocked if its head 2 head ; (and no reference 1650 to compare to either) its not like you've been waiting for drivers or anything. (and the 8gb variant 570 as its same price as this card)


----------



## Deleted member 178884 (Apr 23, 2019)

No wonder nvidia had the drivers nonsense with reviewers, mediocre performance and poor value in comparison to a RX 570.


----------



## Rowsol (Apr 23, 2019)

Sigh. I was hoping this would be a better value than a 1660 but as I feared, it's not.


----------



## dj-electric (Apr 23, 2019)

RX 570 4GB models going for less than 130$ on Amazon...

Donno about that one chief. This is a tough sell.


----------



## John Naylor (Apr 23, 2019)

Have to agree with Wiz's conclusion ... The MSI 1650 overclocked 17% so it kinda "catches" the Sapphire 570 in performance "one might say" ... not me tho as 99% of the time I read such a stament, it does not account for the fact that "the other card can be overclocked too" ..... just not as much as the 570.   With both OC'd, the 570 still has a 10% advantage.  And while we know w/o looking that it will have the edge in noise (29 / 31 dbA), power (80 / 180 watts) and heat loads (65 / 74 C)> I don't see it as a real attraction as those secondary considerations generally only come into play when performance differences are in single digits and here the 570 maintains  a10% edge w/ both cards OC'd.

It's niche if ya can call it that wil be folks without PSUs that provide 6 / 8 pin PCIE cables.   Those in areas of high electricity costs should take note.   At USA average of $0.11 and 30 hours a week or usage with a  Bronze PSU, it will cost an extra $18.42 per year, ($55.26 over 3 years) .  I pay $0.24 so for me that's $44.20 a year ($132.60 over 3 years) which makes the card almost free.

So while AMD doesn't have a horse in the race above  this level, it would seem they are well positioned to keep this one.


----------



## jabbadap (Apr 23, 2019)

yeah not a bad card, but that price is just too high for what you get. I'm more curious to see that Palit StormX performance numbers though, than this quite pointless highest end model for such a tiny gpu. @W1zzard apparently does not have any non-OC model either, or?

PS. Sorry to be an constant a**, but Hitman 2 got that dx12 patch on March update.


----------



## xkm1948 (Apr 23, 2019)

RX570 is way better than this. Really pointless card. It should not be priced higher than $100


----------



## THANATOS (Apr 23, 2019)

Xx Tek Tip xX said:


> No wonder nvidia had the drivers nonsense with reviewers, mediocre performance and poor value in comparison to a RX 570.


Against RX570 you can say everything has a poor value. Not a bad card, but lower cost would be better or higher performance If It had 1024cc instead of 896cc.


----------



## jabbadap (Apr 23, 2019)

John Naylor said:


> Have to agree with Wiz's conclusion ... The MSI 1650 overclocked 17% so it kinda "catches" the Sapphire 570 in performance "one might say" ... not me tho as 99% of the time I read such a stament, it does not account for the fact that "the other card can be overclocked too" ..... just not as much as the 570.   With both OC'd, the 570 still has a 10% advantage.  And while we know w/o looking that it will have the edge in noise (29 / 31 dbA), power (80 / 180 watts) and heat loads (65 / 74 C)> I don't see it as a real attraction as those secondary considerations generally only come into play when performance differences are in single digits and here the 570 maintains  a10% edge w/ both cards OC'd.
> 
> It's niche if ya can call it that wil be folks without PSUs that provide 6 / 8 pin PCIE cables.   Those in areas of high electricity costs should take note.   At USA average of $0.11 and 30 hours a week or usage with a  Bronze PSU, it will cost an extra $18.42 per year, ($55.26 over 3 years) .  I pay $0.24 so for me that's $44.20 a year ($132.60 over 3 years) which makes the card almost free.
> 
> So while AMD doesn't have a horse in the race above  this level, it would seem they are well positioned to keep this one.



Well the whole point of this gpu is on laptops: there it can really shine and I'm kind of sure it will win many design wins on that side of fence...

To seeing that 1080p performance here I'm quite sure nvidia could pair full tu117 with fast gddr6 and very well equal RX 570. But that would probably make it needing 6-pin connector and pushing stock gpu clocks over the optimum.


----------



## dirtyferret (Apr 23, 2019)

John Naylor said:


> It's niche if ya can call it that wil be folks without PSUs that provide 6 / 8 pin PCIE cables.



but it's not a niche.  The GTX 1050ti is the second most popular card on the steam survey (not all of them use a six pin connector) and the GTX 1050 is the third most popular card.  Every time I walk into best buy or costco, all I see are prebuilt PC's with the gtx1050.  The mass PC vendors will buy these cards up in large quantities as will those people who just want their kid to play fortnite.  Yes, DIY builders will pass it up for better cards but obviously not the market Nvidia is targeting with this.


----------



## ShurikN (Apr 23, 2019)

This card would have made sense if it was powered solely through pcie, as that fills a relatively large market. That would also be a nice advantage over a RX570. 
Putting a 6 pin on it while still being slower and more expensive than a 570 is kinda pointless. Would be interesting to see how it performs with default nVidia boost clocks, as those allow 75w max draw.


----------



## THANATOS (Apr 23, 2019)

xkm1948 said:


> RX570 is way better than this. Really pointless card. It should not be priced higher than $100


If It cost only $100, then It would be way better value than RX570, then would RX570 be the pointless card in your eyes? This is not a pointless card, but the cost could be a bit lower.

Many of you say how RX570 is way better because of price/performance but totally ignore It's much higher power consumption. You also need to pay the bill for that higher power consumption and in some countries It's high, so you shouldn't ignore this fact.


----------



## Fluffmeister (Apr 23, 2019)

Yeah not bad, 90% of the performance of a RX 570 at half it's power consumption is nothing to be sniffed at, but $150+ is a bit too steep.


----------



## Deleted member 178884 (Apr 23, 2019)

THANATOS said:


> Many of you say how RX570 is way better because of price/performance but totally ignore It's much higher power consumption. You also need to pay the bill for that higher power consumption and in some countries It's high, so you shouldn't ignore this fact


Really? If you're that worried about the power consumption I hope you're using expensive TV's and Fridges that are energy efficient too.


----------



## THANATOS (Apr 23, 2019)

I pay 0.12 euro/KWh so If I play 25 hours per week and 96 W is the difference between the cards during gaming I have to pay 15 euro more per year. If I bought a new card It would stay at least 2-3 years so RX570 would cost me 30-45 euro more on bills!
So I can't say RX570 is so much better than this card(gpu).


----------



## danbert2000 (Apr 23, 2019)

Well I feel better about putting a 570 in my SO's computer. She just wanted to play Civ 6 and clearly the 570 is still ahead. I'm surprised that Nvidia stuck to $150, I guess they're banking on brand and the unique ability to buy a card without power connectors to justify the bump over the more performant card. Also, I guess $130 is not the "typical" price for 570's.

This is going to be quite the laptop chip, though. A nice bump up from the de facto laptop price/performance king that is the 1050 Ti. And I'm guessing they could do a MaxQ version that would make for some impressive battery life during gaming.


----------



## 3125b (Apr 23, 2019)

Too expensive and too slow. A 4GB RX570 is under 120€ over the pond (like 99€ excl. taxes).
The only point of the 1650 is as a 75W heir to the 1050(Ti), and thats not very interesting at peast to me.


----------



## THANATOS (Apr 23, 2019)

Xx Tek Tip xX said:


> Really? If you're that worried about the power consumption I hope you're using expensive TV's and Fridges that are energy efficient too.


That's not the point and you know It.
You were saying how this card has a mediocre performance and poor value compared to RX570, I just pointed out a big disadvantage of RX570.

Some of you people are hilarious. You care how much you need to pay for a card or If It has the bet perf/price ratio but you totally don't care about how much more you need to pay on bills just to play games on this card.


----------



## Deleted member 178884 (Apr 23, 2019)

THANATOS said:


> You were saying how this card has a mediocre performance and poor value compared to RX570, I just pointed out a big disadvantage of RX570.


My point being it doesn't make a significant impact on your bill, and there's something called *undervolting* which works best on vega but it's worth a shot on the RX 570 even which will drop power consumption a bit and it still kills the 1650 in performance.


----------



## _Flare (Apr 23, 2019)

dirtyferret said:


> I know the reason for the six pin connector is if you want to OC the card but I thought the whole point of these GTX 1650s was for people who just want to stick one into their DELL, ACER, HP, etc., and not worry about swapping out a higher power PSU.



If you go over 5.5Ampere (66W) 12V with the PCIE-Slot you could fry your Mainboard.
Thats one Reason the GTX 1050 Ti Reference only uses about 51W.


----------



## THANATOS (Apr 23, 2019)

Xx Tek Tip xX said:


> My point being it doesn't make a significant impact on your bill, and there's something called *undervolting* which works best on vega but it's worth a shot on the RX 570 even which will drop power consumption a bit and it still kills the 1650 in performance.


If I can buy a faster RX570 for 130euro or GTX1650 for 160euro It looks like a simple choice, but If I add 45 euro to RX570 then It will end up 175 euro vs 160euro and that's not such a simple choice anymore. 
Undervolting won't help you to make up the difference of 96W just a part of It and you can't expect everyone to know how to undervolt their card or to bother doing It.


----------



## thomaskoelln (Apr 23, 2019)

"Radeon RX 570 is considerably faster" - All I needed to know.


----------



## Deleted member 67555 (Apr 23, 2019)

At the end it says the RX570 doesn't overclock that well...
Is that because it's locked at 1425mhz core?
I think 175mhz is more than "not that well".

Anyways I don't think it matters that this card isn't as fast as an RX570 since most of these will work on any machine that can fit it without any extra purchases which is the only reason Nvidia can charge what they are for this.

I paid $300 for higher end card in 2013 that required a 6pin an 8 pin power connection that was basically only as good as this card..
7 years later at 50% the cost and 4x as efficient seems to be mehhhh...mehhh well okay...price


----------



## sutyi (Apr 23, 2019)

jmcslob said:


> At the end it says the RX570 doesn't overclock that well...
> Is that because it's locked at 1425mhz core?
> I think 175mhz is more than "not that well".
> 
> ...



Well... to be honest any of the relatively current architectures don't clock all that well. Their boost algorithm does most of the job already and usually they are artificially power limited anyway...

Pascal or Turing might do +200MHz over advertised boost clock, but its still 10-15% more. The days of modding your 9500 and 9500Pro-s to 9700s are long gone mah dude.


----------



## Aldain (Apr 23, 2019)

What an utter piece of shat gpu..

and why the hell does this site not re-test all the cards with proper up to date drivers.. what is the point of the TPU database when it is obsolete??


----------



## advanced3 (Apr 23, 2019)

THANATOS said:


> That's not the point and you know It.
> You were saying how this card has a mediocre performance and poor value compared to RX570, I just pointed out a big disadvantage of RX570.
> 
> Some of you people are hilarious. You care how much you need to pay for a card or If It has the bet perf/price ratio but you totally don't care about how much more you need to pay on bills just to play games on this card.



If you're really worried about the cost of running a GPU over a couple of years, and its only $45....which is  $.04 a day, you might be in the wrong hobby.

People who buy exotic cares don't worry about how expensive the gasoline is.


----------



## W1zzard (Apr 23, 2019)

Aldain said:


> why the hell does this site not re-test all the cards with proper up to date drivers


i'll be ready to retest in the coming weeks, with a ton of new games, 9900k etc. the problem is that these things take time (around two weeks of non-stop work), and i can't tell nvidia "hey, hold your launch until i'm finished retesting".

the differences between drivers are almost negligible, as has been tested countless of times by many publications


----------



## danbert2000 (Apr 23, 2019)

W1zzard said:


> i'll be ready to retest in the coming weeks, with a ton of new games, 9900k etc. the problem is that these things take time (around two weeks of non-stop work), and i can't tell nvidia "hey, hold your launch until i'm finished retesting".
> 
> the differences between drivers are almost negligible, as has been tested countless of times by many publications



Does your new test regime include DX12 Civilization 6? Ever since Gathering Storm expansion released, it seems to be the fastest API.


----------



## W1zzard (Apr 23, 2019)

danbert2000 said:


> Does your new test regime include DX12 Civilization 6? Ever since Gathering Storm expansion released, it seems to be the fastest API.


Yes


----------



## rruff (Apr 23, 2019)

advanced3 said:


> If you're really worried about the cost of running a GPU over a couple of years, and its only $45....which is  $.04 a day, you might be in the wrong hobby.
> People who buy exotic cares don't worry about how expensive the gasoline is.



The 570 and 1650 are "exotic cars"?! It's more like a Jetta vs a Prius.

Nvidia will sell a ton of these to fit into bargain department store boxes. And laptops.


----------



## HenrySomeone (Apr 23, 2019)

advanced3 said:


> If you're really worried about the cost of running a GPU over a couple of years, and its only $45....which is  $.04 a day, you might be in the wrong hobby.
> 
> People who buy exotic cares don't worry about how expensive the gasoline is.


Not in the 150$ range... I mean yeah, many people don't consider the difference in electric bills, but they should, especially if they game quite a bit, besides, fully OCed 1650 beats the 570 and if you also OC that one, the already unreasonable power consumption (and thats compared to the Pascal cards) goes through roof!


----------



## Deleted member 158293 (Apr 24, 2019)

570 holding up well...  Still!


----------



## HossHuge (Apr 24, 2019)

Glad I didn't wait for this card.


----------



## steen (Apr 24, 2019)

Xx Tek Tip xX said:


> No wonder nvidia had the drivers nonsense with reviewers, mediocre performance and poor value in comparison to a RX 570.


 It's fairly obvious they needed to control first impressions reviews. They got bumped from driverless cars so switched to driverless cards.


----------



## Nima (Apr 24, 2019)

A little bit expensive for the performance. this GPU is choked by it's narrow memory bandwidth. they should have used GDDR6 on this GPU.



Xx Tek Tip xX said:


> My point being it doesn't make a significant impact on your bill, and there's something called *undervolting* which works best on vega but it's worth a shot on the RX 570 even which will drop power consumption a bit and it still kills the 1650 in performance.



I always hear this nonsense that "you can undervolt AMD cards" as if only AMD cards can do this. undervolting works on all cards including Nvidia. there is a thing called Power limit on every Nvidia card that works just like undervolting on AMD cards. Nvidia cards undervolt automatically to reach the specified power limit. my GTX1080 with overcolck on both core and memory and limiting power to 120 watts is almost as fast as stock clocked GTX1080 at 180 watts!!! and I'm pretty sure undervolting works just as good on GTX1650 too.


----------



## steen (Apr 24, 2019)

Nima said:


> A little bit expensive for the performance. this GPU is choked by it's narrow memory bandwidth. they should have used GDDR6 on this GPU.


MSRP would've been even higher.



> I always hear this nonsense that "you can undervolt AMD cards" as if only AMD cards can do this. undervolting works on all cards including Nvidia.


The effect is greater the higher up the frequency/power curve you are. AMD's older uarch has higher power draw given respective process node/target perf levels. NV buttons down power limits even more with Turing.


----------



## londiste (Apr 24, 2019)

@W1zzard do you have a reference/75W card without 6-pin connector to review?
There is Palit StormX in some of the tables, is that going to get a separate review?


----------



## notb (Apr 24, 2019)

dirtyferret said:


> I know the reason for the six pin connector is if you want to OC the card but I thought the whole point of these GTX 1650s was for people who just want to stick one into their DELL, ACER, HP, etc., and not worry about swapping out a higher power PSU.


Most 1650 will not have a 6-pin.
This MSI (much like other reviewed card from ASUS) are top models for this GPU - with factory OC and some headroom for you. Their predecessors (both 1050 and 1050Ti) also had a 6-pin despite 75W TDP.


Xx Tek Tip xX said:


> Really? If you're that worried about the power consumption I hope you're using expensive TV's and Fridges that are energy efficient too.


It's not about peak power usage of the household (although I know people who worry about it as well because of solar power or tariffs).
It's about pure cost of a card.
At some point (hours of GPU usage) the cost of extra power used will consume the initial price difference. It's not that hard to calculate as well.
Keep in mind power prices really vary around the globe. In many developed countries it's 50-100% more expensive than in US.

And if the final cost is the same, GTX gets you less noise and heat.
Extra 100W can really impact room temperature and it gets noticeable in summer.


Xx Tek Tip xX said:


> My point being it doesn't make a significant impact on your bill, and there's something called *undervolting* which works best on vega but it's worth a shot on the RX 570 even which will drop power consumption a bit and it still kills the 1650 in performance.


1650 is a mainstream consumer card. You can't expect these people to undervolt.
And if we're talking about OEM/SI desktops, you may not be allowed to.


----------



## W1zzard (Apr 24, 2019)

londiste said:


> do you have a reference/75W card without 6-pin connector to review?


I do not



londiste said:


> There is Palit StormX in some of the tables, is that going to get a separate review?


yup, coming within the next few hours, it's still overclocked out of the box, with 75 W power limit, so not exactly reference either


----------



## CheapMeat (Apr 24, 2019)

dirtyferret said:


> I know the reason for the six pin connector is if you want to OC the card but I thought the whole point of these GTX 1650s was for people who just want to stick one into their DELL, ACER, HP, etc., and not worry about swapping out a higher power PSU.



It gives you the opportunity to take your GPU with you when you swap out the case or to another system and boost it a little bit. Or some people have lower wattage budget total because of their OEM PSU, and OCing the card lets them stay within the power budget unlike another card. The option is great for some people because it gives you room to grow with the card without needing right away a more expensive one.  Just my opinion.


----------



## kings (Apr 24, 2019)

The card itself is fine, 35% faster than 1050Ti in 1080p, the thing that ruins it is the price! Way too high for this performance level


----------



## notb (Apr 24, 2019)

kings said:


> The card itself is fine, 35% faster than 1050Ti in 1080p, the thing that ruins it is the price! Way too high for this performance level


Yup. Very much the expected outcome. 30% more than the predecessor. Typical for Nvidia.
The card is great, but for me it's just way too long. I'm looking forward to Ventus tests.

That said, I just can't find a reason to buy a GPU without tensor cores in 2019 (since I'm doing way more GPGPU prototyping than gaming). But if my 1050 dies and I can't afford a 2060, this will be a very nice option (and also double the fps for occasional gaming... why not... ;-))


----------



## R0H1T (Apr 24, 2019)

What's the pont of advertising 75W "no power connector" needed when nearly every card released thus far needs it


----------



## londiste (Apr 24, 2019)

R0H1T said:


> What's the pont of advertising 75W "no power connector" needed when nearly every card released thus far needs it


Reviewers are getting the fastest versions of the cards. This is similar to the 1050Ti, where AIB cards could be up to 120W (although few had TDP set that high).
There will definitely be cards available with 75W TDP and no power connector.


----------



## R0H1T (Apr 24, 2019)

I get that but Nvidia is selling this as a feature, isn't it? This card is similar to the GTX 950 LP, not really a successor of the 750Ti lineage.


----------



## londiste (Apr 24, 2019)

R0H1T said:


> I get that but Nvidia is selling this as a feature, isn't it? This card is similar to the GTX 950 LP, not really a successor of the 750Ti lineage.


I would say the opposite. GTX 950 was not really the GTX 750Ti lineage with the primary TDP being set to 90W and some LP variants at 75W. Looks like GTX 750Ti's real successor was GTX 1050Ti and now GTX 1650. The difference is in where the primary TDP is placed. In case of GTX 750Ti, GTX 1050Ti and GTX 1650, it is 75W. All do have more hungry variants but this is where the primary one is located at. GTX 950 was different - the primary TDP is 90W with 75W LP as secondary option.

While almost all of us in the forums consider performance per $/€/£ the most important metric it is not the only one and not primary for everyone. 75W TDP with the implication that it can work on PCI-e slot power and has no need for 6-pin power cable is an important optimization point and indeed a feature for sales. Trying to get office machines, especially brand machines with weak or proprietary PSUs to get basic gaming worthy does bring in a lot of sales. Not to mention OEMs who love the same thing for mostly the same reasons. Low-profile cards are bound to follow and while small it is a curious niche.

GTX 1650 will be a strong player in laptops as well.

Edit:
I am sure the cards that actually are 75W will lose more to RX 570, guesstimate would be 15-20%. This will put the 75W GTX 1650 right in the middle of GTX 1050Ti and RX 570 in terms of performance. The problem is, when you look at 75W cards, GTX 1650 is going to compete with GTX 1050Ti - 120€ in Europe at the moment - while being 7% more expensive and offering 20% more performance. This is the primary competitor in this defined space. Next closest cards are GTX 1050 and RX 560 that are about equal at 20% lower performance. Both are selling at around 100€.


----------



## R0H1T (Apr 24, 2019)

The 750Ti has 60W TDP, not 75W as is the case with x50/Ti variants that followed. The 1650 as well as 1050Ti peak power consumption exceeds 75W a fair bit, now I do recall that the GTX 950 was cut down to fit 75W but didn't realize it was hacked off so much. Besides do you remember why the RX 480 was criticized early at launch, when the power consumption is this much at load buyers should be worried!
















The 6 pin power connector isn't just for show, the variants without them probably will have to be considerably slower so as not to exceed 75W peak power consumption.
Scratch that the* lesser* versions aren't that big of a deal, the only thing to look out for is the *large performance delta* between them especially if 1650Ti is also sold @*75W* TDP.


----------



## jabbadap (Apr 24, 2019)

Cards without 6-pin power connector gtx1050ti/gtx1050 does not exceed that even on peaks. Thus it shows on lower clocks and of course lower performance as you said. I think W1zzard have taken those "stock" gtx1050ti numbers from Palit KalmX review.

And that is the reason why I'm eager to see numbers for that lurking Palit StormX OC W1zzard has. As ugly as it looks that is more real gtx1650 what people are looking for than these out off spec monstrosities.


----------



## londiste (Apr 24, 2019)

R0H1T said:


> Besides do you remember why the RX 480 was criticized early at launch, when the power consumption is this much at load buyers should be worried!


According to several sites measuring the power consumption directly reference RX 480 was consuming 80W or more from PCI-e slot 12V. While the entire coverage over this was overblown, exceeding spec at stock is something that one should generally try to avoid.

The situation with GTX 1050Ti is not the same. As much as these were tested, none of the GTX 1050Ti models without 6-pin connector consumed more than the allowed 66W from PCI-e slot 12V. There are models with 6-pin connector that have higher TDP and do consume more power but they do so within spec. Of course, models without 6-pin connector are slower as they are more restricted by power. GTX 1650 seems to follow the exact same formula so far.

ASUS Strix variants of GTX 1050Ti and GTX 1650 are both overclocked, have increased TDP limit and both have 6-pin connector to feed the power. There will be a performance difference between base models at 75W and OC models at higher TDP. Palit StormX OC review should give a pretty good indication of how large that difference is.

Edit:
As you brought up GTX 1050Ti numbers checked the TPUs reviews for different cards (Card - Peak Gaming/Furmark - Average/Median):
- Asus Strix GTX 1050Ti - 83W/104W - 1740/1772MHz
- Palit KalmX GTX 1050Ti - 58W/54W - 1657/1683MHz
Asus had 5% performance lead over Palit which is honestly less than I was expecting although it matches the clock speed difference.


----------



## notb (Apr 24, 2019)

R0H1T said:


> What's the pont of advertising 75W "no power connector" needed when nearly every card released thus far needs it


Most don't. It's just that most gaming websites, obviously, focus on the most powerful models.
Just look at TPU's history with MSI. "Gaming" and "Lightning" variants are covered, not much else.
https://www.techpowerup.com/reviews/?category=Graphics+Cards&manufacturer=MSI&pp=25&order=date&p=1

The last MSI ITX card TPU tested was a 760 back in 2014 - despite the fact that these cards are really popular. It's just not the client segment TPU attracts.


----------



## A.Stables (Apr 24, 2019)

londiste said:


> @W1zzard do you have a reference/75W card without 6-pin connector to review?
> There is Palit StormX in some of the tables, is that going to get a separate review?



This all the commentators talking you don't need a PCIE but all the cards on the net reviewed are with PCIE power im guessing another 10% less perf


----------



## 64K (Apr 24, 2019)

londiste said:


> According to several sites measuring the power consumption directly reference RX 480 was consuming 80W or more from PCI-e slot 12V. While the entire coverage over this was overblown, exceeding spec at stock is something that one should generally try to avoid.
> 
> The situation with GTX 1050Ti is not the same. As much as these were tested, none of the GTX 1050Ti models without 6-pin connector consumed more than the allowed 66W from PCI-e slot 12V. There are models with 6-pin connector that have higher TDP and do consume more power but they do so within spec. Of course, models without 6-pin connector are slower as they are more restricted by power. GTX 1650 seems to follow the exact same formula so far.
> 
> ...



I don't know what to think about PCIe specs. Remember the R9 295X2 card tested here? It had two 8 Pin power connectors so that would be 300 watts and then 75 watts from the slot. It would have been within specs at 375 watts yet it averaged 430 watts power draw gaming with a peak of 500 watts gaming and then running Furmark it was drawing 646 watts.


----------



## W1zzard (Apr 24, 2019)

notb said:


> that most gaming websites, obviously, focus on the most powerful models.


I keep telling companies to sample their cheaper SKUs, I'd love to review cards for the masses, but most don't get it.


----------



## notb (Apr 24, 2019)

A.Stables said:


> This all the commentators talking you don't need a PCIE but all the cards on the net reviewed are with PCIE power im guessing another 10% less perf


eteknix tested 4 different models already. 2 with a power connector (Gigabyte Gaming, MSI Gaming) and 2 without (ASUS Phoenix, Palit StormX).
Yes, the overclocked ones (with a connector) are slightly faster - usually by 3-5%.
E.g. Metro Exodus 1080p high settings: 30 vs 31fps.
https://www.eteknix.com/asus-gtx-1650-phoenix-graphics-card-review/10/

Despite all that MSI and ASUS cards manage to pull almost 20W more - simply because they could. Of course this includes the second fan as well (~5W).

Palit is significantly slower because of the smaller heatsink.


----------



## tajoh111 (Apr 24, 2019)

It's funny how long it takes for someone to point out a correction if it is in AMD's favor.

I guess that what happens when you have such a strong team red viral marketing team.

http://links.em.experience.amd.com/...zMzNjE4NDc4MzQS1&j=MTQwMTYxNzUxNwS2&mt=1&rt=0



*Offer available through participating retailers only. 18+ only. Following purchase, *Coupon Code must be redeemed by April 6, 2019, after which coupon is void.* Residency and additional limitations apply. For full Terms & Conditions, visit www.amdrewards.com.
Campaign period begins November 15, 2018 and *ends February 9, 2019 or when supply of Coupon Codes is exhausted, whichever occurs first. *Eligible AMD Product must be purchased during Campaign Period. Offer void where prohibited.

https://www.amd.com/en/where-to-buy...r&utm_medium=referral&utm_source=DonanimHaber

The promo page for the game bundle is also dead.

The promo has been mostly dead since February 9th(only obtainable if retailer had extra codes and you emailed the seller personally for the codes).

And 100% dead since April 6th. Look for a card that says the bundle is included on newegg and you can't find anything. See how quickly bad news comes for Nvidia cards, and it has now been nearly 3 weeks to over 2 months to point out a correction for AMD which definitely affect the value of their cards. The sad thing is I bet all these AMD fans know that it has expired but would rather this information not be posted or corrected for because it helps AMD sell cards.


----------



## R0H1T (Apr 24, 2019)

And why does this need to be posted here?


----------



## tajoh111 (Apr 25, 2019)

R0H1T said:


> And why does this need to be posted here?



Because all the reviews on techpowerpowerup for the gtx 1650 mention the rx570 comes with 2 aaa games when they dont in reality and have not been included for a while. Do you think its better for people not to be informed of the actual truth? This should have been corrected sooner. Wizzard is not biased whatsoever, but I would expect the community to inform him sooner..but I suspect those that do know don't want to say.


----------



## W1zzard (Apr 25, 2019)

tajoh111 said:


> the rx570 comes with 2 aaa games when they dont


oh? let me look into that

edit: this has been fixed, thanks for bringing my attention to it!


----------



## HenrySomeone (Apr 25, 2019)

Lmao, yet another case of AMD not delivering what they promise!


----------



## AMX85 (Apr 25, 2019)

WHY? why people don´t realizing that is using old AMD Beta Drivers, 19.2

actual performance difference between 570 and 1650 is higher

Greetings


----------



## John Naylor (Apr 26, 2019)

I usually don't look at cards in the price range as never been asked to build a PC w/ a $150 GFX card before.  I originally gave AMD an easy win for this tier ... but have now realized it's a lot closer than i had originally thought.   I took another look and saw that the 1650 actually beat the 570 in TPUs OC test ...and with that much of an OC .... it should compete fairly well, which is a lot better than I expected on 1st read.

At 1080p, the 570 has a 109% / 100% advantage ... but with the OC....

MSI 1650 = 100% x 1.155 = 115.5
Sapphire RX570 = 109% x 1.103 = 120.23

Still a win for AMD but that makes it only 4% (120.23 /115.5) faster ....

The MSI $1650 is $160 on newegg with a $45 Gaming Bundle (real value to any individual will of course vary by individual).
The Sapphire Nitro + is $130 ... and the 1650 depending on how much ya value Fortnite is costing ya $115 to $160.

On the other hand .... the 1650 card has the edge in noise (29 / 31 dbA), power (80 / 180 watts) and heat load (65 / 74 C)  ... the electricity cost savings kicks ya back $55.26 over 3 years at average US electricity cost.  However most folks don't look at it that way ... most won't get beyond that 570 is cheaper and its 4% faster.   So I still have to give the 570  the win", but it's by no means a big one considering less heat, less noise, a whopping 100 watts less power and $55 "cash back".  In short, what my thinking is .... I don't see nVidia making any effort to get more competitive price wise in the near future.

As far as drivers go ... the 570 is over 2 years old ... if it hasn't been tweaked by this point it never will ... Both cards are useless above 1080p, and no one needs more than 3 GB to run at 1080p.


----------



## HenrySomeone (Apr 26, 2019)

Precisely, yet somehow almost everyone overlooks this angle. If 1650 was just a bit cheaper, it would simply be a flat-out better buy than the obsolete "gas guzzler" 570 that when OCed to the max can pull north of 200W, which is frankly insane for an entry level card...


----------



## rruff (Apr 26, 2019)

John Naylor said:


> a whopping 100 watts less power



Honest question... can AMD ever be in the same league for power efficiency in games without big changes to their architecture? They seem to do alright in mining and maybe some other compute tasks, but...


----------



## John Naylor (Apr 26, 2019)

Who knows ... remember when AMD folks were making fun of the GTX 480 frying eggs 

https://www.tomshardware.co.uk/gf100-fermi-egg-frying-gtx-480,news-33106.html

Green took a 10% hit in market share with that series as they spent time on a total architecture revamp ...   Here's why i think that won't happen.   Back them nVidia was chasing the console market, best thing that ever happened for their GFX card division was letting go of that market.  Yes it's whole lot of revenue but margins are teeny and any bump in the road has to "get solved "off book".

As to the mining ... remeber that electrical costs also hits profits.


----------



## londiste (Apr 26, 2019)

rruff said:


> Honest question... can AMD ever be in the same league for power efficiency in games without big changes to their architecture? They seem to do alright in mining and maybe some other compute tasks, but...


It has been speculated a lot that GCN itself does put some limits to what AMD can do efficiency-wise. Radeon 7 showed that Vega (the current-latest iteration of GCN) needs a full process shrink to get to the same efficiency level as RTX cards. AMD can get into the same league for power efficiency but they either need a big change in the architecture or to compete with a bigger GPU against a smaller one as a lot of AMD cards seem to be a bit too far along the inefficiency curve.


----------



## HenrySomeone (Apr 26, 2019)

rruff said:


> Honest question... can AMD ever be in the same league for power efficiency in games without big changes to their architecture? They seem to do alright in mining and maybe some other compute tasks, but...


Almost certainly not without a major overhaul or well theoretically maybe on like 3-times smaller node (2-times smaller obviously doesn't cut it, as shown by the Laugheon 7 vs (16nm!) 1080Ti)


----------



## jabbadap (Apr 26, 2019)

londiste said:


> It has been speculated a lot that GCN itself does put some limits to what AMD can do efficiency-wise. Radeon 7 showed that Vega (the current-latest iteration of GCN) needs a full process shrink to get to the same efficiency level as RTX cards. AMD can get into the same league for power efficiency but they either need a big change in the architecture or to compete with a bigger GPU against a smaller one as a lot of AMD cards seem to be a bit too far along the inefficiency curve.



Hmh last time I looked Radeon VII is no where near the efficiency levels RTX cards have. I think you meant to say same performance levels or needs a another shrink from 7nm?


----------



## londiste (Apr 26, 2019)

jabbadap said:


> Hmh last time I looked Radeon VII is no where near the efficiency levels RTX cards have. I think you meant to say same performance levels or needs a another shrink form 7nm?


Undervolted Radeon 7 fairly regularly gets to around 220-230W of GPU-only draw without performance hit. Sometimes less, sometimes more. This is in the ballpark. From undervolting results it can be argued that AMD can get the same efficiency if they only reigned in their default voltage settings. The primary reason they do not is that they have a performance level they need to hit which today is RTX2080 and at this point in the efficiency curve the variability is large enough that they feel the need to overvolt a little bit by default.


----------



## jabbadap (Apr 26, 2019)

Spoiler: Uhoh Radeon off topic, apologies






londiste said:


> Undervolted Radeon 7 fairly regularly gets to around 220-230W of GPU-only draw without performance hit. Sometimes less, sometimes more. This is in the ballpark. From undervolting results it can be argued that AMD can get the same efficiency if they only reigned in their default voltage settings. The primary reason they do not is that they have a performance level they need to hit which today is RTX2080 and at this point in the efficiency curve the variability is large enough that they feel the need to overvolt a little bit by default.



Well yeah undervolting is form of OC, some chips can excel from it some don't(infamous silicon lottery). Then again Radeon VII is not the even full vega20 chip, with full fat Vega20 amd should not needed to clock card as high as it's now and binned voltage could have been a bit lower. Same can be said about RX Vega⁶⁴ it can be very efficient with lower voltages, but because of competition it's clocked way over it's optimum point on efficiency curve.


----------



## rruff (Apr 26, 2019)

londiste said:


> Undervolted Radeon 7 fairly regularly gets to around 220-230W of GPU-only draw without performance hit. Sometimes less, sometimes more. This is in the ballpark. From undervolting results it can be argued that AMD can get the same efficiency if they only reigned in their default voltage settings.



Isn't Nvidia in the same boat with this? They can be undervolted as well. Considering that they can usually be overclocked more at stock voltages, I suspect they could be undervolted more at stock clocks. I've never investigated it though. And people have less incentive to try since they are so efficient to start with.


----------

