# Why are people still buying 1050TI's and not RX 570's?



## JRMBelgium (Jan 13, 2019)

Finally a revisiited comparison with modern drivers. Thank god!
Now if he only benchmarked 1440p at medium settings, it would actually show what the card could do at that resolution...







Fact:
The RX 570 clearly wins in terms of performance
Fact:
Cheapest 1050TI that can be found on Amazon costs 167$
Fact:
Cheapest RX 570 ( + 2 full AAA games ) that can be found on Amazon costs 170$
Fact:
Cheapest RX 580 ( + 2 full AAA games ) that can be found on Amazon costs 190$

So why can I find 5 1050Ti's in the Top 50 best selling cards, including one at 2nd best selling and only one 570 card, at place 45 on amazon.com ?
And it's not just on Amazon, in my country and also in the Netherlands, these cards are being sold for the same price and the 1050TI is still gettting most sales.

Seriously, Nvidia fan or not, give me one good argument of buying a 1050Ti and sacrificing that much performance?
It's like consumers want a monoply. It's like they want every GPU generation to cost 25% more. Because that's the message they send to Nvidia this way.
Wanna bet the 3080ti will cost 1600$? Why not. People buy it any way. Price per dollar doesn't mather anymore…

And people are suprised AMD is no longer trying to compete? Why? It's like PC gamers don't want competition and that they want to PC platform to die due too high prices compared to consoles...


----------



## Shambles1980 (Jan 13, 2019)

some people just don't know better. It takes a long time to shake of a stigma.


----------



## Toothless (Jan 13, 2019)

Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.


----------



## bug (Jan 13, 2019)

Because, with AMD's rebrandings it's rather hard to tell what you're getting?


----------



## silentbogo (Jan 13, 2019)

JRMBelgium said:


> Seriously, Nvidia fan or not, give me one good argument of buying a 1050Ti and sacrificing that much performance?


In some places it's still priced on par with 1060 3G. That's the biggest reason.
Next reason is that 1050Ti does not require additional power.
Third reason is the actual reason for upgrade and the brand loyalty. Going from 9-series to 10-series is a big leap. Going from RX400 to RX500 is a fart in a wind.


----------



## JRMBelgium (Jan 13, 2019)

Toothless said:


> Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.



That actually is a good argument. Only card that comes slode from AMD is the 560, but still looses against Nvidia in terms of performance.


----------



## Fluffmeister (Jan 13, 2019)

At least we can see why AMD chose Strange Brigade to pimp Radeon VII.


----------



## JRMBelgium (Jan 13, 2019)

silentbogo said:


> Going from 9-series to 10-series is a big leap. Going from RX400 to RX500 is a fart in a wind.



It's a big leap because they have the R&D budget to make it a big leap. But if consumers ( non ITX users ) keep buying cards that are slower for the same price it's a "circle of life" kind of thing. Nvidia ends up with more money again, releases new generation and keeps the "better brand" status, people buy the slower cards for same price again, nvidia ends up with more money again, etc...

Every time in AMD's history when they actually were able to compete or even beat Nvidia, people just kept buying Nvidia. Even when benchmark manipulation, game rendering manipulation, false advertising ( remember 3.5Gb on GTX 970 everyone? ) gets releases to the press, people keep buying it. And then those same Nvidia buyers bitch about AMD not competing with Nvidia. Why should they, it's not like they are going to switch. If at some point AMD beats Nvidia again, they will just wait for Nvidia prices to drop, so that they can buy Nvidia again.


----------



## rtwjunkie (Jan 13, 2019)

Fluffmeister said:


> At least we can see why AMD chose Strange Brigade to pimp Radeon VII.


I’m confused. Can you tell me why?



JRMBelgium said:


> remember 3.5Gb on GTX 970 everyone? ) gets releases to the press, people keep buying it


People kept buying it because it had 4GB of VRAM, which rarely impacted anything at 1080p, and was a damned good card that was actually a price to performance winner.


----------



## Fluffmeister (Jan 13, 2019)

rtwjunkie said:


> I’m confused. Can you tell me why?



Voila!

https://www.techpowerup.com/251369/amd-announces-the-radeon-vii-graphics-card-beats-geforce-rtx-2080


----------



## EarthDog (Jan 13, 2019)

JRMBelgium said:


> Now if he only benchmarked 1440p at medium settings, it would actually show what the card could do at that resolution...


They are barely playable (60 fps) cards st 1440p...


----------



## xkm1948 (Jan 13, 2019)

I would choose 570 over 1050Ti any time. I mean at 1080p you would want as much fps as you can possibly get for an entry level card.


----------



## cdawall (Jan 13, 2019)

It is really easy why a huge number of people purchased the 1050ti over the 570. There was a time less than 8 months ago that they cost 2-4 times as much.


----------



## xkm1948 (Jan 13, 2019)

cdawall said:


> It is really easy why a huge number of people purchased the 1050ti over the 570. There was a time less than 8 months ago that they cost 2-4 times as much.



totally forgot mining was a thing....

still 570 is now a good price.


----------



## NdMk2o1o (Jan 13, 2019)

I bought a used 580 4GB for £120 a month or 2 back, won't even get an 1050Ti for that (unless you buy one of them special chinese versions that's actually a 450 GTS lol)  let alone a 1060 6GB which is the same performance level. No brainer


----------



## JRMBelgium (Jan 13, 2019)

EarthDog said:


> They are barely playable (60 fps) cards st 1440p...



Hardware.info always benchmarks on medium and ultra, on medium, performance increases by 97% on the rx 570 at 1440p. But you wouldnt know that because 99% of all reviews, including the ones on TPU dont benchmark on medium. They rather benchmark budget cards on unplayable settings.



cdawall said:


> It is really easy why a huge number of people purchased the 1050ti over the 570. There was a time less than 8 months ago that they cost 2-4 times as much.



I am talking about current sales. Not about sales from 8 months ago.


----------



## Final_Fighter (Jan 13, 2019)

you can actually get the rx570 non reference designs on auction sites for under $100, shipping included. thats if you are comfortable with a card that was mined on.


----------



## eidairaman1 (Jan 13, 2019)

Not worried about it too much, mini itx systems you can have an external power connector for the 570.


----------



## EarthDog (Jan 14, 2019)

JRMBelgium said:


> medium, performance increases by 97% on the rx 570 at 1440p.


link please. Id love to see a 97% difference from ultra to medium. I bet you can see such improvements with vram limited situations...other cards would likely see similar results as well.

Mostly I bet any improvements are half that or less.

EDIT: heres the link - https://us.hardware.info/reviews/74...polaris-update-hardwareinfo-performance-score

Anyway, the point is, that it's a reach for a 2560x1440 card. Users are required to lower settings to reach 60 fps in most titles.  I saw the hw.info review and see they average, on medium, 55 fps for the highest oc'd one at 2560x1440. Those titles arent exactly GPU killers either. 

Like cdawall said, it made sense a short time ago...it doesnt now. But, anyone just needs to look the pricing and performance to see that. This isnt a groundbreaking point you are making.


----------



## Solaris17 (Jan 14, 2019)

JRMBelgium said:


> But you wouldnt know that because 99% of all reviews, including the ones on TPU dont benchmark on medium.



Yeah, because I can’t wait to go out and buy a GPU with the intent to play on medium.


----------



## vega22 (Jan 14, 2019)

mind share.

people think you need an nvidia.


----------



## lexluthermiester (Jan 14, 2019)

Toothless said:


> Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.


This about sums it up. In raw performance, the RX570 is better, but raw results are not the sum total of a purchase choice.


----------



## rtwjunkie (Jan 14, 2019)

Solaris17 said:


> Yeah, because I can’t wait to go out and buy a GPU with the intent to play on medium.


A Thousand Times THIS!! 

This is the second thread he’s beat that medium review settings drum.


----------



## Vario (Jan 14, 2019)

Not a surprise, the 1050ti is equivalent to a GTX 680/770.  The reason for me to buy a 1050 or ti would be for non gaming, it uses less power and doesn't need a separate PCI-E power cable.  When you consider that it isn't much more than a $80 GTX 1030 but a lot more powerful than a 1030, it makes more sense to buy a 1050 or even a TI off eBay.  Can be had for ~$100.


----------



## sneekypeet (Jan 14, 2019)

I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!


----------



## cadaveca (Jan 14, 2019)

sneekypeet said:


> I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!



I had to do exactly this, and have had to for many years.... can you guess why?


You know why, because we talked in voice for hours for years... so you know how true and real I'm being here, and how I can say that although I can see your side of things, I don't think this guy is trolling about this...


I had a 2560x1600 monitor. Since 2007. When the 1950X was still a thing. Remember?


You bloody well believe i gamed on medium, and you heard me moan about it forever...


Then, I ran eyefinity, which you are also aware of. Also, medium settings in Battlefield 3, which you know. We talked about this lots. You and I, for hours and HOURS AND HOURS...


Now, we have 4k. People are getting 60 FPS @ 4K on ultra settings? Since when?


People that buy the latest and greatest monitors, don't really have any choice but medium settings, *and it's been that way for not just years.... decades now.* This is a high-end user thing, medium graphical settings, because these monitors aren't cheap, and they are cutting-edge stuff, and they kill VGAs like they always have... and they are WHY WE NEEDED MULTIPLE GPUs and SLI and Crossfire are real tech....


WHY has everyone forgot the past?


Anyway, I'm out. Enjoy!


----------



## trog100 (Jan 14, 2019)

i bought a 1050ti a while back just to go in a small none gaming machine.. it gave the machine basic low resolution gaming abilities on a budget and didnt need any extra power connectors..

i dont think people buy a 1050ti for real gaming more just be able to play the odd game if they feel like it.. its the perfect card for that kind of purpose..

trog


----------



## EarthDog (Jan 14, 2019)

cadaveca said:


> Now, we have 4k. People are getting 60 FPS @ 4K on ultra settings? Since when?


Since the 2080ti on many titles (and yes I get the irony there, save your time ).

The thing is nobody wants to run less than ultra...it's a need, for some, to do so however. But people dont go into it looking to do so. Its guided by their resolution and budget for GPU.


----------



## cadaveca (Jan 14, 2019)

EarthDog said:


> Since the 2080ti on many titles (and yes I get the irony there, save your time ).
> 
> The thing is nobody wants to run less than ultra...it's a need, for some, to do so however. But people dont go into it looking to do so. Its guided by their resolution and budget for GPU.




Well, I've moved on to 8k now, so yeah... still medium.

https://www.techpowerup.com/240668/viewsonic-introduces-new-professional-and-enterprise-monitors

I've talked with so many of you guys about how medium isn't that much different visually at least once at some point in the past...


But TPU has never been the place of true high-end computing, so I'm not really shocked about all of this, to be honest.


----------



## EarthDog (Jan 14, 2019)

A good example, for gaming, of overbuying on resolution and being forced to lower settings to play at the native res.


----------



## sneekypeet (Jan 14, 2019)

Call me strange, but I run 4k with a 1080ti,  since SLi has limits, and I'm still using ultra settings. Most games I can get close to 60 fps, but the game is still "playable" with less fps. Frames dont bother me until the game shudders or hitches.


----------



## eidairaman1 (Jan 14, 2019)

__ https://www.facebook.com/1622923754667622/posts/2000847243541936


----------



## cadaveca (Jan 14, 2019)

EarthDog said:


> A good example, for gaming, of overbuying on resolution and being forced to lower settings to play at the native res.



one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno. It's the EXACT same thing, but reversed, no? 



sneekypeet said:


> Frames dont bother me until the game shudders or hitches.




But you know that I am personally a bit more sensitive to these things, so I seek to avoid any of that at all costs.


----------



## sneekypeet (Jan 14, 2019)

cadaveca said:


> one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno.
> 
> 
> 
> ...



That's the thing.... to me i dont need a set amount of fps to play a game. I am busy playing rather than reading an fps counter.


----------



## EarthDog (Jan 14, 2019)

cadaveca said:


> one could say that buying for ultra settings is the same, and maybe high is the good spot. I dunno. It's the EXACT same thing, but reversed, no?
> 
> 
> 
> ...


I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example). 

The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.


----------



## toyo (Jan 14, 2019)

I doubt people still buy the 1050ti vs the 570 these days. They are similar prices where I live. The cheapest 3GB 1060 is +35$ and the cheapest 4GB 580 is +50$. Probably worth to go for the 580s instead.


----------



## cadaveca (Jan 14, 2019)

EarthDog said:


> I suppose it can be, sure. But since getting a 144hz panel and pumping close to 144hz of synced goodness, it would be tough to go back to 60hz/fps or less.
> 
> The underlying goal of many is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate.



Right, but we both know that really, ultra isn't what the dev intended either. That's what NVidia/Intel/AMD want to showcase their hardware features... And this used to be out in the open, and not a big deal. I mean, it's the land of things like Phys-X and such...

I dunno man, I actually do kind of buy a PC to game at medium. I'm an average person, with an average budget. People with lots of cash play on high. I shouldn't be able to play "ultra"... ever, unless I have the top-end VGA of a specific brand. Wasn't that the point? When did that get lost?

You, you review hardware. You don't pay for this stuff. I don't know that your opinion is actually relevant about pricing and such, really. Not to dismiss, but you know... you're spoiled with hardware.



sneekypeet said:


> That's the thing.... to me i dont need a set amount of fps to play a game. I am busy playing rather than reading an fps counter.


Well, back when we were discussing this, was before microstutter became a term people used. So now I can say I'm sensitive to that, and people know what I'm talking about. That's not a myth, and it's not watching a FPS counter, and its been proven to make other people have motion sickness from games like I do, so at this point I'm pretty vindicated on all of that. Certain FPS intervals make me feel sick. It's not a big deal, but because of that, I seek a consistent FPS. I don't know what that FPS is... that changes depending on what monitor I am using, really.

There used to be a point in time where this price range that the OP is talking about, the 570/1050ti range, was one of the most common. All of a sudden, if you don't have the uber-leet 40GB GPU, then what you like isn't important? OK. See, I can't write for that over-indulging audience...

So me, it's funny. I've been looking at these two GPUs specifically to go with my TR1950X. I kinda want the AMD card so I got a AMD-all build, but man, I like nvidia too...?


I ended up pulling out my 7970 MATRIX and using that for now. That's the last time that AMD was relevant to me when it comes to GPUs. Now I'm looking at all these questions again...

Well, guess what? You're reviewing hardware too, and now I'm back to buying it. It's kind of opened my eyes again as to how important some of the things I started to take for granted while I was doing reviews really is, and how maybe, sometimes what real people want gets missed completely. Oh well.


----------



## sneekypeet (Jan 14, 2019)

I think you are stretching my point @cadaveca. Point is that when I got a 4k screen, I knew I needed strong gpu's to push it. I dont get motion sick, and I don't see the need for huge fps numbers when the screen tops out at 60hz. I am not saying others cannot have an issue,  but with the right equipment you can make the best of it rather than turning everything down.


----------



## cadaveca (Jan 14, 2019)

sneekypeet said:


> I think you are stretching my point @cadaveca. Point is that when I got a 4k screen, I knew I needed strong gpu's to push it. I dont get motion sick, and I don't see the need for huge fps numbers when the screen tops out at 60hz. I am not saying others cannot have an issue,  but with the right equipment you can make the best of it rather than turning everything down.



Oh, I tried. I tried and tried and tried. but SLI sucks on so much.... and we can agree that until the 2080ti, which is still new, that 4k 60 FPS needed SLI, and that my need for it, well, is a rather stupid one, but still, is medical.

ANd its not like I play uncommon titles... some work. Most don't on SLI. Like... these damn monitor purchases... ROFL... heh. You know what I'm on about with that.

I mean, yeah, you're right, you can do differently, but I would just interpose that you, as a hardware reviewer, are actually the least-common, so just because something works for you, doesn't mean it works for everyone... as much as we all might wish it did. We both know that the stupid shit like you buying memory a week after I did means you and I have completely different PCs....

Anyway, the easy way to get that smooth FPS-to-monitor-refresh-ratio, with a single card, no matter the game... is to just run medium on a single GPU. I mean, like I said, games used to do that medium by default anyway.... We all know this. It seems we all forgot. I was stuck in the land of not being able to forget with my bullshit $2500 monitor purchases.


Because you know, I'm not gaming on a $5k PC with a $500 monitor...


----------



## cdawall (Jan 14, 2019)

cadaveca said:


> ANd its not like I play uncommon titles... some work. Most don't on SLI. Like... these damn monitor purchases... ROFL... heh. You know what I'm on about with that.



Which titles do you play that sli doesn't work? I play most of the AAA games released in the last few years on my pair of 1080ti's in sli at 4k60fps.


----------



## EarthDog (Jan 14, 2019)

Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.

But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit. 

Edit: I bought two pcs for my kids out of pocket for 1080p 60hz ultra gaming. So while my daily driver is clearly not a common system, I know what it is like to buy them and shoot for a 1080 60hz ultra gaming goal with a budget more in line with the masses.


----------



## cadaveca (Jan 14, 2019)

cdawall said:


> Which titles do you play that sli doesn't work? I play most of the AAA games released in the last few years on my pair of 1080ti's in sli at 4k60fps.


SLI worked in PUBG since when? 

Lots of games. Everything. BFV, most recently, although its working fine now. F1 games... dude, there's more to gaming than just AAA games... you're proving my point.

I'm just saying that medium settings isn't all that uncommon... as a multi-GPU user since it was possible due to monitor resolution, medium settings on just one of those GPUs when SLI has issues is just the norm, really.



EarthDog said:


> Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.
> 
> But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit.



Yeah, budget. That's my only thing. You're right though, most are at 1080p, and gaming at that res is pretty good.

Which brings me back to the OP, and why I looked at this thread, as the 1050TI is kind of on the limit of that, and the 570 is well in that territory. Because also, @ 1080p, when I can't play ultra with these GPUS, you know for sure medium is going to work...

oh, you made some edits. 


Dude, do you actually talk to anyone in the real world outside of the PC industry that plays games? Do you talk to anyone in the 16-21 age group? That's my kids. They don't think like dis... they woke up like dis.  They turn on the game and play it and don't fuck with settings, for the most part. They expect the PC to do it for them, and that's why things like Gefore Experience are still in use and NVidia still makes it, even though every single reviewer out there complains about it...

Like, ultimately, you're right, but also... so am I. Until you all can accept that there are many different types of users and that we need not stick to one usage model... well, I got some shit to say. Like, sorry... but you know... I got into doing reviews to be a different voice out there. Sneekypeet knows this full well since it's really due to him that I ended up doing reviews here on TPU for like nearly a decade...


I still feel the same way about much of it all, but I did get to talk to a lot of people that did agree with my side of things over the years. The average user.. the average gamer.. is at a far more average place than you guys want to think. You guys are high-end.


----------



## LightningJR (Jan 14, 2019)

Like most have said power. The 570 uses about as much power as a GTX1080 so we're talking about 3 times the power of a 1050ti. AMD does great with price/performance especially with the 570 but if someone is upgrading and has a poor PSU then the 1050ti is the way to go unless they pay more for a new PSU.

I really don't understand why AMD needs so much more power to compete, is it the extra FP64 cores AMD continues to keep in their consumer GPUs or something else? Whatever it is they need to fix it because is ridiculous that if you buy an AMD you will be using more than double the power cost of an NVidia card with similar performance.


----------



## EarthDog (Jan 14, 2019)

I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal. 

My kids are unicorns...I get it. 11 and 7 year olds with a 970 and a r9 270. But both systems were built from scratch, used, for under 1k. Now, we fortnite together and my kids are building machines, and I stink! 

The majority of those around that age are alienware, omen, Ibuypower,...canned systems. 

I accept there are tons of users...but in the end, we would all run ultra if we could get acceptable fps. There are few reasons not to if we _could_.


----------



## hat (Jan 14, 2019)

@cadaveca is making a lot of sense. Not everybody is building a computer with a graphics card which alone costs more than what I make when I work 60 hours in a single week. Those same people are also likely to not play on ultra settings either. I'm not running ultra settings on my 1070. It doesn't take ultra graphics to have a good time playing a game. As long as my games don't look like smeared dog shit in the mud, like they tend to on the _lowest_ settings, I'm good. I don't play much of the latest games, but I do still play a lot of 7 Days to Die... which my 1070 can probably run at the highest settings, but I don't, because the game is a turd and your FPS will tank no matter what... so I stick to med/high settings to keep good performance. In fact, I play with med/high settings on just about every game, because I don't need photorealistic shadows and stuff dragging down my FPS...


----------



## cadaveca (Jan 14, 2019)

EarthDog said:


> I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal.
> 
> My kids are unicorns...I get it. 11 and 7 year olds with a 970 and a r9 270. But both systems were built from scratch, used, for under 1k. Now, we fortnite together and my kids are building machines, and I stink!
> 
> ...



Yeah, I mean, I went back to college for the HVAC stuff. These people are half my age. I made friends with some of them and still talk to them. My oldest is about to graduate highschool, and my youngest is almost out of elementary. The youngest... all his friends are all about fortnight. He's 11. My oldest, he's playing what? BFV, BO, tarkov? My girls play RPGs? My kids have the same systems that yours do, and I have four of those kids, but ALL of their friends... their friends PARENTS that paly games too...

dude.... you know what I'm on about here if you interact with other people, I think. 

The ONE commonality, I'd say, with all these people that paly games, is that they want to have fun, and that fun means the games WORKING. What working means, to some people, seems to be a fluid thing. That just makes me think back to the NVidia study about gamers being willing to accept artifacts in graphics.... and here we are.

So I wanna turn this back to the OP... why do people buy what hey do? DUDE. I really know why. Nobody wants to really admit it.


----------



## Darmok N Jalad (Jan 14, 2019)

Every 570 I check has an 8-pin connector and 500W recommendation, and the cards don’t ship with an adapter cable. That pretty much kills its chances in the OEM desktop world, where a buyer can drop in a 1050 Ti without any doubts. I think the 1050 Ti’s low power requirement puts it into a different category than the 570. AMD needs something better than the RX 560 to manage this, but without the power requirements of the 570. A 12nm 570 with reduced clocks maybe?


----------



## cdawall (Jan 14, 2019)

cadaveca said:


> SLI worked in PUBG since when?
> 
> Lots of games. Everything. BFV, most recently, although its working fine now. F1 games... dude, there's more to gaming than just AAA games... you're proving my point.
> 
> I'm just saying that medium settings isn't all that uncommon... as a multi-GPU user since it was possible due to monitor resolution, medium settings on just one of those GPUs when SLI has issues is just the norm, really.



Does pubg need even one card to play 4k60? A cellphone can pretty much max that garbage out. 

BFV is long since fixed so that is mute. 

F1 games I'll give you that one so we have some non-AAA titles that don't work. 

Sli is pretty engrained in everything right now. Yes there are games that don't work, but those are getting fewer and fewer. 

But I guess that really proves your point that everyone should just deal with medium settings. I'll keep enjoying running almost every game maxed out though. It looks prettier. With that pair of 1080ti's mind you, not even the latest cards.


----------



## JRMBelgium (Jan 14, 2019)

EarthDog said:


> link please. Id love to see a 97% difference from ultra to medium. I bet you can see such improvements with vram limited situations...other cards would likely see similar results as well.
> 
> Mostly I bet any improvements are half that or less.
> 
> ...



TPU users are so used to outdated reviews with old drivers that they even use them to proove a point. Here is a more recent review, showing almost 120fps avarage on 1080p and about 88fps avarage on 1440p. And, lets not forget that the drivers in the past 6 months have improved even further. 

https://be.hardware.info/reviews/83...chips-hertest-hardwareinfo-gpu-prestatiescore

We have to keep in mind though that these games were benchmarks on a very fast CPU, but still 1440p 60fps is not unrealistic on a slower CPU.


----------



## londiste (Jan 14, 2019)

cdawall said:


> It is really easy why a huge number of people purchased the 1050ti over the 570. There was a time less than 8 months ago that they cost 2-4 times as much.


At launch GTX1050Ti was priced against 4GB RX460 with MSRP of $139. RX470 MSRP was $179 and RX570 MSRP was $169. 
That used to be the case for a long while and GTX1050Ti is a better card than RX460/RX570. I have no clue what is up with actual prices today putting it against a card that is one class higher up.


----------



## Zubasa (Jan 14, 2019)

I thought people buy GT1030 DDR4 over RX570 instead. 
On a series note, because people want Geforce even if it cost $2500, and many have no idea that AMD exists.


----------



## cucker tarlson (Jan 14, 2019)

JRMBelgium said:


> TPU users are so used to outdated reviews with old drivers that they even use them to proove a point. Here is a more recent review, showing almost 120fps avarage on 1080p and about 88fps avarage on 1440p. And, lets not forget that the drivers in the past 6 months have improved even further.
> 
> https://be.hardware.info/reviews/83...chips-hertest-hardwareinfo-gpu-prestatiescore
> 
> We have to keep in mind though that these games were benchmarks on a very fast CPU, but still 1440p 60fps is not unrealistic on a slower CPU.


in your cherry picked reviews the gap between Vega 56 and Vega 64 is higer than one between 1080 and 1080Ti.Tell me more about good testing methodology.


----------



## Vayra86 (Jan 14, 2019)

JRMBelgium said:


> Finally a revisiited comparison with modern drivers. Thank god!
> Now if he only benchmarked 1440p at medium settings, it would actually show what the card could do at that resolution...
> 
> 
> ...



I think the mistake in your thinking is to approach the PC gamer as a homogenous group of people. They aren't. Its probably the most diverse target audience in any marketplace. With a vast majority having no clue about market share Nvidia/AMD, and only a vague idea about performance, about what settings or framerate they want to target, etc. etc. There are also very large groups that exclusively play on or two games - games like Warframe - that are benched almost never, anywhere (and run on a toaster - there would be zero noticeable difference in picking a 570 over the 1050ti ). And there are groups that want a thin/light/silent rig because they only play casual games on the living room TV. And... I could name another few dozen examples of very specific use cases. A large amount of those groups are going to want the smallest, lowest TDP/least power connectors options with the best performance at that. But those groups still don't explain the entire (Major) sales gap, I agree.

Look at the market shares of display resolutions on Steam Survey, for example. Higher resolutions and even 1080p isn't as dominant as you might expect. And that happens to be the exact segment these 1050ti's and RX570's target: casual gaming. Casual gamers. That part of the PC gaming audience that knows very little about all the things we discuss on this forum, or in TPU reviews.

This is an enthusiast website. HWi is just about as casual as it gets for those who read tech sites. Youtube is another one of those entry level sources of information. Its the reason you also see a lot of inaccuracies and bad conclusions over in those regions. HWI is well known for straight up bad testing and bench results that are WAY off the mark compared to many others doing the same test. Its fine if you look at HWI only as far as relative performance goes. But that is just about where it stops being relevant.

As for your statements on why pick one over the other, I agree. I'd also pick the faster GPU for my money. But, again: not everyone is keen on or even capable of reading all those reviews _and make sense of the numbers while doing it. _It however has no relation to why or why not test on Medium. The HWI link you posted points that out perfectly: the relative performance of all cards in their line up is perfectly maintained as resolutions and quality settings rise - bar a few exceptions of cards that were already very close together to begin with. Only when you hit unplayable/less comfy (sub ~50 averages) framerates do lower end cards start falling off (hard).

I remember another topic just recently that you fired off where I also kept hammering on that specific point: the relative performance, the GPU hierarchy of these cards never really changes in any test approach, it only does when you present light cards with overweight test scenarios - which isn't good info, it just throws you off balance as a reader.

This means that while you DO see a performance gap between ultra and medium, there is still no new information to be had. Fast cards be fast, slow ones be slow. And guess what, a year from now, those same cards will have fallen off a bit further as benchmarks become heavier at every resolution and quality level and faster cards are released. All of this has nothing to do with a preferred quality setting or resolution and everything with _time_.



EarthDog said:


> I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).
> 
> The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.



Not so sure about that. While I _personally agree_ with you, I think the underlying goal for most is actually just to play - much like @sneekypeet says he does. Most people simply have a budget and get whatever is best for them (available, suited for their rig, known to them, and an upgrade) at a specific point in time. Whatever settings and resolution that enables is secondary. The people who do target a resolution often don't target a quality level, and those who target a specific framerate (higher than 60) also don't target a quality level. The quality level a game is played on is the one thing that is fluid here. Everyone is stuck with their monitor choice in terms of resolution and refresh rate, but they can play around with quality settings.

And ehm... playing a game 'as it was intended by the developer'... I don't know, that sounds to me like something out of the audiophile or movie-fanatic corner and it has absolutely no place in gaming, where half of what you see is simply what the engine/API has available for you and the other half determined/limited by hardware. I mean, how do those console gamers even cope with all that trickery to keep their sacred 30 FPS intact? Is that as developers intended it? I hardly think so... As you can also see in many polls about quality we saw lately, many comments say that people disable specific settings such as (Motion) blur, vignetting, chromatic abberation, DoF, etc etc. And let's be honest... think back about all those DX9 games with excessive Bloom effects that would burn your retinas out. Those went insta-disable in my gaming... It is exactly _that flexibility that makes PC gaming what it is. _And probably also what elevates it beyond console gaming.


----------



## Recus (Jan 14, 2019)

Is there are info who shows that people buy 1050 Ti over RX 570? Because Steam Hardware Info is not valid according to red team.

If you read reviews when 1050 Ti came out you clearly see it was compared with RX 460. So why compare 3 years old sales numbers with today prices?

You can find used 980Ti for RX 570 prices. Why people buy slower GPUs if you can buy gold in second-hand market?



eidairaman1 said:


> __ https://www.facebook.com/1622923754667622/posts/2000847243541936



So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK


----------



## JRMBelgium (Jan 14, 2019)

Recus said:


> Is there are info who shows that people buy 1050 Ti over RX 570? Because Steam Hardware Info is not valid according to red team.
> 
> If you read reviews when 1050 Ti came out you clearly see it was compared with RX 460. So why compare 3 years old sales numbers with today prices?



The Amazon best sellers lists are not based on 3y old sales numbers but current sales.


----------



## Vya Domus (Jan 14, 2019)

Recus said:


> So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK



Sure thing buddy, except one is 350$.


----------



## silentbogo (Jan 14, 2019)

JRMBelgium said:


> The Amazon best sellers lists are not based on 3y old sales numbers but current sales.


https://www.sellerapp.com/amazon-best-seller-rank.html


> The rank assignment also considers the current sales of the product and the sales history too.


----------



## plonk420 (Jan 14, 2019)

Toothless said:


> Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.



i liked the lack of power connector on my GT430 ...until i tried to Hybrid Physx it on a board where all i could use was a 16x to 1x PCI-E adapter, and without the power connector, it wouldn't work.

but i get it. the power consumption is nice. but that's about all it has going for it


----------



## Recus (Jan 14, 2019)

Vya Domus said:


> Sure thing buddy, except one is 350$.



So console class gaming then.



sneekypeet said:


> I have to agree with Solaris here. When I game on PC, I buy what it takes to run the resolution I have with the best settings available. If I wanted to play on medium settings, I would buy a console!


----------



## ArbitraryAffection (Jan 14, 2019)

Got my 570 8GB  for £150 + 2 free games and OC to 1410 and 8440 memory it is coming close to 1060 6GB performance and nearly 2x the performance of 1050 ti for the same price with twice the vram. It is capable of ultra settings 1080p and 60 fps in many titles and 40fps+ ultra 1080p in basically every game. This is value you simply can't get with Nvidia.

Answer to the OP question: Because people are stupid and GeForce mindshare is greater.


----------



## EarthDog (Jan 14, 2019)

Vayra86 said:


> Not so sure about that. While I _personally agree_ with you, I think the underlying goal for most is actually just to play - much like @sneekypeet says he does. Most people simply have a budget and get whatever is best for them (available, suited for their rig, known to them, and an upgrade) at a specific point in time. Whatever settings and resolution that enables is secondary. The people who do target a resolution often don't target a quality level, and those who target a specific framerate (higher than 60) also don't target a quality level. The quality level a game is played on is the one thing that is fluid here. Everyone is stuck with their monitor choice in terms of resolution and refresh rate, but they can play around with quality settings.
> 
> And ehm... playing a game 'as it was intended by the developer'... I don't know, that sounds to me like something out of the audiophile or movie-fanatic corner and it has absolutely no place in gaming, where half of what you see is simply what the engine/API has available for you and the other half determined/limited by hardware. I mean, how do those console gamers even cope with all that trickery to keep their sacred 30 FPS intact? Is that as developers intended it? I hardly think so... As you can also see in many polls about quality we saw lately, many comments say that people disable specific settings such as (Motion) blur, vignetting, chromatic abberation, DoF, etc etc. And let's be honest... think back about all those DX9 games with excessive Bloom effects that would burn your retinas out. Those went insta-disable in my gaming... It is exactly _that flexibility that makes PC gaming what it is. _And probably also what elevates it beyond console gaming.


I am pretty sure about that. You may want to look at what sneeky had to say...



sneekypeet said:


> I have to agree with Solaris here. When I game on PC, *I buy what it takes to run the resolution I have with the best settings available*. If I wanted to play on medium settings, I would buy a console!



...which is what I am saying. Generally default Ultra. The dual polls (still funny both were allowed to live and they have the same damn information) also support the majority of those participating, shoot for the stars and lower only as needed and not for giggles. People target both IQ and FPS. It's part of the reason why we see min and rec specs on the box. 

I just meant ultra. Ultra is a setting IN GAME determined by the developers. With Ultra being as close as possible to their vision while still being able to run on the hardware that is around. People nit picking settings like that isn't what we are talking about here.


----------



## Vayra86 (Jan 14, 2019)

Recus said:


> You can find used 980Ti for RX 570 prices. Why people buy slower GPUs if you can buy gold in second-hand market?
> So AMD promoting 4 GB in 2019 but people bashing Nvidia for RTX 2060 6 GB. OK



Because the 980ti is now a 3+ year old card and the RX570 is new, while they are almost the same performance wise. Its not like the 980ti makes games playable that the 570 can't run proper. The gap isn't that massive. As for your 2060 example... the card is twice the price of the 4GB alternative being promoted. And the RX570 also comes in an 8GB flavor... which is still much cheaper than the 2060. So I think its clear why Nvidia deserves a bit of bashing at that price point for such a meagre GPU. Especially when compared to its previous gen offerings, which had 8GB at that price point.



Recus said:


> So console class gaming then.



Not sure I follow you here.


----------



## EarthDog (Jan 14, 2019)

Vayra86 said:


> Because the 980ti is now a 3+ year old card and the RX570 is new


What does a 570 offer over that card being 3 years newer?


----------



## Vario (Jan 14, 2019)

Until recently the 570 was twice the price.  Its a faulty comparison.  Just because they are the same price now does not mean they are the same product category nor have they always been the same price.  Amazon sales rank is partially determined by historical sales data.

One is a low power card for light gaming or workstations, the other is a mid range gamer.   They aren't the same product category.  This is evidenced by the 1050ti lacking a PCI-E power input, it is a low power card that only needs a PCI-E slot to function.  Imagine having a prebuilt machine with a proprietary power supply such as a Dell.  You could drop the 1050ti in and be able to run games.

In terms of value, the 1050 and 1050ti are a better proposition than the 1030, they are over twice as fast but only $20-30 more.  If you are building a workstation, need a video accelerator to run your monitors, and you want it to be able to run anything (non gaming) years into the future, get the 1050x SKU.  New 1050ti are available as low as $115 shipped on sites like eBay.

Consider the historic pricing of the RX 570 on camelcamelcamel.com Under the crypto boom, $300-400, and in some cases more.  Meanwhile the 1050ti was typically going for half that.  Most people bought 1050ti because it was the only card actually available under $200 that was suitable for any level gaming, not because it was a good card for gaming.  It has the same performance as the 2012 era GTX 680.

Instead of creating a forum thread that implies that 1050ti purchasers are fools, maybe it would have been better to create a Public Service Announcement: Attention, if you are considering buying the 1050ti, be aware that the 570 is currently priced equivalently.




@Vayra86 I agree, the Ultra preset is not something I spend money chasing, I want games to run smooth and even if I can run it on Ultra, I don't care prefer running ultra as a preset, because frequently the Ultra preset has things I do not like such as motion blur, lense flare, and other "special effects".  I run a videocard for a few years until I have to start setting stuff on Low just to have a playable frame rate, and then it is upgrade time once more.


----------



## Vayra86 (Jan 14, 2019)

EarthDog said:


> I am pretty sure about that. You may want to look at what sneeky had to say...
> 
> 
> 
> ...



Then I must have misread sneeky, because I believe he also said that he doesn't care what FPS the game runs at and it can easily be below 60. While your stance on Ultra is not only the quality but also an FPS target.

For the rest of it, the market share of GPUs completely contradicts your statement. Ultra is out of reach for the vast majority and they settle for less. When FPS is abysmal, usually the settings go down. Its the first thing you do when your GPU can't run game X proper. Polls on enthusiast tech sites are not representative of 'most people'. Not even in PC gaming. That is the whole premise of this topic, I believe - what do the lesser GPUs do and why would you not run a lower quality setting, and what do you _really_ miss out on in that case.

Do you run Ultra 'when you can'? Of course. But that is a different question. There are many things people might prefer but won't have or have access to or simply won't result in a comfortable experience. And about nit-picking settings... most games just offer the same range of settings in the exact same way, it has zero to do with developers carefully crafting an artistic experience. It wasn't too long ago that an overall quality setting also determined many aspects of post processing and draw distance etc. And those games still exist. The ultra setting has absolutely nothing to do with developers' intent and everything with hardware capabilities that games and quality settings are scaled around. Its not like higher or lower LOD has anything to do with 'how developers meant it', for example, yet those are inherent to a quality setting in many games. In fact, the exact opposite is true: developers always try to make the lower quality settings good enough to still get their 'intent' across, so that the overall experience is the same regardless of your hardware power. And on top of that, there are many games that become straight up unplayable if you put everything ON and on Ultra, because there's too much crap obscuring your vision all the time (lens flares, glares, flashy lights, way overdone particles).



EarthDog said:


> What does a 570 offer over that card being 3 years newer?



Age, warranty, resale value. And in the case of AMD, probably also some driver love the 980ti isn't getting much of anymore.



EarthDog said:


> shoot for the stars and lower only as needed and not for giggles. People target both IQ and FPS. It's part of the reason why we see min and rec specs on the box.



Yes, and now back to reality: You buy a card once every few years (much longer for most) and in the meantime, you only really get your desired FPS+Ultra quality at the beginning of that upgrade cycle. The longer you stick to the card, the more you have to adjust to keep _either_ a quality level _or _an FPS target. People may want both, but they can never have both at every point in time. And I believe that whenever they can't, the settings menu is the first thing they dive into. So, conclusion: Ultra is not the thing that makes or breaks gaming for most people. Simply because budget and spending doesn't allow it to be.


----------



## micropage7 (Jan 14, 2019)

the brand, many people just read Nvidia if talking about graphic card, like razer too


----------



## EarthDog (Jan 14, 2019)

By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. He doesnt seem to be able to notice a difference/can play in lower fps is also what he said.



Vayra86 said:


> Age, warranty, resale value. And in the case of AMD, probably also some driver love the 980ti isn't getting much of anymore.


I meant features and performance wise... so, nothing. 980tis still cost more than 570 on the market, right? So resale value may or may not be there.


----------



## sepheronx (Jan 14, 2019)

GTX 980ti is really doing well in holding up in modern games at ultra for sure.  But I noticed here at least, on ebay, a used RX 570 costs almost as much as a 4gb RX 580.  I game at 1080p so a 4GB model may just be fine enough.  Although paying an extra $50 for an 8gb model isn't bad either.  I know my friends GTX 970 4gb may not be sufficient for Resident Evil 2 Remake.  But I hope it is.  If not, then a RX 580 may be ordered soon.


----------



## Vayra86 (Jan 14, 2019)

EarthDog said:


> By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. He doesnt seem to be able to notice a difference/can play in lower fps is also what he said.



Up to a point  So what happens when you don't have a few hundred ready to get spent on a GPU, yet your games do produce stutter on Ultra? Settings go down. So the _real_ priority isn't Ultra, its the FPS. Ultra is the luxury you only have if you can afford it. And when somewhat lower settings get very close to the Ultra 'experience'... guess what MOST people tend to default to?

We don't disagree a whole lot, only about your thinking of 'most people'. We are a niche. And when you're in one, its harder to see that.


----------



## EarthDog (Jan 14, 2019)

Lol, I'm just going by the polls bud... consider taking a look at them again. 

There are two threads with identical information in them. One started by Lexi, the dupe by cucker.


----------



## Vayra86 (Jan 14, 2019)

EarthDog said:


> Lol, I'm just going by the polls bud... consider taking a look at them again.
> 
> There are two threads with identical information in them. One started by Lexi, the dupe by cucker.






EarthDog said:


> I've seen the two threads here which are essentially the same poll showing that ultra/highest is a goal.
> I accept there are tons of users...but in the end, we would all run ultra if we could get acceptable fps. There are few reasons not to if we _could_.



No, you just used the polls to support your argument. And I am saying those aren't representative of your idea of 'most people'.

And here you are saying that _actually, _we would only run Ultra if we can do so at acceptable FPS. So its not Ultra that people default to. Its playable FPS, and thén what quality settings you can squeeze out of it. All of this doesn't really truly start with a quality setting, it starts with a budget. As per the example of your kids. The midrange is where budget is a crucial element, because otherwise you wouldn't be in this segment.



EarthDog said:


> By fps targets I mean 60hz/fps... a 1060 and equivilants from previous gens can do that at 1080p ultra. .



You go ahead and try that on a GTX 960 at games that were recent for its time of release. Go ahead. I dare you  Or better yet, try it on a GTX 660 for games at its time of release, let's say Far Cry 3:

https://www.techspot.com/review/615-far-cry-3-performance/page5.html

Woops. And lo and behold, the review even dedicates the majority of its testing to 'High'.


----------



## EarthDog (Jan 14, 2019)

LOL, missing the point and sniping not so relevant ones... so I'm just going to step away. I really do not have time to clarify and spell things out.

Cheers.


----------



## Vayra86 (Jan 14, 2019)

EarthDog said:


> LOL, missing the point and sniping not so relevant ones... so I'm just going to step away. I really do not have time to clarify and spell things out.
> 
> Cheers.



You're not being awfully fair now are you?



EarthDog said:


> I suppose it can be, sure. But since getting a 144hz panel and pumping144fps of synced goodness in most titles, it would be tough to go back to 60hz/fps or less. The difference between my PC and my kids 60hz/fps is more than apparent...as are the medium and ultra settings (fortnite for example).
> 
> The underlying goal of most is simply to play the game as the dev intended... looking how it should. Indeed it can be hard to distinguish differences but again, unless there are other constraints, users dont buy a PC to game on medium out of the gate but do so typically because of other limitations like budget or too high res for the card.





cadaveca said:


> Right, but we both know that really, ultra isn't what the dev intended either. That's what NVidia/Intel/AMD want to showcase their hardware features... And this used to be out in the open, and not a big deal. I mean, it's the land of things like Phys-X and such...
> 
> I dunno man, I actually do kind of buy a PC to game at medium. I'm an average person, with an average budget. People with lots of cash play on high. I shouldn't be able to play "ultra"... ever, unless I have the top-end VGA of a specific brand. Wasn't that the point? When did that get lost?
> 
> ...





EarthDog said:


> Dave, if I didnt review I'd still be shooting for ultra settings still. I would likely be doing it at the same res and hz as well. I wouldn't be using a 2080ti... likely still the 1080 which allowed the titles I play to be above 100 fps anyway. Most of the gaming public is at 1080p. It doesnt take much to game at 60hz 1080p and ultra. A 1060 6gb will do there. Point is the goal for everyone is ultra settings where possible. For some it isnt possible or simply a choice for more fps.
> 
> But that is ultimately everyone's goal to run ultra if possible. I'd be floored if anyone intentionally ran under ultra (glitches, etc aside) if they could match/surpass their refresh rate. It's just a res/budget/game/setting limit.
> 
> Edit: I bought two pcs for my kids out of pocket for 1080p 60hz ultra gaming. So while my daily driver is clearly not a common system, I know what it is like to buy them and shoot for a 1080 60hz ultra gaming goal with a budget more in line with the masses.



I'm not missing anything, just simply not in agreement with your statement that _most people 'target' or 'want' Ultra.  _Or about Ultra being 'how the dev intended it'... come on man. Wake up please - dave was spot on saying that Ultra is inflated, heavy settings to make you feel like your fat GPU is worth having. And also not that without Ultra, you're effectively console gaming. You've even said so yourself, a number of times. Hell, I want a Ferrari too, it doesn't mean I'll ever drive one, so its a target I can easily give up on, in favor of having many other things that are perhaps much more comfortable to have, such as a cup holder.


----------



## M2B (Jan 14, 2019)

EarthDog said:


> A good example, for gaming, of overbuying on resolution and being forced to lower settings to play at the native res.



I have a GTX 1080 and besides of Far Cry 5 I can't remember the last game I played on max settings.
I truly believe if your GPU is capable of handling higher resloutions with some sacrificed graphical details it's the way to go.
There are games that even my overclocked GTX 1080 can't maintain a solid 60FPS on max settings at 1080p, If you always want to go for max settings you'll always be stuck at 1080p even with the fastest GPUs out there.
Having the crispiness and sharpness of higher resolutions is more visually compelling than some sometimes invisible details that ultra settings offer, trust me.
There is also the matter of framerates, if you want a true 144Hz gaming experience even with a 500$ GPU it's impossible to achive unless you're willing to dial back some settings. (In most games)


----------



## eidairaman1 (Jan 14, 2019)

Vayra86 said:


> You're not being awfully fair now are you?
> 
> 
> 
> ...



This I agree with

If people want ultra settings constantly just get a 1080P or 2K with a Vega56.

Not everyone have deep pockets.

I got my card due to pricing being better than the X model, both come in at same clocks or the non X is higher despite a few cus being hard locked.


----------



## JRMBelgium (Jan 14, 2019)

M2B said:


> I have a GTX 1080 and besides of Far Cry 5 I can't remember the last game I played on max settings.
> I truly believe if your GPU is capable of handling higher resloutions with some sacrificed graphical details it's the way to go.
> There are games that even my overclocked GTX 1080 can't maintain a solid 60FPS on max settings at 1080p, If you always want to go for max settings you'll always be stuck at 1080p even with the fastest GPUs out there.
> Having the crispiness and sharpness of higher resolutions is more visually compelling than some sometimes invisible details that ultra settings offer, trust me.
> There is also the matter of framerates, if you want a true 144Hz gaming experience even with a 500$ GPU it's impossible to achive unless you're willing to dial back some settings. (In most games)



A 1080 user on TPU forums claiming that he does not always play on Ultra? You have balls my friend. Seriously, thx for confirming that not ever high end buyer is an Ultra gamer.


----------



## EsaT (Jan 14, 2019)

Toothless said:


> Id want the 1050ti for lack of power connector depending on the build and much less heat in an itx.


Power consumption is indeed one factor I myself take into consideration.
But funnily those same people weren't bothered by power consumption, when energy efficiency situation was completely opposite during HD5870 and its generation cards vs Nvidia's heating Fermistor.
Including Fermi's craptacularly bad idle consumption...
https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html



JRMBelgium said:


> Every time in AMD's history when they actually were able to compete or even beat Nvidia, people just kept buying Nvidia. Even when benchmark manipulation, game rendering manipulation, false advertising ( remember 3.5Gb on GTX 970 everyone? ) gets releases to the press, people keep buying it. And then those same Nvidia buyers bitch about AMD not competing with Nvidia. Why should they, it's not like they are going to switch. If at some point AMD beats Nvidia again, they will just wait for Nvidia prices to drop, so that they can buy Nvidia again.


Aren't you forgetting the claim of AMD having bad drivers, despite of Nvidia being the one having managed to make seriously problematic/even cards destroying drivers more often?

If masses applied all the same requirements to Nvidia they demand from AMD, they shouldn't be touching Nvidias except with this.


----------



## AusWolf (Jan 14, 2019)

eidairaman1 said:


> Not worried about it too much, mini itx systems you can have an external power connector for the 570.


It's not just the external power, but the size and heat also. The RX570 has the same TDP as a GTX1060, and it's pretty hard to get one in small form factor, unlike its green competitors.


----------



## eidairaman1 (Jan 14, 2019)

AusWolf said:


> It's not just the external power, but the size and heat also. The RX570 has the same TDP as a GTX1060, and it's pretty hard to get one in small form factor, unlike its green competitors.


It is so funny how you talk about Green, electricity really isn't green.


----------



## cucker tarlson (Jan 14, 2019)

people buying 1050ti over rx570 with a great game bundle are  stupid and bad at choosing graphics cards.that's it.it's an equivalent for a 1060 3gb which is a card a tier higher at the same price.
I'm sure the number of them is not very high either,just as you had people buying Vega 64 LC over 1080Ti at higher price and those who bought Fury X over 980Ti.


----------



## Deleted member 67555 (Jan 14, 2019)

Game Detail importance=Distance + Game speed.
It really is that simple.
The further away you are from your display the less detail matters.
If a game is designed for extreme fast pace action, detail is less important.

There's been a enough reviews over the years explaining the differences of medium details and Ultra...
I think we all know what PR bloat and bragware is.


----------



## LFaWolf (Jan 14, 2019)

My answer - for most people, the family PC that is used to pay bills, browse Internet, do homework, buy some stuff, is a Dell or HP pre-built SFF with an i5 6400 (?) with a propriertary 250 watt PSU. After careful consideration and calculation of the power supply requirements, and the fact that most 1050 ti comes with a low profile bracket, it was determined that is the best card for an upgrade so the family PC can be used to play some basic games such as Minecraft, Fortnite, PUBG, or something similar, on a 23" 1080p monitor. Any RX 570 won't even fit in the SFF PC.

Not everyone has or can have a dedicated gaming PC.


----------



## Vya Domus (Jan 14, 2019)

JRMBelgium said:


> A 1080 user on TPU forums claiming that he does not always play on Ultra? You have balls my friend. Seriously, thx for confirming that not ever high end buyer is an Ultra gamer.



What's an ultra gamer ?


----------



## cucker tarlson (Jan 14, 2019)

JRMBelgium said:


> A 1080 user on TPU forums claiming that he does not always play on Ultra? You have balls my friend. Seriously, thx for confirming that not ever high end buyer is an Ultra gamer.


it depends on the game,but in vast majority I consider lower settings at higher framerate a much more "high end gamer" way to play than 4K 60 fps Ultra.To me game immersion is far much more dependent on framerate than visual quality.


----------



## JRMBelgium (Jan 14, 2019)

Vya Domus said:


> What's an ultra gamer ?



Someone who plays on medium settings...



cucker tarlson said:


> it depends on the game,but in vast majority I consider lower settings at higher framerate a much more "high end gamer" way to play than 4K 60 fps Ultra.To me game immersion is far much more dependent on framerate than visual quality.



Personally, I prefer sharpness over everything. That's why I love games that offer resolution scale settings. In Frostbite engine games, increasing the resolution scale increases drawdistance, sharpness and increases tessalation. I prefer that over extremely detailled shadows and lighting effects.


----------



## Regeneration (Jan 14, 2019)

RX 500 series launched on April 2017, but software support still immature and drivers are full with bugs.

Seems like the series are meant for miners rater then gamers.

I've been looking at RX 500 feedback lately, and saw a lot of complaints: 100 percent GPU usage after exiting a game, system lag with multiple monitors (if it works), black artifacts on high GPU temperature.


----------



## AusWolf (Jan 14, 2019)

eidairaman1 said:


> It is so funny how you talk about Green, electricity really isn't green.


I meant nvidia, not electrically green. Sorry.


----------



## eidairaman1 (Jan 14, 2019)

AusWolf said:


> I meant nvidia, not electrically green. Sorry.



Considering 560s and 570s are alternatives, there are nano cards to boot...


----------



## AusWolf (Jan 14, 2019)

LFaWolf said:


> My answer - for most people, the family PC that is used to pay bills, browse Internet, do homework, buy some stuff, is a Dell or HP pre-built SFF with an i5 6400 (?) with a propriertary 250 watt PSU. After careful consideration and calculation of the power supply requirements, and the fact that most 1050 ti comes with a low profile bracket, it was determined that is the best card for an upgrade so the family PC can be used to play some basic games such as Minecraft, Fortnite, PUBG, or something similar, on a 23" 1080p monitor. Any RX 570 won't even fit in the SFF PC.
> 
> Not everyone has or can have a dedicated gaming PC.


And some people (like me) prefer ultra SFF gaming PCs over big and heavy ATX ones.


----------



## eidairaman1 (Jan 14, 2019)

AusWolf said:


> And some people (like me) prefer ultra SFF gaming PCs over big and heavy ATX ones.



I have large mitts so the room to work is welcome.


----------



## AusWolf (Jan 14, 2019)

eidairaman1 said:


> Considering 560s and 570s are alternatives, there are nano cards to boot...


The 560 is not a very good gaming card, and the only SFF 570 I know about is the Sapphire Pulse. But even if you find it in a store, its 120 W TDP is something to consider against the 1050Ti's 75 W when you go SFF (heat).


----------



## eidairaman1 (Jan 14, 2019)

AusWolf said:


> The 560 is not a very good gaming card, and the only SFF 570 I know about is the Sapphire Pulse. But even if you find it in a store, its 120 W TDP is something to consider against the 1050Ti's 75 W when you go SFF (heat).


https://www.game-debate.com/gpu/ind...=radeon-rx-560-4gb-vs-geforce-gtx-1050-ti-4gb

80 vs 75 isnt enough difference in tdp, so 2 fps difference between the 2 as well says the 560 is right on with the 1050ti


----------



## Tatty_One (Jan 14, 2019)

In most cases (I checked the 3 biggest online stores) in the UK the cheapest 1050Ti is around 15% cheaper than the cheapest 570, that still makes the 570 the better buy, but not if you don't have the dollar.


----------



## eidairaman1 (Jan 14, 2019)

Tatty_One said:


> In most cases (I checked the 3 biggest online stores) in the UK the cheapest 1050Ti is around 15% cheaper than the cheapest 570, that still makes the 570 the better buy, but not if you don't have the dollar.



Hence the 560 fills that space anyway.


----------



## cucker tarlson (Jan 14, 2019)

eidairaman1 said:


> https://www.game-debate.com/gpu/ind...=radeon-rx-560-4gb-vs-geforce-gtx-1050-ti-4gb
> 
> 80 vs 75 isnt enough difference in tdp, so 2 fps difference between the 2 as well says the 560 is right on with the 1050ti


those tests you posted are sketchy af. 1050ti can be quite a bit faster than 560


----------



## Tatty_One (Jan 14, 2019)

eidairaman1 said:


> Hence the 560 fills that space anyway.


Priced about the same here and there are also plenty of reviews showing 1050Ti wins so don't really think there is anything in it between the 2.


----------



## JRMBelgium (Jan 14, 2019)

cucker tarlson said:


> those tests you posted are sketchy af. 1050ti can be quite a bit faster than 560



Don't use benchmarks that ar THAT old. They in no way represent current performance in modern games with modern drivers.
In the last 6 months alone there have been performance improvements up to 10% in certain games.
Not only that, those benchmarks at Ultra for sub 200$ cards. Come on, people that buy a 1050 don't play on Ultra unless it's a very old game...

Seriously, benchmarks older then 12 months should get deleted form the web automaticly.... Nobody is going to buy a card and use 12+ month old drivers for it. 
And when comparing cards, when a reviewer sees both cards can not keep 60fps, they should lower the settings in those games for a REALISTIC comparison. 
Who's gonna play Battlefield 1 on less then 40fps... come on now... And that's just the freakin singleplayer.


----------



## cucker tarlson (Jan 14, 2019)

JRMBelgium said:


> Don't use benchmarks that ar THAT old. They in no way represent current performance in modern games with modern drivers.
> In the last 6 months alone there have been performance improvements up to 10% in certain games.
> Not only that, those benchmarks at Ultra for sub 200$ cards. Come on, people that buy a 1050 don't play on Ultra unless it's a very old game...
> 
> ...


in the ones you posted 1050ti is 24% faster at 1080p medium and 30% faster at ultra.
it scores the average of 90 fps at medium while it can get 50 fps at ultra. Ultra is therefore a better representation of whta people will be running,no one buys a 1050ti for a 144hz display.They'll wanna stay close to 60.


----------



## notb (Jan 14, 2019)

JRMBelgium said:


> It's a big leap because they have the R&D budget to make it a big leap. But if consumers ( non ITX users ) keep buying cards that are slower for the same price it's a "circle of life" kind of thing. Nvidia ends up with more money again, releases new generation and keeps the "better brand" status, people buy the slower cards for same price again, nvidia ends up with more money again, etc...


You're missing the point with the power advantage. It's not just for ITX.
Pentium/i3/i5 + 1050ti is a frugal combination, pooling under 100W in heavy load (RX570 alone peaks above 180W).
That means the whole PC will be able to run on any Chinese 350W PSU with huge safety margin (even if it turns out to be more like a 250W).
So just by comparing GPUs this might seem like a poor deal, but you're instantly saving $30+ on the the PSU.
And 1050Ti is cool and quiet, so you can use a cheap case as well, because there's hardly any need for noise damping or good airflow. And you don't need extra case coolers. Things add up.

Nvidia understands that by making such cards they're winning the whole entry-level gaming desktop business.
The irony of AMD's strategy is that they're making a lot of fuss about offering cheaper alternative, but they're actually targeting enthusiasts. So a typical AMD buyer won't care about the extra 100W power draw, because... well... because after reading a few topics about PSUs on gaming forums, he'll be 100% sure that 500W GOLD is a bare minimum.


----------



## Casecutter (Jan 14, 2019)

Correct, when in America you can find various RX 570 4Gb deal for $130 (Not even needing a rebate), why the GTX 1050 Ti starts at $160 while more than half have prices still at $180-220... IDK.

Sure if you want/unwilling just to upgrade from some lowly non-efficient OEM PSU, you just continue on running wasted efficiency, while slogged with lower visual immersion... that's someone's prerogative.

Personally, I'd do the RX 570 and use the savings to upgrade to a nice 500W Gold PSU.  See efficiency across the entire system, care less and Game more... paying Nvidia for supposed efficiency "up front" isn't truly getting a lower power bill.

While sure if in a SFF or ITX box sure size and heat you start looking a special GTX 1050Ti can't help that those you're always in some other reality.  This is "GAMING" B-F-B 1080p that all this is about, ducking in other rabbit holes is not the topic.


----------



## notb (Jan 14, 2019)

Casecutter said:


> Sure if you want/unwilling just to upgrade from some lowly non-efficient OEM PSU, you just continue on running wasted efficiency, while slogged with lower visual immersion... that's someone's prerogative.


You see? This is exactly what I've been talking about. You get satisfaction from having a high-end PSU, because you think it's more efficient. So you're willing to pay more.
In the end, you spend relatively more on the non-computational parts (so everything other than CPU and GPU), because you want a high-end system more than optimal performance.
You don't want a "lowly OEM PSU" and I'm pretty sure you don't want lowly fans and lowly cable management and so on. 
But quite a lot of people don't think that way. They don't care what's inside the box. They want best performance they can afford. So they want to spend as little as possible on anything that isn't adding a lot of fps.


> Personally, I'd do the RX 570 and use the savings to upgrade to a nice 500W Gold PSU.  See efficiency across the entire system, care less and Game more...


How exactly is paying more for efficiency translating into the "care less and game more"? :-D
Just the fact that you're posting on this forum means you're actually caring more and gaming less. Because instead of writing your 2000 posts, reading hardware reviews and carefully choosing parts, you could have just bought an off-the-shelf PC (or a console) and play.


----------



## Casecutter (Jan 14, 2019)

notb said:


> satisfaction from having a high-end PSU, because you think it's more efficient. So you're willing to pay more.


First I said a *GOLD* PSU... they are more efficient.

Nope I just know there no need to invest in a Gaming card just for the idea of efficiency alone and pay more for that.  Always smarter to invest in a quality back-bone like a PSU, as most OEM PSU's will cause more problems when over taxed.  You build a house on a good foundation.   I you use a PSU Calculator you see that in three year gaming 4hr a day  over three years there's just a few dollar difference.


----------



## Toothless (Jan 15, 2019)

EsaT said:


> Power consumption is indeed one factor I myself take into consideration.
> But funnily those same people weren't bothered by power consumption, when energy efficiency situation was completely opposite during HD5870 and its generation cards vs Nvidia's heating Fermistor.
> Including Fermi's craptacularly bad idle consumption...
> https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/30.html
> ...


I want something a bit beefier than my 750ti in my itx rig. My 780 is too big and everything else takes too much power. A 1050ti would be perfect but the upgrade isnt needed right now.


----------



## Mescalamba (Jan 15, 2019)

Hm, why indeed..

Well, I owned GPU from both camps (should be probably even number for both nowdays). I left AMD at BF4, cause their intervention (and/or DICE being incompetent) fked that game hard. My dear GPU had phases when it worked fine, then it worked kinda bad, then it didnt work at all (in game). Either game was misbehaving or new version of drivers successfully broken what worked before.

On top of that, I had some driver related issues with 2D software.

All that, while my friend running some nVidia had near zero issues whole time. So I switched and never turn back. Much like my experiment with AMD CPUs. Yea they okay, if you cant buy decent Intel platform.

Today? I guess situation might be different. Or at least I hope it is. Still AMD somehow looks like "solution if you dont have enough money to buy real gear". Unsure if its AMD fault, or just well crafted perspective (by competition).

On upper end of performance spectrum, I sorta cant see any reason to get anything from AMD. And that maybe is also what does its share on perspective for ppl interested in 1050 Ti. That flagship representative portion of market is actually kinda important. Bit like pro gaming for games.


----------



## Shambles1980 (Jan 15, 2019)

in all honesty.. i dont think the 1050ti is something people go out and buy new today.. but possibly they do, the old 750ti's low profile were for sff systems that could do some 1080p gaming and was really to allow you to have a media center pc in the living room for your netflix and stuff whilst also being able to use it as a console and play things like rocket league. Things you just couldnt really do with the igpus.. and even then you wouldnt have a great expirience in most titles.
so i could understand those.. 
1050ti i dont really understand though. you can pick up a used 970 for less and its just massively better, or you can get a  amd card for similar which is also faster..
if you want lower power consumption and no pci connectors for really light gaming and media, well the ryzen chips have vega in them which are some where in the region of a 1030.. 
i genuinly dont see the need for the 1050/ti i dont see the need for any low power middeling/low performance card these days.
yes its faster than a ryzen chip with integrated gfx. but its not enough better to warrant the price to get one.

soo i dont know honestly. I dont know why any one would buy one even if there was no rx 570. the arguments for buying one rather than a 570 is the same argument as just using a ryzen system with its igpu.
and i just cannot see the useage case that would mean you need a dedicated gpu..

really a gpu should imo statr at a 780ti or r9 390.. less than that and you will always be dissapointed. and even with that level you arent going to max out much.


----------



## Melvis (Jan 15, 2019)

The reason why I bought a GTX 1050 Ti was because of a few things. The size of the card and power consumption to fit in my Mini LAN PC and the cost of any AMD Card back at the mid to end of 2018 was double the price!  I got my 1 month old 1050 TI for $140 where any RX 570/580 was over $250-300 second hand, so no brainer back then. Damn you Cripto!


----------



## kapone32 (Jan 15, 2019)

I don't understand the argument about about the PCI-E power connector, any PSU worth it's salt comes with at least one. My reasoning as to why people are still buying 1050TIs vs RX 570 comes down to advertising. I have never seen an AMD commercial on TV but I see Nvidia commercials (especially during big sports events). So I put it down to propaganda. Even though the Rx 570 is a much better option and comes with 2 highly anticipated games. I will also say that the 2 games you are getting will run much better on a RX 570 vs a 1050TI.



Mescalamba said:


> "solution if you dont have enough money to buy real gear"



That is an interesting statement can you show me a Nvidia that can compete with Tahiti in terms of longevity and refinement?


----------



## Vayra86 (Jan 15, 2019)

Mescalamba said:


> Hm, why indeed..
> 
> Well, I owned GPU from both camps (should be probably even number for both nowdays). I left AMD at BF4, cause their intervention (and/or DICE being incompetent) fked that game hard. My dear GPU had phases when it worked fine, then it worked kinda bad, then it didnt work at all (in game). Either game was misbehaving or new version of drivers successfully broken what worked before.
> 
> ...



This is valuable input, because it shows a bit of the mindset going on for many prospective buyers of hardware. It shows the power of mindshare and how you only need one or two bad experiences that will be almost impossible to offset by good ones, to switch brands.

This also ties into the reason I have always said frequent driver releases and polished soft + hardware releases are so very important. AMD dropped the ball on that far too often and this is definitely part of their market share problem.


----------



## INSTG8R (Jan 15, 2019)

Vayra86 said:


> This is valuable input, because it shows a bit of the mindset going on for many prospective buyers of hardware. It shows the power of mindshare and how you only need one or two bad experiences that will be almost impossible to offset by good ones, to switch brands.
> 
> This also ties into the reason I have always said frequent driver releases and polished soft + hardware releases are so very important. AMD dropped the ball on that far too often and this is definitely part of their market share problem.


LOL I’m totally committed...my switch was when I “upgraded” my Ti4200 128 for an FX5500. Owned it for an hour, came back with a 9600XT first time I was able to use AA and been with them ever since.


----------



## remixedcat (Jan 15, 2019)

How is driver stability between the 2 cards?


----------



## EarthDog (Jan 15, 2019)

remixedcat said:


> How is driver stability between the 2 cards?


If anyone tells you one is better or worse, its speaking from personal experience. My personal experience, both have solid drivers for the most part. There is always a dud here and there from both camps. But nobody here can say with any certainty, one is better than the other.


----------



## Zubasa (Jan 15, 2019)

remixedcat said:


> How is driver stability between the 2 cards?


From my expeience with driver on both sides, they are both pretty stable.
I personally prefer the UI and the extra overlay options of the AMD driver more though.
I also find the AMD driver's control panel is quicker to launch, and both do have some delay when selecting tabs etc.


----------



## Mescalamba (Jan 15, 2019)

INSTG8R said:


> LOL I’m totally committed...my switch was when I “upgraded” my Ti4200 128 for an FX5500. Owned it for an hour, came back with a 9600XT first time I was able to use AA and been with them ever since.



Heh yea, GF4 Ti were nice fuckup. Had R9800 back then and for quite a few years after. I didnt even think to buy GF4, it was quite obvious that its near-scam piece of HW.

For me it always changes after some years, when I get disappointed with something too much, I just switch to opposite camp. So far Im glad that we still have something to switch to.  Current situation is actually rather good, since both camps have plenty of HW thats viable (depending on what you want). If I dont mind top end, its pretty much depending on personal preference.

One can have fully gaming viable AMD based PC as much as Intel + nVidia one (or you can be twisted and have AMD + nVidia ).


----------



## eidairaman1 (Jan 15, 2019)

INSTG8R said:


> LOL I’m totally committed...my switch was when I “upgraded” my Ti4200 128 for an FX5500. Owned it for an hour, came back with a 9600XT first time I was able to use AA and been with them ever since.



Hercules 3D Prophet 2 GTS Pro 64MB- ATi 9700 Pro All in Wonder.


----------



## Shambles1980 (Jan 15, 2019)

i used amd as my main gpu after the 8800gts was amd before that diamond stealth before then and had a couple of voodoo cards. but i used amd for  hd 3850 - 7870, the 8800gts did not have over scan and underscan options in the controll pannel which made me move to amd when they had the performance i wanted. after the 7870 i went to nvidia again (had some r9 2xx cards at the same time.) but the 780 was a really good card and the 970 even with the ram issues were good. 
but a 480 or 580 would be a good choice too. so i change mfrs as and when needed. i dont get the brand loyalty thing AT ALL..


----------



## Casecutter (Jan 15, 2019)

Mescalamba said:


> I'm glad that we still have something to switch to.


True this and again... there aren't at this point bad options (cards), just bad price-points. 

I will say this... AMD had minimized their discrete "entry" market presence to induce more people (and I believe OEM's) to swing to APU's, and that was perhaps a little to nonsensical until now with "Raven Ridge" Ryzen/Vega part.  AMD didn't have much in APU implementation up until now, and should've maintain good/lower cost "upgrade" card that the "6" moniker series (4670/5670 even 7770) always seemed to hit.  Then about the time the R7 260 came along AMD looked for more presence in APU's.  It seemed AMD "dis-carded" that market, as being the prominent entry upgrade for those OEM Intel GMA boxes.  They went all in on the 7870, R9 270, RX 470 and  RX 570 as their minimum, 1080p B-F-B gaming offering, which in many ways was not wrong and has served them well.


----------



## kapone32 (Jan 15, 2019)

Shambles1980 said:


> i used amd as my main gpu after the 8800gts was amd before that diamond stealth before then and had a couple of voodoo cards. but i used amd for  hd 3850 - 7870, the 8800gts did not have over scan and underscan options in the controll pannel which made me move to amd when they had the performance i wanted. after the 7870 i went to nvidia again (had some r9 2xx cards at the same time.) but the 780 was a really good card and the 970 even with the ram issues were good.
> but a 480 or 580 would be a good choice too. so i change mfrs as and when needed. i dont get the brand loyalty thing AT ALL..



The only reason I go with AMD is because I had a bad experience with GTS 450(s) in SLI. Nvidia disabled SLI on those cards way back in 2006 or 2007. The next card I bought was the 6850 and then some crossfire love. I actually buy GPU(s) based on Total War games. When Rome 2 was released I went with a 7950 and of course found another. Those cards lasted for 6 years before I bought a RX 470 but I was whelmed by the performance in Warhammer, so I bought a 480 for crossfire. To be honest the only benefit I saw for Polaris vs Tahiti is the reduced power draw of Polaris. I bought a 580 to replace the 470 and was happy. I did see some modern games benefit in terms of FPS with the new architecture.. The Crypto boom was good for people like me as I was able to buy 2 Vega 64s with the sale of my Polaris cards. I have said this before here too moving from Polaris to Vega really showed me increased performance and felt like the jump from the 6850 to the 7950. I do have friends with NVidia cards I notice that the display seems a little washed out  vs AMD. I am not hating on Nvidia I just have always been happy with ATI/AMD.



Casecutter said:


> True this and again... there aren't at this point bad options (cards), just bad price-points.
> 
> I will say this... AMD had minimized their discrete "entry" market presence to induce more people (and I believe OEM's) to swing to APU's, and that was perhaps a little to nonsensical until now with "Raven Ridge" Ryzen/Vega part.  AMD didn't have much in APU implementation up until now, and should've maintain good/lower cost "upgrade" card that the "6" moniker series (4670/5670 even 7770) always seemed to hit.  Then about the time the R7 260 came along AMD looked for more presence in APU's.  It seemed AMD "dis-carded" that market, as being the prominent entry upgrade for those OEM Intel GMA boxes.  They went all in on the 7870, R9 270, RX 470 and  RX 570 as their minimum, 1080p B-F-B gaming offering, which in many ways was not wrong and has served them well.




I bought a RX 570 for $189.99 about a month ago when the R5 2400G was on sale for $209.99


----------



## INSTG8R (Jan 15, 2019)

Mescalamba said:


> Heh yea, GF4 Ti were nice fuckup. Had R9800 back then and for quite a few years after. I didnt even think to buy GF4, it was quite obvious that its near-scam piece of HW.
> 
> For me it always changes after some years, when I get disappointed with something too much, I just switch to opposite camp. So far Im glad that we still have something to switch to.  Current situation is actually rather good, since both camps have plenty of HW thats viable (depending on what you want). If I dont mind top end, its pretty much depending on personal preference.
> 
> One can have fully gaming viable AMD based PC as much as Intel + nVidia one (or you can be twisted and have AMD + nVidia ).


The Ti series were great it was the preceding FX5xxx series that were absolutely terrible. I’ve basically had most of the flagship ATI/AMD ever since 9800Pro to XT(reason I came here and was the @eidairaman1 of the day helping others flash theirs too) X1900XTs in Crossfire(the Dongle) 2900XT m 4870, 5870, 7970, Fury and now Vega And all Sapphire except for the Orginal 9600XT which was and ASUS when they loaded it full of proprietary crap...but it was enough to keep me on the Red Team.  I’ve had NV in my laptop and have recommended NV cards when it’s the right choice but never compelled to ever switch for myself.


----------



## F7GOS (Jan 16, 2019)

bought a 1050ti for testings back at the start of 2017 - thought it was overpriced then and still do now - but bet I could still sell it and recoup 90-95% of the cost I paid for it then.... its a tad mad.

Especially when a lot of them come with 6 pins which a lot of folks forget, especially on the "nicer skus" which a lot of people seem to want.

As a slot in to a reclaimed dell or HP I can see how it makes sense (albeit expensive) but as the card of choice for a new build??


----------



## Vario (Jan 16, 2019)

Haven't had a problem with either brand but I do often buy what the masses buy because I figure there is strength in numbers when it comes to games receiving fixes.  Also the Nvidia product was always cheaper than the Radeon product with the last couple cards I bought when I bought them so it was an easy decision.  Now that the cryptoboom is ending the prices are competitive and I'd probably consider a Radeon.

My history: Diamond Stealth Pro, Voodoo3, Geforce 2 Pro, 9600SE (ugh), Chaintech 5900XT, X800XL, 7800GT, 9800GT, HD7850, 7970, 770, 1060.


----------



## anubis44 (Jan 16, 2019)

"Why are people still buying 1050TI's and not RX 570's?"

Because NVidia's marketing is diabolical: https://youtu.be/o_LSLPItMTY
and perhaps because some people are kind of stupid as a block of wood.


----------



## trog100 (Jan 16, 2019)

this thread is full of petty stupid comment.. a 1050ti has a pretty well defined purpose comparing it to a 570 which has a different purpose is silly..

trog


----------



## Vario (Jan 16, 2019)

anubis44 said:


> "Why are people still buying 1050TI's and not RX 570's?"
> 
> Because NVidia's marketing is diabolical: https://youtu.be/o_LSLPItMTY
> and perhaps because some people are kind of stupid as a block of wood.


No its because they were priced unaffordably thanks to the cryptoboom.
https://www.techspot.com/article/1626-gpu-pricing-q2-2018/


----------



## anubis44 (Jan 16, 2019)

Vario said:


> No its because they were priced unaffordably thanks to the cryptoboom.
> https://www.techspot.com/article/1626-gpu-pricing-q2-2018/
> View attachment 114627


You missed the 'still' part of the sentence in the original question. He wasn't asking why people 'were' buying 1050Tis in the past, he was asking why they are STILL (ie. even very recently, AFTER the crypto-bust and comparatively normal pricing again) buying 1050Tis instead of RX570s.


----------



## Vario (Jan 16, 2019)

anubis44 said:


> You missed the 'still' part of the sentence in the original question. He wasn't asking why people 'were' buying 1050Tis in the past, he was asking why they are STILL (ie. even very recently, AFTER the crypto-bust and comparatively normal pricing again) buying 1050Tis instead of RX570s.


See silent bogo's post here, that claims that Amazon determines sales rank based on both current sales and *historical sales data.*
https://www.techpowerup.com/forums/...is-and-not-rx-570s.251496/page-3#post-3975725


> The rank assignment also considers the current sales of the product and the sales history too.


----------



## NoticedByMany (Feb 14, 2019)

Well I bought mine because I was a noob and didn't even know about the RXs but now I know better and I feel cheated


----------



## eidairaman1 (Feb 14, 2019)

NoticedByMany said:


> Well I bought mine because I was a noob and didn't even know about the RXs but now I know better and I feel cheated



Sell that card


----------



## Camm (Feb 14, 2019)

I personally bought a 1050 Ti as it didn't require a PCI-E connector (which I like for a spare card for whenever I sell my main one).


----------



## ratirt (Feb 14, 2019)

The other valuable option Is Q-sync (or called differently Q-sync is Intel's I suppose) which helps with some of the applications. You move the workload on the card to speed things up which reduces costs of the processor.


----------



## Vycyous (Feb 14, 2019)

NoticedByMany said:


> Well I bought mine because I was a noob and didn't even know about the RXs but now I know better and I feel cheated



NoticedByMany, I'm using your post as an example. I hope that's okay. I'm not trying to call you out or insult you in any way.

This is exactly what AMD is up against; namely, marketing propaganda and therefore a manipulated public perception.

For years, AMD was getting screwed at both ends by Nvidia and Intel with their less-than-honest (and occasionally illegal) marketing and business practices (most recently with Nvidia's "GPP"). It was (and is) so bad that it completely shifted the public perception. I should know, because I was one of those so-called "Nvidiots" who mindlessly bought Nvidia (and Intel) hardware, even when AMD (or ATI) offered better performance, efficiency, and value (the triple crown). I mean, you want(ed) to have "Intel Inside" and wanted to play it "The way it's meant to be played," right?

We should be thankful that AMD is still around and offering some measure of competition against these decidedly unethical tech giants. I can only imagine what the tech landscape might look like if it had been a fair playing field. The hate you so often see for Intel and Nvidia isn't simply because they're "so far ahead" or "so much better" or  "so much more innovative" or because "haters gonna hate." Instead, it's because of how they got there. As consumers, we've crowned the cheaters as the champions.


----------



## Vycyous (Feb 14, 2019)

cucker tarlson said:


> I think you were just an idiot.



I'm fairly new to the TechPowerUp forums, but I've seen many of your posts and most of what you seem to do is insult other people and/or their choices. I'm sure that almost no one appreciates your commentary or thinks you're humorous, so maybe you should keep your thoughts to yourself unless you have something intelligent or worthwhile to say.


----------



## cucker tarlson (Feb 14, 2019)

Vycyous said:


> I'm fairly new to the TechPowerUp forums, but I've seen many of your posts and most of what you seem to do is insult other people and/or their choices. I'm sure that almost no one appreciates your commentary or thinks you're humorous, so maybe you should keep your thoughts to yourself unless you have something intelligent or worthwhile to say.


I meant it had nothing to do with nvidia specifically,you just did no research and bought worse quality.That's why you're calling yourself stupid.


----------



## Vycyous (Feb 14, 2019)

cucker tarlson said:


> I meant it had nothing to do with nvidia specifically,you just did no research and bought worse quality.That's why you're calling yourself stupid.



Well, aren't we all? That's a rhetorical question, because the answer is yes. We've all fallen prey to a marketing ploy or our own feelings and emotions, and anyone who says they haven't is lying.


----------



## ratirt (Feb 14, 2019)

cucker tarlson said:


> I meant it had nothing to do with nvidia specifically,you just did no research and bought worse quality.That's why you're calling yourself stupid.


I don't think he called himself stupid  
BTW I switched to AMD and I didn't regret that even a second.


----------



## londiste (Feb 14, 2019)

Vycyous said:


> I should know, because I was one of those so-called "Nvidiots" who mindlessly bought Nvidia (and Intel) hardware, even when AMD (or ATI) offered better performance, efficiency, and value (the triple crown).


What exact product did you buy and when?


----------



## cucker tarlson (Feb 14, 2019)

Vycyous said:


> Well, aren't we all? That's a rhetorical question, because the answer is yes. We've all fallen prey to a marketing ploy or our own feelings and emotions, and anyone who says they haven't is lying.


to a degree we are. but I wouldn't go as far as blaming a company's marketing practices for your own,bad choices you could've easily avoided.


----------



## Vycyous (Feb 14, 2019)

londiste said:


> What exact product did you buy and when?



It was more of a broad statement. For whatever reason, I stuck with Intel and Nvidia for over 15 years across multiple product generations.



cucker tarlson said:


> to a degree we are. but I wouldn't go as far as blaming a company's marketing practices for your own,bad choices you could've easily avoided.



How long have you been using, working with, or building PCs?


----------



## cucker tarlson (Feb 14, 2019)

Vycyous said:


> It was more of a broad statement. For whatever reason, I stuck with Intel and Nvidia for over 15 years across multiple product generations.
> 
> 
> 
> How long have you been using, working with, or building PCs?


around 20.


----------



## Vycyous (Feb 14, 2019)

cucker tarlson said:


> around 20.



And how many AMD/ATI products have you purchased during that time period?

Edit: I'm not really sure why I'm asking, because you could tell me anything and I wouldn't know whether you're telling the truth or lying.

The point I'm trying to make is that I am certain you have not always made the best choices, have likely tried to blame others for those choices, and have been upset over those choices. When companies employ deceptive and illegal business practices, the people who buy their products are victims, not "stupid" or "idiots" as you have put it. You have most certainly been in that position at some point in your life, so stop trying to call others out for it.


----------



## cucker tarlson (Feb 14, 2019)

Vycyous said:


> And how many AMD/ATI products have you purchased during that time period?


many.
although the most recent one was r9 290 in 2014,so that was some time ago.


----------



## londiste (Feb 14, 2019)

Vycyous said:


> It was more of a broad statement. For whatever reason, I stuck with Intel and Nvidia for over 15 years across multiple product generations.


I thought you were talking about better performance, efficiency and/or value in specific things you bought.


----------



## Vycyous (Feb 14, 2019)

cucker tarlson said:


> many.
> although the most recent one was r9 290 in 2014,so that was some time ago.



Well, I applaud you for buying an AMD product when they were struggling. For me, it's only been within the last few years that my conscience has stopped me from buying new Intel and Nvidia hardware and I have mostly switched to AMD CPUs and GPUs. I still buy used stuff every now and then because I just like to have a wide range of computer hardware to mess around with, I guess... it's a hobby and hobbies are rarely practical.


----------



## cucker tarlson (Feb 14, 2019)

Vycyous said:


> Well, I applaud you for buying an AMD product when they were struggling. For me, it's only been within the last few years that my conscience has stopped me from buying new Intel and Nvidia hardware and I have mostly switched to AMD CPUs and GPUs. I still buy used stuff every now and then because I just like to have a wide range of computer hardware to mess around with, I guess... it's a hobby and hobbies are rarely practical.


Conscience has nothing to do with what I buy. I did not give a crap whether they were struggling or not, hawaii was a very competitive product against 780 which was way overpriced back then, so was a 7870 I bought before that.That's why I had them. Now amd's enthusiast cards are not beating nvidia's offerings in price/perfromance,so I stay away from them. Vega arrived late and priced close to 1080Ti,Radeon VII is a sad continuation to the story of amd's demise in the enthusiast segment.


----------



## Vayra86 (Feb 14, 2019)

Vycyous said:


> Well, aren't we all? That's a rhetorical question, because the answer is yes. We've all fallen prey to a marketing ploy or our own feelings and emotions, and anyone who says they haven't is lying.



So who's fault is that then? AMD's? Nvidia's? The world's?

I'm struggling to find the logic here. Marketing is a given, so deal with it. That was no different ten, twenty or a hundred years ago than it is now. It also has nothing to do with Nvidia or Intel schemes or tactics. All it is, in the end, is AMD lacking to do something to market its products to you.

Think of it as a marketplace on a crowded town square. Every salesman is yelling out loud trying to drown the others around him. AMD is there too, and its salesman sits there in a dark corner with teary eyes wondering why he isn't selling.

Its always the same with this sort of discussion: who's at fault? Is it the company that is finding and creating loopholes to look like the better choice, or is it the company that complains about others doing so? I think that question isn't relevant to begin with. The real question is: who is best at keeping its business afloat? And we know the answer, and that pretty much ends the discussion. Or you can keep going on about it, like AMD fans tend to do, even decades after the fact... the horse is dead, trust me.

The only thing that counts is competitive product, look where AMD is now with CPUs. You need design wins, it is that simple, and ever since ATI was bought, those have been very scarce on GPU. The company has chosen a speciifc approach to its GPUs, its marketing and its ecosystem / industry support (must be open source, must be industry driven, and low-risk/low investment for AMD) and those choices have had their effects. Many of which we could have predicted AND have seen coming for years now (pushing HBM on consumer parts, the increasing inefficiency of GCN, the crappy marketing approach, and now 'focusing on the midrange') have all come true in their worst possible form. Even when AMD had a competitive midrange product, it didn't land with gamers, but with miners, and now mining is over and its no longer competitive (because, again, rebrand upon rebrand of Polaris). And the rest of the product stack is simply _not good enough._


----------



## Mac2580 (Feb 17, 2019)

RX570 wasnt out yet when i bought my 1050ti. No disrespect to the hardware, but I hated the AMD crimson software. They might have sorted it out now (used 2012-2017), but they lost me as a customer due to it.


----------



## Camm (Feb 17, 2019)

Mac2580 said:


> RX570 wasnt out yet when i bought my 1050ti. No disrespect to the hardware, but I hated the AMD crimson software. They might have sorted it out now (used 2012-2017), but they lost me as a customer due to it.



Its the other way around now, AMD's software is amazing to use, where as Nvidia's slow POS makes me want to drive pins into my eyes.


----------



## GoldenX (Feb 17, 2019)

I would chose the 1050Ti over the 570, not for performance nor for driver UI (seriously, that Geforce Windows XP interface is so old now), but for power consumption and proper OpenGL support, the programs I use are not good friends of the AMD driver.


----------



## Mac2580 (Feb 17, 2019)

Camm said:


> Its the other way around now, AMD's software is amazing to use, where as Nvidia's slow POS makes me want to drive pins into my eyes.



Maybe for you, at the time i was running a Q9400 with 4GB ram, and a r7 260x. Always using at least 600mb ram after an hour, and refused to be ended in task manager, and sometimes wouldnt open at all despite it running in the background.

Probably a ram issue on my side because the 4GB were pretty much at full load always, but still left a bad impression in my mind.


----------



## GoldenX (Feb 17, 2019)

Mac2580 said:


> Probably a ram issue on my side because the 4GB were pretty much at full load always, but still left a bad impression in my mind.


There you go.


----------



## Zubasa (Feb 17, 2019)

Mac2580 said:


> Maybe for you, at the time i was running a Q9400 with 4GB ram, and a r7 260x. Always using at least 600mb ram after an hour, and refused to be ended in task manager, and sometimes wouldnt open at all despite it running in the background.
> 
> Probably a ram issue on my side because the 4GB were pretty much at full load always, but still left a bad impression in my mind.


CCC was a completely different animal to what AMD has now.
It was slow and unresponsive, also crashes were not uncommon.
AMD rewrited the driver to Crimson, since then it loads quick on start up and would usually restart itself if it ever crashes.
Also the UI is a lot more responsive than CCC and many features were added since then.


----------



## Super XP (Feb 17, 2019)

> Why are people still buying 1050TI's and not RX 570's?



Maybe because people are misinformed or getting bad advice from others.


----------



## John Naylor (Feb 17, 2019)

Some people do research before they buy, some don't.  Some people do research before they buy and make the wrong decisions because even charts like TPus can be misleading.   For example, if you look at the charts, the RX 480 might look attractive against the 480, 580 and even the 590.  But when you compare the cards both overclocked, then the 1060 has the edge.  But the 1050 Ti was one of those cards I call a "what the" card like the 960 ... when ya look at the relative performance of the 960 and 970 versus the cost difference between them I would say "what the ....  heck would anyone be thinking buying the 960 when for a bit more they could have a 970 ? "  I put the 1050 Ti in that category too. 

The other thing is this ... back in the day IBM was in the apop busiuness and they dominated.  Every year, when the new generation lappies broke, the print mags  ... yes they still existed back then .... had the IBM A20 on the cover.  It would be the Editors Choice and when you read the test results and the article you'd always agree with their conclusion.  Now every exec wanna be out there who as yet didn't have his Mercedes to work, didn't live in the right zip code and couldn't afford the A20p, walked into his office with an IBM laptop.  Same reason every kid cries when Mom wouldn't spring for the Air Jordan's and Gold Medal winners appear on the Wheaties Box ... cause no one remember's the Silver Medal Winners name. 

There's really been no competition at the top since the 780 / 290x generation ... The 780 Ti took the 1st tier, the 780 held the second (again both cards overclocked).  With 9xx, the 970 took the 3rd tier and with 10xx, they took the 4th.   So now.... when the uninformed ask their peers what they use, the responses in those upper tiers which constitute a large art of the market are mostly going to point to the green team.  And those that didn't put the T & E to researching options at their budget level, will make the wrong choice.  

Using this ... 

https://www.techpowerup.com/reviews/MSI/GTX_1060_Gaming_X_3_GB/26.html
https://www.techpowerup.com/reviews/Sapphire/RX_570_Pulse/30.html ....

 ignoring OC ability, rebates and assuming a $1,000 base build (excluding GFX card) with the RX570

Compared to 1060 6GB .... 

1050 Ti = 100 / 160 = 63% x 57.8 / 53.3 = 67.8
RX 570 = 100 / 113 = 88% x 74.8 / 67.8 OC = 97.6

So when we look at value for a build w/ each card

1050 Ti Build = $1170 / 67.8 = $17.26
RX 570 = $1150 / 97.6 = $11.78

So yes, when you put in the T & E, the RX 570 is a way, way better investment than the 1050 Ti ... however, the question that also begs to be ask ?   why aren't they buying the 1060 ?

1060 3GB = 100 / 106 = 94.3% x 95.4 / 85.9 OC = 104.8
1060 3GB Build = $1195 /104.8 = $11.41


----------



## eidairaman1 (Feb 17, 2019)

Zubasa said:


> CCC was a completely different animal to what AMD has now.
> It was slow and unresponsive, also crashes were not uncommon.
> AMD rewrited the driver to Crimson, since then it loads quick on start up and would usually restart itself if it ever crashes.
> Also the UI is a lot more responsive than CCC and many features were added since then.



Yeah ccc was a pain, the console is much better now.


----------



## trog100 (Feb 17, 2019)

a 1050ti is nice upgrade for an office machine.. it just drops in with no extra power connectors needed.. and that is the reason people dont buy a 1060 or amd card.. none of them have the just drop in convenience.. 

its not really a gaming card just a neat way of giving an office machine half decent graphics with minimal expense and pissing about..

trog


----------



## Super XP (Feb 17, 2019)

eidairaman1 said:


> Yeah ccc was a pain, the console is much better now.


The Radeon is being better utilized. That's the better budget card.


----------



## erocker (Feb 17, 2019)

trog100 said:


> a 1050ti is nice upgrade for an office machine.. it just drops in with no extra power connectors needed.. and that is the reason people dont buy a 1060 or amd card.. none of them have the just drop in convenience..
> 
> its not really a gaming card just a neat way of giving an office machine half decent graphics with minimal expense and pissing about..
> 
> trog


Integrated graphics is plenty for office machines. 

Nvidia is almost 6 times larger than AMD. As far as your normal consumer, Nvidia has that market. AMD has done very little lately to gain that market. That's why people buy Nvidia.


----------



## Deleted member 67555 (Feb 17, 2019)

Last time I was at Microcenter a salesman advised a customer that a 1050ti @$170 was better than a rx570 4gb @$130.
BS like that really doesn't help either...
I'd rather be offered the RX570 and if needed a good enough PSU @$40


----------



## cucker tarlson (Feb 17, 2019)

salesmen don't know jack,most people don't follow gpu reviews too.you can have 1600 and rx570 for great price now,but I bet shops still sell more i3+1050ti computers.


----------



## notb (Feb 17, 2019)

erocker said:


> Integrated graphics is plenty for office machines.
> 
> Nvidia is almost 6 times larger than AMD. As far as your normal consumer, Nvidia has that market. AMD has done very little lately to gain that market. That's why people buy Nvidia.


Essentially this is it.
If you're really into computers, you certainly know about a company called AMD. You know they often make cheaper alternatives in both CPU and GPU realm. They also have some nice features which can make them a better choice in your particular case.
But that's all true *if you're into computers*.*

If you're not, you buy the mainstream brands. And this basically means you buy Intel and Nvidia, which is really fine since both make very good products - usually more user-friendly than the smaller competition.

Also, it's not like RX570 is a better card for a typical buyer. Yes, it may have an advantage in performance, but that's about it. Noise, power consumption, heat, ease of use - Nvidia wins in everything that really matters in the mainstream market.
AMD decided to ignore this. They focus on enthusiast DIY-oriented products. It lets them save tons of money on polishing products. And as long as they're doing the profit they want, having just 10% market share in CPUs and 20% in dGPUs could be their strength rather than weakness.

*) If someone from this community has trouble imaging this situation, I suggest to think like he/she buys shoes, wine, flooring, cheese, plates or washing powder. 



jmcslob said:


> Last time I was at Microcenter a salesman advised a customer that a 1050ti @$170 was better than a rx570 4gb @$130.
> BS like that really doesn't help either...


You have to consider the fact that salesmen are interested in their profit margin. Nvidia's higher prices leave more space for the retail. They may also pay some bonuses.
That's how business works and AMD knows this. They could simply increase the MSRP and leave the whole difference to the distribution chain. But they decided that getting better marks in performance/prize reviews will impact their sales more.


> I'd rather be offered the RX570 and if needed a good enough PSU @$40


The suggestion that average PC owner will replace a PSUs just to support a more power-hungry graphic card (from a brand he doesn't know much about) is just delusional. Not happening, ever.


----------



## Shambles1980 (Feb 17, 2019)

i don't need washing powder to have high energy efficiency, make no noise or produce no heat.. all i need it to do is wash stuff well.
in fact everything you mentioned i just need it to do 1 thing well.
and if im buying a gpu im pretty damn sure what i want it to do is display gfx as fast as possible..
To show you how silly the argument is i will now say the exact same thing to you..

my igpu beats the the1050ti on Noise, power consumption, heat, and ease of use the only thing the 1050ti does better is performance. . so the Igpu is better.. (hope you see how silly that sounds)


----------



## notb (Feb 17, 2019)

Shambles1980 said:


> i dont need washing powder to be low energy efficiency, make no noise or produce no heat.. i need it to wash stuff well.
> in fact everything you mentioned i just need it to do 1 thing well.
> 
> and if im buying a gpu instead of using integrated im pretty damn sure what i want it to do is display gfx as fast as possible.. Not be energy efficient and produce less noise and heat.
> ...


The reality is not black and white. A rational person always looks for the best compromise. So despite what you said, if you're a rational person, you will as well.
But everyone has their own ideal compromise, with different weights put to different aspects. We call that "needs".

You have to understand Nvidia and AMD are doing things differently.

AMD products are designed to target a very specific group of enthusiasts. They are and will be better in raw performance/price reviews, because that's what they want to do.
They always make the best performing GPU they can - the one that will shine in benchmarks.
This approach usually leads to making the most powerful GPU that AMD can make in a particular price bracket.

Nvidia products are designed to target the needs of consumers. They make an extensive market research and learn what GPU they should make to sell many of them and with high profit.
This approach does *not* lead to making the most powerful GPU Nvidia can make in a particular price bracket.  But it leads to being a successful and rich company.

Look at this forum. How many topics about stability, undervolting, overclocking, temp, noise, driver issues and - finally - bad BIOS flash topics do we have for AMD cards and how many for Nvidia?


----------



## Super XP (Feb 17, 2019)

notb said:


> Essentially this is it.
> If you're really into computers, you certainly know about a company called AMD. You know they often make cheaper alternatives in both CPU and GPU realm. They also have some nice features which can make them a better choice in your particular case.
> But that's all true *if you're into computers*.*
> 
> ...


"""AMD decided to ignore this."""

What? No they did not. Lol
They finally have a very competitive CPU that's BETTER than anything Intel has to offer. 

They have a struggling GPU based on the old gcn. They've been in the process to completely leave gcn and move onto something new. Designing a brand new GPU takes many years. 

So you rather AMD sit Idle and do nothing or continue refining the old outdated gcn till they are ready for the new GPU release. 

Well that's what they've been doing for several years now. Many Nvidia Fanboys think designing a brand new GPU and releasing it is equivalent to opening up a cracker jack box and POOF a brand new AMD Radeon powerful enough to knock back Nvidia. 

The world doesn't worth this way. Releasing hardware takes years of hard work.


----------



## Fluffmeister (Feb 17, 2019)

Ultimately who cares? Why do people wait years for the same performance at the same price? Why buy Levi jeans instead of Lee jeans when they are all made in the same sweat shop by people getting paid peanuts. Why buy a Ferrari when you can get a Nissan GTR?

Why why why? Why did they die in Waco? I know why.


----------



## Super XP (Feb 17, 2019)

Fluffmeister said:


> Ultimately who cares? Why do people wait years for the same performance at the same price? Why buy Levi jeans instead of Lee jeans when they are all made in the same sweat shop by people getting paid peanuts. Why buy a Ferrari when you can get a Nissan GTR?
> 
> Why why why? Why did they die in Waco? I know why.


Because people want to be seen in a Ferrari as a status symbol. Quite pathetic actually LOL


----------



## notb (Feb 18, 2019)

Super XP said:


> """AMD decided to ignore this."""
> 
> What? No they did not. Lol
> They finally have a very competitive CPU that's BETTER than anything Intel has to offer.


I'm not talking about benchmarks. I just said they will win raw performance comparisons. That's the whole idea. 

I'm just trying to say that AMD could check what consumers need from time to time. It would boost revenue and maybe even improve the product lineup...
2 years ago they were selling sh*t and had 8-10% market share. Now they're selling these shinny benchmark dominators and their market share is reported to be 11-12%.
Sure, it's a huge increase for them, but they're still really tiny.

You say their CPUs are better than anything Intel has. So why aren't most people on Earth already using AMD? Any guess? 

To be honest: there's really nothing bad in being a small company. You can still be profitable and have an interesting product lineup (which AMD does for sure).
But if you're small, you'll never be able to afford a good distribution chain, good enterprise support or invest in new product lines. You're not building a strong brand. And a strong brand is what lets you sell at higher prices.

Also, if I were an AMD fanboy like some people here, I would really worry about 7nm access. Seriously, how much more can companies like Apple or Nvidia pay TSMC for a good place in a queue?


Super XP said:


> Because people want to be seen in a Ferrari as a status symbol. Quite pathetic actually LOL


Oh really? And can a PC be a status symbol?

Why did you fill the "specs" on this forum? Why do you have a PC case with a window?


----------



## trog100 (Feb 18, 2019)

i have two main desktop PCs in my house.. one i call a gaming machine with a 2080TI.. the other is mostly used for browsing and youtube watching and not by me.. it has a 1050TI in it.. 

earlier i used the wrong term.. "office machine".. it isnt its just a low power machine that can be used for light gaming if so required its also in a small form factor case..

thats is why i bought a 1050ti.. there is no amd equivalent.. 

trog


----------



## Super XP (Feb 18, 2019)

notb said:


> I'm not talking about benchmarks. I just said they will win raw performance comparisons. That's the whole idea.
> 
> I'm just trying to say that AMD could check what consumers need from time to time. It would boost revenue and maybe even improve the product lineup...
> 2 years ago they were selling sh*t and had 8-10% market share. Now they're selling these shinny benchmark dominators and their market share is reported to be 11-12%.
> ...


I noticed this way back in the AMD Athlon XP days, where owning an AMD gave the impression of you being POOR for example. And that Intel "Inside" was the only way to go.

As for the rest of your points, I actually agreed with U, just phrased it a different way. AMD being the underdog and all, and holding such title ever since AMD became AMD.
AMD just needs to try and keep ahead, and they've been doing this with innovative products. What else can they do? Intel is far too powerful. 
Oh forgot something, when AMD gets set back, such as the Bulldozer Launch, they fall behind quite a bit, and they take a lot longer to bounce back. Despite the Lacking Bulldozer, I do admit they did great in keeping that product line alive and selling all till the ZEN was released. 

I speak to people today, and when I mention AMD, they say AMD runs too hot and Intel is much faster. My point is the AMD they speak about is the AMD 20+ years ago, when it was too hot and somewhat not as stable as Intel counterparts. lol


----------



## lexluthermiester (Feb 18, 2019)

Super XP said:


> I noticed this way back in the AMD Athlon XP days, where owning an AMD gave the impression of you being POOR for example. And that Intel "Inside" was the only way to go.


Maybe it was the area, be we didn't have this problem where I lived. It was all about the gaming performance. AMD and Intel traded blows on who was top dog in games. For a while, AMD was king with the Athlon XP Barton cores. Then Intel came back with the high GHZ Pentium 4 and took that crown back, but not by much and when you OC'd the Athlon's, the gap closed up. Match an Athlon with an ATI(back then) Radeon and you had a rockin' gaming rig.


----------



## eidairaman1 (Feb 18, 2019)

lexluthermiester said:


> Maybe it was the area, be we didn't have this problem where I lived. It was all about the gaming performance. AMD and Intel traded blows on who was top dog in games. For a while, AMD was king with the Athlon XP Barton cores. Then Intel came back with the high GHZ Pentium 4 and took that crown back, but not by much and when you OC'd the Athlon's, the gap closed up. Match an Athlon with an ATI(back then) Radeon and you had a rockin' gaming rig.



Still do, this is even going with a 7970 6G on a Ryzen, R9 280X 6G R9 290.


----------



## Countryside (Feb 18, 2019)

There has to be a real specific reason to buy 1050ti but just for gaming rx570 is a no brainer.


----------



## sepheronx (Feb 18, 2019)

Used market, a RX 570 is almost the same price as a used RX 480 or 580.  So there is little point in buying a used one.

Heck, I just checked up.  Even new RX 570 are only about $40 cheaper than a new RX 580.  So there is little point in purchasing one in Canada.


----------



## cucker tarlson (Feb 18, 2019)

Super XP said:


> They finally have a very competitive CPU that's BETTER than anything Intel has to offer.


I still think they need purely gaming oriented mid-range chips like 8400/8600k/9600k's to cover the whole spectrum.8400 is 900pln here and it's a better gaming cpu than a 1450pln 2700x. For 1100 you can have a 8600k and blow the 2700x out of the water if you're interested in gaming only.The difference in price is enough for you to get a high end air cooler or a 240mm clc and keep it cool and quiet.


----------



## notb (Feb 18, 2019)

cucker tarlson said:


> I still think they need purely gaming mid-range chips like 8400/8600k/9600k's to cover the whole spectrum.


IMO it's still a long way to go.

No Atom / ULV options (Zen CPUs start at 15W because of IF). Very weak embedded lineup (mostly still using Excavator). I doubt "IoT" is a popular topic on AMD internal meetings.
Mobile CPUs exist but aren't great and currently limited to 4 cores (Intel already offers 6C and just announced 8C mobile CPUs).
There are also gaps in the "Pro" lineup - a problem Intel doesn't have (most consumer CPUs have the essential security features).

And funny enough, until 7nm comes along and makes higher core count possible, AMD is behind Intel on core number in AM4 APUs. And this will not change until they make a more efficient, small IGP. Vega is just way to big. But if they manage to fit a GPU in the I/O module of the new chiplet design - problem solved.

Then there's the question of use scenarios rather than raw specs. Zen doesn't work well in some applications, so AMD has to either drastically improve IF or design a separate lineup.

It's actually a lot better on the GPU front. The lack of hardware Tensor solution is a bummer, but other than that (GP)GPUs do what they're meant to do. They only have an efficiency problem.


----------



## Caring1 (Feb 23, 2019)

JRMBelgium said:


> Fact:
> The RX 570 clearly wins in terms of performance....


Wrong!
Just because you call it a fact, doesn't make it so.
Power consumption is part and parcel of "performance" so explain how a card that uses much more power wins?


----------



## ArbitraryAffection (Feb 23, 2019)

Caring1 said:


> Wrong!
> Just because you call it a fact, doesn't make it so.
> Power consumption is part and parcel of "performance" so explain how a card that uses much more power wins?


What.  570 is faster than 1050ti. Unless you're underclocking the crap out of the 570, it is a fact.

570 doesn't even use that much more power, wtf. And overclocking it can be near 1060 performance.

Truth is, unless you need a slot powered card or low profile, if you buy 1050ti over 570 you are an _idiot._



cucker tarlson said:


> I still think they need purely gaming oriented mid-range chips like 8400/8600k/9600k's to cover the whole spectrum.8400 is 900pln here and it's a better gaming cpu than a 1450pln 2700x. For 1100 you can have a 8600k and blow the 2700x out of the water if you're interested in gaming only.The difference in price is enough for you to get a high end air cooler or a 240mm clc and keep it cool and quiet.


6 threads smh

Can't wait till games start using the 2700X lol


----------



## lexluthermiester (Feb 23, 2019)

Caring1 said:


> Just because you call it a fact, doesn't make it so.


However the overwhelming amount of benchmarking and testing do make it a fact. The RX570 outperforms the 1050ti handily. That is a fact. Whether he says it, W1zzard says it or I say it, the numbers prove it.


Caring1 said:


> Power consumption is part and parcel of "performance"


No it isn't. Power consumption is a metric that stands by itself and has nothing to do with performance. And if you think that the amount of power used by the RX570 over the 1050ti is going to hurt someones power bill, you need to go do some research.


----------



## eidairaman1 (Feb 23, 2019)

Caring1 said:


> Wrong!
> Just because you call it a fact, doesn't make it so.
> Power consumption is part and parcel of "performance" so explain how a card that uses much more power wins?



For the performance you get here look at gpu temps compared to frames, for a 8-10 degree increase in temps you get double the amount of frames.

570 has more goin for it and obviously a superior card.










1050ti is like a FX5200


----------



## cucker tarlson (Feb 23, 2019)

ArbitraryAffection said:


> Can't wait till games start using the 2700X lol



the fact is they ARE.what is happening is the cpu load is spread across all threads,which makes those high core count ryzens pretty cool and power efficient chips for gaming.They're not using as much resources as e.g. a hexacore i5.
Where people make a common mistake though is putting equivalence between core performance and thread count. a 2600x will stay at 50% load across all threads where 8600k will get to 70%,but 2600x will still lose in performance numbers cause of what I said-no equivalence. Games prefer faster core. Those multi threaded games do too. More cores is only useful when you're core count limited.Hexacores with no HT are not. Or al least the number of cases where a 8600k will get hampered by the lack of threads is overwhelmingly lower than where Ryzen will be hampered by its single core performance/latency even though it IS in fact using multiple threads.Otherwise core performance and memory speed/latency is all that matters.
Buying something and then hoping for the scenario where it will outperform the competition to happen is silly to me.If it ever happens and you feel you need an upgrade replace it.Don't sacrifice current performance,at least not too much.If 2700x had 8600k performance that'd be a much better buy than the i5,but truth is 2700x can't outperform 8400 overall.Take the most multi threaded games there are currently.Wildlands can load all 16 threads on 1700x for example,I have a friend who did this test.Even if you drop the resolution to 720p it'll not load all 16 threads anywhere near 100%. It's not like the relevance of core performance is gonna vanish anywhere with the introduction higher core count cpus. To sum up,it's not that games are not using 2700x,they are.It's cause the bottleneck is somewhere else why you see it match 8400 overall. You see 2700x hardly improving the performance of 2600x in games,then you test with higher speed/low latency ram and the performance just skyrockets. Latency is key too. You can see threadripper 2950x with 4.4GHz turbo lose to locked 2nd gen ryzens,which seems weird unless you take a look at latency results on these chips.2nd gen ryzens improved over 1st gen here,not by much,it it already shows a little.I'm curious of this new I/O die design on 3rd gen.I wish they don't make a step forward in core count per ccx to then take a step backwards in not improving latency.I want gaming oriented cpus,not threadrippers brought to mid-range segment.


----------



## ArbitraryAffection (Feb 23, 2019)

cucker tarlson said:


> the fact is they ARE.what is happening is the cpu load is spread across all threads,which makes those high core count ryzens pretty cool and power efficient chips for gaming.They're not using as much resources as e.g. a hexacore i5.
> Where people make a common mistake though is putting equivalence between core performance and thread count. a 2600x will stay at 50% load across all threads where 8600k will get to 70%,but 2600x will still lose in performance numbers cause of what I said-no equivalence. Games prefer faster core. Those multi threaded games do too. More cores is only useful when you're core count limited.Hexacores with no HT are not. Or al least the number of cases where a 8600k will get hampered by the lack of threads is overwhelmingly lower than where Ryzen will be hampered by its single core performance/latency even though it IS in fact using multiple threads.Otherwise core performance and memory speed/latency is all that matters.
> Buying something and then hoping for the scenario where it will outperform the competition to happen is silly to me.If it ever happens and you feel you need an upgrade replace it.Don't sacrifice current performance,at least not too much.If 2700x had 8600k performance that'd be a much better buy than the i5,but truth is 2700x can't outperform 8400 overall.Take the most multi threaded games there are currently.Wildlands can load all 16 threads on 1700x for example,I have a friend who did this test.Even if you drop the resolution to 720p it'll not load all 16 threads anywhere near 100%. It's not like the relevance of core performance is gonna vanish anywhere with the introduction higher core count cpus. To sum up,it's not that games are not using 2700x,they are.It's cause the bottleneck is somewhere else why you see it match 8400 overall. You see 2700x hardly improving the performance of 2600x in games,then you test with higher speed/low latency ram and the performance just skyrockets. Latency is key too. You can see threadripper 2950x with 4.4GHz turbo lose to locked 2nd gen ryzens,which seems weird unless you take a look at latency results on these chips.2nd gen ryzens improved over 1st gen here,not by much,it it already shows a little.I'm curious of this new I/O die design on 3rd gen.I wish they don't make a step forward in core count per ccx to then take a step backwards in not improving latency.I want gaming oriented cpus,not threadrippers brought to mid-range segment.



Fair enough but it's not like 2700X is massively slower than 8600K, it's like 15%. If I was buying Intel right (I'm not) I would get 9700K, not one of the hexa i5's. If 9600K had HT I would've praised it. People talk like Ryzen is trash for gaming but honestly it's almost as good and has waaaay better multi-procesing performance at the same price point. I'll take 15% less FPS when I'm already over 100 and 50% more multi-processing any day  I'm still betting on it.


----------



## cucker tarlson (Feb 23, 2019)

ArbitraryAffection said:


> Fair enough but it's not like 2700X is massively slower than 8600K, it's like 15%. If I was buying Intel right (I'm not) I would get 9700K, not one of the hexa i5's. If 9600K had HT I would've praised it. People talk like Ryzen is trash for gaming but honestly it's almost as good and has waaaay better multi-procesing performance at the same price point. I'll take 15% less FPS when I'm already over 100 and 50% more multi-processing any day  I'm still betting on it.


15% is a lot,that's more than Vega 56 to Vega 64.
It's not trash,no one is saying it's trash except for amd fanbase putting words in other people's mouths.but the balance is not the way a lot of people would like.show me a scenario where a 8c/16t can do something for a regular user that 6c/12t can't. 10-15% more gaming performance is nothing to scoff at on the other hand. current gen i7s are pretty badly priced tbh.I never remember a situation where i7-K cost a 70% premium over i5-K, it's always been 40% here in PL.If 2700x had 8600k's gaming performance I'd probably have bought it already.


----------



## ArbitraryAffection (Feb 23, 2019)

cucker tarlson said:


> 15% is a lot,that's more than Vega 56 to Vega 64.
> It's not trash,no one is saying it's trash except for amd fanbase putting words in other people's mouths.but the balance is not the way a lot of people would like.show me a scenario where a 8c/16t can do something for a regular user that 6c/12t can't. 10-15% more gaming performance is nothing to scoff at on the other hand.


The gamer in me would want the highest FPS in Warframe (9600K would do fine) but the _enthusiast _in me wants *all of the cores ever*. But the skint person in me can't afford 9900K. XD 

But seriously I'm fine with the 2700X. My FPS is 80+ in Warframe and mostly over 100. Going up to 144Hz kinda makes me think about getting a more gaming-oriented CPU to keep it closer to my max Refresh rate, but at this point I will wait for Zen2 in July~ and maybe get some faster RAM too. Saves me getting a new motherboard (if I was going Intel that is). I still don't want to buy Intel based on principle. I just feel their products aren't priced fairly. Okay , actually, come 2020 I will give Intel a clean slate. Maybe I will get an Ice Lake when their 10nm finally comes. Who knows.


----------



## Vario (Feb 23, 2019)

jmcslob said:


> Last time I was at Microcenter a salesman advised a customer that a 1050ti @$170 was better than a rx570 4gb @$130.
> BS like that really doesn't help either...
> I'd rather be offered the RX570 and if needed a good enough PSU @$40


Of course its better... for microcenter ... to sell $40 more in revenue.  That salesman should get a promotion, I don't think they even give these guys commission and hes still pushing the expensive products, what a champ!


----------



## eidairaman1 (Feb 23, 2019)

Vario said:


> Of course its better... for microcenter ... to sell $40 more in revenue.  That salesman should get a promotion, I don't think they even give these guys commission and hes still pushing the expensive products, what a champ!



That's reason why I get informed, if I see someone getting suckered (reason why I stopped seeking advice here) I will inform them (reason why I give advice)

Either way microcenter makes a profit off everything they sell otherwise they dont keep it on the shelf. Heck they still make dough on IGP replacement boards.


----------



## John Naylor (Feb 23, 2019)

The numbers are the numbers.  Arguing about images, phases of the moon, mindshare  and other factors is fruitless.    Yes,, when a project diminates, they benefit from mindshare ... but that just doesn't happen, you have to earn it.  As "educated consumers", we should be beynd that.  When you pick up the Wheaties box, the person on the cover is the one who won the gold medal.  And when you pick a PC related magazine, the cover belongs to nVidia because the press, web based and print, likes to write about the exciting stuff because it sells more ads.  The 1080 Ti took the Gold, the 1080 the silver, the 1070 the bronze, the 1060 just missed a medal, but it finished ahead of everything against it.  The idea of tech mags and sites is to get articles read.  What they prnt has to do with answrering the questions a)  Will they send me a sample and b) who will read in.    Only one of those is a judgement call.  There is one 570 review here on TP ... there's 4 on the 1050 Ti,

 The 570 was the better card but the choice between the 570 and 1050 Ti ***today*** is like arguing about what's better Betmax or VHS.  Prices on the 1060 3 GB have dropped to a point where it makes no sense to consider either of them.  It's one of those cards kike the 970 where the 960 and it's competion were just left in the dust such that the 970 sold more than all 20+ AMD 200 and 300 series cards  combined.  The TX 570 is $150 and no it doesn't make sense to spend an extra $20 for the Ti ($170).  From a performance stabdpoint, it would make sense to buy the 570 is it was $20 more ... the problem for AMD is, it also makes sense to spend the extra $20 to get the 1060 3GB for $190  A $1000 build w/o a GFX card presents the following Performance / proce ratio:

1050 Ti Build = $1170 / 67.8 = $17.26
RX 570 = $1150 / 97.6 = $11.78
*1060 3GB Build = $1195 /104.8 = $11.41*

So while there is a huge jump in value per dollar getting the 570 over the Ti, what abandon your evalauation methods / logic and not the 1060 3 GB.  None of these cards are adequate for 1440p and at 1080p, the 1060 has the best ROI.   But there are other things to think about besides performance.   Let's look at some other factors:

Criteria:  1060 / 570 / 1050 Ti

Fan Noise Idle: 0 / 0 / 0 dba
Fan Noise Load:  29 / 31 / 27 dba
Power:  130 / 180 / 75 watts
Temps @ OC:  67 / 74 / 66C

Anything there that might swing a decision ?  How about the guy with PSU that doesn't have PCI-E cables and can't use  a card that is over 75 watts ? It shouldn't be about hand picking two cards and using a limited scenario; this is what Purch Group does on their media sites to pick "best card under ... [$1 more than the price of the product they been paid to generate interest for].   If we look at the $150 - $200 price range and pick the best card for 1080p, it's clearly the 1060 3 GB.


----------



## EarthDog (Feb 23, 2019)

Vario said:


> Of course its better... for microcenter ... to sell $40 more in revenue.  That salesman should get a promotion, I don't think they even give these guys commission and hes still pushing the expensive products, what a champ!


Pretty sure they do get commission. They insist on sticking their sticker on whatever boxes you may walk away with.


----------



## Shambles1980 (Feb 23, 2019)

i generally think that people who work places like that were once enthusiasts and then just stopped caring and now just live in a bubble of what things were when they were enthusiasts.
Its quite easy to do as well. 
get a job settle down get a family, the next thing you know what used to be the best just isnt any more and things that used to suck are considered top of the line. 

I still don't know how msi and gigabyte managed to get a reputation as a good board mfr and abit went bust.


----------



## Vario (Feb 23, 2019)

Its a tedious and somewhat sophist thread built around flawed premises such as that 1) the 1050ti was always priced the same price as the 570 over its entire release cycle 2) Amazon sales rank is real time 3) cryptocurrency didn't make the 570 substantially more expensive than the 1050ti for a long period 4) people don't buy 1050ti for Dell, HP, and other OEM desktop office boxes that have low power supply wattage and insufficient power connectors for upgrades, all in order to make the fan boy argument that nvidia buyers are nv-idiots.  When is this thread going to die? 
	

	
	
		
		

		
			








Shambles1980 said:


> i generally think that people who work places like that were once enthusiasts and then just stopped caring and now just live in a bubble of what things were when they were enthusiasts.


That's how it is.  You just stop caring and push the products because it is your job to push the products.  Retail jobs mold the employee.


----------



## cucker tarlson (Feb 23, 2019)

Vario said:


> Its a tedious and somewhat sophist thread built around flawed premises, all in order to make the fan boy argument that nvidia buyers are nv-idiots.


they're here to inform people if they see someone getting suckered.


----------



## Vario (Feb 23, 2019)

cucker tarlson said:


> they're here to inform people if they see someone getting suckered.


I would hope no one is actually buying 1050ti at $150.  You can frequently get them for $100 on eBay.


----------



## notb (Feb 23, 2019)

John Naylor said:


> The numbers are the numbers.
> [cut]


While I might not agree on the use of "ROI" in this case (seriously, don't ) the whole post is pointing out an important argument... that many people don't get.

When you have a dense product lineup of GPUs, diversified even further by many different AIB variants, you can make some GPUs excel at a particular thing and target a particular client group. And this specialization will make them relatively unattractive on pure performance/price ratio.

So yes, 1050Ti may not be the best value based on just the basic parameters, but it will be the best buy for many consumers thanks to being small, cool, quiet and powered from PCIe.
In other words: many customers won't see 1050Ti and RX570 as alternatives.

Too understand just how much things like being slot-powered can be worth to customers, you only need to look at RX550 - the best AMD GPU that pulls this of.
The card launched with MSRP of $79, but the cheapest I could find it on Amazon is $110. This is the card we should compare 1050(Ti) to, not the RX570.


----------



## eidairaman1 (Feb 23, 2019)

There are lp 560s...


----------



## cucker tarlson (Feb 23, 2019)

eidairaman1 said:


> There are lp 560s...


which are weak cards,unlike bigger polaris.they're 20% slower than 1050ti while they still require a 6-pin and get massacred in efficiency,1050ti has 1.75x perf/wat of 560.
https://www.computerbase.de/2017-06...istungsaufnahme-der-grafikkarte-youtube-video

sorry,but 560s are just too bad in comparison and low price won't change that.actually,the price is not even that far off 1050Ti,which is a bad buy itself.
higher tier polaris cards deliver very good performance despite being power guzzlers (relatively,I still think 200w is rather easy for today's coolers to deal with quietly).rx560 has neither good performance nor efficiency.nor price for that matter.


----------



## notb (Feb 23, 2019)

eidairaman1 said:


> There are lp 560s...


Sh.t you're right. There are RX560 cards without a power connector. But they're mostly $130-$140, so it's GTX1050 money (for less oomph).

Are there any low-profile RX560s beside MSI? And that MSI costs $160. Nice. :-D


----------



## cucker tarlson (Feb 23, 2019)

why are people still buying rx560 and not 1050s/1050ti's ?


----------



## eidairaman1 (Feb 23, 2019)

Power guzzlers is a funny term.


----------



## cucker tarlson (Feb 23, 2019)

eidairaman1 said:


> Power guzzlers is a funny term.


English is not my 1st lang,so sorry if I make anything sound weird once in a while  they do require beefy coolings solutions despite being 1080/60 cards.


----------



## lexluthermiester (Feb 24, 2019)

Vario said:


> When is this thread going to die?


No one is forcing you to read and participate. Unwatch the thread, move along.


----------



## RealNeil (Feb 24, 2019)

xkm1948 said:


> I would choose 570 over 1050Ti any time. I mean at 1080p you would want as much fps as you can possibly get for an entry level card.



And the 570 does Crossfire too.


----------



## lexluthermiester (Feb 24, 2019)

RealNeil said:


> And the 570 does Crossfire too.


Another good point!


----------



## eidairaman1 (Feb 24, 2019)

cucker tarlson said:


> English is not my 1st lang,so sorry if I make anything sound weird once in a while  they do require beefy coolings solutions despite being 1080/60 cards.



Dual slot isnt beefy and reasonable. Trislot cooling is beefy.


----------



## lexluthermiester (Feb 24, 2019)

eidairaman1 said:


> Dual slot isnt beefy and reasonable. Trislot cooling is beefy.


Remember when the following single-slot card had a cooler that was considered beefy?
https://www.techpowerup.com/forums/threads/rare-gpus-unreleased-gpus.176929/post-3992972


----------



## eidairaman1 (Feb 24, 2019)

lexluthermiester said:


> Remember when the following single-slot card had a cooler that was considered beefy?
> https://www.techpowerup.com/forums/threads/rare-gpus-unreleased-gpus.176929/post-3992972



I put an Arctic Cooling VGA silencer on my Radeon 9700 Pro All In Wonder.

Look at My card now lol.


----------



## RealNeil (Feb 24, 2019)

My two Vega-64 cards are Beefy,.......LOL!


----------



## cucker tarlson (Feb 24, 2019)

RealNeil said:


> And the 570 does Crossfire too.





lexluthermiester said:


> Another good point!


crossfire ? on rx570s ? really ?

both 570 cf and 2060 cost around $350


----------



## lexluthermiester (Feb 24, 2019)

cucker tarlson said:


> crossfire ? on rx570s ? really ?
> 
> both 570 cf and 2060 cost around $350


True. What if you already have an RX570 and a crossfire mobo? Getting another 570 would be simple.


----------



## cucker tarlson (Feb 24, 2019)

lexluthermiester said:


> True. What if you already have an RX570 and a crossfire mobo? Getting another 570 would be simple.


you sell it,lol,are you asking me to describe the process of upgrading a graphics cards in detail?
it would be stupid,not simple.
if you buy another 570 instead of a 1070ti/Vega/2060 you are a stupid,stupid man.


----------



## lexluthermiester (Feb 24, 2019)

cucker tarlson said:


> you sell it,lol,are you asking me to describe the process of upgrading a graphics cards in detail?
> it would be stupid,not simple.


Crossfire is arguably better than SLI and is dead simple to setup. Why would it be stupid?


----------



## cucker tarlson (Feb 24, 2019)

lexluthermiester said:


> Crossfire is arguably better than SLI and is dead simple to setup. Why would it be stupid?


give it up,that's ridiculous when a single card can offer close to 2x the performance at the same cost and less than half power draw.
cf sucks too.can you link any test where it's proving any more relevant than sli or are you going to brainlessly continue repeating what the red fanbase says ?

https://www.techspot.com/review/1763-radeon-rx-590-crossfire/


Spoiler: 1.37x scaling













Spoiler: 2x$280 cards slower than a single $500 card













Spoiler: power draw













> Three years later we find that once again multi-GPU technology seems like a good idea on paper, but in practice it’s a bit of a fail. In our opinion SLI/Crossfire only makes sense for those with money to burn. For example, right now RTX 2080 Ti SLI graphics cards are about the only multi-GPU configuration that makes sense.
> 
> As for the RX 590s in Crossfire, we’d much rather have a single Vega 64 graphics card. It’s extremely rare that two 590s will provide higher frame rates than a single Vega 64, while also offering stutter-free gaming.



quit pleasing people who created this thread,use your own thinking maybe.

worse performance,stutter,higer power draw and noise - that sounds exceptionally well.makes me wanna grab a pair of V56 myself.

and just to be clear,nvidia offers a much wider sli support for games.


----------



## lexluthermiester (Feb 24, 2019)

cucker tarlson said:


> give it up,that's ridiculous when a single card can offer close to 2x the performance at the same cost and less than half power draw.


Those are some interesting numbers and they are just as equally irrelevant as they are not all inclusive. They also have nothing to do with the context of the upgrade scenario I referred to. If someone *already has* an RX570 and can easily add another one to their system, the upgrade path is not difficult to figure out. It's inexpensive and will add a noticeable boost to gaming performance.


cucker tarlson said:


> quit pleasing people who created this thread,use your own thinking maybe.


Thanks for the tip. Try to remind yourself I'm an NVidia owner..


cucker tarlson said:


> worse performance


Compared to what? You go from one card to two and the vast majority of games are going to get a healthy boost. I have personally seen this many times.


cucker tarlson said:


> stutter


Solved with a driver optimization and update, just like NVidia's SLI.


cucker tarlson said:


> higer power draw and noise


Of course, you're adding a second card.


cucker tarlson said:


> makes me wanna grab a pair of V56 myself.


That's not much of a point, two Vega56's will out-pace a lot of single cards at similar cost.


cucker tarlson said:


> and just to be clear,nvidia offers a much wider sli support for games.


Wider, perhaps, not better. Crossfire support, as I understand it, is easy for devs to intigrate into their games because AMD has made the tools easy to use. Crossfire is also easier to use even with games that don't specifically support it, unlike SLI.


----------



## Vayra86 (Feb 24, 2019)

cucker tarlson said:


> give it up,that's ridiculous when a single card can offer close to 2x the performance at the same cost and less than half power draw.
> cf sucks too.can you link any test where it's proving any more relevant than sli or are you going to brainlessly continue repeating what the red fanbase says ?
> 
> https://www.techspot.com/review/1763-radeon-rx-590-crossfire/
> ...



Hey if people still want to go big on CF/SLI in 2019, power to them. If they fail at reading the numbers, posting more of them won't make a difference  Also, be wary when the Lex starts multi quoting every single sentence of a post, or better yet: run away fast!


----------



## cucker tarlson (Feb 24, 2019)

lexluthermiester said:


> Those are some interesting numbers and they are just as equally irrelevant as they are not all inclusive. They also have nothing to do with the context of the upgrade scenario I referred to. If someone *already has* an RX570 and can easily add another one to their system, the upgrade path is not difficult to figure out. It's inexpensive and will add a noticeable boost to gaming performance.
> 
> Thanks for the tip. Try to remind yourself I'm an NVidia owner..
> 
> ...


god damn,you are very meticulous in making sure your commentary is not read backwards,I'll give you that one.
stutter solved,great,I guess it's no longer an issue.
and what good is wider sli support when you can have 100% cf scaling in a few games.play them only,another problem solved by amd's masterful thinking.


----------



## lexluthermiester (Feb 24, 2019)

Vayra86 said:


> Also, be wary when the Lex starts multi quoting every single sentence of a post, or better yet: run away fast!


----------



## RealNeil (Feb 24, 2019)

I'm already heavily invested in SLI & Crossfire. My GPUs are already in sets or pairs. Wherever /whenever it works in a game, I'm all for it. If it is not enabled, I'll get over it. I still have decent performance when just one card works in a game. (Vega-64, 1070Ti, 1080Ti)
My original comment about Crossfire was just pointing out that AMD is still supporting dual card use on lower-priced GPUs, making it a lesser investment, should we choose to go down that path.

NVIDIA is gimping SLI on low (and now mid-range cards) ensuring that SLI costs are that much higher. I see it as a money-grab. 

I'm into shooters and those games tend to support SLI/CROSSFIRE more often than not. 

Tell me that it's not worth it. I don't care because I'm already there and happy with it.


----------



## FYFI13 (Feb 24, 2019)

Vayra86 said:


> Hey if people still want to go big on CF/SLI in 2019, power to them. If they fail at reading the numbers, posting more of them won't make a difference  Also, be wary when the Lex starts multi quoting every single sentence of a post, or better yet: run away fast!


I can't see his post, but i got you  Troll of trolls and AMD shill.



cucker tarlson said:


> stutter solved,great,I guess it's no longer an issue.


Depends on tittle. Wow, Diablo 3, Crysis 3 and few others still stutter. Haven't done more testing, but this dual Vega 56 setup i built didn't look great.


----------



## Redwoodz (Feb 24, 2019)

John Naylor said:


> The numbers are the numbers.  Arguing about images, phases of the moon, mindshare  and other factors is fruitless.    Yes,, when a project diminates, they benefit from mindshare ... but that just doesn't happen, you have to earn it.  As "educated consumers", we should be beynd that.  When you pick up the Wheaties box, the person on the cover is the one who won the gold medal.  And when you pick a PC related magazine, the cover belongs to nVidia because the press, web based and print, likes to write about the exciting stuff because it sells more ads.  The 1080 Ti took the Gold, the 1080 the silver, the 1070 the bronze, the 1060 just missed a medal, but it finished ahead of everything against it.  The idea of tech mags and sites is to get articles read.  What they prnt has to do with answrering the questions a)  Will they send me a sample and b) who will read in.    Only one of those is a judgement call.  There is one 570 review here on TP ... there's 4 on the 1050 Ti,
> 
> The 570 was the better card but the choice between the 570 and 1050 Ti ***today*** is like arguing about what's better Betmax or VHS.  Prices on the 1060 3 GB have dropped to a point where it makes no sense to consider either of them.  It's one of those cards kike the 970 where the 960 and it's competion were just left in the dust such that the 970 sold more than all 20+ AMD 200 and 300 series cards  combined.  The TX 570 is $150 and no it doesn't make sense to spend an extra $20 for the Ti ($170).  From a performance stabdpoint, it would make sense to buy the 570 is it was $20 more ... the problem for AMD is, it also makes sense to spend the extra $20 to get the 1060 3GB for $190  A $1000 build w/o a GFX card presents the following Performance / proce ratio:
> 
> ...



Fixed that for you since you spent so much time with all those metrics, be a shame to give people the wrong advice.   1060 3GB < RX 580 8GB


----------



## eidairaman1 (Feb 24, 2019)

FYFI13 said:


> I can't see his post, but i got you  Troll of trolls and AMD shill.
> 
> 
> Depends on tittle. Wow, Diablo 3, Crysis 3 and few others still stutter. Haven't done more testing, but this dual Vega 56 setup i built didn't look great.



Ehem he has a 2070/2080...



Redwoodz said:


> Fixed that for you since you spent so much time with all those metrics, be a shame to give people the wrong advice.   1060 3GB < RX 580 8GB


----------



## Vayra86 (Feb 24, 2019)

FYFI13 said:


> I can't see his post, but i got you  Troll of trolls and AMD shill.
> 
> 
> Depends on tittle. Wow, Diablo 3, Crysis 3 and few others still stutter. Haven't done more testing, but this dual Vega 56 setup i built didn't look great.





eidairaman1 said:


> Ehem he has a 2070/2080...



*1080. Guess that happens when you're watching the world from your neck


----------



## eidairaman1 (Feb 24, 2019)

Vayra86 said:


> *1080. Guess that happens when you're watching the world from your neck



I was talking about @lexluthermiester ster

He has a 2080 to be exact


----------



## fritoking (Feb 24, 2019)

I know youd need an adapter,  but is this a good deal? It's a mining card but shouldn't it still play games ? $99
Sapphire Radeon RX 570 4GB GDDR5 DVI-D (UEFI) Graphics Card https://www.amazon.com/dp/B07MCDNQX2/ref=cm_sw_r_cp_apa_i_GyYCCbZZ2MGX2


----------



## lexluthermiester (Feb 25, 2019)

Vayra86 said:


> *1080. Guess that happens when you're watching the world from your neck


Had a 1080. Sold it for..


eidairaman1 said:


> I was talking about @lexluthermiester He has a 2080 to be exact


..this;
EVGA GeForce RTX 2080 BLACK EDITION
https://www.evga.com/products/specs/gpu.aspx?pn=10A3582D-8461-4B9F-825E-CCD4B0E7B151



fritoking said:


> I know youd need an adapter,  but is this a good deal? It's a mining card but shouldn't it still play games ? $99
> Sapphire Radeon RX 570 4GB GDDR5 DVI-D (UEFI) Graphics Card https://www.amazon.com/dp/B07MCDNQX2/ref=cm_sw_r_cp_apa_i_GyYCCbZZ2MGX2


That's a great price for that card. However, look at the shipping time. It states that it "Usually ships within one to two *months*". That's a hell of a wait time.
You'd be better off with something like this;
https://www.ebay.com/itm/MSI-Radeon-RX-570-ARMOR-OC-8GB-GDDR5-256-bit-Video-Card/254138972252
8GB versions are better.


----------



## Caring1 (Feb 25, 2019)

lexluthermiester said:


> However the overwhelming amount of benchmarking and testing do make it a fact. The RX570 outperforms the 1050ti handily. That is a fact. Whether he says it, W1zzard says it or I say it, the numbers prove it.
> 
> No it isn't. Power consumption is a metric that stands by itself and has nothing to do with performance. And if you think that the amount of power used by the RX570 over the 1050ti is going to hurt someones power bill, you need to go do some research.



Looks like you are the one that needs to do a lot of research.
Don't confuse performance with higher frame rates, and don't draw other people down to your level of ignorance.
If the person I quoted was simply saying one card is faster, ie has more FPS then I would agree with them, but they didn't.
Performance is not just speed.


----------



## fritoking (Feb 25, 2019)

lexluthermiester said:


> Had a 1080. Sold it for..
> 
> ..this;
> https://www.evga.com/products/specs/gpu.aspx?pn=10A3582D-8461-4B9F-825E-CCD4B0E7B151
> ...


I have an 8 GB 570. just thought it might be a good buy for someone....didnt notice the shipping time tho


----------



## lexluthermiester (Feb 25, 2019)

Caring1 said:


> Don't confuse performance with higher frame rates


That is exactly what performance means. Frame rates are the defacto measurement of gaming performance. It's been that way for 30 years and will continue to be that way.


Caring1 said:


> and don't draw other people down to your level of ignorance.


Irony.


Caring1 said:


> If the person I quoted was simply saying one card is faster, ie has more FPS then I would agree with them, but they didn't.


When people talk about the performance of a card, in this case the difference between an RX570 and a 1050ti, they are talking about *gaming* performance in frames per second, not power efficiency. If they were talking about mining performance, then it's hashes per second. Regardless of the task being discussed, ultimately the calculations per second that a GPU can output are what matter most to people.


Caring1 said:


> Performance is not just speed.


Yes, it is.



fritoking said:


> I have an 8 GB 570. just thought it might be a good buy for someone....didnt notice the shipping time tho


That's cool. That ebay listing I linked for you is just under $125 shipped. It's a good deal. There's also this one for $130;
https://www.ebay.com/itm/MSI-Radeon-RX-570-ARMOR-OC-8GB-GDDR5-Graphics-Card-256-bit/223372624935
By way of reference, there are a few RX580's for similar prices as well;
https://www.ebay.com/itm/MSI-Radeon-RX-580-Gaming-X-8GB-256-bit-GDDR5-Graphics-Card/312500876829
$129 Shipped
https://www.ebay.com/itm/MSI-RADEON...-GDDR5-PCI-EXPRESS-X16-CROSSFIRE/233134874864
$140 shipped
Just food for thought..


----------



## Kissamies (Feb 25, 2019)

The prices are pretty funny in my favourite shop.

Cheapest RX 570 4GB: 229eur (Sapphire Pulse)
Cheapest RX 580 4GB: 175eur (Sapphire Pulse)
Cheapest RX 570 8GB: 239eur (MSI Armor)
Cheapest RX 580 8GB: 229eur (Gigabyte Aorus)


----------



## lexluthermiester (Feb 25, 2019)

Chloe Price said:


> The prices are pretty funny in my favourite shop.
> 
> Cheapest RX 570 4GB: 229eur (Sapphire Pulse)
> Cheapest RX 580 4GB: 175eur (Sapphire Pulse)
> ...


Those are weird prices for sure..


----------



## Vayra86 (Feb 25, 2019)

eidairaman1 said:


> I was talking about @lexluthermiester ster
> 
> He has a 2080 to be exact





lexluthermiester said:


> Had a 1080. Sold it for..
> 
> ..this;
> EVGA GeForce RTX 2080 BLACK EDITION
> ...



Reading comprehension boys. The original line this originated from I believe was aimed at me. The big AMD shill with an Intel/Nvidia rig for the last decade  Scroll back up a bit when in doubt.



lexluthermiester said:


> I think @eidairaman1 knows who he was talking about and too. No, it was not aimed at or about you. But I digress..



No, you're also missing half the picture here, because you've been ignored.

*His=referring to you.
*'I got you' was in response to me, quoting my post.



FYFI13 said:


> *I can't see his post, but i got you * Troll of trolls and AMD shill.





trog100 said:


> i think this thread has somewhat lost the plot.. he he..
> 
> trog



As it is with guns, don't blame the thread, blame the people... I'm just trying to preserve what's left of sanity, apparently not all that much


----------



## lexluthermiester (Feb 25, 2019)

Vayra86 said:


> *Reading comprehension boys.* The original line this originated from I believe was aimed at me. The big AMD shill with an Intel/Nvidia rig for the last decade  Scroll back up a bit when in doubt.


I think @eidairaman1 knows who he was talking about and too. No, it was not aimed at or about you. But I digress..


----------



## trog100 (Feb 25, 2019)

i think this thread has somewhat lost the plot.. he he..

trog


----------



## las (Feb 25, 2019)

lexluthermiester said:


> Crossfire is arguably better than SLI and is dead simple to setup. Why would it be stupid?



Multi GPU is pretty much dead unless you're a bencher. Especially with low to mid-end cards. You will have a much better experience with a better GPU. SLI/CF is like asking for trouble. Even when fps is decent with multi GPU, the experience if often bad and stuttery because of frametimes and huge dips (minimum fps often sucks on CF/SLI).

Pretty much no dev's care about multi GPU support in games. Almost no people use it these days. It's up to AMD and Nvidia to patch up with drivers. I'm never going to buy multi GPU again, that's for sure.


----------



## lexluthermiester (Feb 25, 2019)

las said:


> Multi GPU is pretty much dead unless you're a bencher.


That's a myth. I upgrade gaming rigs all the time and because of the satisfaction guaranty that I offer in my shop, a system is not going out the door until we have shown the client how much better an additional card runs their games. If Crossfire/SLI didn't work, or was "dead", we wouldn't be promoting such as an upgrade worth doing. It does work and makes for an easy and inexpensive upgrade for people who have the right hardware.


las said:


> Pretty much no dev's care about multi GPU support in games.


Also a myth. Every developer codes support for CF/SLI.


las said:


> Almost no people use it these days.


Incorrect. It's an upgrade and even custom build setup we do 3 to 4 times a month. It's not something most people do, but enough people do it that it justifies continued support and development.


las said:


> I'm never going to buy multi GPU again, that's for sure.


Good for you.


----------



## Vayra86 (Feb 25, 2019)

lexluthermiester said:


> Also a myth. Every developer codes support for CF/SLI.



This is BS. There are many, many games without support for CF/SLI and there is no way they'll ever get it.

As for the other points; the fact people still do it, doesn't make it a good investment anyway, once you involve total cost of ownership and performance over the time it is owned, CF/SLI is almost never beneficial. It USED to be a positive financial outcome but today it happens far too often that two cards cost more than the single GPU alternative and if they don't, they are hopelessly bandwidth starved (low/midrange) and stutter is a fact you simply can't deny.

There is good reason Nvidia banned the fingers from the midrange, especially with new engines pushing much more across VRAM. Gamers / public opinion doesn't really accept microstutter like it historically used to do. Ever since frame pacing-gate the stance on multi GPU has radically changed. It took some time, but that's where we're at and I think its positive to have only capable GPUs offer mGPU. However on the other hand, the result of that is that support is dwindling. The new APIs add more fuel to that fire and a DX12 future is almost certainly not welcoming to mGPU at all. That idea already died with Ashes and never got picked up again.

Maybe we should rephrase: _SLI/CF is reduced to an even smaller niche than it ever was_. And from a sensible standpoint, the only niche that should exist is that of ultimate performance - ie multiple top-end GPUs to drive the highest res and the highest detail and refresh.


----------



## las (Feb 25, 2019)

lexluthermiester said:


> That's a myth. I upgrade gaming rigs all the time and because of the satisfaction guaranty that I offer in my shop, a system is not going out the door until we have shown the client how much better an additional card runs their games. If Crossfire/SLI didn't work, or was "dead", we wouldn't be promoting such as an upgrade worth doing. It does work and makes for an easy and inexpensive upgrade for people who have the right hardware.
> 
> Also a myth. Every developer codes support for CF/SLI.
> 
> ...



Everything I wrote is correct. Maybe you should get out of your bubble. Not a single game dev cares about multi GPU support. It's waste of time and money for them. Less than 0.25% of PC gamers use multi GPU.

Even when it works the experience is bad compared to a single GPU. Frametimes are all over the place. Dips, spikes and stutter.

Tons of new games does not even support multi GPU. PC barely gets focus from dev's, and you think these dev's care about 0.25% of PC gamers? Haha. SLI/CF is dead.


----------



## ArbitraryAffection (Feb 25, 2019)

eidairaman1 said:


> There are lp 560s...


I was after a LP 560 for a while but one was never available at retailers here in the UK unless i am completely blind. MSI do a Lp dual fan rx 560 with 4GB I think but is from Amazon US and really expensive. I am actually looking for a LP Radeon for my server, but i personally think the 550 is too expensive at the moment. The closest thing to a LP 560 here that I have seen is actually this. Which is actually really cool and cute and blue :3 But yeah a bit pricey.


----------



## lexluthermiester (Feb 25, 2019)

Vayra86 said:


> This is BS. There are many, many games without support for CF/SLI and there is no way they'll ever get it.


Total poppycock. And even if true, games still perform better when a second GPU is added to a system. It is possible that game devs don't need to "support" it for a game to benefit from multi-GPU's.


las said:


> Everything I wrote is correct. Maybe you should get out of your bubble. Not a single game dev cares about multi GPU support. It's waste of time and money for them. Less than 0.25% of PC gamers use multi GPU.


Alrighty then, bye bye now..


----------



## ArbitraryAffection (Feb 25, 2019)

lexluthermiester said:


> Total poppycock. And even if true, games still perform better when a second GPU is added to a system. It is possible that game devs don't need to "support" it for a game to benefit from multi-GPU's.


My experience with Crossfire recently when i got to try out two Vegas was really bad. I wanted to run Fallout 4 at 4K VSR and the cards could technically do it but there was artefacting everywhere and stutter and half the scene wasn't rendering. Also the colours went all weird in Warframe when it was forced on, otherwise it just doesn't support it. Would've been nice to run 4K vsr 144Hz in warframe though


----------



## Vayra86 (Feb 25, 2019)

lexluthermiester said:


> Total poppycock. And even if true, games still perform better when a second GPU is added to a system. It is possible that game devs don't need to "support" it for a game to benefit from multi-GPU's.



LOL. Yep you really do live in alternate lala-land then. Show some examples, I guess? Until then you're full of shit. Higher FPS =/= better performance, its just higher FPS. When that gets accompanied by horrible stability and frame pacing though, the end result is an unplayable game.

I speak from experience. I play Elder Scrolls Online and it was, since beta one of those games that lots of people tried to get SLI working on. I spent *several months* trying to get it to work _somehow_ but no frame delivery method would result in a playable game. Even when the second GPU didn't add performance, it did introduce latency and heaps of stutter. You need a working profile + engine support or it simply will not fly, and a vast number of games simply do not support it. You can try dozens of AFR/SFR methods, none will be desirable.


----------



## las (Feb 25, 2019)

I tried Sapphire Fury Nitro CF back in 2016 and the experience was horrible overall. It's been years since I've tried SLI. It was 970 SLI and i clearly remember all those issues in Far Cry 4 with shadow bugs and mediocre scaling. There's litteraly tons of posts on Steam on every new game release from multi GPU users complaining. I wonder why?

When was the last Nvidia or AMD card with 2 GPU's on one board?  Both have been slowly moving away from multi GPU for years now.

Why would they spend millions and millions on optimizing drivers for 0.25% of PC gamers when they could use the time/money to improve the experience for 99.75% instead.


----------



## Kissamies (Feb 25, 2019)

I had also 970 SLI and no problems there, even that my CPU was a cheap Pentium G4560 back then. I saw a great improvement in framerates and had no issues with stuttering or other problems many people talks about.



las said:


> When was the last Nvidia or AMD card with 2 GPU's on one board?


IMO Radeon Pro Duo isn't THAT old..


----------



## las (Feb 25, 2019)

Chloe Price said:


> I had also 970 SLI and no problems there, even that my CPU was a cheap Pentium G4560 back then. I saw a great improvement in framerates and had no issues with stuttering or other problems many people talks about.
> 
> 
> IMO Radeon Pro Duo isn't THAT old..



It's 3 years old and it was not a gaming card


----------



## Kissamies (Feb 25, 2019)

las said:


> It's 3 years old and it was not a gaming card


IMO it was for gaming *and* professional use.


----------



## las (Feb 25, 2019)

Chloe Price said:


> IMO it was for gaming *and* professional use.



Yeah obviously it could be used for both. AMD says nothing about gaming on the product page tho. The card did not sell well.

https://www.amd.com/en/products/professional-graphics/radeon-pro-duo-polaris

When was the last gaming-focused multi GPU card? 

295X2? Almost 5 years.


----------



## cucker tarlson (Feb 25, 2019)

trog100 said:


> i think this thread has somewhat lost the plot.


no wonder when you've got the terribly confused join the discussion.


----------



## Kissamies (Feb 25, 2019)

las said:


> Yeah obviously it could be used for both. AMD says nothing about gaming on the product page tho. The card did not sell well.
> 
> https://www.amd.com/en/products/professional-graphics/radeon-pro-duo-polaris
> 
> ...


Oh, I mean the liquid cooled 2016 dual Fiji, not that what you linked.


----------



## cucker tarlson (Feb 25, 2019)

Chloe Price said:


> Oh, I mean the liquid cooled 2016 dual Fiji, not that what you linked.


I think you're both thinking about the same thing,amd got things confused cause radeon pro duo has a successor now with the same name.I think it's dual polaris with 32 gigs.
the original pro duo was a flop x2


----------



## notb (Feb 25, 2019)

RealNeil said:


> I'm already heavily invested in SLI & Crossfire. My GPUs are already in sets or pairs. Wherever /whenever it works in a game, I'm all for it. If it is not enabled, I'll get over it. I still have decent performance when just one card works in a game. (Vega-64, 1070Ti, 1080Ti)
> My original comment about Crossfire was just pointing out that AMD is still supporting dual card use on lower-priced GPUs, making it a lesser investment, should we choose to go down that path.
> 
> NVIDIA is gimping SLI on low (and now mid-range cards) ensuring that SLI costs are that much higher. I see it as a money-grab.


Honestly, I don't really get the reasoning here.
Why exactly are you into SLI/Crossfire? Is it because you like even numbers? Is it because it adds +3 to how awesome the PC is?

You say you're getting decent performance from just one card. That you'll get over it if multi GPUs aren't supported by a game.

I assume you know that in average case going SLI/Crossfire doesn't give enough performance gain to make it better value than a more powerful card (when it exists).
And that's before you start thinking about cooling, noise and power draw...

Nvidia is not "gimping" anything. It's just that SLI doesn't make much sense for GPUs below 2080.
And it's not free - it increases the cost of manufacturing. So why would all the 2060 or 2070 buyers have to pay for something almost none of them would use?
Moreover, SLI circuitry costs more or less the same on every card, so it's relatively more expensive with the cheaper cards.

Anyway, SLI and Crossfire are becoming obsolete. APIs are going to support multiple GPUs without additional hardware. It will be more efficient as well.


lexluthermiester said:


> Incorrect. It's an upgrade and even custom build setup we do 3 to 4 times a month. It's not something most people do, but enough people do it that it justifies continued support and development.


So give us a number. How many (in %) of your clients upgrade to multi GPU setups (not including those with flagship cards, where multiple GPUs make sense).


----------



## bogmali (Feb 25, 2019)

trog100 said:


> i think this thread has somewhat lost the plot.. he he..
> 
> trog



I have to agree with you due to:

1. Namecalling
2. Bickering back and forth 
3. Baiting

Closing shop


----------

