# Vega 56 Pulse or RTX 2060 Founder/Zotac Twin?



## floop (Jan 17, 2019)

A simple question for you but very difficult for me...

I have an AOC c24g1...1080p144hz with Freesync and a Ryzen 2600 etc...

So what should i buy ?
I play gdr and fps non competitive... and i dont change my pc every new hardware coming out lol


----------



## kurosagi01 (Jan 17, 2019)

Raw performance on paper the choice would be the RTX 2060, not sure if Nvidia is including BFV or not with 2060.
If you're thinking of going higher resolution the extra 2GB from the Vega 56 will come in handy and if you're interested in Division 2, Resident Evil 2 remake and devil may cry 5 that is on promotion with AMD GPUs then the Vega 56 would be a good deal.
If you can get the Vega56 for £50($60) less than the RTX2060 then my money is on Vega56, if you want to be more "futureproof" in features then RTX2060.


----------



## gamerman (Jan 17, 2019)

well its easy.

points are:

1. its alot faster
2. its have alot better efficiency
3. its keep value much better, meaning if you want sell it after few years and buy new gpu again,its easy.
4. better drivers reason, (nvidia)

and that gpu is rtx 2060 FE or any AIB version of it.

you cant go wrong.


----------



## ArbitraryAffection (Jan 17, 2019)

Vega 56, tuned and overclocked is just as fast as the 2060, maybe even faster. It has 2GB of extra VRAM and I've personally owned the Sapphire Pulse, it is a fantastic card, runs cool and quiet too.

If they are the same price, I'd still pick the Vega 56; the 2060 is apparently a stutterry mess in BF5 with RTX due to lack of VRAM and going forward you'll want at least 8GB IMO. IF the 56 is cheaper, then 100% get the 56. But I wouldn't pay _more_ for the 56.

Yes power use is significantly lower on the 2060 but as I said the Pulse is an excellent cooler so it's not going to run hot / make a lot of noise if your case has good airflow.

And lastly, please don't listen to the "gamerman" shill above, everything he posts is to try and smear against Radeon or Ryzen. I'm fairly certain he's either a paid shill or a massive fanboy; 2060 isn't not "a lot faster". And AMD has had better drivers than Nvidia for a while now. Stability wise they are equally as good but AMD has A LOT more features, built on performance monitoring, an overlay, a great overclocking tool, etc.



kurosagi01 said:


> Raw performance on paper the choice would be the RTX 2060, not sure if Nvidia is including BFV or not with 2060.
> If you're thinking of going higher resolution the extra 2GB from the Vega 56 will come in handy and if you're interested in Division 2, Resident Evil 2 remake and devil may cry 5 that is on promotion with AMD GPUs then the Vega 56 would be a good deal.
> If you can get the Vega56 for £50($60) less than the RTX2060 then my money is on Vega56, if you want to be more "futureproof" in features then RTX2060.


2060 is hardly future proof with the 6GB of vram; with RTX enabled in BF5 it is already running into issues.


----------



## xkm1948 (Jan 17, 2019)

RTX2060 hands down. Faster, cooler, less power hungry and now Adaptive Sync support. Just read W1zzard’s review!


----------



## ArbitraryAffection (Jan 17, 2019)

xkm1948 said:


> RTX2060 hands down. Faster, cooler, less power hungry and now Adaptive Sync support. Just read W1zzard’s review!


It really isn't faster. Maybe 5%, and against a reference, throttling 56 (which runs at significantly lower clock speeds). AIB 56 with increased power limit will trade blows with 1070 Ti (and thus 2060) and even tie 1080 in many games and you're getting more VRAM. RTX is literally a joke this generation in all honestly, and the 2060 doesn't have the power -or- VRAM to really handle it. It really comes down to price but the 56 is still a very valid option going forward.


----------



## kurosagi01 (Jan 17, 2019)

ArbitraryAffection said:


> 2060 is hardly future proof with the 6GB of vram; with RTX enabled in BF5 it is already running into issues.


I did quote the word future proof purely because it has them fancy new 'features" like RTX for example lol, but my vote is also on Vega 56 if its cheaper by what i stated in my previous post.
You might even find Vega 64 for same price as Vega 56 in some places..at least in the UK the Asus strix 56 cost the same as the 64 on ScanUK.


----------



## xkm1948 (Jan 17, 2019)

ArbitraryAffection said:


> It really isn't faster. Maybe 5%, and against a reference, throttling 56 (which runs at significantly lower clock speeds). AIB 56 with increased power limit will trade blows with 1070 Ti (and thus 2060) and even tie 1080 in many games and you're getting more VRAM. RTX is literally a joke this generation in all honestly, and the 2060 doesn't have the power -or- VRAM to really handle it. It really comes down to price but the 56 is still a very valid option going forward.



Sure you can tune Veg56. But 2060 overclocks just well, not if better while drawing less power. 

I see Vega56 around maybe $300 or lower to be a worthy recommendation over 2060


----------



## the54thvoid (Jan 17, 2019)

RTX 2060 is about 10% faster at 1440p but often ties neck and neck with Vega 56. 

Neither should disappoint but the FE 2060 will be a good solid bet with a quieter profile. ThePulse is a cheaper model of 56, so be aware of that. Don't get the Zotac, it's probably not as quiet as the FE version (not seen reviews yet).

And, to counter what @ArbitraryAffection says, you can also easily overclock the FE for close to 10% extra performance. The memory at 1440p gaming will not be a deal breaker.


----------



## NdMk2o1o (Jan 17, 2019)

Vega 56 as low as 300 now, that would be my choice


----------



## ne6togadno (Jan 17, 2019)

2nd to @ArbitraryAffection 
if 56's price is 10-20 above or less then 2060 get 56. if 2060 is cheaper get 2060.
beside what @ArbitraryAffection already mentioned i'd add that 56 come with plug and play freesync support, while for 2060 freesync support for your monitor will come someday with driver update from nvidia


----------



## ArbitraryAffection (Jan 17, 2019)

kurosagi01 said:


> I did quote the word future proof purely because it has them fancy new 'features" like RTX for example lol, but my vote is also on Vega 56 if its cheaper by what i stated in my previous post.


No point having those fancy features if you're running into memory issues a week after launch. Trust me I am not being a fangirl when I vouch for Vega 56, but the extra VRAM alone is worthy of mention.



the54thvoid said:


> RTX 2060 is about 10% faster at 1440p but often ties neck and neck with Vega 56.
> 
> Neither should disappoint but the FE 2060 will be a good solid bet with a quieter profile. ThePulse is a cheaper model of 56, so be aware of that. Don't get the Zotac, it's probably not as quiet as the FE version (not seen reviews yet).
> 
> And, to counter what @ArbitraryAffection says, you can also easily overclock the FE for close to 10% extra performance. The memory at 1440p gaming will not be a deal breaker.


Honestly you will not notice the difference in FPS Between these cards in most games, but you _will _notice that VRAM limit being slammed into, hard. Also Vega can use HBCC which can massively improve frametimes when running over 8GB of VRAM. I've owned the Pulse and it's by no means a 'cheap' card. The cooler and power delivery are fantastic (board uses the same high-quality components as the AMD Vega Reference design). I even flashed a NITRO+ bios on mine when i had it and increased power limit to 295W and with that, 1000MHz HBM and 1600 Mhz core. I think at this point it will be winning raw FPS vs 2 Ghz 2060.


----------



## Final_Fighter (Jan 17, 2019)

if you now how to squeeze the extra performance out of the vega than it can match the rtx2060 and its also a good combination given your current monitor. you should have a good 600w psu before considering the vega if your plans are to overclock it. if you are just looking at raw performance out of the box and dont care for rtx or freesync there is the option of getting a used 1080 that can be had for cheaper on some  auction sites. rtx2060 is a good card but i would not be buying it because of rtx(too young). that said the rtx2060 is faster out the box but it does not guarantee freesync support. the vega is a good match and allows you to use the features you bought into with your current monitor.


----------



## ShieldHead (Jan 17, 2019)

I pick the vega 56 if its the same price or cheaper than the rtx 2060.
I'd also argue that the vega architecture is more future proof and amd driver features and support beats nvidia's. (I had a gtx 970 before the vega). The sapphire pulse is also very well made. Sapphire is by far my favorite OEM.
An overclocked vega 56 comes close to a rtx 2070 in some games and the power draw is not unreasonable. I've seen an article where undervolted vega bests a gtx 1080 in perf/W.


----------



## xkm1948 (Jan 17, 2019)

I will just leave W1zzard’s words here. I trust a seasoned GPU reviewer who worked on both Vega and 2060 a bit more than general forum member ideas.


----------



## ShieldHead (Jan 17, 2019)

xkm1948 said:


> I will just leave W1zzard’s words here. I trust a seasoned GPU reviewer who worked on both Vega and 2060 a bit more than general forum member ideas.
> 
> View attachment 114677


In all fairness TPU only reviewed a reference vega 56 which throttles at 1300mhz.
Plus in my country i can get a sapphire pulse for under 400€ while the rtx is still inflated at a little over 400 for the cheaper models


----------



## floop (Jan 17, 2019)

in my country the vega 56 pulse costs 357 eur including shipping and three games of bundle instead the 2060 costs 399 with BFV or Anthem...

i'm very undecided!


----------



## eidairaman1 (Jan 17, 2019)

Id say the V56 for the AIB design standpoint.


----------



## cucker tarlson (Jan 17, 2019)

none of those cards are perfect,but I'd definitely choose the rtx 2060 for 1080p as well as definitely go with Vega 56 for 1440p.
nvidia's own/zotac twin coolers on a 180W card will perform better or at least just as good than sapphire's dual fan pulse cooler on a 250W card,though if you decide to go with v56 pulse you'll get a card that really excells at keeping this 250w power hog cool and quiet.

https://www.computerbase.de/2018-03...nahme-der-grafikkarte-rise-of-the-tomb-raider


----------



## newtekie1 (Jan 17, 2019)

ArbitraryAffection said:


> It really isn't faster. Maybe 5%, and against a reference, throttling 56 (which runs at significantly lower clock speeds). AIB 56 with increased power limit will trade blows with 1070 Ti (and thus 2060) and even tie 1080 in many games and you're getting more VRAM. RTX is literally a joke this generation in all honestly, and the 2060 doesn't have the power -or- VRAM to really handle it. It really comes down to price but the 56 is still a very valid option going forward.




I find your comments pretty damn funny, or maybe ironic.  I mean, you post calling someone else a shill, then post bullshit like this.

First, the RTX 2060 is not "maybe 5%" faster.  It's 11% faster at 1080p, which is what the OP is using.

Second, this idea that the reference card just looks bad because it throttles is totally bogus as well.  The reference Vega56 doesn't throttle, in fact, it maxes out at about 75°C with the stock fan curve. Vega64 reference hits 85°C and throttles, but not Vega56.

Third, this idea that the RTX 2060 doesn't have enough VRAM to handle RTX is crazy.  I mean, Vega56 can't even do RTX, so just the fact that the RTX 2060 can is a bonus.  But on top of that, the RTX 2060 handles RTX just fine with RTX set to low, again at 1080p which the OP uses.  RTX Low on the RTX 2060 is more than playable at 1080p and definitely not a stuttery mess, as you put it.  In fact, even W1z says the RTX 2060 is entirely playable, maintaining over 60FPS, at 1080p even with RTX set to High settings.

Finally, the issue of 6GB of VRAM being an issue moving forward, and the Vega56 having 2GB extra.  Yes, outwardly the extra 2GB would seem to be an advantage.  But then you also have to consider nVidia's memory compression that they use, that has allowed them to get away with lower memory amounts as well as lower memory bus widths for a few generations now.  The extra 2GB on the Vega56 is not as big of an advantage as it seems when you consider that.  The fact is the 6GB won't be a limiting factor on the RTX 2060.


----------



## cucker tarlson (Jan 17, 2019)

newtekie1 said:


> I find your comments pretty damn funny, or maybe ironic.  I mean, you post calling someone else a shill, then post bullshit like this.
> 
> First, the RTX 2060 is not "maybe 5%" faster.  It's 11% faster at 1080p, which is what the OP is using.
> 
> ...


sapphire pulse is just 2% faster than reference v56,margin of error stuff.
https://www.computerbase.de/2018-03/sapphire-radeon-rx-vega-56-pulse-test/2/



xkm1948 said:


> RTX2060 hands down. Faster, cooler, less power hungry and now Adaptive Sync support. Just read W1zzard’s review!


given the OP is using 1080p,this is pretty much an unquestionable choice.


----------



## ShieldHead (Jan 17, 2019)

I wouldn't buy a RTX for raytracing just yet. I agree that it's an awesome technology but its going to take atleast another 5 years until GPU's have the horsepower to run it the way it should. (Not that tech demo that is BFV).
For now the vega 56 is the better choice IMO. Yes it will consume slightly more power (the Vega 56 Pulse has a tdp of 180w btw) but it can me tuned to perform better than even an overclocked rtx 2060. The cooler on the Pulse is very silent aswell, probably even quieter than the Zotac, while still being 50€ cheaper and coming with 3 games.
Still the reference vega 56 is at its limits, while the pulse can be easily pushed to a steady 1600Mhz, using just 215W. (reported by gpu-z)

See for yourselves the potential of the vega:
https://www.hardwareluxx.de/index.p...vega-56-und-vega-64-im-undervolting-test.html


----------



## cucker tarlson (Jan 17, 2019)

ShieldHead said:


> I wouldn't buy a RTX for raytracing just yet. I agree that it's an awesome technology but its going to take atleast another 5 years until GPU's have the horsepower to run it the way it should. (Not that tech demo that is BFV).
> For now the vega 56 is the better choice IMO. Yes it will consume slightly more power (the Vega 56 Pulse has a tdp of 180w btw) but it can me tuned to perform better than even an overclocked rtx 2060. The cooler on the Pulse is very silent aswell, probably even quieter than the Zotac, while still being 50€ cheaper and coming with 3 games.
> Still the reference vega 56 is at its limits, while the pulse can be easily pushed to a steady 1600Mhz, using just 215W. (reported by gpu-z)
> 
> ...


v56 pulse has the tdp of 210w and draws 250w on standard bios.see the computerbase.de link I posted a few posts above.I don't know where you're getting this 180w tdp on a vega 56 from.the lowest power consumption on a vega 56 card you can find is 220w and that's on a nitro card running a power save bios.


----------



## Divide Overflow (Jan 17, 2019)

floop said:


> in my country the vega 56 pulse costs 357 eur including shipping and three games of bundle instead the 2060 costs 399 with BFV or Anthem...
> 
> i'm very undecided!


Both are very nice GPUs and the differences between them seem to be negligible.   I'd go with the better deal you can find in your area.


----------



## Joss (Jan 17, 2019)

Both are good choices but I'd go with the Vega for the exotic factor


----------



## ShieldHead (Jan 17, 2019)

cucker tarlson said:


> v56 pulse has the tdp of 210w and draws 250w on standard bios.see the computerbase.de link I posted a few posts above.I don't know where you're getting this 180w tdp on a vega 56 from.the lowest power consumption on a vega 56 card you can find is 220w and that's on a nitro card running a power save bios.


Well sometimes TDP doesnt reflect actual power consumption. Dont forget I9-9900k has a tdp of 95W 
The sapphire pulse has a 180w tdp bios and a low power 165W (same as reference). Actual power usage is around 50w more tho.
Well it's up to the OP, one has a higher power consumption but also more performance potencial, 2GB more Ram and a much better cooler and looks. The other has better performance out of the box and also RTX and DLSS...


----------



## tvamos (Jan 17, 2019)

newtekie1 said:


> Finally, the issue of 6GB of VRAM being an issue moving forward, and the Vega56 having 2GB extra.  Yes, outwardly the extra 2GB would seem to be an advantage.  But then you also have to consider nVidia's memory compression that they use, that has allowed them to get away with lower memory amounts as well as lower memory bus widths for a few generations now.  The extra 2GB on the Vega56 is not as big of an advantage as it seems when you consider that.  The fact is the 6GB won't be a limiting factor on the RTX 2060.




compression, you say?


----------



## Fluffmeister (Jan 17, 2019)

Well, the 6GB 1060 there does have higher minimums than the 8GB competition, clearly 3Gb isn't enough in that case but he wasn't talking about 3GB so meh.


----------



## Durvelle27 (Jan 17, 2019)

if the choice was Vega 56 or RTX 2070 i'd suggest the 2070 but given its about the undercut RTX 2060, my vote hands goes for the Vega 56


----------



## tvamos (Jan 17, 2019)

Fluffmeister said:


> Well, the 6GB 1060 there does have higher minimums than the 8GB competition, clearly 3Gb isn't enough in that case but he wasn't talking about 3GB so meh.


And what happens one day when a game uses and needs 6.5 - 7.5GB vram? And we were discussing future needs. Personally, I think 2060 is too strong chip for "only" 6GB, and if they do announce versions with 5, 4 or 3GB it will be funnyfor sure.


----------



## kastriot (Jan 17, 2019)

Neither, i would buy used 1080 Ti


----------



## cucker tarlson (Jan 17, 2019)

kastriot said:


> Neither, i would buy used 1080 Ti


really ? for a ryzen 2600 system at 1080p 144hz ?


----------



## kurosagi01 (Jan 17, 2019)

If you can find a 1080ti for 400 euros then by all means get 1080ti.
Seeing there is almost a 40 euro difference and you're interested in the 3 games then get the vega 56. It will still last you good 3 years or so.


----------



## notb (Jan 17, 2019)

tvamos said:


> And what happens one day when a game uses and needs 6.5 - 7.5GB vram?


LOL. And what if the next day a game needs 8.5GB? Or forces you to use RTRT? Or games become DLSS-optimized and Nvidia gets 20% boost?

Generally speaking Nvidia cards have less RAM, but they tend to use it a lot better. It's the same story as with raw compute power.
If you sell 10 cards, it's cheaper to give it more cores/RAM and save on optimization and drivers.
If you sell 100 cards, it's cheaper to optimize and use less hardware.


----------



## bug (Jan 17, 2019)

I voted RTX, but I wouldn't get the Zotac. According to TPU's review, the fans on that thing are loud. Almost as loud as the fans on Vega56/64.
I'm looking to get one myself, but I still haven't been able to pick a manufacturer/model.


----------



## qubit (Jan 17, 2019)

NVIDIA. Easy choice.


----------



## moproblems99 (Jan 18, 2019)

Whichever has the performance you need at the price you want.  Forget about whose name is on the box.


----------



## Outback Bronze (Jan 18, 2019)

Voted RTX 2060.


----------



## ShurikN (Jan 18, 2019)

Wanted to vote for 2060 until I saw just how good can V56 be when undervolted.


ShieldHead said:


> See for yourselves the potential of the vega:
> https://www.hardwareluxx.de/index.p...vega-56-und-vega-64-im-undervolting-test.html



And that power draw is not that bad at all.

If you are willing to dabble in undervolting it's a no-brainer, especially when it's almost 50eur cheaper and you get 3 games.


----------



## Kissamies (Jan 18, 2019)

Even though I hate Nvidia, I'd still go for RTX 2060.


----------



## flmatter (Jan 18, 2019)

If it were me, Vega. imho


----------



## biffzinker (Jan 18, 2019)

Tough choice, originally the RTX 2060 was my pick but Vega 56 under stock voltage is worth the consideration. Which ever offers the best price to performance.


----------



## chinmi (Jan 18, 2019)

if i'm not mistaken, rx vega comes with 3 bonus games,

*Resident Evil™ 2
Devil May Cry™ 5
Tom Clancy’s The Division® 2*

so those should be a point of consideration too on choosing between an rtx2060 or vega56. those 3 games combined could probably cost around us$150 or more, so if the vega is around $30-$50 cheaper, then all of those combined it's amost a $200 price advantage to buy the vega.


----------



## biffzinker (Jan 18, 2019)

chinmi said:


> if i'm not mistaken, rx vega comes with 3 bonus games,
> 
> *Resident Evil™ 2
> Devil May Cry™ 5
> ...


Buying the RTX 2060 gets you either Anthem or Battlefield V your pick.


----------



## ne6togadno (Jan 18, 2019)

but with vega 50$ cheaper then 2060 you can have all 3 games and still have 50$ left for anthem or bf5 to pick.
so for 350 you can have 4 games and new vga or 1 game and new vga
i'd trade rtx gimmick for 3 more games but still it's up to op to deside what he'd like


----------



## Vya Domus (Jan 18, 2019)

The 2060 will run out of memory, that's not a prediction but a reality :










Go to 8:45 , it turns out that when you switch on the beloved RTX function (one of the biggest selling points of this card), you get horehounds stutter because of insufficient memory. All that at 1080p mind you.

I find it downright ridiculous and bizarre for a card with this level of performance to ship with just 6 GB when it can already hit it's memory cap. I wouldn't touch it, do yourself a favor and get a Vega 64/56 or GTX 1080.


----------



## bug (Jan 18, 2019)

Vya Domus said:


> The 2060 will run out of memory, that's not a prediction but a reality :
> 
> 
> 
> ...


You must have included the wrong video. Instead of "horehounds stutter because of insufficient memory", this one inlcudes a slight dip towards 40fps and nothing about memory usage.


----------



## notb (Jan 18, 2019)

Vya Domus said:


> I find it downright ridiculous and bizarre for a card with this level of performance to ship with just 6 GB when it can already hit it's memory cap. I wouldn't touch it, do yourself a favor and get a Vega 64/56 or GTX 1080.


You're seriously suggesting buying 1080 instead of 2060 (similar performance, way more expensive) just because you're prejudiced against RTX, which can be switched off?


----------



## Vayra86 (Jan 18, 2019)

tvamos said:


> And what happens one day when a game uses and needs 6.5 - 7.5GB vram? And we were discussing future needs. Personally, I think 2060 is too strong chip for "only" 6GB, and if they do announce versions with 5, 4 or 3GB it will be funnyfor sure.



This. It is far too easily overlooked. If you intend to keep your GPU for more than a year or two, this is something that should be vital in your purchase decision.

Low VRAM cards fall off _fast_. It happens every time. And we also still see the 7970 as pretty relevant (much more than one would expect given its age) due to its 3GB, which was considered a lot at its time of release. Similarly, 3GB 780(ti)'s also survived far longer than the rest of the Kepler/refresh stack.

If you're talking about value for money, the Vega 56 choice at even an equal price as the 2060 is easily the safer choice. Yes it guzzles more power. But what you're looking at is a card with consistency over longer periods of time, versus a card that is somewhat faster today (a barely noticeable FPS win) and will lose its consistency in two years time. 192 bit + 6 GB VRAM will bite you in the ass. Both these cards are very capable of 1080p/ultra or 1440p high. There was a good reason the Pascal line up equipped 8 GB on its 1070 and up products, and that reason hasn't changed + RTX 2xxx doesn't improve on existing compression methods. A GTX 1070ti or 1080 will be that much more relevant in two years time, and an RTX 2060 will not be the card to pick up on second hand market compared to all of those options.

You have to keep in mind that we're not seeing major performance wins per generation anymore. That means you're likely to hold on to your GPU for a longer time. And thát means, that 8GB will come in handy - it already does today, and it surely will in the future. Nvidia's delta compression has nothing to do with that - that is just about bandwidth. And even there you see edge cases where the compression is insufficient and consistency suffers. Edge cases, but I'd buy a GPU for good performance _everywhere. _That is the same logic I apply to a gaming-CPU, and reason I'd advise an Intel CPU for that purpose up until Ryzen pops up with a 4.5 Ghz boost. If you spend this kind of money, performance should simply be optimal in every situation.

RTX and DLSS are completely irrelevant. Don't buy into tech that has no content to show for it. It was never a good idea, not with AMD, not with Nvidia.

Then the price comparison. A vega 56 at 300 bucks is a no brainer versus the 2060 IMO. At equal price, it all depends on your personal view on my story above. I'll just conclude by saying: been there done that, all I'm saying is from experience.



biffzinker said:


> Buying the RTX 2060 gets you either Anthem or Battlefield V your pick.



Two EA published MTX-infested cesspools... I wouldn't consider that a bonus, those games will be in the budget bin within a year. And you'll be buying DLC to stay relevant in them regardless, so they ain't free at all. Those titles up against The Division, RE2 and DMC5... its not even a contest. You're looking at two subscriptions versus 3 real games.


----------



## bug (Jan 18, 2019)

notb said:


> You're seriously suggesting buying 1080 instead of 2060 (similar performance, way more expensive) just because you're prejudiced against RTX, which can be switched off?


Every single time someone comes here asking "should I buy A or B?" there's someone assuming they didn't do their homework and starts suggesting C, D, E, etc.
How about we don't assume everyone (else but us) is clueless and, I don't know, ask first if they have considered or are willing to consider something besides A and B?


----------



## floop (Jan 18, 2019)

Vya Domus said:


> The 2060 will run out of memory, that's not a prediction but a reality :
> 
> 
> 
> ...




very very interesting this video!

but i have a ryzen 2600 not a i7... 

p.s. the Evga rtx2060 xc gaming, what's it like °C and db? XD


----------



## ne6togadno (Jan 18, 2019)

https://www.techpowerup.com/reviews/EVGA/GeForce_RTX_2060_XC_Ultra/36.html


----------



## floop (Jan 18, 2019)

ne6togadno said:


> https://www.techpowerup.com/reviews/EVGA/GeForce_RTX_2060_XC_Ultra/36.html


ehehehe but this it's the Ultra version not the XC Gaming....


----------



## ne6togadno (Jan 18, 2019)

with only one fan you can bet it will be worse despite lower clocks on gaming


----------



## os2wiz (Jan 18, 2019)

gamerman said:


> well its easy.
> 
> points are:
> 
> ...


In fact Radeon drivers are every bit as good asNvidia drivers. Also Nvidia did a poor implementation of free sync support in their latest driver release. I have a 4K Samsung freesync monitor with a MSI Gaming X GTX 1080 Ti. The enabling of Gsync support for my monitor caused obvious issues that were not there previously so I had to disable adaptive sync support as Nvidia botched it.  I had an RX Vega 56 before my Nvidia card. My freesync monitor worked quite well with card. I only ditched it because I got a great price on the 1080 Ti and desired the higher refresh rates. As soonas AMD introduces a high end card I will switch back. I very well may buy the Radeon 7 when available on February 9th. The RX Vega 64 from Sapphire can be purchased for just under $399 now. It is a great buy with more memory than 2060 RTX. There are no games other than Battlefield V supporting ray tracing and it has a poor implementation and performance. Screw ray tracing and get the Vega 64 by Sapphire on Newegg. It is a far better value .DLSS is a fake technology from Nvidia that supposedly gives you almost 4k resolution with higher frame rates. But it is not using native resolution which leads to edge artifacts and distortions.


----------



## ShieldHead (Jan 18, 2019)

floop said:


> very very interesting this video!
> 
> but i have a ryzen 2600 not a i7...
> 
> p.s. the Evga rtx2060 xc gaming, what's it like °C and db? XD


There's no difference between an i7 8700k and your Ryzen 5 2600 at those framerates. DXR gaming is limited by the Graphics card.
Are you really considering a single fan Evga card versus the Pulse?


----------



## bug (Jan 18, 2019)

ne6togadno said:


> with only one fan you can bet it will be worse despite lower clocks on gaming


Not really, you can't.
You could bet Palit using 4 heat pipes instead of 3, would run cooler than EVGA. It doesn't.


----------



## ne6togadno (Jan 18, 2019)

> Bajo una temperatura ambiente de 20ºC, la *RTX 2060 XC Gaming OC 6GB *se mantiene en unos *30ºC* en *reposo* con el ventilador girando lentamente y siendo prácticamente *inaudible*. Sin embargo, a *pleno rendimiento*, alcanza los *67ºC *y las revoluciones suben bastante, pues en torno a *40 dB* se hace *algo molesta*. Lo mismo ocurre bajo *OC*, que mantiene sonoridad, pero aumenta las temperaturas a los *71ºC*.
> 
> *Under an ambient temperature of 20ºC, the RTX 2060 XC Gaming OC 6GB is maintained at about 30ºC at rest with the fan rotating slowly and being practically inaudible. However, at full capacity, it reaches 67ºC and the revolutions rise a lot, because around 40 dB it becomes somewhat annoying. The same happens under OC, which maintains loudness, but increases temperatures to 71ºC.*


https://elchapuzasinformatico.com/2019/01/evga-geforce-rtx-2060-xc-gaming-review/#equipo-de-pruebas
right under termal images

w1zz' reveiw for xc ultra reports 35db at 70C


----------



## floop (Jan 18, 2019)

wow... i find a Gigabyte Vega 64 Gaming OC at 399 eur... same price of the 2060... what  i do ? ahah


----------



## ne6togadno (Jan 18, 2019)

nice price not so sure about quality of gigabyte's vega 64


----------



## floop (Jan 18, 2019)

ne6togadno said:


> nice price not so sure about quality of gigabyte's vega 64


why?
for the bios? i read about a update of the firmware...


----------



## TheoneandonlyMrK (Jan 18, 2019)

newtekie1 said:


> I find your comments pretty damn funny, or maybe ironic.  I mean, you post calling someone else a shill, then post bullshit like this.
> 
> First, the RTX 2060 is not "maybe 5%" faster.  It's 11% faster at 1080p, which is what the OP is using.
> 
> ...


6GB is already looking breachable today ,plus Nvidia sells the 2060..... get the Vega , teach Huang how not to rip off consumer's via share price..
Win next generation when they bother to compete on price.


----------



## ne6togadno (Jan 18, 2019)

floop said:


> why?
> for the bios? i read about a update of the firmware...


nop.
it's like gigbyte didnt put much effoert in vega cards. both their 56 and 64 are average and below and it is like they are available only so that gb can say "hey we too have vega" or rather "we didnt liked to but we have contract with amd and we had to buy some vega chips". most user's reveiws about gb's vega cards are around average.

edit: dont bios falsh vega unless you are 100% sure what are you doing.


----------



## cucker tarlson (Jan 18, 2019)

ShieldHead said:


> There's no difference between an i7 8700k and your Ryzen 5 2600 at those framerates. DXR gaming is limited by the Graphics card.
> Are you really considering a single fan Evga card versus the Pulse?


there will be at 1080p 144hz


----------



## ShieldHead (Jan 18, 2019)

cucker tarlson said:


> there will be at 1080p 144hz


Good luck with raytracing at those framerates.

The cooler on the Sapphire Pulse is definitely better than the gigabyte's, however you should still be able to undervolt it a good bit for some extra performance.
My choice would be the vega 64 at that price.


----------



## cucker tarlson (Jan 18, 2019)

ShieldHead said:


> Good luck with raytracing at those framerates.


what are talking about ? where did you get the idea the op was referring to ryzen 2600 bottlenecking his DXR performance?


----------



## ShieldHead (Jan 18, 2019)

cucker tarlson said:


> what are talking about ? where did you get the idea the op was referring to ryzen 2600 bottlenecking his DXR performance?


From his response to the video. I may have got it wrong but doesn't matter anyways...


----------



## cucker tarlson (Jan 18, 2019)

Vega 64 is about 10% faster than 56,which you will sometimes notice,butwith a crap gigabyte cooler on a more power hungry card. A quality sapphire Vega 56 is a better choice than the crappiest Vega 64 from Gigabyte.When it comes to Radeons,go with Sapphire or just don't. For 1080p,rtx 2060 is a better choice than both Vega cards,and it's already been said.


----------



## floop (Jan 18, 2019)

cucker tarlson said:


> For 1080p,rtx 2060 is a better choice than both,and it's already been said.


i0m confused !!!!


----------



## cucker tarlson (Jan 18, 2019)

floop said:


> i0m confused !!!!


just don't get the Gigabyte V64 and you'll be fine  what games do you play ?


----------



## gamerman (Jan 18, 2019)

its real life fact that all polaris and vega gpus from nvidia are lausy junks, evry1 can see it review and test.


exmaple how is possible that 175W nvidia gpu beat over 310W amd gpu??


think!!! bcoz amd gpus are stone age old junkies,,they are slow have terrible efficiency and value is zero.

future is and stay nvidia and intels gpus.

rtx 2060 is many times better than any polaris or vega gpu, and tests show it clear. so i cant understand that any1 EVEN CAN recomended thouse.


so,if you play FHD or WHQL games buy rtx 2060 or if you play WHQL games with big monitor 29 or bigger buy rtx 2070, and buying rtx 2080 you get more frames.

buying rtx 2080 ti, you can play 4K games even 34" monitors,and you can do it 750w psu.


----------



## cucker tarlson (Jan 18, 2019)

gamerman said:


> its real life fact that all polaris and vega gpus from nvidia are lausy junks, evry1 can see it review and test.
> 
> 
> exmaple how is possible that 175W nvidia gpu beat over 310W amd gpu??
> ...


lol,I absolute adore your comments,though I'll ask anyone,especially the OP,not to take a word of this seriously and read it for entertainment purposes only.

We're trying to help the OP here,and he wants to spend his money wisely.The only bad choice here that I see is the V64 from Gigayte.It's gonna match rtx 2060 performance at 1080p,wiith a shitty cooler and a humongous power draw.The V56 pulse is a quality card,not much slower than 2060,and is still a good choice if the OP decides to buy it.


----------



## ShieldHead (Jan 18, 2019)

gamerman said:


> its real life fact that all polaris and vega gpus from nvidia are lausy junks, evry1 can see it review and test.
> 
> 
> exmaple how is possible that 175W nvidia gpu beat over 310W amd gpu??
> ...


wow

Well I agree with *cucker tarlson. I*f you play games like fortnite, PUBG, GTA 5, then grab the RTX 2060. Otherwise get the Vega 56. I bet the vega will also age better due to extra Vram and brute performance.


----------



## cucker tarlson (Jan 18, 2019)

ShieldHead said:


> wow
> 
> Well I agree with *cucker tarlson. I*f you play games like fortnite, PUBG, GTA 5, then grab the RTX 2060. Otherwise Vega 56 ftw!


computerbase.de rtx 2060 review has a section devoted to high refresh rate gaming

https://www.computerbase.de/2019-01...itt_esportspiele_fortnite_und_co_im_benchmark

in those games 2060 can be a lot faster than Vega 56. Nvidia cards in general tend to use CPU resources better,that's why at 1080p they're usually outperforming Radeons.

2600+2060 would be a nice combination


----------



## ShieldHead (Jan 18, 2019)

cucker tarlson said:


> computerbase.de rtx 2060 review has a section devoted to high refresh rate gaming
> 
> https://www.computerbase.de/2019-01...itt_esportspiele_fortnite_und_co_im_benchmark
> 
> in those games 2060 can be a lot faster than Vega 56.


E-sport games always favored Nvidia, probably because the Asian and laptop market is dominated by Nvidia


----------



## cucker tarlson (Jan 18, 2019)

ShieldHead said:


> E-sport games always favored Nvidia, probably because the Asian market is dominated by Nvidia


not favored nvidia,but their driver just have less cpu overhead.take any amd vs nvidia comparison,at 1080p amd card usually loses by a larger margin and then picks up at 1440p,best case scenario for amd cards is 4k.not jsut because of their memory bandwidth like many think,but because at 4K the cpu overhead is out of the equsion totally.


----------



## notb (Jan 18, 2019)

ShieldHead said:


> I bet the vega will also age better due to extra Vram and brute performance.


Or worse due to lack of technologies that Nvidia already implemented and AMD admitted to be working on.


floop said:


> wow... i find a Gigabyte Vega 64 Gaming OC at 399 eur... same price of the 2060... what  i do ? ahah


400 EUR including taxes is a very low price. At least I would be concerned.


----------



## ShieldHead (Jan 18, 2019)

cucker tarlson said:


> not favored nvidia,but their driver just have less cpu overhead.take any amd vs nvidia comparison,at 1080p amd card usually loses by a larger margin and then picks up at 1440p,best case scenario for amd cards is 4k.not jsut because of their memory bandwidth like many think,but because at 4K the cpu overhead is out of the equsion totally.


Yes  that is true however some CPU heavy games (like kingdom come deliverance) perform better on Radeon.



notb said:


> Or worse due to lack of technologies that Nvidia already implemented and AMD admitted to be working on.



I doubt Raytracing is a "future-proofing" technology. DLSS basically reduces quality... might aswell tinker with ingame settings.

My conclusion is: If you play games that heavily favor nvidia then get the rtx 2060. If you want a good all around gpu with lots of potential then get the vega 56


----------



## cucker tarlson (Jan 18, 2019)

brute force has nothing to do with aging,driver support has all to do with aging.look how much brute force fury x had and how badly it aged.


----------



## ShieldHead (Jan 18, 2019)

cucker tarlson said:


> brute force has nothing to do with aging,driver support has all to do with aging.look how much brute force fury x had and how badly it aged.


The fury x aged the same way as the maxwell cards. The gtx 980ti was better from the start and had more ram and lots of OC potential. However the difference between the 2 remained, in large, the same


----------



## Final_Fighter (Jan 18, 2019)

he does have a freesync monitor. it does not make much sense to buy a freesync monitor then buy a graphics card and hope that nvidia certifies your monitor at a later time after nvidia claimed to have already tested 400 and only 12 got approved. just because you can enable it does not mean that it is actually working and its not worth taking a chance unless you already happen to have the hardware and the feature was implemented in a driver update. in this case he doesnt already have an rtx2060 so the safe bet for him is the v56 with some tweaking. the performance gap after tweaking the card will be almost unnoticeable after that and in 3 years time he will probably hand the card off to somebody he knows and get himself a new card.

if he is just shooting for fps numbers out of the box and doesnt care much about the freesync function of his monitor and is willing to take a chance and hope that it will work, go with the rtx2060.

if he wants full supported freesync function and is willing to overclock and mod, he should get the vega 56. it will be around 7% giver or take 2% slower after he finishis with overclocking and the such but he will get to use his initial investment, the monitor. not to mention the performance gap will continue to get smaller as time goes on.


----------



## Vya Domus (Jan 18, 2019)

gamerman said:


> how is possible that 175W nvidia gpu beat over 310W amd gpu??



How is it possible that you still can't write a proper sentence in English ?


----------



## floop (Jan 18, 2019)

i want to play. stop...! lol i dont like the competition about hz and fps ahah and i dont change my hw so easily... just there is my confusion and my topic.


----------



## ShieldHead (Jan 18, 2019)

floop said:


> i want to play. stop...! lol i dont like the competition about hz and fps ahah and i dont change my hw so easily... just there is my confusion and my topic.


You will be very happy with any of the two. As *Final_Fighter *said regarding freesync, be on the safe side and get the vega.
The vega is also cheaper and comes with 3 free games, whats not to like? I doubt you will notice much difference in performance.


----------



## moproblems99 (Jan 18, 2019)

floop said:


> i want to play. stop...!



Keep in mind, most people on a forum are going to tell you what they would buy.  Not what you should buy.  All of the data is right in front of you to make a decision:

Vega:

3 free games
might be cheaper
guaranteed to work with you monitor (freesync)
can be made more efficient with tweaking
RTX 2060:

Likely a little faster
might be more expensive
RTX and DLSS, if you care
choice of Micro transaction riddled game
The choice will ultimately come down to which games you play and how they perform on each card.  It is likely both cards will suit you just fine.  Neither one of them will be future proof because they are effectively mid-range and only run at their best for a gen or two before graphics requirements start to outpace them.

Unfortunately, this is where you are going to have to use that grey matter we were all blessed with.

Edit: FWIW, I have the Pulse Vega 56.  Good card but has it's limitations.


----------



## ASOT (Jan 18, 2019)

RTX 2070 or sh 1080ti


----------



## R0H1T (Jan 18, 2019)

Has anyone suggested the Vega VII yet?


----------



## moproblems99 (Jan 18, 2019)

R0H1T said:


> Has anyone suggested the Vega VII yet?



No because it wasn't what the OP asked.  It is likely out of the OP's budget...hence, Vega 56 or 2060.


----------



## eidairaman1 (Jan 18, 2019)

R0H1T said:


> Has anyone suggested the Vega VII yet?



Out of his range just like ASOTs suggestion...


----------



## Vario (Jan 18, 2019)

Vega 56 and sell the included games.  The proceeds from selling the games will tilt the decision towards the Vega with its additional VRAM.
The game codes might be worth as much as $25-35 each, and you have three of them to sell.

Edit: Not sure the value of V56, the prices are all over the place whereas its a safe bet the 2060 is ~$360.  Depends on which one you can get cheaper IMO.

Edit2: looks like you can sell the Nvidia codes and people will actually buy them despite requiring RTX card to activate, they are worth around $25-35 as well but you only get one code.

https://babeltechreviews.com/the-rtx-2060-review/5/  The performance is close enough one could probably play on either and not notice a difference, but I think over time the 8GB vram will provide more useful.


----------



## Vayra86 (Jan 18, 2019)

floop said:


> i want to play. stop...! lol i dont like the competition about hz and fps ahah and i dont change my hw so easily... just there is my confusion and my topic.



That is the way these topics always end up, unfortunately. People think your purchase will have some sort of impact on the overall GPU landscape or something. I don't know.

In my opinion you should let *price *be your guide, combined with the *quality of the cooling solution*. At equal price, and an equal cooling solution (not noisy at full load temps, and beefy enough to let the cards boost properly) it really is a toss up in performance - but the Vega option does have a VRAM advantage that is likely to pay off in the longer term. After that comparison you could look at game bundle promotions. But I wouldn't factor the latter in with cost - free games are not free money.

So, simple summary:
- Price + quality of the card/cooling
- Vega 56 / 64 / 2060 will allow you to play the same games at the same settings and nearly similar FPS

Check out reviews for the best models of each card to look around for, find the best price among those models, and win. Most of this thread consists of a bickering about a 5-10% performance gap in certain situations, that really is irrelevant in regular gameplay.


----------



## John Naylor (Jan 18, 2019)

1.  The 2060 to my eyes is the proverbial no brainer being faster then the Vega 64 ... especially if it's one of the ones that shuts the fans off.

2.  Still have yet to see any instance of 6 GB not being enough at 1440p.

3.  Not sure how the Zotac Twin does but even the FE model does better ... with the various AIB offerings between 118 and 123 in the TPU OC test

https://tpucdn.com/reviews/MSI/GeForce_RTX_2060_Gaming_Z/images/relative-performance_2560-1440.png


----------



## deemon (Jan 18, 2019)

If your pants are nt burning, you might even consider picking up the GTX 1160 or GTX 1660 or whatever it's actual final name will be. 2060 without the DLSS / RTX crap and hopefully -100$ ... or even -50$ would make it nobrainer.


----------



## notb (Jan 18, 2019)

ShieldHead said:


> I doubt Raytracing is a "future-proofing" technology.


I seriously doubt you understand what "raytracing" is (the way you write about it). RTX cards have been around for months. Everyone interested in GPUs should have already learned what they are about.

Anyway, AMD is working on RTX analogue, so it might just be that in 1-2 years every new card does it and, hence, most AAA games utilize it.
And people without such tech will end up with uglier games. 


> DLSS basically reduces quality... might aswell tinker with ingame settings.


How exactly does DLSS reduce quality? 


> My conclusion is: If you play games that heavily favor nvidia then get the rtx 2060. If you want a good all around gpu with lots of potential then get the vega 56


If anything, it's Nvidia who favours games, not the other way around. They make cards that are good for gaming (and they make cards that are good for computing as well).
Why would anyone want an "all-round" GPU? It makes no sense. We buy GPUs for particular tasks.

I'm not that interested in how games are made, but it might just be that it's easier to optimize games for Nvidia GPUs. Hence, you end up with a feeling that game studios prefer that company.
Nvidia dominated GPU computing not because programmers and scientists favour the green side, but because it's just way easier and more effective.


----------



## Deleted member 67555 (Jan 18, 2019)

moproblems99 said:


> Whichever has the performance you need at the price you want.  Forget about whose name is on the box.


Thread for literally every choice.
But this is for opinions


----------



## ASOT (Jan 18, 2019)

Moneywise those gpu mention are better then 56 or 2060


----------



## floop (Jan 18, 2019)

Vario said:


> Vega 56 and sell the included games.


what?
I know that it's impossible



Vario said:


> https://babeltechreviews.com/the-rtx-2060-review/5/ The performance is close enough one could probably play on either and not notice a difference, but I think over time the 8GB vram will provide more useful.


in the games who i want to play, the Vega it's slower... XD


----------



## ShieldHead (Jan 18, 2019)

notb said:


> I seriously doubt you understand what "raytracing" is (the way you write about it). RTX cards have been around for months. Everyone interested in GPUs should have already learned what they are about.


What I meant is that the current gpu's that support raytracing are not powerful enough to utilize it in a way that matters. Who cares about some nice puddle reflection? I want full raytraced lighting and shadows.




notb said:


> How exactly does DLSS reduce quality?


DLSS sounds good, albeit just a fancy pre-generated AA, but do you think many games will support it? RTX cards released in september and so far only a few games have it/promise to have it.




notb said:


> If anything, it's Nvidia who favours games, not the other way around. They make cards that are good for gaming (and they make cards that are good for computing as well).
> Why would anyone want an "all-round" GPU? It makes no sense. We buy GPUs for particular tasks.
> 
> I'm not that interested in how games are made, but it might just be that it's easier to optimize games for Nvidia GPUs. Hence, you end up with a feeling that game studios prefer that company.
> Nvidia dominated GPU computing not because programmers and scientists favour the green side, but because it's just way easier and more effective.



My best guess is that some game and engine developers rather optimize for the majority (with possible incentives behind it). We've seen lots of recent games where radeon graphic cards perform better than its nvidia counterparts.

My recomendation is still a very impartial one, the vega 56 is cheaper, performs very close to the RTX 2060, guaranteed to work with freesync monitor and comes with 3 games. Thats a better list than having a GPU that can poorly render some fancy effects on a handfull of titles.



floop said:


> what?
> I know that it's impossible
> 
> 
> in the games who i want to play, the Vega it's slower... XD



If that's the case then just get the RTX 2060 and be done with it. Don't dwell too long on the subject.


----------



## NdMk2o1o (Jan 18, 2019)

floop said:


> what?
> I know that it's impossible
> 
> 
> in the games who i want to play, the Vega it's slower... XD


/thread


----------



## biffzinker (Jan 18, 2019)

floop said:


> in the games who i want to play, the Vega it's slower... XD


Get the RTX 2060 FE or this MSI RTX 2060 VENTUS 6G OC looks interesting.
https://www.newegg.com/Product/Product.aspx?Item=N82E16814137380


----------



## notb (Jan 18, 2019)

ShieldHead said:


> What I meant is that the current gpu's that support raytracing are not powerful enough to utilize it in a way that matters. Who cares about some nice puddle reflection? I want full raytraced lighting and shadows.


Computers can't do full ray-traced lighting. They can only approximate.
Computers can't even store precise numbers, so what exactly do you expect? 


> My best guess is that some game and engine developers rather optimize for the majority (with possible incentives behind it). We've seen lots of recent games where radeon graphic cards perform better than its nvidia counterparts.


Which are often badly ported console games. 
If you don't care about optimization, a game will likely run better on a more powerful hardware, which is usually AMD.
If you care, it may be easier to optimize for Nvidia. It's certainly easier to do computing on Nvidia.

As for Tensor and RTX cores - we're still talking about using them as Nvidia wants you to do in a gaming card, i.e. for DLSS and RTRT. But the same stuff is found in computing accelerators - doing more general tasks. After all it's just a circuit optimized for certain algebra problems. It's a matter of time until we learn how to utilize them in consumer PCs.

So while using RTRT will always mean a drop in fps (like any other image quality option), switching it off could mean that the hardware can be used for other jobs => provide a performance boost.
It's a bit too early to judge, but look what happened with the whole GPGPU phenomenon. And it all started in a similar way - GPU makers designed circuitry that was meant to improve lighting effects.


----------



## moproblems99 (Jan 18, 2019)

notb said:


> Computers can't do full ray-traced lighting. They can only approximate.
> Computers can't even store precise numbers, so what exactly do you expect?
> 
> Which are often badly ported console games.
> ...



Hey all well and good but your conversation does nothing for the OP.  This is for another thread.

Edit:



notb said:


> Computers can't even store precise numbers, so what exactly do you expect?



Also, you may want to rethink that.  Wall street would like to have a word with you.



floop said:


> in the games who i want to play, the Vega it's slower... XD



Then you have the answer.  If it is within you budget, buy it.  Don't make this brain surgery.  It is merely a gpu.


----------



## eidairaman1 (Jan 18, 2019)

moproblems99 said:


> Hey all well and good but your conversation does nothing for the OP.  This is for another thread.
> 
> Edit:
> 
> ...



Nuclear timer to boot lol


----------



## Vario (Jan 19, 2019)

floop said:


> what?
> I know that it's impossible
> 
> 
> in the games who i want to play, the Vega it's slower... XD


You can sell the codes on ebay, type AMD game code and you can see that people do in fact buy the codes for a decent amount of money.  You can think of it as a rebate on the purchase price.


----------



## deemon (Jan 19, 2019)

Vario said:


> You can sell the codes on ebay, type AMD game code and you can see that people do in fact buy the codes for a decent amount of money.  You can think of it as a rebate on the purchase price.



Where or from who do you actually get the "included games"? I see vega gpu's for sale here, 0 included games mentioned.


----------



## biffzinker (Jan 19, 2019)

deemon said:


> Where or from who do you actually get the "included games"? I see vega gpu's for sale here, 0 included games mentioned.


Not all cards have the free games bundle. 

This one does and it's selling for $339.99
https://m.newegg.com/products/N82E16814131740


----------



## deemon (Jan 19, 2019)

biffzinker said:


> Not all cards have the free games bundle.
> 
> This one does and it's selling for $339.99
> https://m.newegg.com/products/N82E16814131740



Know any Europe sellers who does this?


----------



## biffzinker (Jan 19, 2019)

deemon said:


> Know any Europe sellers who does this?


Found this seller, I'm sure there's more than one.
https://www.overclockers.co.uk/powe...hbm2-pci-express-graphics-card-gx-193-pc.html


----------



## cucker tarlson (Jan 19, 2019)

This availability of the game bundle is incredibly spotty. Here's a full list of vendors

https://www.amdrewards.com/terms?we...sionGUID=75a81528-7563-3270-973a-227e262453a1  -> go to "raise the game fully loaded radeon system integrator campaign" (4th link)

as an example,only two vendors in my country are covered by the program.that means I'm very limited to the choice of the seller if I want to get the bundle.The cheapest Vega 56 I can get there is 1770 pln for Gigabyte V56 dual fan,while in other shops I can get the much better V56 pulse at 1500pln.

The game bundle is quite good.The limited availability of Radeon cards has always been a problem,at least here in PL and many other European countries too. The terms of the game bundle are pretty outrageous as well given it's only a handful of vendors that will offer it, in the end you may pay for a card covered by the programme as much or even more than you'd do if you just found the best offer on the card and get the games separately. Yet people are once again fooled into thinking that AMD somehow lowered the msrp.


----------



## notb (Jan 19, 2019)

cucker tarlson said:


> only two vendors in my country are covered by the program.that means I'm very limited to the choice of the seller if I want to get the bundle.The cheapest Vega 56 I can get there is 1770 pln for Gigabyte V56 dual fan,while in other shops I can get the much better V56 pulse at 1500pln.


3 (Komputronik, Morele, X-Kom).
And of course they raise the price. Why not? They're selling a more attractive product than the store that isn't covered by the offer.
It's the same story as with cashbacks. Some shops are included and some are not. And the price difference is usually very close to the cashback value. 

That's the whole point of making such campaigns. People always think about lowest price they've seen. That's their reference point when they hear about additional bonuses.

Remember: all promotion/marketing campaigns cost money. Manufacturers / sellers organize them to earn more, not to give you stuff for free.
And if they're making more profit, it means you're paying more, right? 

It's just like with phone+plan bundles. It's always more expensive than buying a phone and a plan separate (even from the same provider). Imagine how much additional work this generates for the company (they have to get the phones, store them, train the sales people and so on).

But people still buy phone bundles because finding how much the phone costs and multiplying the monthly payment by 24 is so f..g hard.


----------



## floop (Jan 20, 2019)

At the end, i buyed a Vega 56 pulse with 3 games bundle. 
Now, i dont know what fan buy for the ryzen 2600.
Noctua u12s? 
Coolermaster ml240l rgb?


----------



## eidairaman1 (Jan 21, 2019)

floop said:


> At the end, i buyed a Vega 56 pulse with 3 games bundle.
> Now, i dont know what fan buy for the ryzen 2600.
> Noctua u12s?
> Coolermaster ml240l rgb?



Can you get Scythe, Thermalright There?


----------



## xkm1948 (Jan 21, 2019)

floop said:


> At the end, i buyed a Vega 56 pulse with 3 games bundle.
> Now, i dont know what fan buy for the ryzen 2600.
> Noctua u12s?
> Coolermaster ml240l rgb?



Default cooler works just fine.


----------



## floop (Jan 21, 2019)

eidairaman1 said:


> Can you get Scythe, Thermalright There?


which Scythe? or Deepcool with rgb fan?


xkm1948 said:


> Default cooler works just fine.


yeah but if i want to oc ?


----------



## Zubasa (Jan 21, 2019)

floop said:


> which Scythe? or Deepcool with rgb fan?
> 
> yeah but if i want to oc ?


Depends on how far you want to OC, for mild OCs the stock cooler would do okay as well.
For max OCs you might want to get something beefier or get an AIO.

Some of those big air coolers are a pain in the rear with blocking ram slots etc,
and they cost almost as much as some AIO Liquid anyway.
For AIO I would stay away from Coolermaster, I have expereince of their pumps dying in a year or so.


----------



## floop (Jan 21, 2019)

oh ok... thanks!!!

Noctua? Are horrible but great, no?


----------



## R0H1T (Jan 21, 2019)

Horrible in what sense - color, pricing? They're arguably the best when it comes to air cooling as well as fans, though pricey for sure.


----------



## floop (Jan 21, 2019)

yeah for the color... i have a iTek Lunar 23r2 like a case with rgb fans.


----------



## kurosagi01 (Jan 21, 2019)

The stock cooler for ryzen is perfectly fine for the non-x if you're only going to overclock it to run it at the turbo speed.
Also congrats on the purchase of Vega 56.


----------



## medi01 (Jan 31, 2019)

gamerman said:


> 1. its alot faster
> 2. its have alot better efficiency
> 3. its keep value much better, meaning if you want sell it after few years and buy new gpu again,its easy.
> 4. better drivers reason, (nvidia)
> .



FUD part is marked with this color,  (highly) arguable with this


----------



## bug (Jan 31, 2019)

medi01 said:


> FUD part is marked with this color,  (highly) arguable with this


If you think "better drivers" is FUD, you're not a Linux user. AMD improved a lot in the past few years, but their OpenCL still trails Nvidia: https://www.phoronix.com/scan.php?page=article&item=rocm-20-linux50&num=1
If all you do is game on Windows, I don't imagine there's any significant difference between the two. Outside that, there is.


----------



## Vya Domus (Jan 31, 2019)

floop said:


> which Scythe? or Deepcool with rgb fan?
> 
> yeah but if i want to oc ?



You don't need an AIO or crazy expensive air coolers. My 1700X OC'd to 4 Ghz (albeit on low voltage) can be kept cool by a Scythe Katana 4 which wasn't particularly huge nor expensive. It's old but I see it can be found for as low as 30$ but I would probably buy something like this these days : https://www.newegg.com/Product/Product.aspx?Item=9SIA9ZH6CU1449


----------



## floop (Jan 31, 2019)

Vya Domus said:


> You don't need an AIO or crazy expensive air coolers. My 1700X OC'd to 4 Ghz (albeit on low voltage) can be kept cool by a Scythe Katana 4 which wasn't particularly huge nor expensive. It's old but I see it can be found for as low as 30$ but I would probably buy something like this these days : https://www.newegg.com/Product/Product.aspx?Item=9SIA9ZH6CU1449


yeah you right!

but i search at moment somethings with rgb


----------



## John Naylor (Jan 31, 2019)

floop said:


> which Scythe? or Deepcool with rgb fan?
> 
> yeah but if i want to oc ?



Don't expect much OC from Ryzen..... the $45 Scythe Fuma equals or outperforms the $90 flagship model air coolers from Noctua and Cryorig (and anyone else).  And at the same fans speeds / noise levels any CLC on the market.  The $37 Scythe Mugen Max if ya can find it is it's equal but may have fit problems in ya case


----------



## medi01 (Jan 31, 2019)

bug said:


> If you think "better drivers" is FUD, you're not a Linux user


1) There was no reference to Linux made, just the classical lie
2) When I'm working, OS doesn't matter, if I'm gaming AND using PC, games simply do not support it, so, shrug, how common a task is "working under Linux with OpenCL"?



notb said:


> Computers can't do full ray-traced lighting. They can only approximate.


Also, real ilght is a complex matter, diffraction and what not and, so there go your soft shades even if your ray tracing is running on uber machine.



xkm1948 said:


> I will just leave W1zzard’s words here


His tests show Vega to be 10% behind.
But, hold on:










But what about 1070Ti? Hm:


----------



## bug (Jan 31, 2019)

medi01 said:


> 1) There was no reference to Linux made, just the classical lie
> 2) When I'm working, OS doesn't matter, if I'm gaming AND using PC, games simply do not support it, so, shrug, how common a task is "working under Linux with OpenCL"?



It's pretty common now that OpenCL is used in so many places. From LibreOffice to image and video processing, there are many workflows that benefit from OpenCL. Serious compute works still eschews OpenCL in favour of CUDA though.

(To add insult to injury, Nvidia beats AMD at OpenCL despite Nvidia's lack of support for OpenCL 2.0. They do it leveraging only OpenCL 1.2)


----------



## notb (Feb 1, 2019)

medi01 said:


> 2) When I'm working, OS doesn't matter, if I'm gaming AND using PC, games simply do not support it, so, shrug, how common a task is "working under Linux with OpenCL"?


I'd be really surprised if you weren't regularily running software that uses OpenCL. Even 7zip is GPU-accelerated these days.

Apart from stuff that benefits a bit from GPUs (you may notice or no), there are some applications that can potentially get a huge boost: rendering, photo/video editing, simulations etc.
Most of it is written around OpenCL, but sometimes there's also a CUDA option (for example in Adobe Premiere Pro, albeit that's not on Linux yet).


> Also, real ilght is a complex matter, diffraction and what not and, so there go your soft shades even if your ray tracing is running on uber machine.


Light is fairly simple as long as you can live in classical physics (and that's enough for 3D rendering we're talking about). 


bug said:


> Serious compute works still eschews OpenCL in favour of CUDA though.


Depends on your definition of "serious compute". 
If you know that software will be run on Nvidia, going with CUDA is a no-brainer. If you're building a computation environment from scratch, it usually also makes sense to go with Nvidia (yes, it's likely cheaper ;-)).
But if software has to work on both AMD and Nvidia (and something else potentially) OpenCL is perfectly fine and is often used.
For example: until not so long ago everything for Macs had to be written in OpenCL, since you can't spec them with Nvidia GPU. It's changing now thanks to eGPU.


----------



## bug (Feb 1, 2019)

notb said:


> Depends on your definition of "serious compute".
> If you know that software will be run on Nvidia, going with CUDA is a no-brainer. If you're building a computation environment from scratch, it usually also makes sense to go with Nvidia (yes, it's likely cheaper ;-)).
> But if software has to work on both AMD and Nvidia (and something else potentially) OpenCL is perfectly fine and is often used.
> For example: until not so long ago everything for Macs had to be written in OpenCL, since you can't spec them with Nvidia GPU. It's changing now thanks to eGPU.


"Serious compute" doesn't care about AMD or Nvidia. "Serious compute" is about getting results. CUDA is usually 10x as fast as AMD's OpenCL.
I'm not an advocate of closed source solutions, but when closed source can be that fast and open source can't... well...


----------



## medi01 (Feb 1, 2019)

bug said:


> CUDA is usually 10x as fast as AMD's OpenCL



Thanks for spreading FUD, citizen.
And it takes certain qualities to blow it to this magnitude.


----------



## bug (Feb 1, 2019)

medi01 said:


> Thanks for spreading FUD, citizen.
> And it takes certain qualities to blow it to this magnitude.


Unfortunately I can't source that, because apps do not sport swappable compute backends. But it is what I understood from conversations with guys that did try both frameworks.


----------



## cucker tarlson (Feb 1, 2019)

tbh both vega 56 and rtx 2060 are good choices with good points and flaws on both sides. the rtx gpu itself is faster,more advanced and much more efficient. vega 56 is gcn,with all the strong points and disadvantages of the architecture that's becoming very long in the tooth.Vega is equipped with 8gb of memory though while the decision to put 6gb on a $350 card that will now occupy the gtx 1080/v64 place is questionable.GCN had an advantage back in the early days of dx12 and vulkan,pascal improved on that significantly and with the arrival of turing I can see nothing that amd has done over the course of the last 3-4 years that would still give them any edge over nvidia cards.They've slept on innovation and will pay the price now that a 1920 cuda turing card can match a vega cards not only in dx11 but dx12 and vulkan too. The rtx card should age better than vega.
For me RTX 2060 is a premium 1080p card,exceptional at performing in any game as long as you stay at this resolution. At 1440p it's mostly fine but you're looking at borderline vram usage.Not saying it's not enough,but it's definitely on the verge. E.g. all recent Ubisoft games show 5-6GB vram usage on my card already.


----------



## bug (Feb 1, 2019)

cucker tarlson said:


> tbh both vega 56 and rtx 2060 are good choices with good points and flaws on both sides. the rtx gpu itself is faster,more advanced and much more efficient. vega 56 is gcn,with all the strong points and disadvantages of the architecture that's becoming very long in the tooth.Vega is equipped with 8gb of memory though while the decision to put 6gb on a $350 card that will now occupy the gtx 1080/v64 place is questionable.GCN had an advantage back in the early days of dx12 and vulkan,pascal improved on that significantly and with the arrival of turing I can see nothing that amd has done over the course of the last 3-4 years that would still give them any edge over nvidia cards.They've slept on innovation and will pay the price now that a 1920 cuda turing card can match a vega cards not only in dx11 but dx12 and vulkan too. The rtx card should age better than vega.
> For me RTX 2060 is a premium 1080p card,exceptional at performing in any game as long as you stay at this resolution. At 1440p it's mostly fine but you're looking at borderline vram usage.Not saying it's not enough,but it's definitely on the verge. E.g. all recent Ubisoft games show 5-6GB vram usage on my card already.


See here: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/
And they were keen enough to look at actual performance, because we all know allocated memory is not a good indicator 

Still, it's a missed opportunity, 8GB VRAM on a 256bit interface would have made a killer product. Maybe they figured it would cut too close to 2070. Or maybe they're saving it for 2060Ti.


----------



## cucker tarlson (Feb 1, 2019)

bug said:


> See here: https://www.techspot.com/article/1785-nvidia-geforce-rtx-2060-vram-enough/
> And they were keen enough to look at actual performance, because we all know allocated memory is not a good indicator
> 
> Still, it's a missed opportunity, 8GB VRAM on a 256bit interface would have made a killer product. Maybe they figured it would cut too close to 2070. Or maybe they're saving it for 2060Ti.


it's enough in the scenario they tested.
I chose 1080ti over 2080 for the vram myself. I use a lot of aa even on a 24" 1440p monitor and have seen my gtx 1080 run out of vram at the settings I use. I find 24" 1080p downright terrible for games.
If you're willing to adjust the settings in case there's problems then 2060 with 6gb will run fine at 1440p,no doubt about that.


----------



## moproblems99 (Feb 1, 2019)

bug said:


> allocated memory is not a good indicator



allocated memory == used memory.  Your going to have problems if all your memory is allocated and something else needs it.


----------



## notb (Feb 1, 2019)

bug said:


> "Serious compute" doesn't care about AMD or Nvidia. "Serious compute" is about getting results.


That is just some weird propaganda. :-D
Yes, computation is about getting results (serious or not).
But people that work with computing definitely care what hardware and software they use.


> I'm not an advocate of closed source solutions, but when closed source can be that fast and open source can't... well...


Why aren't you an advocate of closed source? Something wrong with the results? :-D



moproblems99 said:


> allocated memory == used memory.  Your going to have problems if all your memory is allocated and something else needs it.


Wrong.
A lot of software allocates more memory than it needs.


----------



## bug (Feb 1, 2019)

notb said:


> Why aren't you an advocate of closed source? Something wrong with the results? :-D



I'm a software developer, I like it better when I can actually dig through the documentation and source code as opposed to paying thousands of dollars for courses teaching me how to use proprietary stuff.
I'm also an engineer, meaning I need to be pragmatic and use whatever gets the job done, so I understand the need for closed source as well. But I prefer it to be open.



moproblems99 said:


> allocated memory == used memory.  Your going to have problems if all your memory is allocated and something else needs it.


Quite wrong. Memory is routinely allocated pre-emptively because frequent small allocations are too costly. Allocated and unused RAM is just swapped out. For video cards, that's much less of a problem, because how often do you start several games and let them compete for VRAM?


----------



## moproblems99 (Feb 1, 2019)

notb said:


> Wrong.
> A lot of software allocates more memory than it needs.



Sure, write a program that allocates all your GPU VRAM and then go play a game.  Please report the results.



bug said:


> how often do you start several games and let them compete for VRAM?



So, if your OS decides its going to need 4GB of your vram because it might need it, how do you think your games will play?


----------



## kastriot (Feb 1, 2019)

This thread lost his  purpose long time ago except for shooting the breeze.


----------



## bug (Feb 1, 2019)

moproblems99 said:


> So, if your OS decides its going to need 4GB of your vram because it might need it, how do you think your games will play?


The OS doesn't allocate VRAM on its own. Applications and games do. What a game will do usually is look at its settings  and based on that it will try to shove as many textures in the available VRAM as possible. Sometimes that means it could load textures it won't need until half an hour later, sometimes your VRAM will be barely enough for a few seconds.


----------



## moproblems99 (Feb 1, 2019)

Not worth it.  Have a nice weekend.


----------

