# RX Vega Achieves 43 MH/s @ 130 W in Ethereum Mining



## Raevenlord (Sep 4, 2017)

AMD's RX Vega is more along the lines of an original computing card that was moved over to the consumer segment for gaming workloads than the other way around. Raja Koduri himself has said something along those lines (extrapolating a little more than what he can actually say), and that much can be gleaned with at least a modicum of confidence through AMD's market positioning and overall computing push. In the argument between gamers and miners, Raja Koduri didn't have all that much to say, but for AMD, a sale is a sale, and it would seem that after some tweaking, RX Vega graphics cards can achieve much increased levels of mining efficiency than their Polaris counterparts, further showing how Vega handles compute workloads much better - and more efficiently - than traditional gaming ones.



 

 

 

 



Now granted, Vega's strength in mining tasks - Ethereum in particular - stems mainly from the card's usage of HBM2 memory, as well as a wide architecture with its 4096 stream processors. By setting the core clocks to 1000 MHz, the HBM2 memory clock at 1100 MHz, and power target at -24%, Reddit user S1L3N7_D3A7H was able to leverage Vega's strengths in Ethereum's PoW (Proof of Work) algorithm, achieving 43 MH/s with just 130 W of power (104 W of these for the core alone.) For comparison, tweaked RX 580 graphics cards usually deliver around 30 MH/s with 75 W core power, which amounts to around 115 W power draw per card. So Vega is achieving 43% more hash rate with a meager 13% increase in power consumption - a worthy trade-off if miners have ever seen one. This means that Vega 64 beats RX 580 cards in single node hashrate density, meaning that miners can pack more of these cards in a single system for a denser configuration with much increased performance over a similarly specced RX 580-based mining station. This was even achieved without AMD's special-purpose Beta mining driver, which has seen reports of graphical corruption and instability - the scenario could improve for miners even more with a stable release.



 

 

Moreover, S1L3N7_D3A7H said he could probably achieve the same mining efficiency on a Vega 56, which isn't all that unbelievable - memory throughput is king in Ethereum mining, so HBm2 could still be leveraged in that graphics card. It seems that at least some of that initial Vega 64 stock went into some miner's hands, as expected. And with these news, I think we'd be forgiven for holding out to our hats in the expectation of increased Vega stock (at the original $499 for Vega 64 and $399 for Vega 56 MSRP) come October. Should the users' claims about RX Vega 56 efficiency be verified, and coeteris paribus in the mining algorithms landscape for the foreseeable future, then we can very much wait for respectable inventory until Navi enters the scene.

*View at TechPowerUp Main Site*


----------



## Nephilim666 (Sep 4, 2017)

No real issue with this from a consumer perspective. It would be great if AMD would market their products better so that people can be better informed.

tl;dr
AMD for mining and compute
NVIDIA for gaming


----------



## RejZoR (Sep 4, 2017)

Well, at least they'll be able to clear all the inventories to miners then... And "crappy" RX580 will become available again for gamers wanting new modern graphic card for the year 2018...


----------



## arbiter (Sep 4, 2017)

Am I only one that question's the only 130watt draw? Seems bit off, Would like to see testing using proper meters to see if that is what the card is really using.

Software isn't always right on draw.


----------



## _Flare (Sep 4, 2017)

Investment in better Drivers *WASTED*
Investment in Mantle and Vulkan and DX12 *WASTED*
Investment in better Colaboration with Game- and Engine-Makers *WASTED*

Mining YES


----------



## RejZoR (Sep 4, 2017)

_Flare said:


> Investment in better Drivers *WASTED*
> Investment in Mantle and Vulkan and DX12 *WASTED*
> Investment in better Colaboration with Game- and Engine-Makers *WASTED*
> 
> Mining YES



This is the major problem with selling all your cards to miners. Yes, you make the sales, but you're basically screwing up years of relations with gamers, developers and game engine makers. And a graphic card maker without the ecosystem of support and relations with developers is worth pretty much nothing. If they screw this up by neglecting gamers, they can just close the consumer division and just focus on mining nonsense. And when that happens, PC gaming market will be screwed, having only NVIDIA as graphics provider will suck enormously.

Or if they plan on using mining as sustainable income, they really need to do something about the availability of chips. I really hope Navi will pay off and they'll be able to do that. If not, it's gonna suck even more. Cards that are hard to obtain with inflated prices are nothing we ever wanted.


----------



## T4C Fantasy (Sep 4, 2017)

arbiter said:


> Am I only one that question's the only 130watt draw? Seems bit off, Would like to see testing using proper meters to see if that is what the card is really using.
> 
> Software isn't always right on draw.



i completely believe it since you can get a FURY X to mine at 85w

dont believe it? under volt these cards youll be shocked


----------



## Chaitanya (Sep 4, 2017)

PC gamers are screwed royally first with miners snatching up cards inflating the prices and now nVidia raising prices of their GPUs(which are already inflated).
https://www.tweaktown.com/news/58995/geforce-gtx-10-series-price-increases-coming-month/index.html


----------



## R0H1T (Sep 4, 2017)

Oh this is gonna go down well, not! The miners will keep snatching them up, all bad PR goes to AMD


----------



## techy1 (Sep 4, 2017)

no surprises here - as I stated long before release when firsts "leaks" appeared and when ams suddenly rebranded their RX "4" to RX "5" (why do AMD need a rebrand their old shyt few months before THE Vega?) - Vega will be inferior GTX 1080 (a 1.5 year old card) in all departments - performance, price/performance and in power draw. AMD did knew that and could not do a thing about that, only hope for AMD was miners. and as I stated before release - if all RX 470- RX 580 are sold out to miners for 400$+, then Vega will be sold out for 800$+. obviously it is pitty for gaming comunity - all that Vulkan vs DX 12, Freesync vs G-sync and competition in 4K gaming hopes are put on ice for long time.  Also it is bad for AMD financially - because there will be a flood of second hand cheap cards soon (obviously mining WILL end) - and noone will buy new AMD cards even for discount prices - I see rebranded Vega atmepts after that


----------



## _Flare (Sep 4, 2017)

@RejZoR
you hit the nail on the head

GCN is stuck with so many features "Compute OK, Graphic-Processing NOT SO OK"
i lost confidence in any future change giving the needed efficiency and performance
the 1080Ti is 35% faster with 231W average THAN Vega64 Pwr-Safe 214W, what in the world should happen to get there with GCN ?
GCN Tops-out at 4 Geometry-Engines wich is forever 33.33% less than 6 wich Nvidia can build.
The only thing whould be to clock Frontend and Backend 50% higher than Nvidia, wich is very unlikely.
Primitive-Shader can help but "Never bet on a Software-Solution where you can use a Hardware-Solution!"
With a declining Marketshare at the Gaming-Cards no Studio will use Prim-Shader just for AMD


----------



## Deleted member 172152 (Sep 4, 2017)

AMD seems to have given up on the high-end gaming market. Rx vega is definitely a compute card meant for things like servers, not gaming, that just has less memory than the "real" compute cards.

Let's hope Raj gets fired and the graphics team gets an increased budget so they can make compute cards AND gaming cards!


----------



## Dave65 (Sep 4, 2017)

_Flare said:


> Investment in better Drivers *WASTED*
> Investment in Mantle and Vulkan and DX12 *WASTED*
> Investment in better Colaboration with Game- and Engine-Makers *WASTED*
> 
> Mining YES



THIS!


----------



## _Flare (Sep 4, 2017)

Nvidia hold back Volta because it will have tons of Compute with its 7 TPC per GPC.
The successor of the GTX1080 could have 3584 Cores with 4 GPC.
I whould bet on 5 TPC per GPC again, but with new SM (without TensorCores)
they have plenty of month for tuning now, because Vega failed so hard.
I bet GCN Navi doesn´t reach the 1080Ti either, even if Navi clocks with 2.5GHz.


----------



## Liviu Cojocaru (Sep 4, 2017)

Bye bye AMD Radeon Group...welcome AMD Mining Group , I expect next series of Nvidia cards to start at 599$


----------



## idx (Sep 4, 2017)

A GPU should be more like a general purpose chip for parallel compute tasks that can be used in any way a user needs it to be. It is really bad what Nvidia is doing with their stripped off GPUs.

I really hope AMD will never ever follow that path. Nvidia went so far with crippling GPUs to the level that they swap shader files at the driver level and custom patch drivers for specific games ( isn't that supposed to be the developers problem to optimize their applications...).

I would love to see GPUs treated more like GP Processors not some specific game renderer.

EDIT:

I mean if I buy GPU I would like it to do the tasks that I throw at. It is not of anyone's business to interfere with what people do  with their hardware ( edit, render, mine , game, or even experiment with other stuff).


----------



## silentbogo (Sep 4, 2017)

Lol. Some random guy from reddit claims 130W usage based on HWInfo screenshot and behold - it's all over the net!
I've been reading about it yesterday, and even WCFTech [!!!]... just think about it, WCFTech did a follow-up/fact checking on those claims:


			
				WCFTech.com said:
			
		

> *[Update 6:35 PM Sunday, September 3, 2017 ]*: We finally got to do some power measurements of our own, using a power meter. Our test bed idles at 138 watts and the entire system consumes an average of about ~385 watts under load while mining. This yields a delta of 248 watts, which is obviously significantly higher than what the redditor claims to have achieved with his system.


In context: they are measuring RX Vega64 with 980mV undervolt at 1130/1100 MHz (vs 1000/1100 @1000mV?).

All things considered,  43.5MH/s is still an impressive result but in this context it is irrelevant. That power consumption number is total fiction of a delusional kid from reddit and until vega finally hits the shelves at promised prices - no one in their right mind is going to buy it. It is still a 6+month for a complete return on investment at MSRP, and "f^&k that" at today's fictional retail price. In terms of perf/W - a pair of undervolted GTX1060 6G's makes more sense and is abundant in stores worldwide.

So, once again, there is no reason for miners to hunt for Vega until the price drops to MSRP, the shelves are stocked, and/or AMD optimizes the crap out of it  to run ~60+MH/s.


----------



## idx (Sep 4, 2017)

Raevenlord@ please allow me to fix this for you:

"*RX Vega Isn't for Gamers*"

can be more like :

"*RX Vega Isn't for Gaming ONLY!*"

or maybe :

"*RX Vega Isn't  JUST an Nvidia titles render ONLY !"*

oh .. wait ! myabe :

"*RX Vega Isn't  JUST an Nvidia GAME WORKS ! render ONLY !"*
ahh .. nvm My language level is not that good in English


----------



## Vayra86 (Sep 4, 2017)

I would prefer to see a bit more restraint from TPU news, when we start linking random reddit claims, this is becoming a shitshow faster than you can imagine. Verify, let it bake for half a day, then reconsider if its still news please.

If we want to follow every reddit claim, we log on to reddit, amirite?


----------



## bug (Sep 4, 2017)

So why did AMD feature gaming performance in their marketing slides then?


----------



## idx (Sep 4, 2017)

bug said:


> So why did AMD feature gaming performance in their marketing slides then?


Maybe because gaming is an important main part in all this ?


----------



## renz496 (Sep 4, 2017)

idx said:


> A GPU should be more like a general purpose chip for parallel compute tasks that can be used in any way a user needs it to be. It is really bad what Nvidia is doing with their stripped off GPUs.
> 
> I really hope AMD will never ever follow that path. Nvidia went so far with crippling GPUs to the level that they swap shader files at the driver level and custom patch drivers for specific games ( isn't that supposed to be the developers problem to optimize their applications...).
> 
> ...



you know what so funny about this? when nvidia start introducing more compute oriented stuff with Fermi they say nvidia are selling useless feature to those that buying the GPU for gaming purpose. now when AMD did this and nvidia make two separate architecture for gaming and compute they say what nvidia is doing is bad (crippling the GPU capabilities). so in the end everything that nvidia do is wrong.


----------



## Vayra86 (Sep 4, 2017)

renz496 said:


> you know what so funny about this? when nvidia start introducing more compute oriented stuff with Fermi they say nvidia are selling useless feature to those that buying the GPU for gaming purpose. now when AMD did this and nvidia make two separate architecture for gaming and compute they say what nvidia is doing is bad (crippling the GPU capabilities). so in the end everything that nvidia do is wrong.



Nah people are just cheap and want everything (super efficient gaming performance + all the compute they can get + solid drivers for everything) at the price of a mid range gaming GPU, even though there has been a gaming/pro market since forever.

Meanwhile, Vega Frontier Edition is not in their systems either and they waited out the first Titan to jump on the 780. You get it? 

These are also the people who complain about Intel not making faster CPUs while there is no competitor in the world that can make them faster.


----------



## Rahmat Sofyan (Sep 4, 2017)

Good news..

Perhaps used RX 580 and RX 480 will flooding the market soon ..


----------



## Recus (Sep 4, 2017)

idx said:


> Raevenlord@ please allow me to fix this for you:
> 
> "*RX Vega Isn't for Gamers*"
> 
> ...



No Vega definitely isn't for gamers. AMD used Geforce cards in their Gamescom booth.




Spoiler: pics


----------



## FordGT90Concept (Sep 4, 2017)

Radeon versus GeForce has really become more like RISC versus ASIC.  Both designs have their pros and cons.  Vega is as good at compute as Pascal is at gaming; the reverse is untrue.


----------



## Aldain (Sep 4, 2017)

What a bad title


----------



## bug (Sep 4, 2017)

FordGT90Concept said:


> Radeon versus GeForce has really become more like RISC versus ASIC.  Both designs have their pros and cons.  Vega is as good at compute as Pascal is at gaming; the reverse is untrue.


Depends on which Pascal you're looking at


----------



## FordGT90Concept (Sep 4, 2017)

GP100 is slower than Vega64 in compute. Only way GP100 comes out on top is in memory intensive tasks thanks to the 4096-bit bus.


----------



## dimitrix (Sep 4, 2017)

Fake news ... power consumption is almost double ... near to 250W

So you can make 50 MH/s using 2XRX570 with less power ...


----------



## _Flare (Sep 4, 2017)

https://www.nvidia.com/content/PDF/...DIA_Fermi_Compute_Architecture_Whitepaper.pdf

Efforts to exploit the GPU for non-graphical applications have been underway since 2003. By using high-level shading languages such as DirectX, OpenGL and Cg, various data parallelalgorithms have been ported to the GPU. Problems such as protein folding, stock options pricing, SQL queries, and MRI reconstruction achieved remarkable performance speedups on the GPU. These early efforts that used graphics APIs for general purpose computing were known as GPGPU programs.
While the GPGPU model demonstrated great speedups, itfaced several drawbacks. First, itrequired the programmer to possess intimate knowledge of graphics APIs and GPUarchitecture. Second, problems had to be expressed in terms of vertex coordinates, texturesand shader programs, greatly increasing program complexity. Third, basic programming features such as random reads and writes to memory were not supported, greatly restricting the programming model. Lastly, the lack of double precision support (until recently) meantsome scientific applications could not be run on the GPU.

To address these problems, NVIDIA introduced two key technologies—the G80 unified graphics and compute architecture (first introduced in GeForce 8800®, Quadro FX 5600®, andTesla C870®GPUs), and CUDA, a software and hardware architecture that enabled the GPU to be programmed with a variety of high level programming languages. Together, these two technologies represented a new way of using the GPU. Instead of programming dedicated graphics units with graphics APIs, the programmer could now write C programs with CUDA extensions and target a general purpose, massively parallel processor. We called this new way
of GPU programming “GPU Computing”—it signified broader application support, widerprogramming language support, and a clear separation from the early “GPGPU” model ofprogramming.

So GPGPU is dead, long live GPU-Computing ... (addition for the archives)


----------



## T4C Fantasy (Sep 4, 2017)

silentbogo said:


> Lol. Some random guy from reddit claims 130W usage based on HWInfo screenshot and behold - it's all over the net!
> I've been reading about it yesterday, and even WCFTech [!!!]... just think about it, WCFTech did a follow-up/fact checking on those claims:
> 
> In context: they are measuring RX Vega64 with 980mV undervolt at 1130/1100 MHz (vs 1000/1100 @1000mV?).
> ...



Its not hard to play with voltages to get a vega 64 to mine at 130w....

Thats my only problem with the post... 130w is so easy to get if you know how to undervolt and clock


----------



## DeathtoGnomes (Sep 4, 2017)

dimitrix said:


> Fake news ... power consumption is almost double ... near to 250W
> 
> So you can make 50 MH/s using 2XRX570 with less power ...


dont be silly, ever hear of undervolting? Get out and visit more tech and review sites! There were some claims to have had wattage down to under 120 watts.

beep-beep, cluebus leaving...


----------



## _Flare (Sep 4, 2017)

idx said:


> A GPU should be more like a general purpose chip for parallel compute tasks that can be used in any way a user needs it to be. It is really bad what Nvidia is doing with their stripped off GPUs.
> 
> I really hope AMD will never ever follow that path. Nvidia went so far with crippling GPUs to the level that they swap shader files at the driver level and custom patch drivers for specific games ( isn't that supposed to be the developers problem to optimize their applications...).
> 
> ...



1. GCN never will be able to come close to GeForce at Top-Tier.
2. A totally Computing focused GPU, wich can game too, never will come close a pure Pixel-Monster, wich can Compute too, in Games on an efficient approach.
3. GCN just isn´t as modular as GeForce are.
some pictures what Nvidia can mix and match:
PDF from "Tesla G80", to Volta GeForce
https://www.file-upload.net/download-12692808/nvidia_SMX_GM104_GP100_GV100.pdf.html

and now remember what happened from HD7970 to Tonga to Fiji to Vega, sad huh?


----------



## ZoneDymo (Sep 4, 2017)

are we just going to ignore the Vega 56 is just as good as a GTX1070 if not better for the same money?

I mean sure AMD is much better in another area, but just because it does as good or even a bit worse in some area does not mean it suddenly is worthless for that area....

Yet I get that sort of vibe from TPU as of late, and I dont know where its coming from.
Or have fanboys successfully overhyped Vega so that the outcome could only be disappointment so that we can now systematically set AMD aside for no real reason?


----------



## _Flare (Sep 4, 2017)

well, like at Fiji Nano vs. GTX980 ... the Nano Vega will show IF a sub-180W GPU can beat the GTX1080, too.

And Navi will only scale with Mhz like the architectures before, but we will NOT see IPC-Gains in Games.
And IF Navi will have MCMs with so then doubled or quadrupled Frontends the Driver alone has to do the sync and take down 4 CPU-Cores easy.


----------



## okidna (Sep 4, 2017)

Recus said:


> No Vega definitely isn't for gamers. AMD used Geforce cards in their Gamescom booth.
> 
> 
> 
> ...



Damn mining craze, even AMD can't get stock of their own cards!


----------



## silentbogo (Sep 4, 2017)

T4C Fantasy said:


> Its not hard to play with voltages to get a vega 64 to mine at 130w....
> 
> Thats my only problem with the post... 130w is so easy to get if you know how to undervolt and clock


My problem is not the 130W figure, my problem is that the guy shows a screen of HWInfo displaying 103W GPU core and 43W VRAM, and screams 130W mining.
And this is wrong at least on several levels:
1) 103+43 ≠ 130
2) Software readings are inaccurate, especially when it comes to Vega
3) Card's power consumption is a lot more than just adding vCore and vRAM power

So, while it may somehow be possible to run Vega64 at 130W for mining, it is not possible at the current state to run 43.5MH/s at 130W....  
...because science.

The most realistic approximation I can get off the top of my head is ~180W at 1000mV vCore with 1GHz GPU downclock, given that you don't attempt to overclock HBM.
We do have some Vega owners on TPU, who can probably verify my guesstimate.


----------



## Sempron Guy (Sep 4, 2017)

our local newspaper tabloid can come up with a better title than is. Why stoop so low?


----------



## Camm (Sep 4, 2017)

People whinge about AMD abandoning gamers.

Gamers abandoned AMD a long time ago, even when its cards were faster and cheaper.

AMD's strategy will make sense in the long term - full precision shaders are unnecessary for 90% of game tasks, so the shift to 16 bit shaders will be a boon both to compute centric tasks, and gaming on those tasks.

Until then (if your one of the 30% that buys a card that's faster than a 580), either you have other shit to do than just game, or really like AMD tech no matter what.


----------



## qubit (Sep 4, 2017)

So Vega is a big old disappointment when it comes to gaming, but I have to hand it to AMD, they're very clever here. We must remember that their ultimate aim, like with any company, is to make a profit - to make money - and as much of it as possible, not to make customers happy, especially not the mainstream ones.

They've figured out that they can make a killing serving the mining market and not bother too much with satisfying gamers, so they've concentrated on creating great mining cards. You can even see this from Raja's comment, where Raevenlord says "in the argument between gamers and miners, Raja Koduri didn't have all that much to say, but for AMD, a sale is a sale". Quite. Sucks for us gamers, but I can't fault them for doing this as profit is the bottom line. I'd do exactly the same in their shoes. This strategy explains the use of that fast, expensive HBM on their cards, since it clearly helps with mining.

What we don't want is for NVIDIA to start doing the same, but I can see it happening...


----------



## EarthDog (Sep 4, 2017)

I thought i was on tpu, is this TMZ?


----------



## FordGT90Concept (Sep 4, 2017)

Camm said:


> People whinge about AMD abandoning gamers.
> 
> Gamers abandoned AMD a long time ago, even when its cards were faster and cheaper.


AMD did not abandon gamers and never has.  Gamers always were the core of their audience and still are.  Miners are extremely fickle and completely unloyal.  Enterprise customers buy in large volumes in spats.  Gamers are the only purchasers that AMD can build a GPU company from because they're reliable and relatively consistent (respond to broad market forces rather than emotion).

What you speak of...AMD still commanded some 40% of the GPU market share among gamers.  Technology products ebb and flow based on their technology so AMD naturally fell behind when they were blindsided by Maxwell.  Even so, AMD can price their products to still be competitive against NVIDIA, and they do, which is why they still command a healthy market share.  AMD was blindsided by GP102 again.  What's AMD's response?  Vega56: faster than GTX1070 by a large margin for about the same price.  Their technology still is behind NVIDIA which shows in power consumption but, honestly, who cares?  Most people make their purchase based on performance per dollar, not performance per watt.

AMD got knocked down again because NVIDIA outmaneuvered them again.  They aren't out of the fight, not by a long shot.

Oh, and that RX Vega56 bitch slaps GP102 at compute.  It's icing on the cake even if not particularly useful to gamers right now.  There's still people happily playing new games on HD 7970 GHz cards.  Radeon cards have untapped potential that is released with age.  The same can't be said for GeForce.  That raw compute power and architectural flexibility is the reason for that.


----------



## iO (Sep 4, 2017)

qubit said:


> So Vega is a big old disappointment when it comes to gaming, but I have to hand it to AMD, they're very clever here. We must remember that their ultimate aim, like with any company, is to make a profit - to make money - and as much of it as possible, not to make customers happy, especially not the mainstream ones.
> 
> They've figured out that they can make a killing serving the mining market and not bother too much with satisfying gamers, so they've concentrated on creating great mining cards. You can even see this from Raja's comment, where Raevenlord says "in the argument between gamers and miners, Raja Koduri didn't have all that much to say, but for AMD, a sale is a sale". Quite. Sucks for us gamers, but I can't fault them for doing this as profit is the bottom line. I'd do exactly the same in their shoes. This strategy explains the use of that fast, expensive HBM on their cards, since it clearly helps with mining.
> 
> What we don't want is for NVIDIA to start doing the same, but I can see it happening...



Sure, because they were able to predict an ASIC resistant currency that only scales with memory bandwith 3-5 years ago when the development of Vega began.


----------



## qubit (Sep 4, 2017)

iO said:


> Sure, because they were able to predict an ASIC resistant currency that only scales with memory bandwith 3-5 years ago when the development of Vega began.


You think you've cockily nailed my argument with a one liner don't you?  You have not, smartypants.


----------



## vega22 (Sep 4, 2017)

another flame/click bait title dude :|

so because someone took the time to develop and optimise a mining client, which shows the real power these cards have, it makes them not for gaming....


----------



## Vayra86 (Sep 4, 2017)

qubit said:


> You think you've cockily nailed my argument with a one liner don't you?  You have not, smartypants.



I'll take a stab at it then - to consider AMD (or Raja) smart enough to market this card for mining instead of gaming is way overestimating their capacity for good business. On top of that, it ISNT good business when you release sub top GPU performance over a year later than the competitor at the exact same price. Its stagnation and nobody likes it. The only reason Vega is now in the picture for mining, is because the Nvidia stock (which mines more efficiently these days...) ROI has become a lot less favorable but mining still is profitable.

Don't attribute a mining craze to anything AMD is doing right now, really, it is complete idiocy. The only thing mining is, is Vega's saving grace, because AMD gets to sell these at a decent price and not go the way the Fury did, price drop after price drop because nobody wants a loud AIO when the competitor has silent air. All AMD does is jump on the bandwagon out of pure necessity. There is no strategy here.


----------



## Raevenlord (Sep 4, 2017)

vega22 said:


> another flame/click bait title dude :|
> 
> so because someone took the time to develop and optimise a mining client, which shows the real power these cards have, it makes them not for gaming....



Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?

If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.

If you purchased any Vega graphics card above MSRP, you may still feel great about your purchase, but objectively, it's almost certain that the comparable NVIDIA alternative is better in pure gaming/price/performance/power terms.

Meanwhile, if you're mining, you're actually tapping into Vega's potential and strengths, which sadly, and I would love to be wrong, isn't reflected in its gaming prowess.

That's why these aren't for gamers right now. Your mileage may vary with personal opinion, your favorite manufacturer, sure. But objectively, in a technical review, price/performance/power consumption graph like you see here at TPU, that doesn't stand.


----------



## silentbogo (Sep 4, 2017)

Vayra86 said:


> Don't attribute a mining craze to anything AMD is doing right now, really, it is complete idiocy. The only thing mining is, is Vega's saving grace, because AMD gets to sell these at a decent price and not go the way the Fury did, price drop after price drop because nobody wants a loud AIO when the competitor has silent air. All AMD does is jump on the bandwagon out of pure necessity. There is no strategy here.



Agreed. Over the course of Vega story everyone from AMD tried to distance themselves from mining as far as possible, from Lisa Su stating that "mining is not taken in consideration" to Raja Koduri taking the NVidia approach with gaming and deep learning. Heck, even all promotional events were centered around games, from DOOM demo, to dumping a ton of money into promoting Quake Champions and co-sponsoring QWC with both cash and hardware. 
RX Vega is about as gaming as a gaming card can get, the only problem is not even performance, but the supply.

So far out of the entire lineup of AMD's new cards we have:
- depleted polaris supply
- almost no Vega56/64 supply
- very-very few Frontier Edition cards in even fewer countries
- complete mystery about Radeon Instinct... (even Fiji-based MI8 is not out yet, and Vega-based MI25 was probably the first Vega to be announced)


----------



## EarthDog (Sep 4, 2017)

Raevenlord said:


> Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?


But that also doesnt make them NOT for gaming, an inflated price.

A valid argument can be made it isnt for mining either as the ROI at its current pricing doesnt balance out with some other cards either.

See how that works... for both sides?



qubit said:


> You think you've cockily nailed my argument with a one liner don't you?  You have not, smartypants.


Instead of taking your ball and going home with a snarky comment, how about you respond with why you feel that isnt true...its how forums should work.


----------



## renz496 (Sep 4, 2017)

FordGT90Concept said:


> GP100 is slower than Vega64 in compute. Only way GP100 comes out on top is in memory intensive tasks thanks to the 4096-bit bus.



GP100 is only slightly slower than Vega (10Tflops vs 13Tflops). but for pure compute chip GP100 still better than Vega since GP100 FP64 was rated at 1/2 of it's FP32 while for Vega was rated at 1/16 (5Tflops vs 0.8Tflops). and we still have to count how efficient the hardware is to extract it's raw performance.


----------



## Vayra86 (Sep 4, 2017)

Raevenlord said:


> Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?
> 
> If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.
> 
> ...



Still, I will side with most people here who feel titles like these are on the edge of clickbait-material, or just a bit over it. I don't mind a bit of sarcasm and fun in the news posts, I really like it actually, but its a VERY fine line, and this one took it too far. I do still believe people are entitled to form their own opinion without being pushed from the onset towards a specific one. 

A good contrast: in earlier articles, you used the same sort of tone of voice but ended the article with a genuine question towards the opposite. Much better that way because it opens up the debate instead of steering it.


----------



## Raevenlord (Sep 4, 2017)

EarthDog said:


> But that also doesnt make them NOT for gaming, an inflated price.
> 
> A valid argument can be made it isnt for mining either as the ROI at its current pricing doesnt balance out with some other cards either.
> 
> See how that works... for both sides?




It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)

Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.

That's why the argument works better for one side than the other, and that's why I insist it's a much better fit for miners than gamers.



Vayra86 said:


> I don't mind a bit of sarcasm and fun in the news posts, I really like it actually, but its a VERY fine line, and this one took it too far. I do still believe people are entitled to form their own opinion without being pushed from the onset towards a specific one.
> 
> A good contrast: in earlier articles, you used the same sort of tone of voice but ended the article with a genuine question towards the opposite. Much better that way because it opens up the debate instead of steering it.



That's an excellent point, one that I can side with. While I don't feel it's clickbait, I admit that I force my own view on the matter somewhat forcefully, and immediately, with that title. And while my original interpretation of the title didn't see it that way, I understand perfectly why readers might. 

As such, I will remove that excessive fat from the title, and leave this here for users to see how the change occurred.


----------



## renz496 (Sep 4, 2017)

Camm said:


> People whinge about AMD abandoning gamers.
> 
> *Gamers abandoned AMD a long time ago, even when its cards were faster and cheaper.*
> 
> ...



is that so? if that's the case AMD will not gain market share when they have 6 month lead with 5870 (they even beat nvidia in market share back then). they also gain market share when they have 3 months lead on 7970. if AMD want people to buy their GPU they need the fastest single GPU crown for themselves. it was that simple.


----------



## EarthDog (Sep 4, 2017)

Raevenlord said:


> It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)
> 
> Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.
> 
> That's why the argument works better for one side than the other, and that's why I insist it's a much better fit for miners than gamers.



Better, doesnt mean good, nor does it mean it isnt a gaming card either.


----------



## cucker tarlson (Sep 4, 2017)

At least Vega is gonna have better resale value than GTX. Unless the cryptocurrency mining market plummets. Then Vega's resale value is going to go down dramatically.


----------



## vega22 (Sep 4, 2017)

Raevenlord said:


> Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?
> 
> If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.
> 
> ...



i guess it really comes down to who you class in the "for gamers" bracket. most gamers are not spending, even, vega 56 (msrp) money as the majority of gamers buy the more entry level cards. last i was looking the 1050 was the gamers choice as it was vastly out selling all other gpu. it's us enthusiast who pay for the cards higher up the tree and are currently unable to get these cards for whatever purpose we want to use them for.


while i am not disputing the facts in the piece, i just feel the title is a large part of the reason that people are already arguing in this thread. disputes which spill out across the whole forums and do not make it the friendly place it once was


----------



## Raevenlord (Sep 4, 2017)

vega22 said:


> while i am not disputing the facts in the piece, i just feel the title is a large part of the reason that people are already arguing in this thread. disputes which spill out across the whole forums and do not make it the friendly place it once was



I agree with that, hence my removal of the offending words from the title =)

I've made my arguments here in the comment section, and I still believe that currently Vega isn't a good option for gamers/enthusiast gamers. However, the way it was conveyed wasn't the correct one, and the one I want to have here on our site, so I prefer to take a step back instead of plowing through users' expectations.


----------



## silentbogo (Sep 4, 2017)

Raevenlord said:


> It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)


The price tag impacts everyone, but I don't believe it makes it in any way a non-gaming product. I can't remember the time when "overpriced" was a barrier for PC enthusiasts. People overpay by crazy amount for everything, from constantly degenerating gaming mice to overengineered VRMs and ridiculously massive cooling solutions, from LEDs and "Limited Edition" color options to factory overclock. Even for brand names....

In case of miners - it's all about the price. Even if Vega is the best possible mining card on the market, it still does not make sense to opt for a 7-8 months ROI, when less powerful options can give you profit in 4-5 months even with current inflated prices.


----------



## EarthDog (Sep 4, 2017)

You guys need an editor who checks your articles before they publish. Too many times things needed to be changed and recanted.. I love the informed takes, a lot better than it was with other newsies in the past, but still...that title was.....


----------



## Th3pwn3r (Sep 4, 2017)

Raevenlord said:


> Would you say that current prices and availability are for gamers? That it's worth it to pay $800 for a Vega 64, or $600+ for a Vega 56, solely for gaming?
> 
> If you were able to purchase Vega 56 at MSRP, great. That's a pretty good gaming card, you'll be extremely satisfied - and you have objective reasons for being so.
> 
> ...




I'm in total agreement. However, these days you need to be a bit more specific to prevent any counter argument since people nitpick every little thing. Vega is not for gamers in the sense that GAMERS can CURRENTLY get a lot more performance for the money out of just about any Nvidia card. Should Vega pricing get back to MSRP it's a lot more appealing, like you said 56 is actually decent, 64 still isn't good in my opinion, not unless the price comes down, power consumption producing heat being my main reason.


----------



## idx (Sep 4, 2017)

_Flare said:


> https://www.nvidia.com/content/PDF/...DIA_Fermi_Compute_Architecture_Whitepaper.pdf
> 
> Efforts to exploit the GPU for non-graphical applications have been underway since 2003. By using high-level shading languages such as DirectX, OpenGL and Cg, various data parallelalgorithms have been ported to the GPU. Problems such as protein folding, stock options pricing, SQL queries, and MRI reconstruction achieved remarkable performance speedups on the GPU. These early efforts that used graphics APIs for general purpose computing were known as GPGPU programs.
> While the GPGPU model demonstrated great speedups, itfaced several drawbacks. First, itrequired the programmer to possess intimate knowledge of graphics APIs and GPUarchitecture. Second, problems had to be expressed in terms of vertex coordinates, texturesand shader programs, greatly increasing program complexity. Third, basic programming features such as random reads and writes to memory were not supported, greatly restricting the programming model. Lastly, the lack of double precision support (until recently) meantsome scientific applications could not be run on the GPU.
> ...



OpenCL is a GPGPU api , and since opengl 4.3 there is a shader stage called Compute Shader so any developer can throw some GP tasks on that stage of rendering .
Vulkan is also going to merge with OpenCL for GPGPU applications. (sadly all nvidia GPUs support only OpenCL 1.2 while Intel and AMD already support 2.1 and 2.2 on the way).

Khronos stated during SIGGRAPH 2017 that Vulkan is going to be not just a graphics api, Vulkan is a GPU API .


----------



## TheMailMan78 (Sep 4, 2017)

Nephilim666 said:


> No real issue with this from a consumer perspective. It would be great if AMD would market their products better so that people can be better informed.
> 
> tl;dr
> AMD for mining and compute
> NVIDIA for gaming


I would just be happy if AMD didn't lie about its pricing. I used to be an AMD/ATI fan. I don't take kindly to liars. How can we be better informed when even the price they tell reviewers is a blatant lie.


----------



## TheoneandonlyMrK (Sep 4, 2017)

Raevenlord said:


> It does work for both sides, but sale prices impact miners less than they do gamers. For a gamer, paying $800 for a Vega 64 that brings the exact same gaming experience (within 2%) as a $520 GTX 1080 is much, much worse than for a miner, which can recoup those $ in other ways other than gaming (which the gamer almost certainly never will.)
> 
> Yes, they have lowered ROI than some other cards - but if it's still profitable for them, they'll do it, especially with these undervolts and overclocks that bring Vega's compute power to the table. even more so now; and you also have to take into account pure performance density considerations, since a single Vega is (arguably) more interesting than a pair of RX 580's, simply because you can get a single vega for 2x RX 580's, thus achieving a smaller system footprint with the same - or almost equivalent - mining power.
> 
> ...


I have a waterblocked modded rx480(580modded timings++bios) and a waterblocked rx vega 64 in the same rig with enough cooling for them and a bit more and I cannot hit those values with that wattage , the memory downclocks if under powered so you have to push the slider to plus 20 to maintain that memory speed even though its undervolted and downclocked ,cards differ but id suggest hes reading the wrong value as does wccf in a rare change from their normal hyperbole.
Wccf ran their own tests that suggest that reddit users wrong as is the software he's using , hwinfo64.
I tried the latest hwinfo64 build and reported hbm memory watts drawn dropped from 100-165watts to never above 30 between builds.


And there in is the pertinent point , 90% of software does not work at all with vega regarding tuneing and monitoring software , so anyone saying anything finite about what vega can or can't do is probably missinformed including many reviewers and their overclocks , a power clamp is the only sure way I've seen of knowing vegas power draw .....

Thats a full stop comment , every software lies.
All of them are unreliable yet people are basing a lot of stuff and comments of of it, or less, word of mouth.


----------



## MrGenius (Sep 4, 2017)

Seriously!? This is 1000000000000000000000000000000000000000000000000% unsubstantiated BULLSHIT! It CANNOT be and HAS NOT been replicated or confirmed by anyone and until it is it SHOULD NOT be believed for half a second.

And YES, someone already tried to replicate and confirm the results. And came NOWHERE CLOSE!!!
http://wccftech.com/amd-rx-vega-64-...m-eclipsing-polaris-efficiency-factor-2x/amp/

Proving it's total BS!!!

Let me quote myself for clarity on the matter:


MrGenius said:


> Read the article carefully. They basically proved it was impossible with the amount of watts reportedly consumed.
> 
> 1 card + 248W = 43.8 MH/s
> 2 cards + 248W = 87 MH/s
> ...


https://www.techpowerup.com/forums/threads/so-long-vega.236740/


----------



## qubit (Sep 4, 2017)

EarthDog said:


> Instead of taking your ball and going home with a snarky comment, how about you respond with why you feel that isnt true...its how forums should work.


I was just responding in kind, did you properly read what I said? If people think I "don't have an answer" (I do) then I don't really care either. Make a polite post to me and I'll happily discuss it. Snarky and I won't bother. Simple.

I see that a friendly and reasoned reply from Vayra, I think, has disapeared, most likely because posted during the database transition. I never had a chance to read it properly unfortunately, but I would have been happy to reply to it.


----------



## B-Real (Sep 4, 2017)

_Flare said:


> Nvidia hold back Volta because it will have tons of Compute with its 7 TPC per GPC.
> The successor of the GTX1080 could have 3584 Cores with 4 GPC.
> I whould bet on 5 TPC per GPC again, but with new SM (without TensorCores)
> they have plenty of month for tuning now, because Vega failed so hard.
> I bet GCN Navi doesn´t reach the 1080Ti either, even if Navi clocks with 2.5GHz.


What?
The Vega reached the 980Ti (1070) and even the 1080. Why are you making fake conclusions?


----------



## lemkeant (Sep 4, 2017)

Well I have a Vega 64 I just installed (had it for a couple of weeks, but waiting on my EK block). My freesync monitor kept me with AMD

I tossed it in my NCase build and the blower can barely keep up in the ITX case...FYI in case others were curious. Anyway, my rig idles around 60-70 watts. Using the same settings this person did, I'm pulling about 255 watts and the same 42-43 mh/s. Doing the math that puts the card around 180-190 watts. Definitely not 130 watts. 

Attaching a super quick screenshot


----------



## EarthDog (Sep 4, 2017)

qubit said:


> I was just responding in kind, did you properly read what I said? If people think I "don't have an answer" (I do) then I don't really care either. Make a polite post to me and I'll happily discuss it. Snarky and I won't bother. Simple.
> 
> I see that a friendly and reasoned reply from Vayra, I think, has disapeared, most likely because posted during the database transition. I never had a chance to read it properly unfortunately, but I would have been happy to reply to it.


lol....

Dear qubit, can you please kindly respond to the person with whatever information you have so as to clear up the misconceptions...?

Kthx.


----------



## Kissamies (Sep 4, 2017)

This toy money crazyness needs to stop. I want a card with ok price for gaming, not a mid-end card with insane price, since toy money miners buy all the cards.

Luckily I have lots of monopoly money.


----------



## Camm (Sep 4, 2017)

renz496 said:


> is that so? if that's the case AMD will not gain market share when they have 6 month lead with 5870 (they even beat nvidia in market share back then). they also gain market share when they have 3 months lead on 7970. if AMD want people to buy their GPU they need the fastest single GPU crown for themselves. it was that simple.



Nvidia still outsold AMD that half. Its the closest AMD ever came to taking a lead in marketshare, but even against the 2 series, AMD couldn't outsell Nvidia. Which is sad, because the 5xxx series were awesome.


----------



## ihog6hog (Sep 4, 2017)

This's UBIQ Coin , Not ETH Coin

UBIQ Coin use low power.


----------



## qubit (Sep 5, 2017)

EarthDog said:


> lol....
> 
> Dear qubit, can you please kindly respond to the person with whatever information you have so as to clear up the misconceptions...?
> 
> Kthx.


That's some glorious sarcasm. 

You didn't get my point though.


----------



## evernessince (Sep 5, 2017)

silentbogo said:


> Lol. Some random guy from reddit claims 130W usage based on HWInfo screenshot and behold - it's all over the net!
> I've been reading about it yesterday, and even WCFTech [!!!]... just think about it, WCFTech did a follow-up/fact checking on those claims:
> 
> In context: they are measuring RX Vega64 with 980mV undervolt at 1130/1100 MHz (vs 1000/1100 @1000mV?).
> ...



There must be allot of delusional people then, because it's been sold out since launch.  FYI Vega may not be an awesome gaming card but it is still very good at professional work and mining.

":at today's fictional retail price. In terms of perf/W - a pair of undervolted GTX1060 6G's makes more sense and is abundant in stores worldwide."

Read the article, density is important to miners.  In addition, GTX 1060 6GB cards are running about $300 right now.


----------



## EarthDog (Sep 5, 2017)

qubit said:


> That's some glorious sarcasm.
> 
> You didn't get my point though.


it wasnt...and, i got it. Just looking for answers instead of playing games.


----------



## thesmokingman (Sep 5, 2017)

Recus said:


> No Vega definitely isn't for gamers. AMD used Geforce cards in their Gamescom booth.
> 
> 
> 
> ...




How are they going to show A/B comparisons if they don't use a B sample? Both NV and AMD use competitor products to show A/B comparisons. OMG, this is news!


----------



## sweet (Sep 5, 2017)

thesmokingman said:


> How are they going to show A/B comparisons if they don't use a B sample? Both NV and AMD use competitor products to show A/B comparisons. OMG, this is news!


It seems your sarcasm detector is broken, or the guy you quoted is actually an absolute retard.


----------



## TheMailMan78 (Sep 5, 2017)

So many butthurt AMD employees on TPU now. Here's a pro tip. Build better products, hire a better marketing team, don't lie about price to reviewers. Believe it or not people can do math and hedging you bets with miners is a losing one.


----------



## TheoneandonlyMrK (Sep 5, 2017)

TheMailMan78 said:


> So many butthurt AMD employees on TPU now. Here's a pro tip. Build better products, hire a better marketing team, don't lie about price to reviewers. Believe it or not people can do math and hedging you bets with miners is a losing one.


Troll much , relevance Zero.
Were on about mining efficiency bullshit in this thread man ,get with the flow, just sounding like a butt hurt hater who woke up the wrong side of his bed, i do it myself sometimes.


----------



## dozenfury (Sep 5, 2017)

If you could buy the 580 and Vega at retail, the 580 would be the most cost efficient based on pure mining speed (power aside).  A 580 at $229 and 30 MH/s is about $7 per MH, vs. $9 or $11 for a Vega 56 (assuming the 56 could also hit 43 MH/s) at $400 or 64 at $500.  

It's going to be different for everyone if the power savings would be enough to offset that, since power rates fluctuate so much.  It's 6.5 cents kw/h in my area, but other areas are up to double that.  And it still would take quite a while to make up the higher per MH card cost than a 580 in 15w an hour less power.

But it's all kind of moot until you can actually buy the cards for retail, since it's really going to be very tough to recoup costs at gouging prices.  And there's the roll of the dice that ether prices won't crash again, and that difficulty doesn't shoot through the roof more than it has, and that ether doesn't go proof-of-stake before you earn your $ back.  A year ago that was expected this past summer, so we're already kind of in bonus time for proof-of-stake.


----------



## TheMailMan78 (Sep 5, 2017)

theoneandonlymrk said:


> Troll much , relevance Zero.
> Were on about mining efficiency bullshit in this thread man ,get with the flow, just sounding like a butt hurt hater who woke up the wrong side of his bed, i do it myself sometimes.


100% relevance. Mining is a crap industry that AMD has decided to hitch its wagon too. Mining efficiency is kinda moot considering people don't even know what they are mining. AMD is playing off to this stupidity and it shows with all the hired fanboys in this thread.


----------



## TheoneandonlyMrK (Sep 5, 2017)

TheMailMan78 said:


> 100% relevance. Mining is a crap industry that AMD has decided to hitch its wagon too. Mining efficiency is kinda moot considering people don't even know what they are mining. AMD is playing off to this stupidity and it shows with all the hired fanboys in this thread.


I dissagree but hey ho.
I don't think Amd aimed at mining its a result of trying to catch-up with nvidias gpgpu offerings as its clear Ai will be big in the future.
Your first comment"So many butthurt AMD employees on TPU now. Here's a pro tip. Build better products, hire a better marketing team, don't lie about price to reviewers. Believe it or not people can do math and hedging you bets with miners is a losing one."

Just decries Amds pr and pricing , nothing to do with mining efficiency imho
And with the tone you show again in a thread plus avatar ,sure seams like you have a chip on your shoulder and are leaning fanboi style yourself.

Amd fogot to send my pay check for the last 15 years where do I complain.


----------



## ERazer (Sep 5, 2017)

cucker tarlson said:


> At least Vega is gonna have better resale value than GTX. Unless the cryptocurrency mining market plummets. Then Vega's resale value is going to go down dramatically.



who would buy a used up mining card?


----------



## DeathtoGnomes (Sep 5, 2017)

TheMailMan78 said:


> 100% relevance. Mining is a crap industry that AMD has decided to hitch its wagon too. Mining efficiency is kinda moot considering people don't even know what they are mining. AMD is playing off to this stupidity and it shows with all the hired fanboys in this thread.


If AMD really is counting on mining for profits, they need to do a better job at catering to that community. Hired fanboi's aside.


----------



## R-T-B (Sep 5, 2017)

TheMailMan78 said:


> 100% relevance. Mining is a crap industry that AMD has decided to hitch its wagon too. Mining efficiency is kinda moot considering people don't even know what they are mining. AMD is playing off to this stupidity and it shows with all the hired fanboys in this thread.



I've been over this with you.  What they are mining is very well established.

I really doubt AMD hires any fanboys.  They don't have the money to do so.


----------



## cucker tarlson (Sep 6, 2017)

R-T-B said:


> I really doubt AMD hires any fanboys.  They don't have the money to do so.


Didn't they split into a third division for that ? The RFG ?





ERazer said:


> who would buy a used up mining card?


I was thinking of miners picking up your used vega. I've seen people sell their used R9 290s from gaming rigs for 30-40% more than they're worth during the holidays.


----------



## yotano211 (Sep 6, 2017)

This thread is so full of poor mad angry gamers.


----------



## Vayra86 (Sep 6, 2017)

yotano211 said:


> This thread is so full of poor mad angry gamers.



Doubtful, 90% already has Pascal.


----------



## TheMailMan78 (Sep 6, 2017)

R-T-B said:


> I've been over this with you.  What they are mining is very well established.
> 
> I really doubt AMD hires any fanboys.  They don't have the money to do so.


Yeah I know "algorithms". Of what nobody knows.


----------



## Vayra86 (Sep 6, 2017)

TheMailMan78 said:


> Yeah I know "algorithms". Of what nobody knows.



Hot air of uncertain value. Kinda like farting..


----------



## R-T-B (Sep 6, 2017)

TheMailMan78 said:


> Yeah I know "algorithms". Of what nobody knows.



SHA256?  Equihash?  They are known.


----------



## ERazer (Sep 6, 2017)

R-T-B said:


> SHA256?  Equihash?  They are known.


deleted, ill just google it.


----------



## TheMailMan78 (Sep 6, 2017)

R-T-B said:


> SHA256?  Equihash?  They are known.


YES! and why did the NSA make SHA256? What's its purpose? (Cant wait to hear this tap dance).


----------



## R-T-B (Sep 6, 2017)

TheMailMan78 said:


> YES! and why did the NSA make SHA256? What's its purpose? (Cant wait to hear this tap dance).



A simple math hash?  What?  Are you honestly arguing it computes something? (the algorithm is well known and doesn't, its creation was part of a public contest and was audited)

It's purpose is cryptographic comparisons.

There really isn't a tap dance.


----------



## DeathtoGnomes (Sep 7, 2017)

I had thought SHA256 was/is part of windows 3.1 encryption security thing.


----------



## Aleks (Sep 8, 2017)

Hey Guys!
Just wondering what mining software you're using withe the Vega's?
I can't seem to get my Claymore Miner to read the Vega's at all. It works fine with the 570 and 580's (and all the nVidia's) but cannot recognize the Vega Cards.
Any help would be awesome.


----------



## medi01 (Sep 8, 2017)

Nephilim666 said:


> tl;dr
> AMD for mining and compute
> NVIDIA for gaming



$200 tax on adaptive sync monitor, if you go with nvidia card, makes your "for gaming" point moot.

Main issue with AMD is that their GPUs are nowhere to buy for reasonable price, not that they don't perform well.


----------



## Vya Domus (Sep 8, 2017)

Camm said:


> Gamers abandoned AMD a long time ago, even when its cards were faster and cheaper.



Actually it was the press that started throwing crap at them , as a result gamers strayed away form AMD.

It's sad but the only way AMD can recover is by doing exactly the same things that we hate Nvidia for doing. In reality it never really mattered if they had faster and cheaper cards because review sites always found vague and ridiculous reasons to not recommend their cards. Just look at reviews of the 290X and RX480.



_Flare said:


> Nvidia hold back Volta because it will have tons of Compute with its 7 TPC per GPC.
> The successor of the GTX1080 could have 3584 Cores with 4 GPC.
> I whould bet on 5 TPC per GPC again, but with new SM (without TensorCores)
> they have plenty of month for tuning now, because Vega failed so hard.
> I bet GCN Navi doesn´t reach the 1080Ti either, even if Navi clocks with 2.5GHz.



Sorry to say this , but you don't really know what you are talking about.


----------



## Aquinus (Sep 8, 2017)

TheMailMan78 said:


> YES! and why did the NSA make SHA256? What's its purpose? (Cant wait to hear this tap dance).


As @R-T-B said, the same reasons as any other hashing function. SHA256 just has a longer result which means more possible unique values to avoid the collision of two inputs mapping to the same output.


----------



## _Flare (Sep 9, 2017)

Vya Domus said:


> Actually it was the press that started throwing crap at them , as a result gamers strayed away form AMD.
> 
> It's sad but the only way AMD can recover is by doing exactly the same things that we hate Nvidia for doing. In reality it never really mattered if they had faster and cheaper cards because review sites always found vague and ridiculous reasons to not recommend their cards. Just look at reviews of the 290X and RX480.
> 
> ...



So, do you think the 4.5GHz of your FX-6300 is worth the same as same GHz with a Ryzen or i7-7700K?
i think, yes

Do you think energyefficiency matters when talking about gaming?
i think yes, if you don´t have electricyty for free
i think yes, if you want to easily and quiet cool your rig

Do you think there is a factor "x" in per clock-performance, when comparing nvidia GPU-Cores to AMD GPU-Cores?
i think the upper limit is about 1:1.25 to 1:1.33 advantage per clock at nvidia in average in games

this is btw one big point in the efficiency, just compare the Watt, Performance and Efficiency etc. of the 2048 Core Cards: GTX980 with the RX470, RX570
and additionally the GTX1070 with 1920 Cores.

maybe you learn a bit


----------



## Vya Domus (Sep 9, 2017)

_Flare said:


> So, do you think the 4.5GHz of your FX-6300 is worth the same as same GHz with a Ryzen or i7-7700K?
> i think, yes
> 
> Do you think energyefficiency matters when talking about gaming?
> ...




A block of text that has nothing to do with what I said. I have no idea about what are you even arguing and why.

 My only comment to you was and I quote :



> Sorry to say this , but you don't really know what you are talking about.



I wasn't arguing about anything , it was just an observation.


----------



## DonCaffe (Sep 17, 2017)

*PUMP UP THE GOLD MINING POWER  !!!!*

100 MH/sec/per core is the 1st breakthrough
1GH/sec/per core is the 2nd
Vega hyper cube with 100x100x100 cores will be ultimate gold mining machine.....


----------



## yotano211 (Sep 18, 2017)

DonCaffe said:


> *PUMP UP THE GOLD MINING POWER  !!!!*
> 
> 100 MH/sec/per core is the 1st breakthrough
> 1GH/sec/per core is the 2nd
> Vega hyper cube with 100x100x100 cores will be ultimate gold mining machine.....


Go home you're drunk


----------

