• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's RX Vega Launch Prices Might be Just Smoke and Mirrors

Meanwhile the rest of the known world get gouged up the wazoo for the Vega 64, prices here are in excess of 1000 dollars $1,149.95NZD ($820.03USD) to be exact so no way in hell will I be paying that ridiculous amount of money
 
I just want to get a vega card to play with, I dont even mine.
 
.....gonna pick up a second hand Nvdia 10 series from some one caught in the magic of new products...

upload_2017-8-15_23-18-11.jpeg
 
It's quite simple.

The cards rule the wavy seas of mining. Miners are buying them at insane prices. This time, differently from Polaris, a nice share goes to AMD, because it's a new card, and not a rebrand.

It's too good for miners to skip, no matter what 'the community' thinks of it. I don't like it either, but mining exists.

Benchmarks are what is to be expected from early drivers and games few years old. They'll get better in time. New games will have optimization for Vega, whatever its marketshare will be.

In the meantime, each unit produced is sold, which probably can't be said for 1080 (TI).

And for a guy sweating alot, all because 100W difference - try turning on your AC while gaming, it will consume 9.6KW / 8h, while 8h of gaming save will save you 0.8KW on world famous electricity saving and most power-efficient architecture in the universe, ever... Taking it to even more ridiculous level, Vega will save you heating money at winter... THE WINTER IS COMING!

People *really* exaggerate on relatively small power saving, making it such a huge deal. Most of us have at least one AC, do we now? And most of us don't have solar panels or thermal pumps, do we? Those things save power, not the GPU/TPU...

As much as i'm not happy with what i'm seeing from the new Vega card being a big time Amd fan. I must admit I really don't care about power consumption. Not really a big deal.
 
What miner is going to buy such expensive and power hungry card? Not many I reckon...
There are millions of users out there with cheap electricity. I also have solar panels on the house and I actually sell the electricity due to surplus. If not for the stupid price....
 
AMD is a joke and their graphics division is run by a joke. Vega is a disaster any way you look at it. They haven't made any tangible improvements to their graphics IP since GCN came out back in 2012. Their Chinese design team sucks and they've got nothing going for them. Jim Keller and most of the people who had worked on Zen have left the company too. AMD is a sinking ship. I wouldn't buy any of their products because the competition is better and cheaper.
That comment shows just how stupid you must be, cheaper and better?
Come back when you know something!
 
WTF. Still nothing available, and they are already price $100 more in the US. AMD what the hell are you doing? You know this shit happened to Ryzen at lauch. Can you get someone with experience in logistics to handle these huge product launches? Why can you deal with Samsung or Micron for your supplies to get products available? Who is going to pay hogwash for 1000W requirement GPU unless you're a real fanboy (like me?)!!!?!?!!??!?!?!!
 
It's all fault of dealers if the prices are every time different that the launch price because money and all that shit.
I'm not gonna pay more because a dealer is hungry for money.
 
Many shops in my country ask 240€ for a rx560 4GB

399€ for a GTX 1060 and the best part 584€ for a rx570.

There are no smoke and mirrors here.
 
Last edited:
Ok, so it's gonna be one of those conversations...
You go on a lot about G-Sync as if adaptive sync is the number one reason to buy a graphics card. It's not. While it's nice to have and does an amazing job at improving smoothness when there are framedrops, ultimately you want a card that can sustain those high framerates without dropping them, or hardly ever. Yes, the CPU has to keep up, game coded efficiently etc I know this, so don't "correct" me on it.

I've also acknowledged that FreeSync has the advantage of having no price premium, but as AMD is touting this as the main advantage to buying Vega, can't you see that this is a problem? They should be touting straight up framerate performance, not compensation for the lack of it!

I do remember when the two systems were first tested, G-Sync was found to have the edge on performance, so the price did buy you better and is not only an "NVIDIA tax". With the evolution of FreeSync, I'm not sure how true that holds now. Ultimately, I just wish NVIDIA would move away from the proprietary route so adaptive sync would become a defacto feature for everyone, but that's not the situation now unfortunately, so we just have to put up with it.

That hash rate range of 32-36MH/s I quoted to you is from TPU's review, so where are you getting that 42MH/s from? I don't know what you mean by dual mining it, either?

The card takes up two slots, not one, as you've stated.

  • So I just deleted the first paragraph of your argument. You basically just said "AMD is bad" for 4 sentences. Yup, AMD is late to the party - thanks for stating the obvious. We are here to talk about what we can buy NOW. Not what should have been. After all the 1080 Ti should have launched in 2016.
  • Your G-Sync paragraph also falls flat. Did you read anything I said? The main problem isn't even the price, the problem is G-sync is objectively a worse option than Freesync. Less options, less inputs, higher price, and worse monitors in general. Samsung and LG won't even bother to make them because of how silly they are. For the love of God, consoles are about to start using Adaptive sync on budget TV's. That's how behind Nvidia is becoming - oh, and they want $200 for this vastly inferior technology (The crux of your entire argument).
  • The MH you quoted is from a bloody gaming website that has made fun of Bitcoin for a decade. Bitcoin is worth $4,200, TPU doesn't know what they are talking about. For reference the 580 get's 23 MH/s at stock and up to 32 when tweaked. Do the math on Vega noob. You do not know what you are talking about.
 
It seems like vega is far from being the miner eldorado that you paint it to be

I didn't. Can you read English?

I will quote myself because apparently you can't: " it is more efficient, you can fit twice as many in the same amount of space, and most importantly: they are in stock! As long as RX 580's are above $250, RX Vega's will be hard to find."


I will say it again for those who are reading at elementary levels: The RX 580 is going for $350+ right now. That makes Vega a steal for $500-$600.
 
What is the financial return on mining at 35 m/h rate per card? Genuine question. How long to make back $600? How much does it cost in energy per card per month?
Answers would be nice for bitcoin and ethereum.

Thanks.
 
What is the financial return on mining at 35 m/h rate per card? Genuine question. How long to make back $600? How much does it cost in energy per card per month?
Answers would be nice for bitcoin and ethereum.

Thanks.


Well so I personally don't worry about energy usage. I understand in some European countries it is a major issue, but I live in the US. I will say though that since I started building mining rigs again I haven't noticed any increase in my energy bill (My July bill was actually lower than June lol).

Second, I think the people selling their coins the second they make them are suckers. I mine the coins I believe have promise, and then I sell them slowly as they increase in value. I guess you could argue I am more of a venture capitalist than a pure miner, but in my opinion it's silly to just blindly mine and sell. After all if you are selling a coin the second you mine it, why are you even bothering? Clearly you don't think it is worth anything....

But I will answer your question. If I combine the profits from dual mining Ethereum and Siacoin (Again, Vega is especially good at mining 2 coins at once), then I would make roughly $130 per month. But again, that's at today's prices. I also use this card to game, and I can sell the extra cards for $400+ in a few months. So if it breaks even, I have already made a large profit if I sell the cards!
 
title reads like amd has much say it what price retailers really charge :/

this is part of the reason some see you as anti amd raven :|

shops will always charge as much as they find people are willing to pay. if anything it looks like amd are undercharging the retail chain for their cards.
 
The main problem isn't even the price, the problem is G-sync is objectively a worse option than Freesync. Less options, less inputs, higher price, and worse monitors in general. Samsung and LG won't even bother to make them because of how silly they are. For the love of God, consoles are about to start using Adaptive sync on budget TV's. That's how behind Nvidia is becoming - oh, and they want $200 for this vastly inferior technology (The crux of your entire argument)

This is some next level nonsense. LG and Samsung both offer GSync Solutions, and Freesync is far from "objectively" better. You're literally manufacturing information to argue against Nvidia products. I would be amazed if the percentage of gamers that use Freesync or Gsync was even over 5% of the entire gaming market. They are niche products from the get go, and using it as some proof that AMD products are innately superior is idiotic. But please, continue to cite vague sources and make up numbers (like your claim that Vega64 will really mine at 44MH/s) as the crux of your argument. I'm sure net thing you'll tell me the Vega 56/64 Compute technology will make it faster than the 1100 series in games released in 2020 based on data you scryed while communing with the ancients.
 
title reads like amd has much say it what price retailers really charge :/

this is part of the reason some see you as anti amd raven :|

shops will always charge as much as they find people are willing to pay. if anything it looks like amd are undercharging the retail chain for their cards.

Title is based on story (linked in OP) that a retailer is stating AMD pricing is not $499, it's higher. Point being if AMD release a card to reviewers at a suggested price which they then inflate (AMD, not the retailers) then it is smoke and mirrors and dishonest.

If however, AMD or a retailer can show the current msrp, then it'll all be clear. It would be easy for AMD to clarify the msrp so the blame can fall on the retailers. But Gibbo (honest guv' he's not a bleedin' liar?) is saying that retailers will actually lose money on the cards if the had to sell them at $499 as AMD aren't offering the rebate now.
 
What is the financial return on mining at 35 m/h rate per card? Genuine question. How long to make back $600? How much does it cost in energy per card per month?
Answers would be nice for bitcoin and ethereum.

Thanks.

If that's etherium that's $130 a month minus overhead
 
If that's etherium that's $130 a month minus overhead

300w usage per day (single card system) 24/7 for a full month would cost approx £30 in the UK (based on an online tool). So first year annual cost plus vega would be about £1000+ (if at 300 watt power use). After that, obviously it'd drop to <£400/yr to run.

Hmmm.... Checks saving account...
 
If this is true.

Then AMD you can keep your exspensive power hungry radiators. I am gonna keep my GTX 1080 TI.

o_O You have a GTX 1080 Ti, of course your going to keep it, why would you even of thought of not keeping it in the first place? I find this statement very bizarre

AMD is a joke and their graphics division is run by a joke. Vega is a disaster any way you look at it. They haven't made any tangible improvements to their graphics IP since GCN came out back in 2012. Their Chinese design team sucks and they've got nothing going for them. Jim Keller and most of the people who had worked on Zen have left the company too. AMD is a sinking ship. I wouldn't buy any of their products because the competition is better and cheaper.

Jim Keller isnt owned by AMD, he was hired by AMD and then he goes on his merry way to another company to help them with a new design and so on, so he couldnt of "left" the company when he was never owned by AMD in the first place, he is basically a sub contractor and a bloody Legend really lol
 
what if...what if this is just fine when them primitive discard accelerator and hbcc gets fully utilized and the 56 will be on par/faster on average vs the 1080 and the 64 be close to the 1080ti?
 
o_O You have a GTX 1080 Ti, of course your going to keep it, why would you even of thought of not keeping it in the first place? I find this statement very bizarre

It's easy to understand. I was holding off, holding off, holding off to upgrade. When Ryzen came out I pulled the trigger and switched to that from Intel. I had wanted to see Vega but it took sooo bloody long to come out. I did the same way back when Fury X came out close enough to the 980ti. If Vega 64 had been really good, I'd have considered it and had my financial situation been better, I may have sidegraded or (given some of the hype by a few members) upgraded from a 1080ti to Vega.

But alas, it's way worse than people were expecting as little as 2-3 months ago. If you go back through the threads you'll see people (myself included) expecting it to come close to the 1080ti (within 10%) just like Fury X was close to (stock) 980ti. Some were saying (and that includes Captain Tom) that it would beat the 1080ti. Unfortunately there is a lot of damage control being done by people who have historically on TPU supported AMD and it's very much like Orwell's 1984 when the written past can simply be abolished or even in this thread, mistruths from both sdies that can easily be disproved (Gsync monitor not being made by LG or Samsung - when they do as an example, or HBM being useless which it is not).

I think a rule should be made that when you outright lie about something or make a stance that is counter to something you argued about passionately - you get a ban. Unless of course you have the grace to say - "I was wrong". It could be called the Armstrong Rule after Lance who actually destroyed the careers of some people who accused him of doping when all the time, he was. I'm exempt from that rule because i reserve the right to be wrong about anything at all times.
 
As it's already been mentioned a hundred of times AMD doesn't have enough money to R&D two architectures - one for compute, another for gaming. That's why they are left selling compute cards as gaming cards.

NVIDIA has been developing two archs since Kepler - their last universal arch was Fermi and we all remember it fondly.



WTF are you talking about? RX Vega's mining performance results have already been published everywhere. It sucks at that. A pair of GTX 1070 (vs RX64) will be like 40% faster, cheaper and consume a lot less power.

with kepler nvidia still use one architecture for all purpose. if you look at GK110 and GK104 the SM arrangement still the same. just that GK104 FP64 performance was being limited even more than GK110 (Fermi GF100 and GF104 are also like this). but starting with kepler we did saw that nvidia willing to make more specific product for the professional line up only. GK210 for example is exclusive to tesla prodcut. maxwell is pure FP32 architecture. only with pascal nvidia truly make separate architecture for compute and gaming. GP100 have 64 cuda cores per SM. GP102 and the rest have 128 cuda cores per SM (similar to maxwell).
 
what if...what if this is just fine when them primitive discard accelerator and hbcc gets fully utilized and the 56 will be on par/faster on average vs the 1080 and the 64 be close to the 1080ti?

did primitive discard accelerator work automatically in games? if not then we will not going to many games going to take advantage of that. as for HBCC the setting is being turned off by default. according to anandtech (that ask AMD for clarification) the feature should be turn off by default because it can cause performance hit IF the game VRAM usage did not exceed the amount of VRAM available on the card.
 
Back
Top