• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega 64 8 GB

AMD stagnated since they gave up actually developing something new, while Nvidia keeps pushing forward.
Imagine a shrunk Fiji bumped about 300 MHz, then it should become obvious how little Vega really improves.

Vega is the first arch designed when AMD had absolutely no money left. Even the Fiji benefited from respectable R&D numbers, but not Vega.

Remember how the 7970 had features that allowed it to mature incredibly well, even 3 years after it came out? At the end of the day, Vega was designed to be able to last Radeon at least 5 years. Therefore they had to design an arch that not only had substantial legs to grow through future tech (HBC, RPM, FP16, DSBR), but it also had to be ok at compute and AI. That's a very tall order. Right now the tally is:

  • Great at compute and certain professional workloads
  • Serviceable gaming performance with accommodation for the tech future games will use.
  • Serviceable AI performance for the price. Certainly FAR better than GCN, but nowhere near Volta. Probably the only use-case where I think it's fair to say that Vega is a stepping stone.
 
This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this.

It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.

Great review W1zzard.
 
This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this.

It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.

Great review W1zzard.

I think it was designed to compete with Volta, and then AMD changed their mind late 2016 when it became clear that wasn't gonna happen lol.
 
I don't know, after two years, in which time AMD effectively dismissed high end last year because apparently that's not where the money is, now AMD returns to the high end to give me... a slightly worse GTX 1080? Call that biased, but I just don't see this as a 86% card.


I think they were expecting more from process node development than they got, and Vega was the result coupled with trying to make a one size fits all, much like Tonga performed worse than Tahiti.

AMD has had and still has a serious fabrication issue, now its more of commitments to MS and Sony, how much they needed Zen to work and they seemed to hit just off target with it, mining drying up their Polaris dies. Its a great position they are in but they just seem to keep screwing it up just enough that no one takes them seriously, and they are clawing for market share by offering these at such low MSRP's and allowing retailers to rape the public in favor of cryptominers.

For example if they came out and said Vega is a compute chip from the get go, designed to use larger memory sizes for mining, and many other tasks.... but instead they keep leading it on as a great gaming card, despite making close to nothing by the time they pay for the silicon, interposer, HBM. There has to be a reason the boards are designed so hardy, and its not to overclock for gaming. TIle based rendering only helped some, and if they have been polishing the drivers with final silicon for the last few months they probably have a relatively short laundry list of bugs that won't massively improve performance.

So, we are still making up excuses?
No, Vega10 is the gaming chip. The compute chip is known as Vega20 and is coming next year.

Where is an excuse? The card performs exactly where they said it would, and its just as mediocre as I thought it would be. Same shit AMD always pulls, they could have made way more money off this release by pricing them twice as high and calling them mining cards with how they are designed and built, instead they used poor marketing and allow retailers to screw buyers. Things like Rapid Packed Math can be used for shader function, but that requires more money and developer interaction than AMD will put in, but it can be used for compute and mining, the whole architecture is compute based, with deeper pipelines and what appears to be more CPU heavy driver interaction. The only redeeming features appear to be software things they could have given to Polaris cards and made them worth more as gaming cards.
 
Where is an excuse? The card performs exactly where they said it would, and its just as mediocre as I thought it would be.

I was referring to:
Vega is a compute chip that can also play some games.
We've heard this excuse many times the last two months, ever since people started to realize it would suck at gaming. Many have said things along the lines of "since it sucks in gaming, it was never meant for gaming". But no, Vega10 is primarily targeting gaming.

And it didn't quite land where AMD said it would. Where is the 4× performance per watt?

Same shit AMD always pulls, they could have made way more money off this release by pricing them twice as high and calling them mining cards with how they are designed and built, instead they used poor marketing and allow retailers to screw buyers.
Mining has nothing to do with it, that's just another excuse.
 
On the Amazon best-selling CPU list, the Ryzen 1700x jumped to the 2nd position (from maybe a top20 or a bit above). Guess why.
 
I was referring to:

We've heard this excuse many times the last two months, ever since people started to realize it would suck at gaming. Many have said things along the lines of "since it sucks in gaming, it was never meant for gaming". But no, Vega10 is primarily targeting gaming.

And it didn't quite land where AMD said it would. Where is the 4× performance per watt?


Mining has nothing to do with it, that's just another excuse.


When is it not an excuse and instead a clear reflection of reality?

AMD has known for years that compute was big money, watched Nvidia, and put their money in that pot.

I put my money on AMD stock and pulled out before Vega and made money. Is that an excuse?
 
Take the 1080ti put a sticker AMD La Vega 72 sell it 1 year later a litle cheaper a little warmer and everyone is happy ....:)
 
So another meh graphics card from AMD that only matches the performance of the year old GTX 1080 with lots of power draw and noise, especially irritating coil whine. I'll pass. Also, dunno why they bother with that expensive HBM, when GDDR5X does just fine.

Let's hope AMD can finally leapfrog NVIDIA sometime not too long in the future and give us some competition.

HAHA, then the power usage be even higher ?.
 
From my perspective this is too little too late from old mate AMD. There are compelling aspects to this card like price/performance, mining, but too many drawbacks, like heat, power and being so late to bring this perf level to the playing field.

GTX1080 owners have had this performance available to them for over 14 months now, and running cooler, using less power.

I really wanted AMD to nail this one, but I fear it just cements them as ~1.5 generations behind still (more specifically looking at being able to be the top dog, and having a sound, efficient architecture). The way the landscape looks right now they will not be able to compete well with the outright performance of a new Volta card (considering they can't compete with a 1080Ti), presuming that typically the "xxx80" card is marginally faster than the previous gen's Ti (making say a GTX1180 faster than a 1080Ti with lower power consumption), let alone do it in the ballpark of performance per watt.

They need to Ryzen their GFX segment, back to the drawing board, complete overhaul.

For the meantime we get some competitive prices and shake up in this segment, and that's better than nothing, especially for those invested in freesync.
 
The only reason these cards are compelling is if you happen to have a FreeSync monitor.
 
The only reason these cards are compelling is if you happen to have a FreeSync monitor.

I actually have been holding on buying one because no AMD card could properly handle the FreeSync range at 4k. This Vega release will enable more sale than you expected, mate.
 
The only reason these cards are compelling is if you happen to have a FreeSync monitor.
I have a freesync monitor though i have an over a year old 1070, back then thinking of selling it because mining craze and get myself a Vega 64 but the savings i get in purchasing a freesync monitor wil be nulled in getting a buying a new power supply, i don't want my ancient PSU to run this power hungry beast :O
 
I am sorry but the conclusion on this review does not make any sense, how can this card be highly recommended but then it reads...

"Price-wise, the Radeon RX Vega 64 clocks in at $499, which is not unreasonable. It is basically priced the same as the GTX 1080, which does offer much better power/heat/noise levels."

Basically the pricing IS PRETTY MUCH UNREASONABLE, although I understand the reason just look at the GPU die size it must be costing an arm and a leg to produce those chips, I mean it is even bigger than the 1080ti and still can't match it.

Just stay away from this card, either get a 1080 (you can even get pretty good deals on eBay since the card is like 18 months old), or wait for Volta in a few months.
 
I wonder how long it will take before RTG start to use AI to design their next chips. You know limit input conditions and let it run millions of possible scenario in terms of performance/power consumption. In the end the design engineers will be just sitting there and choose what looks best to them

The only reason these cards are compelling is if you happen to have a FreeSync monitor.

Or need a personal space heater during winter and too lazy to turn on the A/C
 
This card is not designed to compete with the 1080 Ti. It is designed to compete with the 1080. Any research put towards AMD's statement, or W1zzard's statement at the end of his review will reveal this.

It holds its own against the 1080, and then some. Power draw is slightly higher, but on the tune of around 20 W higher... That's really going to hurt your electrical bill.

Great review W1zzard.

I have always felt that the human mind is fascinating and one thing in particular is cognitive dissonance.

In order to assert your claim that the power draw is "slightly higher", not only did you have to compare under the most favorable conditions with the smallest differences, but you also based the calculation on power consumption using the power safe mode with extra low power draw. All the benchmarks have demonstrated that at this power save mode, performance is not equal to the 1080.

Just how much mental gymnastics do you have to do in your head l to say that power draw is "is slightly higher, but on the tune of around 20 W higher" with a straight face?

It's pretty obvious here (https://www.techpowerup.com/reviews/AMD/Radeon_RX_Vega_64/29.html) that the power consumption is significantly higher than that? Cognitive dissonance is a fascinating thing indeed.

Oh, and FYI, I am neither a fan or Team Red or Green. In fact I was hoping the Vega will be better because this will force Nvidia to also offer more competitive products, which ultimately benefit the consumers.
 
Dug through a whole lot of reviews. Here is what I get:

1. Most of the extra transistors were used on improving frequency

2. RTG sacrificed efficiency (IPC) from Fiji to improve frequency

3. Everything else remained relatively the same.

4. GloFo manufacturing simply cannot tame the power consumption at such high frequency.


So in the end, RTG was actually hoping to design a super pumped up Fiji while keeping the major components (4 Async Engine, 64CU, ROP) same as Fiji. Unfortunately this resulted in horrible efficiency. So RTG kept pumping up the speed hoping it will at least look good on performance numbers. However the release of 1080Ti and TitanXp crushed that hope as well. Just like Faildozer, GCN simply is too outdated for the current crop of applications. Simply pumping up MHz won't save it. RTG need some fresh new design badly.
 
And the Vega 64 is a bad overclocker. Again the Vega 56 is the decent card of the two.
 
I am sorry but the conclusion on this review does not make any sense, how can this card be highly recommended but then it reads...

"Price-wise, the Radeon RX Vega 64 clocks in at $499, which is not unreasonable. It is basically priced the same as the GTX 1080, which does offer much better power/heat/noise levels."

I actually don't think that's fair. Depending on the games you play (And somewhat on the resolution), Vega 64 can easily be an average of 10%+ stronger than the 1080. Then factor in Freesync, multiple use cases (mining, compute, professional programs), and the enormous technology baked into the card that will undoubtedly yield some decent performance gains over time - and you have a card that I would say is a better choice if priced a tad below the 1080 (Or even at the same price).

Not to mention Freesync makes Nvidia straight up not an option for many people. For instance I have some friends that look at the $200 troll toll for G-Sync and how horrifically bad the 780 aged, and they just will not buy an Nvidia card for at least another generation.
 
Every generation the bar is decreased by AMD. Maxed out Navi will only compete with the GTX 2060 if this continues.
 
@Captain_Tom , first i wrote about the "all resolutions"-Graph and yes i wrote it the wrong way for the HD5870, it should be, the GTX285 was about 17% slower (100% vs. 83%) thats correct.
 
I wonder how long it will take before RTG start to use AI to design their next chips. You know limit input conditions and let it run millions of possible scenario in terms of performance/power consumption. In the end the design engineers will be just sitting there and choose what looks best to them
That's not really all that conclusive, Bulldozer die was laid out by an automated process and Zen die was laid out by hand ... input conditions are already limited (you ask of AI for max efficiency, min latency and max stable clock), however variations in transistor layout are really huge output conditions so not much help from standard AI like neural networks or genetic algorithms.
 
Last edited:
Back
Top