Thursday, August 3rd 2017

AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.

The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.
The results in a number of popular games were as follows:

Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)

If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes.
Source: TweakTown
Add your own comment

169 Comments on AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer

#126
Sasqui
Captain_TomHate to say it, but I don't think mining will ever "peter out". For sure there will be periods of boom and bust, but mining is here to stay buddy.

And if you think about it, it was only a matter of time before some program found a way to make money off of the massive computational power modern GPU's have.
We shall see, and no... crypto currency is not going away, it's simply a highly speculative commodity.

And I'm not your buddy, that's just plain gay.
Posted on Reply
#128
birdie
DeathtoGnomesI'm convinced. There are too many posters in this thread being paid specifically to attack other people and hijack the discussion.

As for me the unverified results are nice considering AMD hasnt been in such a position in a long time. I think Nvidia will up its game with Volta, which might mean a delayed release until they prove they retake top daug slot without much fanfare.
I guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time. :)

I still hope you're were joking.
Posted on Reply
#129
efikkan
Vya Domus
efikkan- HBC is relevant for pure compute workloads, but not for gaming.
You are so wrong , but just as always you're going to ignore facts and carry on.
Then please focus on the facts instead of personal attacks.

A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.
Captain_TomVolta is professional only at the moment, and I don't expect that to change till MAYBE the end of 2018. In fact I am pretty sure Nvidia confirmed the next series is another maxwell...cough.... Pascal refresh (But likely with more GDDR5X/6. It will be stronger, but lack all of these features.

The only exception I can think of is possibly a cut-down Volta sold as a Titan card for $1500 - $2000.
GV102 and GV104 are already taped out, and the first test batch will arrive soon. So unless Nvidia run into problems like on Fermi, they can be released anywhere from 5-10 months from today.
Posted on Reply
#130
Assimilator
Captain_TomNo surprise here.

$400, and it slightly loses to the 1080 while having FAR better long-term technology. This will sell well, and Vega64 should be at least 15% stronger than this!
"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.
Posted on Reply
#131
Vya Domus
efikkanThen please focus on the facts instead of personal attacks.

A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc.
I literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.

Facts ? All you do is ignore them mate. :laugh: Just like you did in out previous discussion.

We got you bro , AMD hasn't got a clue about what they're doing and you do.
Posted on Reply
#132
oxidized
Vya DomusI literally posted a presentation from AMD themselves where they showcased HBC in action in a game. Yet you insist it isn't possible. Right , AMD knows nothing about it and you do. You know , just because you bring up all that technical stuff that hasn't got much to do with the subject it doesn't really help you prove that you know what you are talking about.

Facts ? All you do is ignore them mate. :laugh: Just like you did in out previous discussion.

We got you bro , AMD hasn't got a clue about what they're doing and you do.
So basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?
Assimilator"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.
Hey you're wrong! AMD has the alien technology nobody has, but everyone wants, that's why everyone's constantly copying them, and try to stop them by paying to make them look bad!
Posted on Reply
#133
Vya Domus
oxidizedSo basically we can't trust reviews and reviewers but we SHOULD trust the company itself? I mean how can you not be trolling?
:roll:Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.

Right, wasted enough time with you both , you are both on ignore.
Posted on Reply
#134
oxidized
Vya Domus:roll:Of course it looks like trolling when you ain't got a clue about how things work and are under the impression you know better that a team of engineers.

Right, wasted enough time with you both , you are both on ignore.
The level of your stupidity is astonishing, really, i must compliment you.
Posted on Reply
#135
the54thvoid
Super Intoxicated Moderator
I thought HBC stood for "Holy Batman, Catwoman!"
Posted on Reply
#136
Fluffmeister
the54thvoidI thought HBC stood for "Holy Batman, Catwoman!"
But seriously, buy cheap (if even possible), and then flog to miners... hmm I'm tempted.
Posted on Reply
#137
vega22
the54thvoidIs that not worrying? All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost. All AMD are doing with these compute heavy consumer cards is fueling the mining craze. There is no disputing AMD's consumer compute ability but:

1) It's still not enough to be the fastest gaming GPU and ;
2) It keeps the profit margins very low

Anyway, if you see my post further up, Vega 64 is a mining monster. So it's going to disappear fast on one of those jumbo jets. Not good news at all for gamers. :mad:
nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming :D
Posted on Reply
#138
deu
Sempron GuyI couldn't explain it right but I see tearing at 75fps in all the games I tested. Capping it at 74fps does the trick. Though I'm not sure about the explanation behind it.
Same with gsync as far as I understand.
Posted on Reply
#139
Dimi
deuSame with gsync as far as I understand.
I never see any tearing on my gsync monitor, goes to 165hz though.
Posted on Reply
#140
DeathtoGnomes
birdieI guess you need to go see a shrink ASAP. Next you're gonna say that Jensen Huang frequents these forums and slanders AMD in his spare time. :)

I still hope you're were joking.
Thanks for making my point.
Posted on Reply
#141
Captain_Tom
the54thvoidIs that not worrying? All Nvidia have to do is refresh maxwell...cough Pascal... to stay quite far ahead at minimal cost. All AMD are doing with these compute heavy consumer cards is fueling the mining craze. There is no disputing AMD's consumer compute ability but:

1) It's still not enough to be the fastest gaming GPU and ;
2) It keeps the profit margins very low

Anyway, if you see my post further up, Vega 64 is a mining monster. So it's going to disappear fast on one of those jumbo jets. Not good news at all for gamers. :mad:
I never said it was good, but I wouldn't say this is "bad".

Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol). Nvidia can afford to have 2 architectures at the same time, and AMD cannot. However AMD is starting to finally make money again, and so this will change by 2019.


Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Posted on Reply
#142
efikkan
Captain_TomWhy are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Well, for starters we hear this every single time. Buy a underperforming AMD card now, and it will be better in the future, but it never pans out. The primary reason is that your expectations are too inflated. Secondly, AMD will shift their focus to Navi in a few months.

Fp16 is certainly interesting, and will be gradually more used in the future. But for the next 2-3 years there will be a limited amount of games giving a little boost there, and even with this boost it still wouldn't beat a 1080 Ti. Also, keep in mind that AMD already needs to improve their scheduling, so usage of fp16 will result in even more idle resources. HBC wouldn't give it an advantage over Nvidia unless a game needs more memory than the competition can provide and the game uses intrinsics. So simply stated, even in your best case scenario, Vega doesn't look good in comparison to Pascal.
Posted on Reply
#143
bug
Captain_TomI never said it was good, but I wouldn't say this is "bad".

Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol). Nvidia can afford to have 2 architectures at the same time, and AMD cannot. However AMD is starting to finally make money again, and so this will change by 2019.


Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Out of curiosity, what would qualify as bad in your opinion?
Posted on Reply
#144
Fluffmeister
I'm just impressed to hear Maxwell is worthless beyond gaming and mining (lol).
Posted on Reply
#145
Th3pwn3r
Assimilator"Long-term technology" what is this BS you're spouting? There's no such thing as future-proofing when it comes to GPU technology - a GPU is relevant for a year, year-and-a-half, maybe even 2 years and then it's just too slow for gaming.
By far the worst post in this thread. If you don't realize that AMD has gone against the grain then you need to read more.
Posted on Reply
#146
S@LEM!
all in all, for such a small Group like RTG with R&D crippled by the former management, i'm just glad for what they accomplished. they always push for new stuff no matter what, opted for open standards friendly ecosystem long term functionally and future proof, they superior the market of consoles, the god of APU I remember when AMD's slogan was "The Future is Fusion" when we are fighting over quad cores CPUs. Sure they may deliver half baked tech sometime, but they strike hard along the way and i'm sure the next Vega will be even more appealing for hard core enthusiast not just for pro and developers.

can't wait for the official reviews and god save us from the miners!
Posted on Reply
#147
xenocide
Whether or not the Vega 56 is more cost effective than the 1070 is irrelevant when you consider that the GTX 1070 has been out for about a year and the Vega 56 still isn't. Most people already gave Nvidia their money, and the prospect of a lateral move isn't exactly appealing.
Posted on Reply
#148
Frick
Fishfaced Nincompoop
@Vya Domus @oxidized I want to hear your arguments on wether HPC has a use in gaming. No videos.
vega22nah that bubble is bursting as we speak. by the time they are on the shelves most miners will be dumping their older, low power, cards and using those funds to buy vega and 4k screens for gaming :D
Bursting or simply upgrading? Aye it will burst as mining gets harder, but not completely for a while.
Posted on Reply
#149
renz496
B-RealThen I have no idea what you checked. First is Hardware Canuks as a written review, YT videos like HW Unboxed (
), getting 1% in favor of the 1060 and first it was a 12% difference. Yeah, it consumes 30-40W more at default (which is said by HW Unboxed that shouldn't be a deal breaker), but actually, JayzTwoCents' XFX 480 video shows that under stress test, that RX480 eats about 95 to 120W. And with Crimson, AB etc. you can control the more hungry RX480s like the Sapphire or MSI. Also, ones like MSI got later BIOS updates that got their power consumption lower.
some people were mislead by that video by Jay thinking that the whole card only use that much power when running 3D application like Fire Strike. but in reality those were power used on GPU core section only not the entire GPU because there is no sensor on the card that can measure the entire power being used by the card. that's why reviewer like techpowerup and toms hardware did not look at power consumption reported by MSI Afterburner (like Jay did at the time) and have specific equipment to measure GPU power consumption only in a system. this is PCB analysis of the exact card use by Jay (XFX RX480 GTR):

Posted on Reply
#150
renz496
Captain_TomNo surprise here.

$400, and it slightly loses to the 1080 while having FAR better long-term technology. This will sell well, and Vega64 should be at least 15% stronger than this!
that might be wonderful if Vega is launching at the same time as GTX1080/1070 back in may 2016. but right now we already nearing at the end of pascal life cycle. if the said feature is really important to have nvidia will have them as well. volta will be here in less than a year.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts