Thursday, August 3rd 2017
AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer
TweakTown has put forth an article wherein they claim to have received info from industry insiders regarding the upcoming Vega 56's performance. Remember that Vega 56 is the slightly cut-down version of the flagship Vega 64, counting with 56 next-generation compute units (NGCUs) instead of Vega 64's, well, 64. This means that while the Vega 64 has the full complement of 4,096 Stream processors, 256 TMUs, 64 ROPs, and a 2048-bit wide 8 GB HBM2 memory pool offering 484 GB/s of bandwidth, Vega 56 makes do with 3,548 Stream processors,192 TMUs, 64 ROPs, the same 8 GB of HBM2 memory and a slightly lower memory bandwidth at 410 GB/s.
The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.The results in a number of popular games were as follows:
Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)
If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes.
Source:
TweakTown
The Vega 56 has been announced to retail for about $399, or $499 with one of AMD's new (famous or infamous, depends on your mileage) Radeon Packs. The RX Vega 56 card was running on a system configured with an Intel Core i7-7700K @ 4.2GHz, 16 GB of DDR4-3000 MHz RAM, and Windows 10 at 2560 x 1440 resolution.The results in a number of popular games were as follows:
Battlefield 1 (Ultra settings): 95.4 FPS (GTX 1070: 72.2 FPS; 32% in favor of Vega 56)
Civilization 6 (Ultra settings, 4x MSAA): 85.1 FPS (GTX 1070: 72.2 FPS; 17% in favor of Vega 56)
DOOM (Ultra settings, 8x TSAA): 101.2 FPS (GTX 1070: 84.6 FPS; 20% in favor of Vega 56)
Call of Duty: Infinite Warfare (High preset): 99.9 FPS (GTX 1070: 92.1 FPS; 8% in favor of Vega 56)
If these numbers ring true, this means NVIDIA's GTX 1070, whose average pricing stands at around $460, will have a much reduced value proposition compared to the RX Vega 56. The AMD contender (which did arrive a year after NVIDIA's Pascal-based cards) delivers around 20% better performance (at least in the admittedly sparse games line-up), while costing around 15% less in greenbacks. Coupled with a lower cost of entry for a FreeSync monitor, and the possibility for users to get even more value out of a particular Radeon Pack they're eyeing, this could potentially be a killer deal. However, I'd recommend you wait for independent, confirmed benchmarks and reviews in controlled environments. I dare to suggest you won't need to look much further than your favorite tech site on the internet for that, when the time comes.
169 Comments on AMD RX Vega 56 Benchmarks Leaked - An (Unverified) GTX 1070 Killer
And I'm not your buddy, that's just plain gay.
Not sure the Radeon Packs are gonna stop the onslaught, but I guess it was worth a go.
I still hope you're were joking.
A predictive cache algorithm can only detect linear access patterns, just like a prefetcher does in a CPU. But it can't predict accesses when there are no patterns, because there are no way to predict random accesses. That's why HBC would work fine for linear traversal of a huge datasets, but it wouldn't work for game rendering, where the access patterns vary by game state, camera position, etc. GV102 and GV104 are already taped out, and the first test batch will arrive soon. So unless Nvidia run into problems like on Fermi, they can be released anywhere from 5-10 months from today.
Facts ? All you do is ignore them mate. :laugh: Just like you did in out previous discussion.
We got you bro , AMD hasn't got a clue about what they're doing and you do.
Right, wasted enough time with you both , you are both on ignore.
Nvidia made an excellent gaming arch with Maxwell (But it's worthless for most other things besides also mining lol). Nvidia can afford to have 2 architectures at the same time, and AMD cannot. However AMD is starting to finally make money again, and so this will change by 2019.
Why are you so sure Vega won't pan out? It will be mediocre now, but it has FP16/HBC/RPM/etc. Consider that Far Cry 5 and many other games will be written in FP16 in order to support game consoles and AMD cards (That makes Vega64 a 27 TFLOP card in that gamer lol). Vega is the foundation of their future, and designed when AMD had no money left for the Radeon department. Thus even more-so than GCN did, this arch will start out slow for the time but age VERY well.
Fp16 is certainly interesting, and will be gradually more used in the future. But for the next 2-3 years there will be a limited amount of games giving a little boost there, and even with this boost it still wouldn't beat a 1080 Ti. Also, keep in mind that AMD already needs to improve their scheduling, so usage of fp16 will result in even more idle resources. HBC wouldn't give it an advantage over Nvidia unless a game needs more memory than the competition can provide and the game uses intrinsics. So simply stated, even in your best case scenario, Vega doesn't look good in comparison to Pascal.
can't wait for the official reviews and god save us from the miners!