Monday, December 12th 2016

AMD "Vega" Demoed in Sonoma, California

AMD's next-generation high-end graphics card, based on the "Vega" architecture, was showcased at an event in Sonoma CA, earlier this week. While the architecture is being debuted with the Radeon Instinct MI25 deep-learning accelerator, a prototype graphics card based on the silicon was exhibited by the company, showing Vulkan API gaming.

AMD was pretty tight-lipped about the specifications of this prototype, but two details appear to have slipped out. Apparently, the chip has a floating point performance of 25 TFLOP/s (FP16), and 12.5 TFLOP/s (FP32, single-precision). On paper, this is higher than the 11 TFLOP/s (FP32) of NVIDIA TITAN X Pascal. The other important specification that emerged is that the card features 8 GB of HBM2 memory, with a memory bandwidth of 512 GB/s. This, too, is higher than the 480 GB/s of the TITAN X Pascal. It remains to be seen which market-segment AMD targets with this card.

This article was updated on Dec 15 to accommodate AMD's request to remove all info regarding the demo system, the shown game and its performance, which has been put under NDA retroactively.
Source: Golem.de
Add your own comment

120 Comments on AMD "Vega" Demoed in Sonoma, California

#51
erocker
*
the54thvoidI replied to this notion of yours before - Vega 11 is (according to most sites and leaks) the smaller chip. Just like Polaris 10 (RX 480) and 11 (RX470 and 460). Other sites also suggested the price will be similar to the GTX 1080. Look at AMD's past releases now.

7970 matched 680
Fury X matched 980ti

There is a precedent now for AMD to match the pricing structure of Nvidia.

Don't get me wrong, I want Vega to come in at DX11 equal to a 1080, but far superior at DX12 and Vulkan and be £100 cheaper.

EDIT: if Vega matches 1080 and AMD price it the same, AMD become absolutely complicit in the ridiculous pricing of gfx cards. But if Vega beats the 1080 - they can sell it for MORE..... not good.
They just need to get this new card out before GTX 2080/2070.
Posted on Reply
#52
AsRock
TPU addict
Vega Handles Doom 4K Ultra HD at Over 60 FPS
while looking at a wall ( WTF ), wow that's just some shady picture lol. Sorry not sold on this yet getting the feeling of the hype is going get more than it should once again.



And Doom well Meh.
Posted on Reply
#53
the54thvoid
Intoxicated Moderator
erockerThey just need to get this new card out before GTX 2080/2070.
I don't know if you jest or not but the alleged Pascal refresh (on Samsung's fab process?)may bring those cards about sooner rather than later... Nevermind any 1080 ti interloper.
Posted on Reply
#54
cdawall
where the hell are my stars
the54thvoidI don't know if you jest or not but the alleged Pascal refresh (on Samsung's fab process?)may bring those cards about sooner rather than later... Nevermind any 1080 ti interloper.
If the 2070/2080 is just a refresh on an updated process there is no guarantee it will perform any better.
Posted on Reply
#55
Xzibit
the54thvoidThe 25GFlops is double precision, the 12.5GFlops is touted as Vega's single power (higher than Titan XP). I think, if my reading was correct.
25 GFlops 16bit (Half-precision)
12.5 GFlops 32bit

GP100
NVLink
10.6 GFlops 32bit
5.3 GFlops 64bit (Double-precision)
PCI-E
9.3 GFlops 32bit
4.6 GFlops 64bit (Double-precision)

Titan X (Pascal)
11 GFlops 32bit


Posted on Reply
#56
NDown
AsRock
while looking at a wall ( WTF ), wow that's just some shady picture lol. Sorry not sold on this yet getting the feeling of the hype is going get more than it should once again.



And Doom well Meh.

i'd say in hectic action/scene it'd be averaging around 60-lower 70s,
Posted on Reply
#57
AsRock
TPU addict
Now lets see it to The Witcher 3 in 4k.

Like i said Dooms pretty Meh.
Posted on Reply
#62
Camm
AsRockNow lets see it to The Witcher 3 in 4k.

Like i said Dooms pretty Meh.
Doom is a well optimised game however which works well for benchmarking, whilst the Witcher is a piece of shit (technology wise) with gameworks "enhancements".

And thats a really disappointing score, unless thats a small vega benchmark or running under OpenGL. Titan XP pushes a good 30% faster in DOOM.
Posted on Reply
#63
EdInk
The seems to be my solution. GTX1070 can't quite handle 3440x1440 without compromises, GTX1080 too pricey. This card should fit nicely,
Posted on Reply
#64
AsRock
TPU addict
CammDoom is a well optimised game however which works well for benchmarking, whilst the Witcher is a piece of shit (technology wise) with gameworks "enhancements".

And thats a really disappointing score, unless thats a small vega benchmark or running under OpenGL. Titan XP pushes a good 30% faster in DOOM.
Sorry but no, Doom looks like a piece of crap compared to the graphics in the witcher 3.

But yes there is that reason as AMD have there name on it but the game is not all that graphical wise.
Posted on Reply
#65
notb
Once again... early results on a "sweet spot" game and a huge hype. Hard not to be skeptical.
At least this time it's a mainstream title, so it's actually used in most reviews. We'll be spared the "why don't you test with Ashes of the Singularity" mantra... :P
Posted on Reply
#66
Camm
AsRockSorry but no, Doom looks like a piece of crap compared to the graphics in the witcher 3.

But yes there is that reason as AMD have there name on it but the game is not all that graphical wise.
I wasn't saying the Witcher wasn't prettier. I said it was piss poor for determining a GPU's performance.
Posted on Reply
#67
Jism
Doom is used to show the cards full potential in for example Vulkan.

It makes no sense to use DX11 games or games that are not optimized for gamecode of AMD.

We all know Nvidia uses a different approach when rendering frames. The positive is that it's usually faster in DX11 games but lacks in DX12.

Async compute is what favours AMD, and since we know the future is more aimed for putting a GPU to work the way it should, AMD is on the right side.

Gamedevs are having a less troubled coding of games since consoles use the same GCN hardware. Porting is easyer and it removed the obstacle the PS3 with the IBM Cell had.

Bottomline: Vega is ready.
Posted on Reply
#68
awesomesauce
xkm1948Behind view.
tested with vega and ZEN? :P
Posted on Reply
#69
Xzibit
xkm1948Behind view.
Thats an interesting pic because if that matches up with the MI25 shroud and heatsink (Fins look parallel) it would mean it was demoing Doom while being passively cooled at 4k Ultra IQ.
Posted on Reply
#70
dinmaster
looking at the picture, the fins look like they cover 4+ slots, hard to see but you can see it. seems like the cooling solution isn't finalized yet and they just put a CPU cooler on the video card. what is there to hide by taping the back of the case. get over yourselves amd, obviously something shady is going on and they don't want us to see what is actually happening.
Posted on Reply
#71
iO
dinmasterlooking at the picture, the fins look like they cover 4+ slots, hard to see but you can see it. seems like the cooling solution isn't finalized yet and they just put a CPU cooler on the video card. what is there to hide by taping the back of the case. get over yourselves amd, obviously something shady is going on and they don't want us to see what is actually happening.
They obviously dont want to show how far (or not) the product development is.
And if they still use a CPU cooler then there wont be a release anytime soon.
Posted on Reply
#72
Xzibit
iOThey obviously dont want to show how far (or not) the product development is.
And if they still use a CPU cooler then there wont be a release anytime soon.
I think he was joking. CPU cooler fins wouldn't be visable on the first 2-3 port covers, you'd see pipes for sure. Plus they would need to use a double U CPU design to even get the fins close enough to the case port covers.
Posted on Reply
#73
iO
XzibitI think he was joking. CPU cooler fins wouldn't be visable on the first 2-3 port covers, you'd see pipes for sure. Plus they would need to use a double U CPU design to even get the fins close enough to the case port covers.
It definitely is a CPU cooler. Kinda looks like a Ninja 2 although I'm not sure...
Posted on Reply
#74
cdawall
where the hell are my stars
dinmasterlooking at the picture, the fins look like they cover 4+ slots, hard to see but you can see it. seems like the cooling solution isn't finalized yet and they just put a CPU cooler on the video card. what is there to hide by taping the back of the case. get over yourselves amd, obviously something shady is going on and they don't want us to see what is actually happening.
It shouldn't be covering 4+ slots this is the case they are using?



I mean it definitely isn't a production cooler, but as an owner of that case I could see them having some major issues after they tapped almost all of the airflow off...
Posted on Reply
#75
Xzibit
Radeon Pro pre-release



The GPU placement would have to be closer to the display outputs. Maybe a modified version of this. If they are fins because its almost touching the case i/o covers

Posted on Reply
Add your own comment
May 19th, 2024 08:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts