• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX Vega Preview

Compute hasn't really got anything to do with a fixed data type , FP16 , 32 , 64 , integer performance , repurposed SIMD instructions like those specialized in matrix operations inside V100. It's all considered compute.
 
I believe support is coming, it's mentioned Wolfenstein II for example will support RPM:
Wide adaptation will take many years, but it will happen eventually. All current PC games use fp32 for fragment processing, while fp16 should be sufficient in most cases. As node shrinks become more scarce, fp32 will be too expensive to use for everything. Mixed precision is inevitable.
 
Price better than I expected the bundle well if that what make them get them into gamer hand and not miner that what they got to do. For the AMD fan whose hung on through it all as of yet move to Ryzen and all the latest monitor it seem like a good deal.
 
Wide adaptation will take many years, but it will happen eventually. All current PC games use fp32 for fragment processing, while fp16 should be sufficient in most cases. As node shrinks become more scarce, fp32 will be too expensive to use for everything. Mixed precision is inevitable.


That must be why Pcgameshardware did a test and a 1070 lost between 8-3% with HDR turned on? Nvidia has been using "optimizations" for years to improve performance at the expense of visuals. http://www.pcgameshardware.de/Ferns...itman-Shadow-Warrior-Resident-Evil-7-1219723/
http://www.pcgameshardware.de/Fernseher-Hardware-134500/Specials/HDR-Hitman-Shadow-Warrior-Resident-Evil-7-1219723/galerie/2698259/
translate_c
 
That must be why Pcgameshardware did a test and a 1070 lost between 8-3% with HDR turned on? Nvidia has been using "optimizations" for years to improve performance at the expense of visuals. http://www.pcgameshardware.de/Ferns...itman-Shadow-Warrior-Resident-Evil-7-1219723/
http://www.pcgameshardware.de/Fernseher-Hardware-134500/Specials/HDR-Hitman-Shadow-Warrior-Resident-Evil-7-1219723/galerie/2698259/
translate_c
That has definitely nothing to do with it.

Most games use HDR with bloom or other forms of tone mapping already, switching out the tone mapping should not have any substantial performance penalty.
 
That has definitely nothing to do with it.

Most games use HDR with bloom or other forms of tone mapping already, switching out the tone mapping should not have any substantial performance penalty.


LOL, so a test of exactly the same setting between HDR turned on and off and the performance penalty of having it turned on has nothing to do with the performance penalty of it being turned on........ can I have some of what you are smoking?

http://www.guru3d.com/news-story/hd...0-series-cards-limited-to-8-bit-via-hdmi.html

AMD has had 8, 10, and 12 bit available for quite awhile, but you had to have content in fullscreen that supported it, and it comes with no performance penalty. Even GTA5 you could force -HDR so the 16 bit sky and other items would be forced into 24bit or 32, to eliminate banding, and lo, even my puny 7970 has no performance penalty using it compared to Nvidia card losing 5-10% when enabled.
 
Last edited:
I have waited for three long years to replace my 290x Crossfire setup. Been holding out for Vega since last year, then assumed we would see Vega in February 2017, then saw the 1080ti drop, and thought NV must know AMD is bringing something good to the table, so I waited. Now this has landed and it seems the LE liquid cooled Vega64 cannot be purchased without a bundle, and lots of heat/power for a year old 1080 competitor.

So, I just pulled the trigger on the Zotac 1080ti Amp Extreme Core, my first Nvidia card (not counting an MX400 spare from forever ago)... I guess @cdawall was right...I should have ordered it back in May.

Sorry AMD, I tried, but this launch has been a complete and utter failure in my opinion.

JAT
 
LOL, so a test of exactly the same setting between HDR turned on and off and the performance penalty of having it turned on has nothing to do with the performance penalty of it being turned on........ can I have some of what you are smoking?

http://www.guru3d.com/news-story/hd...0-series-cards-limited-to-8-bit-via-hdmi.html

AMD has had 8, 10, and 12 bit available for quite awhile, but you had to have content in fullscreen that supported it, and it comes with no performance penalty. Even GTA5 you could force -HDR so the 16 bit sky and other items would be forced into 24bit or 32, to eliminate banding, and lo, even my puny 7970 has no performance penalty using it compared to Nvidia card losing 5-10% when enabled.
Your understanding of HDR support is appalling.
 
Your understanding of HDR support is appalling.

Mass Effect Andromeda video options for HDR for example states prefectly clear that it will use FP16 . That's why Nvidia cards lose performance.
 
I have waited for three long years to replace my 290x Crossfire setup. Been holding out for Vega since last year, then assumed we would see Vega in February 2017, then saw the 1080ti drop, and thought NV must know AMD is bringing something good to the table, so I waited. Now this has landed and it seems the LE liquid cooled Vega64 cannot be purchased without a bundle, and lots of heat/power for a year old 1080 competitor.

So, I just pulled the trigger on the Zotac 1080ti Amp Extreme Core, my first Nvidia card (not counting an MX400 spare from forever ago)... I guess @cdawall was right...I should have ordered it back in May.

Sorry AMD, I tried, but this launch has been a complete and utter failure in my opinion.

JAT

I was actually still holding out hope...looks like the 1080ti market will earn more sales soon
 
I have waited for three long years to replace my 290x Crossfire setup. Been holding out for Vega since last year, then assumed we would see Vega in February 2017, then saw the 1080ti drop, and thought NV must know AMD is bringing something good to the table, so I waited. Now this has landed and it seems the LE liquid cooled Vega64 cannot be purchased without a bundle, and lots of heat/power for a year old 1080 competitor.

So, I just pulled the trigger on the Zotac 1080ti Amp Extreme Core, my first Nvidia card (not counting an MX400 spare from forever ago)... I guess @cdawall was right...I should have ordered it back in May.

Sorry AMD, I tried, but this launch has been a complete and utter failure in my opinion.

JAT
And I am sure you sold those AMD cards for more than you paid for it so you picked up that 1080ti and maybe some games and lots less power draw with only 1 card instead of 2.
 
Nah. I've absolutely babied these cards, and refuse to hand them over to miners to abuse. The cards are still flawless, and are beastly with good crossfire profiles, but 4GB of vram isn't cutting it at 1440 ultrawide.

I'll probably throw them at the new Kreij memorial build when it comes around, so a gamer can enjoy them, instead of a miner just beating then to death.

JAT
 
Your understanding of HDR support is appalling.


Great post, it adds so much.

I do understand HDR, and the difference between chroma, luma and its range on RGB colorspace due to bit per channel depth, HDR allowing for higher pixel depth and luma and a more complete color range.

In hardware Nvidia has traditionally downsampled color depth on certain surfaces as it allows for faster rendering and there is a performance penalty to calculating shading on a higher depth surface or calculating color gradients instead of lower bit depth plus dither.

But..... thanks, I'm new here and don't know much.
 
Gamer nexus did a test with FE and it performed exactly the same at 1050 mhz or something like that. The reason the performance doesn't scale with the frequency is because there is a bottleneck somewhere else , most likely something to with the way the drivers operate
I told this before, its because of the front end. Hawaii and Fiji share the same front end and the same 4 shader engine. If NCU don't fix the 4 shader engine limitation (they say they did, but we shall see), those extra shader can't be properly fed, and also have the same number of geometry processor among other things like ACE, that is why Hawaii and Fiji have small difference performance in some games.

If Vega front end is indeed similar, I lost hope with it's performance. History tells that a product that is massively delay usually isn't good.
 
Nah. I've absolutely babied these cards, and refuse to hand them over to miners to abuse. The cards are still flawless, and are beastly with good crossfire profiles, but 4GB of vram isn't cutting it at 1440 ultrawide.

I'll probably throw them at the new Kreij memorial build when it comes around, so a gamer can enjoy them, instead of a miner just beating then to death.

JAT
You sold them for normal prices?
 
There's some discrepancy in the numbers. We have TDP, TGP, board power all being thrown out with number sets that do not match. The direct TDP info we got from AMD is in the article, but RX Vega 64 air-cooled supposedly has a 220 W TDP and RX Vega 56 a 165 W TDP.
AMD has stated, that TGP (Total Graphics Package) for Vega 56 is 165W, this includes GPU + Memory + Interposer.. so looking at about 150W TDP, Nano will be 150W TGP and 64 @ 220 Max TGP.
 
Nah. I've absolutely babied these cards, and refuse to hand them over to miners to abuse. The cards are still flawless, and are beastly with good crossfire profiles, but 4GB of vram isn't cutting it at 1440 ultrawide.

I'll probably throw them at the new Kreij memorial build when it comes around, so a gamer can enjoy them, instead of a miner just beating then to death.

JAT

I don't really get why people think saving these cards is a big deal. To me it seems ridiculous, these aren't ultra rare precious gems lol. If it were me I'd sell them and invest the money into better parts. Let the miners have the old junk, I'll buy the new stuff that works better for what I want. I would say I'd give it to a friend or relative BUT the money gained by selling it would be more worth it to invest into a BETTER card.
 
I don't really get why people think saving these cards is a big deal. To me it seems ridiculous, these aren't ultra rare precious gems lol. If it were me I'd sell them and invest the money into better parts. Let the miners have the old junk, I'll buy the new stuff that works better for what I want. I would say I'd give it to a friend or relative BUT the money gained by selling it would be more worth it to invest into a BETTER card.
HA! What better card should I have invested in?

I'll donate them so they will still get some good use, rather than let them be beaten to death and thrown away.

And to call a pair of pristine, low-mileage Powercolor PCS+ 290x's "junk" is laughable. These will be "precious gems" in the future if they are taken care of. But I love my tech, and try to take good care of it. Hell my 4850s and 6870s still run just fine.

JAT
 
I told this before, its because of the front end. Hawaii and Fiji share the same front end and the same 4 shader engine. If NCU don't fix the 4 shader engine limitation (they say they did, but we shall see), those extra shader can't be properly fed, and also have the same number of geometry processor among other things like ACE, that is why Hawaii and Fiji have small difference performance in some games.

If Vega front end is indeed similar, I lost hope with it's performance. History tells that a product that is massively delay usually isn't good.

That's where they fail. You can place as much hardware scheduling and load balancing on the GPU as possible if the drivers are still limited by the CPU since you send all commands in a serial fashion on just one thread it will be all for nothing. 4096 shaders is huge , DX 11 performance is bound to be bad.

If you use Vulkan and DX12 this is supposed to be fixed. But Vulkan adoption rate is low and DX12 implementations are abysmal for the most part.
 
Last edited:
HA! What better card should I have invested in?

I'll donate them so they will still get some good use, rather than let them be beaten to death and thrown away.

And to call a pair of pristine, low-mileage Powercolor PCS+ 290x's "junk" is laughable. These will be "precious gems" in the future if they are taken care of. But I love my tech, and try to take good care of it. Hell my 4850s and 6870s still run just fine.

JAT

I still have a stack of my favorite cards as far back as my original ti4200. Do I need the money or want the nostalgia to hang onto...oh wait I forget random people on the internets opinion don't matter.
 
And when will be 100 euro card available.
 
There is a confusion here between single precision floating point frame buffer (HDR lighting and bloom algorithm that exists since 2004), and HDR 10 bit or 12 bit display output that requires new display (also the game has to support both FP frame buffer and hd range output).
 
Great post, it adds so much.

I do understand HDR, and the difference between chroma, luma and its range on RGB colorspace due to bit per channel depth, HDR allowing for higher pixel depth and luma and a more complete color range.

In hardware Nvidia has traditionally downsampled color depth on certain surfaces as it allows for faster rendering and there is a performance penalty to calculating shading on a higher depth surface or calculating color gradients instead of lower bit depth plus dither.

But..... thanks, I'm new here and don't know much.

You've lost your venom but your response made me laugh. :toast:

On another point though, Nvidia get all this flack for not having this or that but a lot of people miss the business point. When your cards are cheaper to make and you can sell them at equivalent prices (GTX1060 - RX480/580) and you use small core counts compared to the competitor, you can take that hit of 3-8% on certain settings. It doesnt matter that Vega RX seems to have a billion great technologies and run way fasterthan Fiji when it still looks (and I stress looks, we dont know for sure) like it only matches a GTX1080. 4096 cores packed with insane goodness only matches a card with just 2560 cores?

Nvidia have split the gfx card from compute. GP100 is a true HPC card, GP102 is stripped out. GV100 is even more HPC and AI focussed and it already exists. Nvidia intentionally keep the compute side down to cut costs, increase profits, increase efficiency and the cards remain faster. And faster by enough to still be faster when slowed by lack of compute.

I can't stress enough how many times I've commented that AMD/RTG's brute force strategy (perhaps because they can't afford two designs) of cramming so many cores into their design isn't helping against Nvidia's very boring but very fast design. People say how AMD are innovating but it's because people ignore the HPC chips from Nvidia where they pour their research into. HPC is Nvidia's core business in terms of market value, if not income. The share price is high because it's seen as the darling of AI and it has a lot of support in the industry. The graphics cards are the bastardised offspring of that endeavour. Wasn't always that way but I think Pascal (GP100) was the turning point in recent years. This is why the rumours suggest that although GV100 is already in production, consumers wont see it as a gfx card because it is far too complex to use for that purpose, for now.
 
Back
Top