Friday, September 16th 2016

AMD Actively Promoting Vulkan Beyond GPUOpen

Vulkan, the new-generation cross-platform 3D graphics API governed by the people behind OpenGL, the Khronos Group, is gaining in relevance, with Google making it the primary 3D graphics API for Android. AMD said that it's actively promoting the API. Responding to a question by TechPowerUp in its recent Radeon Technology Group (RTG) first anniversary presser, its chief Raja Koduri agreed that the company is actively working with developers to add Vulkan to their productions, and optimize them for Radeon GPUs. This, we believe, could be due to one of many strategic reasons.

First, Vulkan works inherently better on AMD Graphics CoreNext GPU architecture because it's been largely derived from Mantle, a now defunct 3D graphics API by AMD that brings a lot of "close-to-metal" API features that make game consoles more performance-efficient, over to the PC ecosystem. The proof of this pudding is the AAA title and 2016 reboot of the iconic first-person shooter "Doom," in which Radeon GPUs get significant performance boosts switching from the default OpenGL renderer to Vulkan. These boosts aren't as pronounced on NVIDIA GPUs.
Second, and this could be a long shot, but the growing popularity of Vulkan could give AMD leverage over Microsoft to steer Direct3D development in areas that AMD GPUs are inherently good at - these include asynchronous compute, and tiled-resources (AMD GPUs benefit due to higher memory bandwidths). AMD has been engaging aggressively with game studios working on AAA games that use DirectX 12, and thus far AMD GPUs have been either gaining or sustaining performance better than NVIDIA GPUs, when switching from DirectX 11 fallbacks to DirectX 12 renderers.

AMD has already "opened" up much of its GPU IP to game developers through its GPUOpen initiative. Here, developers will find detailed technical resources on how to take advantage of not just AMD-specific GPU IP, but also some industry standards. Vulkan is among the richly differentiated resources AMD is giving away through the initiative.

Vulkan still has a long way to go before it becomes the primary API in AAA releases. To most gamers who don't tinker with advanced graphics settings, "Doom" still works on OpenGL. and "Talos Prinicple," works on Direct3D 11 by default, for example. It could be a while before a game runs on Vulkan out of the box, and the way its special interest group Khronos, and more importantly AMD, promote its use, not just during game development, but also long-term support, will have a lot to do with it. A lot will also depend on NVIDIA, which holds about 70% in PC discrete GPU market share, to support the API. Over-customizing Vulkan would send it the way of OpenGL. Too many vendor-specific extensions to keep up drove game developers to Direct3D in the first place.
Add your own comment

111 Comments on AMD Actively Promoting Vulkan Beyond GPUOpen

#76
phanbuey
Captain_TomHence why I just bought a Fury for $310 - it beats the 1070 in TODAY's games. That's just stupid.
What on earth are you talking about... the 780 was 15-20% faster than the 7970 GHZ edition, so even if you hit 1200Mhz clocks, as soon as someone OC'ed a 780 they would still soundly beat that card.

I get that you like your cards, and that's cool - but benchmarks are benchmarks, the Fury is nowhere near beating the 1070...

And it won't - in maybe 2-3 titles it will match the 1070, beat it by very little, but it will lose in a large majority of the rest... thats what i mean by false hope. It's just not true.

and AMD markets it and people like you believe it and go out buying $310 Furies thinking they'e beating 1070's ... when they're not. This is my problem with these types of PR campaigns.

]
Posted on Reply
#77
the54thvoid
Super Intoxicated Moderator
Captain_TomUhhh have you not looked at benchmarks for the past year? I genuinely encourage you to go read them and then come back.


Ok you back? Good.

1) The 7970 overclocks better than anything that has been released since then. My 7970 ran at 1220/1840. My brother's 7950 ran at 1120/1800, and all of my crypto-mining 7950's ran at 1100+/1800+. Those are 40% overclocks lmao! My 7970 benches as well as a 980 in Deus Ex: MD and BF1. So drop that argument here.

2) 2-3 generations? You completely missed what I was saying. I said that withing a year of the 7970's launch it was ALREADY beating the 680 by 10-20% on average. Most people keep their cards for 2-3 years in my experience.

Furthermore just because it is 1-2 generations newer doesn't make a difference. Everyone CONSTANTLY complains about AMD's recent trend of re-branding old GPU's. I will admit that I think it is stupid too, but can you blaim them? Radeon is like 1-2 times smaller than Nvidia. If they can sell the 7970 2 years later and have it compete with the 970 they will lmao. Hence why I just bought a Fury for $310 - it beats the 1070 in TODAY's games. That's just stupid.
You're overstating the 680. The 680 did enough to stay ahead of the initial 7970. But AMD rereleased the 7970 as a GHz edition. It did perform on par if not better.
Given the driver optimisations AMD perform, it gained performance over the years. The 680 didn't because Nvidia get DX11 driver optimisations very quickly.
The question is, are you buying a card for now or two years down the line? You seem to take a hugely AMD slanted bias. I've seen your posts on other forums and it's clear you're a hater, you lack balance. For all your precise arguments, like any hater, you tend to use slanted evidence or ignore standard business practice.
I own a 980ti and it chuckles me to see it perform on par with a Fury X. Now that's in a DX11 version of a Gaming Evolved title. Sure, in DX12 the Fury X will gain some frames but why should I cry? I played the game at 1440p with very high settings (some maxed) at about 60fps. It was excellent.
In Doom Vulkan I was on far higher fps. Yes a Fury X would have got more but I've got a 60hz monitor. My gaming experiences have been great.
I bought the card a year ago. It hasn't let me down.
Going forward, I am no fan of Nvidia. I won't spend the money for a 1080, 1080ti or above because Vega is only 6 months away (hopefully at most). If Vega has a better perf/watt than Polaris and it's a far bigger chip it should match the 1080 in DX11 and it should absolutely own the bare metal API's. So Nvidia won't see my money again until Volta and even then, that's only if it performs.

So if Vega is twice as good as a 480, I'm on that next. But the reason I have no reason to move from my 980ti is because it still performs very, very well. If I can play AAA Gaming Evolved titles with butter smooth frames, I have nothing to feel cheated about. Only children get upset because someone elses card plays it faster than theirs.

Oh and one more thing, my Powecolor LCD 7970 clocked to the 1300 catslyst maximum. My MSI version (under an EKWB) only managed 1225.

I had more fun overclocking my original Titan using the voltage soft mod. I can't remember the clocks but they were scary for such a card. Two of those cards got recycled in TPU.
Posted on Reply
#78
geon2k2
FordGT90ConceptI want a push to Vulkan for the simple reason it works on other operating systems too (Linux/Mac).
I second that, graphics api should not be linked to the platform. Nowadays the main reason to purchase windows at home and the main reason there is no other serious competitor on PC is that so many applications are tied to windows apis.
Posted on Reply
#79
Relayer
nVidia a a member of Khronos too. They can add optimizations to Vulkan for their hardware too. Of course then it would be open source though.
Posted on Reply
#80
_Flare
- Fails in bringing CPU-Pressure down in Games using DX12
- Often no FPS-Gains for slower CPUs
- sadly most DX11 to DX12 shifts stay fokused on strong single-thread-Perf like all DX before

Vulkan resulting from Mantle does everything better on weak Hardware, logically because the Consoles use 8 weak Cores so the work needs to be ballanced as good as possible between ALL available Hardware



this Chart is old but i can´t show it often enough:
Fact the 290X theoretically is way stronger than a GTX 780GHz

Feeding the GTX in DX11 is done with less CPU-Load over the 290X resulting in higher FPS on every CPU in DX11 (you get the max Perf out of it easier)

- the Gains the 290X gets in Mantle are mostly from feeding it better with the CPU
- the impressive Mantle-Leads over Nvidia-DX11 are only done between the "lowest 4-Thread CPU" and the "highest non-k i5"

"On the 2 strongest CPUs" the 290X can close the Gap to DX11 because the GPU Utilization was bad in DX11, but it can´t beat the GTX
- because DX11-Language can´t be ballanced in the GCN GPU good enough because GCN is very compute-oriented. It needs compute-Shaders etc. to get fully utilized, DX11 mostly isn´t used like that.
____________________________________________________________________________________________________________

Sadly DX12 seems to not focus on giving us more FPS with cheaper CPUs, no it just makes new Games on $1500+ Rigs look better

I´m a person being satisfied with for example a GTA without Temporal-Aliasing and Flickering-Edges and Tearing and Input-Lag
With a Budget for CPU+GPU of upto $350 ... for example i3-6100 with RX470 4GB

that could be done with DX12 or Vulkan
Posted on Reply
#81
bug
Captain_TomLOL can you read? I said IN VULKAN.
In Vulkan what? The whole architecture suddenly becomes more energy efficient?
Also, are you basing your Vulkan performance evaluation on something more than just Doom?

Edit: Mind you, while everybody was quick to point out how much more Polaris benefits from Vulkan in Doom, nobody was equally quick to measure the power consumption at the same time.
Posted on Reply
#82
efikkan
RelayernVidia a a member of Khronos too. They can add optimizations to Vulkan for their hardware too. Of course then it would be open source though.
The APIs are not optimized for any hardware, that's up to the driver implementations.

Nvidia is not only a member, their president Neil Trevett is an Nvidia employee. Nvidia started OpenGL ES, and have been the major contributor to OpenGL (post 2.0), OpenCL, and also Vulkan. There is no doubt that they are dedicated to adding support and evolving the APIs.
Posted on Reply
#83
Ungari
bugHow on earth is AMD's efficiency fine when the RX480 eats as much power as the GTX1070?
We're still talking pennies per month on the electric bill?

Every time I see the point brought up, I think of a parrot; "Power Consumption, the Power Consumption...caw caw rawwwk!" .
Posted on Reply
#84
RejZoR
People who go after 3 frames of difference and declare kings of the hill based on that are idiots. The fact is, if you buy Fury X even today, you can be assured you'll enjoy all latest games in highest details and resolutions. You really have to be running most demanding game in 4K with max settings to even drag framerate to 30fps. What does that tell you? Reality is, graphic cards from either camp are about the same within similar price range. It doesn't even matter what you pick in the end, it's tiny things that make your decision in the end. Someone who buys new graphic card every year wont' care about future proofing. Someone who doesn't, might, because it means the particular graphic card will last longer. For some V-Sync modes are more important and for others it's the RGB lighting on the graphic card cover or the sticker. If people always made rational decisions, then they'd sell exactly ZERO Titan cards. And yet that's not the case. So stop bitching over few frames per second lead or loss, when you draw a line, they are basically the same. In the end emotional factor plays a larger role than an actual performance.

@Ungari
People are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.
Posted on Reply
#85
Ungari
rtwjunkieI'm really curious where this almost immediate obsolescence is occurring, because my 980Ti still performs stellar as does the 980 before it that is now used by my better half. Indeed, it continues to perform at it's same original standard, probably better then at release, due to drivers optimizing performance. In that respect, it is no better or worse for it's category than my R9 380X, which has matured in its mid-priced category.
It all depends on what games and tasks you are doing, and what you are looking to do in the near future. There are many low processor games that are popular, and even new sprite games just released, but there are also those that push the limits of our cards if you want all the eye candy. Already your cards VRAM can be maxed out, shortly after the 980Ti was released a 6GB HD Texture was made for Rainbow Six Siege. This trend of games utilizing more and more VRAM is increasing.
I'm not suggesting that your card is obsolete right now since Pascal is super-clocked Maxwell, but the lack of Async Compute will certainly be an issue if you ever decide to play these new APIs.
RejZoRPeople are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.
What is sad is that just because Nvidia uses it as a selling point in their advertising, tech enthusiasts actually buy into this as being an important criteria in evaluating performance.
You would think that of all people, tech junkies would know better.
Posted on Reply
#86
$ReaPeR$
RejZoRPeople who go after 3 frames of difference and declare kings of the hill based on that are idiots. The fact is, if you buy Fury X even today, you can be assured you'll enjoy all latest games in highest details and resolutions. You really have to be running most demanding game in 4K with max settings to even drag framerate to 30fps. What does that tell you? Reality is, graphic cards from either camp are about the same within similar price range. It doesn't even matter what you pick in the end, it's tiny things that make your decision in the end. Someone who buys new graphic card every year wont' care about future proofing. Someone who doesn't, might, because it means the particular graphic card will last longer. For some V-Sync modes are more important and for others it's the RGB lighting on the graphic card cover or the sticker. If people always made rational decisions, then they'd sell exactly ZERO Titan cards. And yet that's not the case. So stop bitching over few frames per second lead or loss, when you draw a line, they are basically the same. In the end emotional factor plays a larger role than an actual performance.

@Ungari
People are funny. They bitch over power consumption of graphic cards where it's like 50W difference. But when it comes to home appliances like fridges or tumble dryers, they don't care even for 100kW of difference per year.Like you said, it's literally pennies even for such massive differences, those 50W difference is nothing. And it also doesn't reflect as dramatically in terms of thermals. It helps if it's lower, but people tend to blow this stuff way out of proportions.
preach it brother!
UngariWhat is sad is that just because Nvidia uses it as a selling point in their advertising, tech enthusiasts actually buy into this as being an important criteria in evaluating performance.
You would think that of all people, tech junkies would know better.
no, they dont, in their majority. just like @RejZoR said, most choices are made based on "feelings" not rationality.
Posted on Reply
#87
Fluffmeister
Even sadder was when AMD created a video mocking nVidia's power consumption.
Posted on Reply
#88
TheGuruStud
bugIn Vulkan what? The whole architecture suddenly becomes more energy efficient?
Also, are you basing your Vulkan performance evaluation on something more than just Doom?

Edit: Mind you, while everybody was quick to point out how much more Polaris benefits from Vulkan in Doom, nobody was equally quick to measure the power consumption at the same time.
Bullshit. I saw the reviews. Power usage went up a few watts. Quit trying to spin it.
Posted on Reply
#89
renz496
R-T-BI know my throat is getting hoarse from saying this, but that's simply not true. This is more evidence that no one here really understands what words/phrases like "low-level" and "close to the metal" mean.

You don't optimize hardware to a low level api, you optimize software to a hardware exposed by a low level api.

At this moment, people are using exposed parts by DX12 to better optimize for AMD because frankly, there's a lot of optimizing to do compared to their DX11 renderer. There is some valid argument that async compute IS better supported on AMD's side, but it's not a valid argument for the way you are using it as NVIDIA also supports several things AMD doesn't:

this. that's why i wonder if those that say that nvidia need to build their architecture better to take advantage of DX12 are really understand what going low level really is
Captain_TomAnyone else remember that both AMD and Nvidia are bidding to supply the graphics in Samsung's next Smartphone APU's?

By making Vulkan the standard API of Android, AMD may have just secured a massive advantage in their bidding....
it doesn't matter if AMD have massive advantage in Vulkan. the only thing that matter if the said hardware support Vulkan or not. Samsung is building phone. they did not build ultimate gaming machine. so whoever can give the better deal probably will win. though what samsung discussing with AMD and Nvidia is mostly not about making GPU for samsung.
Captain_TomTegra chips are absolute garbage in terms of efficiency. Their powerful chips hog 25-50w (Far more than a phone can take), and their 5w variants fail to beat their Qualcomm/Apple competition.


AMD's efficiency is totally fine, 14nm just isn't mature for big chips yet. Furthermore you should look at efficiency for AMD in Vulkan. Their far cheaper to produce 480 is roughly as efficient as the 1070 (Like a 10% difference).
and you think AMD can hold candle in SoC space with their GCN? you just look at desktop part and then confident that AMD will do fine in mobile SoC? that is outright delusional.
Posted on Reply
#90
TheGuruStud
renz496and you think AMD can hold candle in SoC space with their GCN? you just look at desktop part and then confident that AMD will do fine in mobile SoC? that is outright delusional.
Low clocks are different (if indeed they would be low). With proper fabbing, maybe.
Posted on Reply
#91
renz496
EarthDog/thread.

Though it is great news android will use it, that doesn't mean the pc market will adapt. I certainly hope it does, more competition never hurts. I wont hold my breath though.
but i seriously believe most android game will end up using OpenGL ES 2.0 only lol.
Posted on Reply
#92
Prima.Vera
Quick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
Posted on Reply
#93
INSTG8R
Vanguard Beta Tester
Prima.VeraQuick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
Well it would make sense for them to go that route seeing as they have been using OGL anyway.
Posted on Reply
#94
efikkan
Prima.VeraQuick question.
Are the current gen consoles also able to support Vulkan? How about PS4-Pro?
PS4, Xbox One and updates have full hardware support for Vulkan 1.0, it's just a matter of drivers.
Posted on Reply
#95
Jism
If DX12 is limited to W10 with functions only, how is mantle doing in an older OS? For example, does it offer all hardware features that normally would be available in DX12 on W10 ?
Posted on Reply
#96
Ungari
R-T-BI know my throat is getting hoarse from saying this, ...
Do you yell at the monitor while replying to posts?
Posted on Reply
#97
Pruny
efikkanPS4, Xbox One and updates have full hardware support for Vulkan 1.0, it's just a matter of drivers.
I think microsoft only alows directx on xbox.
Posted on Reply
#98
bug
UngariWe're still talking pennies per month on the electric bill?

Every time I see the point brought up, I think of a parrot; "Power Consumption, the Power Consumption...caw caw rawwwk!" .
Well, you may say that on the desktop, but in this case we're talking mobiles where every watt counts.
Posted on Reply
#99
efikkan
JismIf DX12 is limited to W10 with functions only, how is mantle doing in an older OS? For example, does it offer all hardware features that normally would be available in DX12 on W10 ?
It has nothing to do with the features in Direct3D 12, just at Microsoft wants to keep it to Windows 10. Hardware features has little to do with OSs ;)
PrunyI think microsoft only alows directx on xbox.
Yes, but we are talking about Vulkan ;)
Posted on Reply
#100
R-T-B
UngariDo you yell at the monitor while replying to posts?
Only when I'm angry.
Posted on Reply
Add your own comment
Nov 21st, 2024 09:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts