• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD's Vega-based Cards to Reportedly Launch in May 2017 - Leak

Still waiting for the 80% difference ...
polaris_vs_pascal.png

polaris_vs_pascal_1.png

These charts are not that hard to read!
 
No, they are not. But people with common sense don't cut them out and compare different tiers.
Now who's the one cherry-picking? We are comparing efficiency, not price here, and you wouldn't find two chips that are exactly the same size. If you don't like a comparison with GP104 and GP106 then you have a problem.

I think you have forgot what we were discussing here. Some claims Vega will be more efficient than Pascal, and since Vega 10 will compete with GP104 it's a fair comparison. GP106 is 55% more efficient, GP104 ~80% more efficient and GP102 ~85% more efficient, so AMD have their work cut out for them.
 
Last edited:
But with the 1060 3GB that is closest with the performance it's sub 20% difference. What's your point?

You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.
 
Last edited:
Stop bitching over the usual crap, all the conjecture is pointless, no point getting worked up and feisty over it. There is no Vega for months.

Edited because there's no point talking performance.
 
You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.

The other thing is, are these calculations done under new driver sets? Already we've seen AMD gain another 10% perf with Polaris under DX11. But lets face it, Nvidia does so well efficiency wise as compute features have been stripped off die since Kepler. And is half the reason why even 10xx series Nvidia gpu's are still not fully DX12 and Vulkan spec level compliant, and why they gain nothing from either api's.
 
The other thing is, are these calculations done under new driver sets? Already we've seen AMD gain another 10% perf with Polaris under DX11. But lets face it, Nvidia does so well efficiency wise as compute features have been stripped off die since Kepler. And is half the reason why even 10xx series Nvidia gpu's are still not fully DX12 and Vulkan spec level compliant, and why they gain nothing from either api's.

AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.
 
AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.

There is not one PC game so far that requires D3D12 runtime most DX12 games run with 11_1 or 11_3 which are DX12 compliant. Its going to take a few more years for things to sort out with DX12 and even see a "real DX12" game rather then a DX11 game with a DX12 stamp.

Hardware has to be available
Developers willing to program for
Enough of a customer base to make it worth it
 
Last edited:
There is not one PC game so far that requires D3D12 runtime most DX12 games run with 11_1 or 11_3 which are DX12 compliant. Its going to take a few more years for things to sort out with DX12 and even see a "real DX12" game rather then a DX11 game with a DX12 stamp.

Hardware has to be available
Developers willing to program for
Enough of a customer base to make it worth it

Yes, that's why folk need to stop saying NV aren't compliant, as if AMD are. It's post truth.

Nobody has it nailed down and as you rightly say, it's going to take quite a while to get it right. Still a lot of folk on W7 as well (raises hand) which doesn't help the cause.
 
So a year behind the big green :)

Why are you happy about this? If AMD falls to far behind we'll be seeing an nvidia monopoly and GPU prices will skyrocket. Just look at the price of a 1080 vs the 980 at launch - the 1080 is a whopping 775$ (3100 lei - tax included) in my country, while the 980 was about 550-600$ (2000-2450 lei depending on model) soon after launch. That's a HUGE price increase of 200$ for a high-end card over a period of what, two years?

This is what happens when there's no competition. Not to mention lack of competition is BORING for hardware enthusiasts like myself.

AMD is not fully DX12 compliant either. The only reason AMD perform better in DX12 and Vulkan is better use of compute and parallelism of sorts.
Too many people wave the DX12 flag as if it's magical. It's not. It brings no better gfx settings and is so far proving hard to code.
People get pissed off Nvidia performs so well with such a lean efficient design and people get pissed off because AMD are doing better.
People ought to look at things from outside the arena and see how fantastic both things are for us consumers in the long run.

Agreed.

The thing is, there's huge improvement in performance using DX12 / vulkan. I've had the opportunity to test the 8GB RX 480 (Asus Strix) and a 4GB RX 470 (PowerColor RedDragon V2) and I have to say I was impressed.
 
Last edited:
They've always had a gaming flagship & a separate (compute) flagship ever since the days of Fermi. That they've neutered DP on subsequent Titan's is something entirely different, the original Titan had excellent DP capabilities but every card that's followed had DP cut down massively & yet many call it a workstation card.

The GP102 & GP100 are different because of HBM2, Nvidia probably felt that they didn't need "nexgen" memory for their gaming flagship that or saving a few more $ was a better idea i.e. with GDDR5x & the single card cannot support these two competing memory technologies.

not really. with fermi and kepler GF100/GF110/GK110 still being use in a gaming card. but with pascal GP100 is not used on gaming card at all. and despite cards like GK110 is more compute oriented and GK104 is more gaming oriented the SM arrangement still the same. that is not the case for GP100 and other pascal chip.

titan x maxwell might not have excellent DP capability as kepler based titan but it's spec is exactly the same as quadro M6000. so if you don't need those certified drivers that come with quadro those titan x can easily replace quadro M6000 at far cheaper price (5k vs 1k). in fact as anandtech point it out some company actually build their product using titan x maxwell instead of using quadro or tesla. titan x pascal was sold for it's unlock INT8 performance.

No one said it wasn't, but they could've gone the route of AMD & given 16/32 GB of HBM2 to 1080Ti/Titan & yet they didn't & IMO that's down a lot to costs.
The Titan is still marketed as a workstation card, I do wonder why?

titan x maxwell is a bit weird since there is no advantage to it compared to regular geforce based card. the only advantage it has really is that 12GB of VRAM making it having the same spec as quadro M6000. but i heard talks about some people sought after it's FP16 performance. it might be the reason why nvidia end up cripling FP16 in non tesla based card with pascal.
 
Last edited by a moderator:
So a year behind the big green :)
Feels so good, isn't it? I mean paying 700$ for 314mm^2 chips.

On the other hand, if you pull all the data from the Titan X review we see that both manufacturers see decreasing performance per flop as they go up the scale and it's worse for AMD:
View attachment 82981

So if we plot out all of these cards as Gflops vs Performance and fit a basic trend line we get that big Vega would be around 80% of Titan performance at 12 TFlops. (this puts it even with the 1080).


View attachment 82980

So for AMD to be at 1080TI levels, they'll need to have improved their card efficiency by 10 - 15 percent for this architecture.

Given the number of changes they've talked about with this architecture, I don't think that's infeasible but it is a hurdle to overcome.


It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.


480 = 232mm (15% larger than 1060 but 10% slower)

Well, intresting to mention that 6 month after release 480 took over.
https://www.techpowerup.com/forums/...ver-improving-over-time-is-not-a-myth.228443/

You are comparing the least efficient Pascal card to the most efficient Polaris. Average them all and you'll get Pascal being ~50% more efficient.

EDIT: Actually did the calculations and Pascal is 59% more efficient on average. 460,470, and 480 vs 1050 Ti, 1060 3, 1060 6, 1070, and 1080.

Bullshit:

upload_2017-1-25_20-42-26.png


https://www.computerbase.de/2017-01...si-gaming-test/3/#abschnitt_leistungsaufnahme

Thanks to TPU benchmarks (I have yet to find site that claims 480 consumes more than 1070) in practice negligible power consumption differences are hyperboled into orbit.
 
Last edited by a moderator:
It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.




Well, intresting to mention that 6 month after release 480 took over.
https://www.techpowerup.com/forums/...ver-improving-over-time-is-not-a-myth.228443/

Exactly.

The odd arguments these people keep making are quite puzzling, no?

-Why do people keep comparing Hawaii and Fiji TFLOPS to Vega, when Polaris is much closer architecturally?

-Why do people keep exaggerating how powerful the Titan is?! As usual people seem to be mislead by the name - it's only ~45% stronger than the Fury X depending on the game and resolution. Do people really think it's going to be hard for AMD to make a card 50% stronger than a card they made 2 years ago?!?!?!

-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.
 
-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.

Are you Donald Trump? The misuse of historical accuracy is disturbing. For reference, I simply googled 290x review and went to the first one, a TPU review on the 290X Lightning model.

I don't see the 780ti being crushed HARD, as you put it. I appreciate you prefer AMD (I've seen you on other sites lauding AMD to the moon) but your abuse of truth is silly.

Again, I hope the Vega card is more powerful than most people are expecting. I'll be very pissed off if a card I'm stalling an entire build for doesn't match (or beat) the rumoured 1080ti. I'm in a win, win here - I want Vega to perform well - I'm not a zealot like some Nvidia owners can be. But you're always a wee bit overly red pumped to be taken seriously.

perfrel_1600.gif
 
Are you Donald Trump? The misuse of historical accuracy is disturbing. For reference, I simply googled 290x review and went to the first one, a TPU review on the 290X Lightning model.

I don't see the 780ti being crushed HARD, as you put it. I appreciate you prefer AMD (I've seen you on other sites lauding AMD to the moon) but your abuse of truth is silly.

Again, I hope the Vega card is more powerful than most people are expecting. I'll be very pissed off if a card I'm stalling an entire build for doesn't match (or beat) the rumoured 1080ti. I'm in a win, win here - I want Vega to perform well - I'm not a zealot like some Nvidia owners can be. But you're always a wee bit overly red pumped to be taken seriously.

perfrel_1600.gif

HAHAHAHHAAHA - A 900p benchmark. Nice "Alternative Facts" shill. Or perhaps this the ideal resolution for nvidiots.

Here's something from 2016:

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/perfrel_3840_2160.png

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_1080/images/perfrel_2560_1440.png

780 Ti almost losing to a 7970 GHz (Which beat the pathetic 780).


To be clear I didn't cherry pick at all. I simply googled "290X 780 Ti 2016". That is the first thing that popped up.
 
-Overall, why do people continue to just assume AMD can't make powerful cards? The 5870 and 5970 practically had supremacy in the Enthusiast market for a FULL YEAR. The 7970 and then GHz edition essentially held the performance crown for an entire year until Nvidia released a larger card that cost $1000, and then the 290X crushed the Titan and 780 Ti HARD. AMD is no stranger to performance wins.

You're talking release performance, right? It seems so.

a) 7970 lost to GTX 680 when 680 was released, but it had performance crown for about ~3 months. Later 7970 GHz Edition equalled performance of GTX 680, or was a bit faster, but with way higher power consumption.

b) 290X was faster than GTX 780 at release and had a good fight with GTX Titan. Nvidia then released the 780 Ti to counter the 290X/Hawaii GPUs and the 780 Ti easily won vs. the 290X. Later custom cards came closer to the performance of the 780 Ti, but the 780 Ti custom versions were again faster.

c) Only thing you had right was HD 5870 and 5970 being unmatched until GTX 480 and GTX 500 series came. GTX 480 came late and was faster, but consumed a hell lot of power. GTX 580 easily won vs HD 5000 and HD 6000 series.

This is still talking release and maximum 6 months after it. It's pretty irrelevant to compare performances now and disregarding release performance of the GPUs, as nobody buys a GPU to have more performance than the competitor GPU 3 years later.

Also @the54thvoid is right, here are the other tables:

perfrel_1920.gif


perfrel_2560.gif


https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/24.html

That's (one of the best) custom 290X losing to reference 780 Ti. Custom 780 Ti is even faster. 290X simply had no chance.
 
Last edited:
You're talking release performance, right? It seems so.


No I am talking about right now. Right now the 290X destroys the 780 Ti and even trades blows with the 980. Heck even in late 2014 the 290X was curb stomping the 780 Ti.


Sorry but I don't need to read your long fanboy rant. Continue to live in the past - All Nvidiots do by necessity.
 
No I am talking about right now. Right now the 290X destroys the 780 Ti and even trades blows with the 980. Heck even in late 2014 the 290X was curb stomping the 780 Ti.


Sorry but I don't need to read your long fanboy rant. Continue to live in the past - All Nvidiots do by necessity.
Any evidence for your bold statements? Until now you're just behaving like a childish fanboy, nothing more.
 
Any evidence for your bold statements? Until now you're just behaving like a childish fanboy, nothing more.

lol scroll up cry baby. I already posted TechPowerup benches from 2016.


Oh.... He can't read. That's sad.
 
lol scroll up cry baby. I already posted TechPowerup benches from 2016.


Oh.... He can't read. That's sad.
I already stated, nobody cares about performance in 2-3 years when he choses to buy a GPU in anno 2014, if you're unable to understand this, I'm gonna end this discussion. And I thought fanboys like this are prominent elsewhere.

Also calling someone else "nvidiot", and behaving like a fanboy is really funny. Seems you don't see your own behaviour as fanboyism, but it is.
Sorry but I don't need to read your long fanboy rant.

Oh.... He can't read. That's sad.

You're a really funny person. I'd say the only fanboy here, is you.
 
I already stated, nobody cares about performance in 2-3 years when he buys a 290X in anno 2014,

Anno 2014? What are you talking about lmao! You can't read!

I posted the aggregate framerate - that includes Nvidia's broken "The Way It's Meant to not Boot" games.

Most people keep their cards for 2-3 years on average. Even at launch the 780 Ti was tied in 4K, and only won by 10% at most in 1080p (1440p depended on the game). Nobody with half a brain pays 30% more money for 0-10% more performance that will only last a year. Considering the 780 Ti came out AFTER the 290X, I would call it pretty pathetic.
 
Anno 2014? What are you talking about lmao! You can't read!

I posted the aggregate framerate - that includes Nvidia's broken "The Way It's Meant to not Boot" games.

Most people keep their cards for 2-3 years on average. Even at launch the 780 Ti was tied in 4K, and only won by 10% at most in 1080p (1440p depended on the game). Nobody with half a brain pays 30% more money for 0-10% more performance that will only last a year. Considering the 780 Ti came out AFTER the 290X, I would call it pretty pathetic.
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards, calling them "Nvidiots", but you know nothing of them. That is pathetic behaviour.

Yes 290X is now better than years before, but that doesn't change the fact nobody can see the future. People paying over 500 bucks for a GPU, mostly chose the 780 Ti, because it was simply the all around better GPU, power consumption wise, performance wise by far (custom GPU vs. custom) and also Nvidia had simply better drivers at least until a few months ago. 4K didn't play the slightest role back then, it's not even really important now. 1440p and 1080p which I posted, even the ref 780 Ti had a easy win VS. Lightning 290X there, which is one of the best 290X, and this was many months after the first release of 290X, so drivers were already a lot better. You can say that the 290X/390X is on same level or better now, but power consumption is still a mess (idle, average gaming, multi monitor, blu ray/web). So in the end, it's still not really better in my books, as I'm using Multi Monitor and I don't want to waste 35-40W. I also don't want a GPU that consumes 250-300W of power, that's especially true for the 390X that's even more power hungry than the 290X. Nvidia GPUs are way more sophisticated power gating wise, much more flexible with core clocking, and only Vega can change that, because Polaris didn't. And yes I hope that Vega is a success.

Comparing the 290/390X with the 980 is just laughable. The 980 is a ton more efficient. Maybe efficiency is not important for you, but for millions of users it is. It has also to do with noise, 780 Ti/980 aren't as loud as 290/390X.

But I'm still laughing about you calling me a "Nvidiot fanboy". I'm a regular poster in AMD reddit, I owned several Radeon cards and I know ~everything about AMD. If at all, I'm more likely a fanboy of AMD than Nvidia, but this doesn't change certain facts that you can't change as well.
 
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards, calling them "Nvidiots", but you know nothing of them. That is pathetic behaviour.

Yes 290X is now better than years before, but that doesn't change the fact nobody can see the future. People paying over 500 bucks for a GPU, mostly chose the 780 Ti, because it was simply the all around better GPU, power consumption wise, performance wise by far (custom GPU vs. custom) and also Nvidia had simply better drivers at least until a few months ago. 4K didn't play the slightest role back then, it's not even really important now. 1440p and 1080p which I posted, even the ref 780 Ti had a easy win VS. Lightning 290X there, which is one of the best 290X, and this was many months after the first release of 290X, so drivers were already a lot better. You can say that the 290X/390X is on same level or better now, but power consumption is still a mess (idle, average gaming, multi monitor, blu ray/web). So in the end, it's still not really better in my books, as I'm using Multi Monitor and I don't want to waste 35-40W. I also don't want a GPU that consumes 250-300W of power, that's especially true for the 390X that's even more power hungry than the 290X. Nvidia GPUs are way more sophisticated power gating wise, much more flexible with core clocking, and only Vega can change that, because Polaris didn't. And yes I hope that Vega is a success.

Just a moment here. Did you even read the reviews? The reference 780 Ti consumed 15 Watt less when gaming (muh better efficiency!!111), it was 8% faster and 28% more expensive vs the reference 290x. And even if the 290x was 8% slower, it still managed to push 60+ in most games tested here on TPU at 1080p. People only bought the 780 Ti because it was nVidia, not because it was that much better as you say. The only two problems with 290x were it's multimonitor power consumption and poor reference colling. Otherwise it was a great card! Stop making things up ...
 
It's an interesting take, but you need to explain why Vega would have worse perf/tflop than Polaris (after AMD explicitly claimed more "non tflop" stuff in it).

Also, 1060 outperforming 480 in your chart is... somewhat outdated.

It's because as AMDs dies get larger and speeds go up the performance per TFLOP decreases.

¯\_(ツ)_/¯
 
Yeah of course, for a fanboy like yourself, anything Nvidia does is pathetic and you hate people that use Nvidia cards

Comparing the 290/390X with the 980 is just laughable. The 980 is a ton more efficient. Maybe efficiency is not important for you, but for millions of users it is. It has also to do with noise, 780 Ti/980 aren't as loud as 290/390X.

But I'm still laughing about you calling me a "Nvidiot fanboy".


No I have nothing inherently against Nvidia, or any company that makes a product. I don't call people "Nvidiots" because they buy Nvidia cards, I do so when I truly believe they are a fanboy. And yeah I assume most people who defend Kepler are in fact fanboys, because kepler was a big joke if you actually know what you are talking about.


I have owned plenty of Nvidia cards (Haven't owned any for a few years now though). However I honestly don't believe you when you say you own AMD cards considering the continued noob arguments I keep hearing.


The 390X is noisy huh? Um no they were all AIB whisper quiet cards. Of course you probably don't know that because you are clearly uninformed on all of these cards from top-to-bottom. I mean the 290X was hot, but not loud if using its default fan settings,; and again - that's for the cheap launch cards. If you bought the plentifully available AIB cards you would know they were very quiet. Quite funny you bring up noise when the Titan series (And now 1070/80 FE) have been heavily criticized for their under-performing and noisy fan systems.


Also the efficiency argument is my favorite myth. Only with the release of Maxwell did Nvidia start to have any efficiency advantage at all, and that was only against the older generation AMD cards. I will leave you with this:

upload_2017-1-27_15-40-20.png



^WOW! A full 5-10% more efficient! (depending on the card). Anyone who thinks that is worth mentioning is simply looking for reasons to support "Their side."

Pascal was really the first generation Nvidia won efficiency in any meaningful way.
 
Last edited:
Back
Top