Thursday, July 6th 2017

AMD RX Vega Reportedly Beats GTX 1080; 5% Performance Improvement per Month

New benchmarks of an RX Vega engineering sample video card have surfaced. There have been quite a few benchmarks for this card already, which manifests with the 687F:C1 identifier. The new, GTX 1080 beating benchmark (Gaming X version, so a factory overclocked one) comes courtesy of 3D Mark 11, with the 687F:C1 RX Vega delivering 31,873 points in its latest appearance (versus 27,890 in its first). Since the clock speed of the 687F:C1 RX Vega has remained the same throughout this benchmark history, I think it's fair to say these improvements have come out purely at the behest of driver and/or firmware level performance improvements.
The folks at Videocardz have put together an interesting chart detailing the 687F:C1 RX Vega's score history since benchmarks of it first started appearing, around three months ago. This chart shows an impressive performance improvement over time, with AMD's high-performance GPU contender showing an improvement of roughly 15% since it was first benchmarked. That averages out at around a 5% improvement per month, which bodes well for the graphics card... At least in the long term. We have to keep in mind that this video card brings with it some pretty extensive differences from existing GPU architectures in the market, with the implementation of HBC (High Bandwidth Cache) and HBCC (High Bandwidth Cache Controller). These architectural differences naturally require large amounts of additional driver work to enable them to function to their full potential - full potential that we aren't guaranteed RX Vega GPUs will be able to deliver come launch time.
Sources: Videocardz, 3D Mark's latest 687F:C1, 3D Mark's first 687F:C1
Add your own comment

141 Comments on AMD RX Vega Reportedly Beats GTX 1080; 5% Performance Improvement per Month

#26
bug
RejZoRSo, being rational and understanding technologies without jumping to baseless conclusions makes you a fanboy now. Okay...
RejZoR... But clearly, it's no going to cost 800€...
Posted on Reply
#27
ratirt
Filip GeorgievskiAt this point im just thinking is this the best VEGA they have?
And if it is priced at 600$ it will be too expensive for my taste, and this is comming from an AMD GPU Fan (not fanboy).
You guys do remember that the R9 FURY X in DX12 is also neck a neck to the 1080 and VEGA barely beats the 1080?

Price wise this would not be a smart move since you can find a FURY X as low as 350$ and it is maybe 5 - 8% slower than the VEGA at almost same TDP ( FURY X has 300W+ TDP)?

I hope AMD release a better VEGA chip since this one is not any better than the FURY X.
I'd wait for full review when the card gets here. hold your horses bro :) It may be an indicator of the performance but it is not what it actually is in the real game environment. So be patient.
RejZoRSo, being rational and understanding technologies without jumping to baseless conclussions makes you a fanboy now. Okay...
It appears that's what it is now and the actual fanboys call themselves experts :)
Posted on Reply
#28
EarthDog
RejZoRSo, being rational and understanding technologies without jumping to baseless conclussions makes you a fanboy now. Okay...
There are people who believe the earth is flat is a rational thought...they understand technology too.

Not saying you dont get it, just stating what one believes is rational and understamds the tech, doesnt mean tbey are right either since its in their head.

Forums, lol!
Posted on Reply
#29
TheinsanegamerN
RejZoROf course it needs more power. It's not yet using tiling. I'm shocked how people still don't get it. Tile based rendering is the essence of efficiency, it's why it's used on mobile devices since forever. It's also why Maxwell/Pascal cards are so efficient.
So another magic "thing" that will save AMD then? Just like DX12, viulkan, mantle, windows 10, consoles, now we have TBR will save AMD.

No it wont. All the tricks in the world cant make up for a half baked arch.
R0H1TThe real benefits of Vega will not be seen until after Scorpio & PS5 are released, it's going to be like HD 7970 or R9 290 (i.e. 390) all over again.

Also 1800 boost clocks sound impressive but fake, they'd have to move away a lot from GCN right up until Polaris & I don't see that happening without some major compromise, perhaps in GPGPU?
So VEGA will only perform properly once it is 3 years old and once it no longer matters from a sales perspective?
Posted on Reply
#30
phanbuey
hopefully it will push down the prices of the current batch...

Would be nice to have a Ti for less than $700 lol
Posted on Reply
#31
Gasaraki
ratirtThat's good news right? Improvements 5%per month. Nice I wonder how long they can keep going with this. I'm sure they finally will hit the wall with performance boost over the driver improvement.

BTW. I wonder if there will be any OC potential. If this Vega hits 1800, it might get interesting.
OMG by the time it comes out, i'll beat the 1080Ti by 30%! Sweet!
Posted on Reply
#32
R0H1T
ratirtIt was a wishful thinking not information. Stop picking the words and create your own fake understanding. We don't talk about Polaris but Vega and we don't know the potential. Probably it wont hit 1800Mhz. It would be interesting if it did.
Read carefully next time please.
We also don;t know how different Vega is from Polaris, is it GCN like 1.2 (Fiji or Tonga) vs GCN 1.0 or like Polaris (GCN 2.0) vs Tahiti.

It may but that'd be a radical departure from GCN uarch all the way upto Polaris, GCN is a compute heavy platform & even if we take the a better GF process node into account it's hard to see Vega clock that high unless it's radically different from anything before it. I wasn't adding anything to your PoV, I was just expanding on what potential Vega holds or how it can clock that high.
Posted on Reply
#33
Sandbo
FordGT90Concept3DMark11? Seriously? Why run anything other than TimeSpy and Firestrike?
Same question here, everytime I see these old results I get skeptical, it just looks like they were hiding something at the back.
Posted on Reply
#34
Octopuss
RejZoRSo, being rational and understanding technologies without jumping to baseless conclussions makes you a fanboy now. Okay...
I'm not saying you are. I'm saying it could give such impression.

Maybe I am too, not having a Nvidia card in my PC since 2002...
Posted on Reply
#35
RejZoR
TheinsanegamerNSo another magic "thing" that will save AMD then? Just like DX12, viulkan, mantle, windows 10, consoles, now we have TBR will save AMD.

No it wont. All the tricks in the world cant make up for a half baked arch.


So VEGA will only perform properly once it is 3 years old and once it no longer matters from a sales perspective?
And you think Maxwell/Pascal achieves efficiency through, I don't know, magic pixies? Dear god. NVIDIA made a smart move to implement TBR when it actually mattered the most, while everyone was struggling to shrink 28nm down to 16/14nm. It's what gave them the power advantage because that was really the only way to make it more efficient with the given 28nm fab process. You can't just pull efficiency out of thin air. AMD simply decided not to do that and work with 28nm, probably to minimize development costs (lets be honest, they were financially occupied with Ryzen). It's what made Maxwell 2 so power efficient and once they shrunk that down to 14nm, they had additional edge with that on Pascal. It's not rocket science, it's simple understanding of GPU designs. And I mean understanding on a very high level. Anyone here should get it.

NVIDIA was throwing money into R&D and it paid off in a form of efficiency. Framebuffer compression, TBR rasterizer, it probably cost them a lot, but it paid off. AMD used a different approach saving costs by tweaking what they already had and worked with year old design which, despite being a bit more power hungry still delivered performance. Hawaii core tweaked a bit into Grenada made R9 390X competitive against GTX 980. They were essentially trading blows through games. Only reason why I grabbed GTX 980 was because I was curious. I've had Radeons for years and there was a lot of buzz around GTX 980 being efficient and on paper delivering higher DX12 support level. Which was a bit gimped by the lack of functional async, but whatever. So I said, lets give it a try. One may argue inferiority, but in my books, if it delivers performance, I frankly don't care how, be it through finesse of Maxwell 2 or through brute force of R9 390X. They both worked.
Posted on Reply
#36
bug
RejZoRAnd you think Maxwell/Pascal achieves efficiency through, I don't know, magic pixies? Dear god. NVIDIA made a smart move to implement TBR when it actually mattered the most, while everyone was struggling to shrink 28nm down to 16/14nm. It's what gave them the power advantage because that was really the only way to make it more efficient with the given 28nm fab process. You can't just pull efficiency out of thin air. AMD simply decided not to do that and work with 28nm, probably to minimize development costs (lets be honest, they were financially occupied with Ryzen). It's what made Maxwell 2 so power efficient and once they shrunk that down to 14nm, they had additional edge with that on Pascal. It's not rocket science, it's simple understanding of GPU designs. And I mean understanding on a very high level. Anyone here should get it.

NVIDIA was throwing money into R&D and it paid off in a form of efficiency. Framebuffer compression, TBR rasterizer, it probably cost them a lot, but it paid off. AMD used a different approach saving costs by tweaking what they already had and worked with year old design which, despite being a bit more power hungry still delivered performance. Hawaii core tweaked a bit into Grenada made R9 390X competitive against GTX 980. They were essentially trading blows through games. Only reason why I grabbed GTX 980 was because I was curious. I've had Radeons for years and there was a lot of buzz around GTX 980 being efficient and on paper delivering higher DX12 support level. Which was a bit gimped by the lack of functional async, but whatever. So I said, lets give it a try. One may argue inferiority, but in my books, if it delivers performance, I frankly don't care how, be it through finesse of Maxwell 2 or through brute force of R9 390X. They both worked.
That is true, but I'd like to add a couple of things:
1. Recent AMD cards have been "trading blows" with Nvidia only if you disregard power draw. I can certainly understand if you're not concerned with it, but that doesn't mean other aren't.
2. You say AMD did what they did because their money was tied up with Ryzen. Well, they did pour money into HBM on consumer cards and it wasn't their most inspired choice.
Posted on Reply
#37
dadyal
1730mhz clock speed it will touch stock GTX 1080ti FE
Posted on Reply
#38
Basard
Hell yeah, so in a year and half it will be two times faster!
Posted on Reply
#39
RejZoR
I disagree on the second point. The Ryzen thing is an assumption. It's what we consumers know, what really was the reason we'll probably never know. But if you think HBM was a missed investment, you're VERY wrong. The HBC is AMD's long term investment and I just got words about R9 Fury X already using it to some basic degree (not sure why no one brought that up in the past) and RX Vega/Vega FE really kicking it into full speed. Fury X didn't have dedicated HBC silicon, but its memory controller was able to use memory beyond 4GB by utilizing RAM for non essential stuff (without it, performance would just tank like insane). It's what allowed them to stick NAND memory to a freaking graphic card and it's what gives Vega FE nearly unlimited memory capacity. Something not even 16GB on any NVIDIA card can satisfy professional users.

People think AMD is just throwing tech around randomly, but if you look through years, it were all really slow, but strategic decisions. From merging CPU and GPU into APU's and evolving all that into giving GPU's ability to really fully share memory with system RAM to high speed, high bandwidth interconnections like Infinity Fabric. When you look at it all, it all makes a lot of sense when you're trying to build an ecosystem. Consoles with AMD's guts kinda showcase that already. I mean, Xbox One X features only GDDR5 memory fully shared between CPU and GPU (there is no "normal" RAM, it's just insanely fast GDDR5).
Posted on Reply
#40
bug
HBM isn't a missed investment, but consumer HBM was. As a customer, I'm more about buying fleshed out products rather than funding future developments. It's a quirk of mine.
Posted on Reply
#41
qubit
Overclocked quantum bit
So, AMD's latest and greatest GPU with the big die and lots of power and heat beats NVIDIA's year old GPU (Founders Edition of course) with the little die by a whopping 5% and is significantly slower than the 1080 Ti with the big die.

Wow, take my money AMD!! :rolleyes:

Now, again, before all you AMD apologists start foaming at the mouth at me for being "anti-AMD" and an "NVIDIA fanboy", you should instead be annoyed with AMD for continuing to put out disappointing products, not the guy (me) pointing out their failings.

For the record, I would have loved Vega to leapfrog the performance of the 1080 Ti and make NVIDIA play catch up for a change. That's real competition and results in better products for us at lower prices.
Posted on Reply
#42
cdawall
where the hell are my stars
phanbueyhopefully it will push down the prices of the current batch...

Would be nice to have a Ti for less than $700 lol
They sell for $669 on sale consistently
Posted on Reply
#43
xkm1948
AenraIt's getting tiring.. you were the number 1 prospective buyer, it was delayed, you got bitter, you got even more bitter.
You bought a Ti, and ever since doing that, you've switched camps and gone full bashing/negativity; for months now. Is it not time to cease?

If i've got your point/emotional response to it all, i'd bet you the oldies here have as well, sooner than i did too.
..We get it. Honestly :)
Dude, so anyone's view that does not align to your hope/hype of Vega is "hater #1"

In another thread you gave me the crown of #1 hater. So who is the actual hater here? Me or @cdawall ?

:D :D :D
Posted on Reply
#44
Shihab
RaevenlordThese architectural differences naturally require large amounts of additional driver work to enable them to function to their full potential - full potential that we aren't guaranteed RX Vega GPUs will be able to deliver come launch time.
At this stage, I wonder if anyone expects AMD to launch a card that does. "Finewine tech" 'n all. I just hope they stop doing that. Products that sell solely on the promise of what they'll be later on are stupid.
Posted on Reply
#45
cdawall
where the hell are my stars
AenraIt's getting tiring.. you were the number 1 prospective buyer, it was delayed, you got bitter, you got even more bitter.
You bought a Ti, and ever since doing that, you've switched camps and gone full bashing/negativity; for months now. Is it not time to cease?

If i've got your point/emotional response to it all, i'd bet you the oldies here have as well, sooner than i did too.
..We get it. Honestly :)
I have purchased amd cards for over a decade now mixed between them and nvidia. I have always given amd the benefit if the doubt. To this day the 79x0 series and Fermi are my two favorite series of all time. I am still a prospective buyer of 23rd vega and held off on purchasing waterblocks for my 1080it's because of it.

That being said. If it doesnt release for another month and barely edges out a card released spring of last year I will not touch it. This is getting silly and the hypetrain is full runaway lately. Amd is so far behind volta was delayed for a refresh card out of nvidia. They are stagnating rhe market. That makes me mad. Also being delayed for 7 months, that also makes me mad. Ryzen not working correctly on launch, leaves me zero faith in amd as of late which is sad. I guess we will see what comes down the pipelining, but as it sits I would have more luck hearding two year old's than promising a solid amd release.
xkm1948Dude, so anyone's view that does not align to your hope/hype of Vega is "hater #1"

In another thread you gave me the crown of #1 hater. So who is the actual hater here? Me or @cdawall ?

:D :D :D
I still hate nothing other than amds release schedule. You know the book of lies.

Well that and driver crashes. After mining in the current market for a bit I can tell you amds drivers make my soul hurt and habe significantly colored my view of amd lately.
Posted on Reply
#46
GhostRyder
I doubt they are going to be very highly competitive in all regards until they do a complete overall on their GPU's. That being said, I still won't completely judge the card until its released because jumping to conclusions is what makes you super hyped just to lead to disappointment, or overly negative without any merit.

Reality is regardless how this card turns out, they need a whole new GPU lineup.
Posted on Reply
#47
the54thvoid
Super Intoxicated Moderator
Well, hopefully we'll get a review in a few weeks. Hopefully TPU's AMD PR marketing review they did recently has put them on the review cycle for Vega RX.
Posted on Reply
#48
notb
OctopussAs much as I agree with most of your posts AND you sound like you know what are you talking about, you really sound like a die hard AMD fanboy in the last several weeks :p
Several weeks?!
RejZoROne may argue inferiority, but in my books, if it delivers performance, I frankly don't care how, be it through finesse of Maxwell 2 or through brute force of R9 390X. They both worked.
For last few months you've been criticizing Intel for incremental updates of old architecture and praising AMD for revolutionary Ryzen. Now you totally change the front because of Vega.
Posted on Reply
#49
ratirt
notbSeveral weeks?!

For last few months you've been criticizing Intel for incremental updates of old architecture and praising AMD for revolutionary Ryzen. Now you totally change the front because of Vega.
I don't think you know what you are talking about. It's easier to stir into the skid like you do with your opinion right now getting Ryzen into the picture. We are talking about Vega here not Ryzen, This competition is between AMD and NVIDIA not AMD and Intel. What's your point here? You put NV and Intel into one basket? or what is it?
With this logic you been praising Nvidia for several weeks as well. So you did with Intel. and??
BTW yeah the increase in Intel's CPU's IPC of previous CPU putting kaby as the latest is practically none. isn't that the truth? So what's the reason to bring it up here?
Posted on Reply
Add your own comment
Jan 18th, 2025 00:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts