Wednesday, July 19th 2017

AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)
All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.
AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Sources: Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
Add your own comment

175 Comments on AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

#51
Footman
If all the rumors are actually true and the Vega performs at or below 1080 speeds then I will admit to being a little disappointed. Especially when you factor in the amount of power these cards need to run at 1600mhz core speed reliably, according to tests performed on the already released Vega FE.

I will likely buy one though as I am still looking for a faster video card to replace my RX 580 for Freesync gaming on my 2560x1440 144hz monitor....
Posted on Reply
#52
bug
TheLostSwedeThe latest generation of TN panels are actually not as bad as they used to be. See this review for example www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Also, not all IPS panels suffer badly from the glow, but some have a weird coating that makes it worse.
Oh, I did not mean they're as bad as they used to be (fwiw I used them for gaming for years and they did just fine). But there's no way to not see any difference from IPS.

That review is the one I've quoted from above ;)
Posted on Reply
#53
TheGuruStud
No one knows system specs. One turd burglar on reddit does not a story make.

Cmon, don't wcctardtech this crap.
Posted on Reply
#54
Darmok N Jalad
I'm more interested to see what sort of APU solution they can come up with using Ryzen and Vega. While AMD hasn't had much desktop success, they have managed to secure several console wins, and I suspect that won't change since they hold the keys to the strongest APU combination. I would think that a Ryzen-based APU will wind up in PS5 and whatever comes out after X1X. I am curious if they will use HBM by that time.
Posted on Reply
#55
Footman
bugOh, I did not mean they're as bad as they used to be (fwiw I used them for gaming for years and they did just fine). But there's no way to not see any difference from IPS.

That review is the one I've quoted from above ;)
If you are looking for a 2560x1440 144mhz IPS Freesync monitor, then I just purchased the new Nixeus 27 EDG and I'm very impressed. Amazon have the basic version at $449 USD, but there is a more expensive version with a better stand for $499. Newegg also has these listed, although I believe that Amazon is the only reseller with the basic stand version so far.

Just saying.
Posted on Reply
#56
HD64G
Darmok N JaladI'm more interested to see what sort of APU solution they can come up with using Ryzen and Vega. While AMD hasn't had much desktop success, they have managed to secure several console wins, and I suspect that won't change since they hold the keys to the strongest APU combination. I would think that a Ryzen-based APU will wind up in PS5 and whatever comes out after X1X. I am curious if they will use HBM by that time.
Vega and Ryzen will be combined through IF (as they are both compatible) to create the perfect APU (CPU & GPU on same die maye?). Especially if they manage to get 2 X 1GB HBM into the package, we will see tremendous performance for an APU imho. For notebooks and next gen comsoles it will become a must to have an AMD APU if executed well.
Posted on Reply
#57
Capitan Harlock
My question is , what card is the same used in the Public presentation with Doom in 4k?
This RX Vega or the FT edition?
Posted on Reply
#58
efikkan
chaosmassivetheir CPU department start clawing back its market pie
its GPU part is very concerning,,,,
The big problem for AMD is their competitors are developing tirelessly.
In the CPU department, Ryzen is a huge step up from Bulldozer, but still >30% behind Skylake IPC. AMD need to keep developing, and I'm not talking about small tweaks. Intel will release its next architecture next year, so AMD needs to keep investing to keep up, relying on Zen for five years is not good news.
Regarding GPUs, AMD have been pretty stagnant since the launch of GCN, just minor tweaks while Nvidia keep innovating.
Meanwhile AMD have spent billions on projects like "skybridge" and K12, and APUs which are not profitable. AMD's research budget might be tight, but if they focused on two instead of five things, they could at least make a profit.
Hugh MungusWith AMD's cpu money RTG can hopefully increase its R&D budget and if rx vega is about as good as a 1080 now, it should at least outperform it in the long run when more optimized games are released and it gets better drivers, which still would make it a good long-term option.
The same story as always; default to waiting for "optimized" software. The final phase of the AMD product cycle.
bugI don't think they planned to underdeliver. But sometimes crap happens and without extra resources, you can't turn things around (i.e. run another silicon revision).
Well, they might not estimate the performance exactly, but they did know the consumption for a given clock frequency. Nvidia manages to get 200-300 MHz more, while consuming less energy. This is due to the chip design, which is no accident.
Posted on Reply
#59
bug
efikkanWell, they might not estimate the performance exactly, but they did know the consumption for a given clock frequency. Nvidia manages to get 200-300 MHz more, while consuming less energy. This is due to the chip design, which is no accident.
Design is one factor, but fab process and maturity is just as important.
I'm just saying, when designing a GPU it's not like you can set in stone TDP and processing power and then go on the achieve both. There are margins and sometimes it happens that you end up overstepping them. Because of various factors.
Posted on Reply
#60
efikkan
bugDesign is one factor, but fab process and maturity is just as important.
Sure, but the process is just fine. Polaris was taped out on both TSMC and Samsung, and AMD chose Samsung. The process is stable and mature by now. The process is not responsible for Vega consuming ~300W to compete with GTX 1080 on 180W, that's due to the chip design. A less efficient design(longer critical path) would require higher voltage to sustain a specific clock, which results in higher energy consumption. With the attributes of the process known, the consumption is very predictable before tapeout.
Posted on Reply
#61
Alduin
I hope vega top model is faster than
GTX 1080ti if not
We will have to see what volta brings
Posted on Reply
#62
cdawall
where the hell are my stars
RejZoRGeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
Um the geforce fx was openly relegated at the time as a complete failure.

The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.

Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
Posted on Reply
#63
S@LEM!
they could have safe themselves, money and time. by releasing Fury X 2 without all hype b$hit poor Volta thing
This kind of improvements over 50%+ power increase from competition is nothing but an overclocked Fury X with better yield manufacturing and a fancy HBM2.0 not for the sake of advanced technology but to cover up the monster power hunger of an aging architecture to let the clocks stable with that many shading units for 375W

Poor Vega, you got an idiotic hype of PR
Poor Raji, you had only one job
Posted on Reply
#64
GhostRyder
efikkanThe big problem for AMD is their competitors are developing tirelessly.
In the CPU department, Ryzen is a huge step up from Bulldozer, but still >30% behind Skylake IPC. AMD need to keep developing,
I don't know why you keep saying that, they are not 30% behind. But whatever...
cdawallThe Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.
Don't know if I would count that as a selling point, its kinda tardy to the party after promising it and then not delivering for years. If anything I think the only reason they did that is because mobile fermi chips were still around more than for the desktop counterparts (I can't really find people who use the desktop parts but some still have laptops that have a Fermi based chip inside).
cdawallNow lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
Yea I had a feeling...When things stay quiet you can guess whats going to happen. The only saving grace will be price unless this card is literally in the middle between a GTX 1080 and 1080ti in which it can at least be argued (Though price still has to be good).
Posted on Reply
#65
ZoneDymo
cdawallUm the geforce fx was openly relegated at the time as a complete failure.

The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.

Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
ermmm I think you might have to look up some benchmarks man, seems your memory is tainted by rose coloured glasses.
media.bestofmicro.com/1/2/242390/original/Crysis-1920.png
1.bp.blogspot.com/_9vgJ1nwu_xA/S64R8D60rWI/AAAAAAAAClg/r4madSxcd0E/s1600/GTX480+benchmark+tests+5cn7g6.png
Posted on Reply
#66
Vayra86
DimiBuy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. 165hz 1440p G-Sync monitor.

Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.
While totally offtopic, I gotta set this straight here.

399 for a TN is not great value, its selling an overpriced el-cheapo TN.
www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Look at the first picture there with the gradient bars. Top middle of screen is blue-ish, the rest of the bars are not. Backlight bleed and bad color uniformity + contrast shift of TN.
Color and black uniformity charts for that panel are among the worst of all panels in the comparison. White uniformity is good, but how often do you look at 100% white canvas? That's right, never, and if you do, its extremely unpleasant to the eye. The gamma of this monitor also does not stick to 2.2 which means that whatever you do, you'll crush blacks or lose bright tones.

So, may look good to Toms' (which I find odd, the review reads like an advertorial) but in reality its crap. That line about unable to distinguish from IPS is straight from the review as well. Hell you can't even calibrate this panel to not show visible DeltaE errors.

If you then look at the comments to owners of this monitor you can also see that it suffers from everything that belongs to budget-segment TN: bad QC, sharp edges on the plastic bezels, bad OSD buttons, etc etc etc.

Credibility - 1
Posted on Reply
#67
cdawall
where the hell are my stars
ZoneDymoermmm I think you might have to look up some benchmarks man, seems your memory is tainted by rose coloured glasses.
media.bestofmicro.com/1/2/242390/original/Crysis-1920.png
1.bp.blogspot.com/_9vgJ1nwu_xA/S64R8D60rWI/AAAAAAAAClg/r4madSxcd0E/s1600/GTX480+benchmark+tests+5cn7g6.png
That supports what I said. At least the second one and really a single game argument...

www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_480_SOC/10.html

That same 5870 doesn't look at hot once the 480 gets a factory oc
Posted on Reply
#68
RejZoR
cdawallUm the geforce fx was openly relegated at the time as a complete failure.

The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.

Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
There was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.
Posted on Reply
#69
Vayra86
RejZoRThere was such disparity with GeForce FX in terms of heat and consumption compared to Radeon 9000 series and yet they were selling despite everyone saying its crap. You'll NEVER EVER see that with Radeon cards. As presented with endless exhibits.
Let's face it then, that is also entirely up to AMD's marketing compared to Nvidia's. And to this day, that difference is still visible. You'd think that after a couple decades of experience they would wisen up abit, no?

Instead, even with Vega just now we get 'leaks' and 'events' that just look silly, are outright misleading, or don't give any new information, further fueling either a hype train so the product underdelivers (Fury X), or like Vega right now, massive disappointment before the release is even there.

And don't even get me started on the advertorials, even here on TPU, they're quite possibly even as low as you can go. Again, like I've said before, its completely bizarre.
Posted on Reply
#70
Kyuuba
Looks like a home party.
Posted on Reply
#71
Eric3988
I don't like the secrecy here, doesn't exactly inspire confidence. Say the card will compete with Nvidia's best and show us actual benchmarks or just undercut them like they've been doing all these years recently. Doesn't matter to me, just keep Nvidia honest at this point because they have been on top for too long and can get away with charging whatever they want for their higher tier cards.
Posted on Reply
#72
Hood
When a mouse challenges the cat to a fight, he must play a game of hide and seek, hoping to wear kitty out. It's a long shot; in the end kitty almost always wins. The mouse has huge balls, though, and that's why half the spectators are rooting for him. Too bad he has to die, the little guy has a good heart, just not enough resources.
Posted on Reply
#73
Basard
S@LEM!they could have safe themselves, money and time. by releasing Fury X 2 without all hype b$hit poor Volta thing
This kind of improvements over 50%+ power increase from competition is nothing but an overclocked Fury X with better yield manufacturing and a fancy HBM2.0 not for the sake of advanced technology but to cover up the monster power hunger of an aging architecture to let the clocks stable with that many shading units for 375W

Poor Vega, you got an idiotic hype of PR
Poor Raji, you had only one job
They did release an x2 Fury... named Pro Duo. It was a big fail.
Posted on Reply
#74
ZoneDymo
cdawallThat supports what I said. At least the second one and really a single game argument...

www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_480_SOC/10.html

That same 5870 doesn't look at hot once the 480 gets a factory oc
no it does not.... a stock HD5870 does only 4 fps less then a stock GTX480...
and you claimed you needed a HD5970 to get the same performance... in fact from there you see a HD5970 does a lot better.


and yeah nothing looks as hot as the GTX480 as the GTX480 was extremely hot ;)
Posted on Reply
#75
EarthDog
Ill bet they will both run hot, as do a lighter and bonfire with a yellow flame...but which has more energy behind it? :)
Posted on Reply
Add your own comment
Jan 11th, 2025 19:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts