Wednesday, July 19th 2017

AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.

The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)
All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.
AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Sources: Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
Add your own comment

175 Comments on AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080

#26
bug
EarthDog440w and 1080 performance... :(
Not each and every design is a winner. But in this instance it looks like we'd get better performance if we allowed Polaris to draw that much power. Oh well, the wait is almost over, we should have our answers soon.
Posted on Reply
#27
Fluffmeister
It does increasing look like Vega is going to be a turkey.

Look forward to the reviews, when are they due?
Posted on Reply
#28
Crap Daddy
JB_GamerWell I truly believe that no-one wants that scenario - "go home AMD, you're done", not even the most green-faced Nvidia fanboys/die-hards, what then, leaving Nvidia without competition?!
Of course not. The correct phrase would be "Go home AMD, you're drunk" Sober up and come back with something good. Like Ryzen. As for competition, there's none at the high-end level since a few good years now, it's just getting worse.
Posted on Reply
#29
rtwjunkie
PC Gaming Enthusiast
buggalugsThis is the new normal It seems most companies are releasing products ahead of reviews these days, and they're trying hard to control the message. Not just AMD but Intel is doing it too and Nvidia did it.

The old days of companies giving review sites products to review before launch is over.

Its going to be hard for AMD and Vega, Im sure Nvidia is sitting on new cards, ready for an answer to vega launch. I hope AMD planned ahead.
It's not just the hardware manufacturers either. Many games are going the same route: release, then review.
Posted on Reply
#30
RejZoR
EarthDog440w and 1080 performance... :(
GeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
Posted on Reply
#31
EarthDog
RejZoRGeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
Lol, people complained back then too about it..wth are you talking about?? To that end, it actually beat out whatever it was competing against from ati at the time, didnt it? It also wasnt 440W worth of single gpu either. 250w tdp vs 375W. This is a record afaik for a single gpu tdp a full 50% more power for less performance (1080ti and rx vega xtx). Put it up against the 1080 which it appears to compete with, and that becomes 180w vs 375w.

Nobody is saying its garbage, its going to have to compete on price again since performance right now has it on par with a 1080 and power to performamce is abhorrent.
Posted on Reply
#32
TheLostSwede
News Editor
bugHarsh words.
Remember, AMD doesn't have infinite resource (neither does anyone else) and they covered a lot of ground with their CPUs in the past year. GPU division not keeping up isn't totally unexpected.
You're aware that the CPU's and GPU's are developed by two entirely different teams, spread between different locations, right? And if as you say, AMD has such limited resources, then maybe they should've spent them more wisely and maybe not made a bunch of marketing noise about a product that looks like it'll be a dud at best. The worst thing you can do in the tech industry is over promise and under deliver.
Posted on Reply
#33
Alejandrodg82
The Quim ReaperIf Vega costs more than $399, they may as well not bother.

Performance isn't what will sell the card, price is.

Offering similar levels of performance for Nvidia levels of money will accomplish nothing and will only sell to the AMD die-hards, which is a tiny percentage of the GPU market.

Price is the only trick card they can play, being so late to market, and if they don't play it, well....go home AMD, you're done.
Couldn´t agree more. Who is gonna buy this allegedly "power hungry" card, even when Volta is just arround the corner. If the price isn´t right, this is gonna be a mayor flop for them. It´s like they always have some caveat that pushes me away from their products.

I was about to buy a freeesync ultrawide (CF791) and hoping to put a Vega GPU. Between the flickering issues that have surfaced with freesync and this... It just makes me want to stay with nvidia and their damn expensive ecosystem, just to play it safe.
Posted on Reply
#34
bug
TheLostSwedeYou're aware that the CPU's and GPU's are developed by two entirely different teams, spread between different locations, right? And if as you say, AMD has such limited resources, then maybe they should've spent them more wisely and maybe not made a bunch of marketing noise about a product that looks like it'll be a dud at best. The worst thing you can do in the tech industry is over promise and under deliver.
I don't think they planned to underdeliver. But sometimes crap happens and without extra resources, you can't turn things around (i.e. run another silicon revision).
Posted on Reply
#35
the54thvoid
Super Intoxicated Moderator
RejZoRGeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
Like @EarthDog has said, people did mention it back then. AMD even created PR about how hot Fermi was. Everyone mocked the GTX480 for being hot and hungry. AMD people lapped it up. Hell, that's why I went with HD5850’s.

www.guru3d.com/news-story/ati-ad-that-mocks-nvidia-fermi-spotted-on-youtube.html
Posted on Reply
#36
Anymal
RejZoRGeForce FX was hot and power hungry, but it generally had ok framerate in things that weren't DX9. It sold well anyway. Fermi was hot and power hungry. Still had somewhat decent framerate as well. Still sold. But when AMD pulls the same thing, OH MAH GOD, AMD IS GARBAGE OMG OMG OMG. Like, c'mon, are people all 5 years old?
Dang, you are the AMD Fanboy.
Posted on Reply
#37
Dimi
Alejandrodg82Couldn´t agree more. Who is gonna buy this allegedly "power hungry" card, even when Volta is just arround the corner. If the price isn´t right, this is gonna be a mayor flop for them. It´s like they always have some caveat that pushes me away from their products.

I was about to buy a freeesync ultrawide (CF791) and hoping to put a Vega GPU. Between the flickering issues that have surfaced with freesync and this... It just makes me want to stay with nvidia and their damn expensive ecosystem, just to play it safe.
Buy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. 165hz 1440p G-Sync monitor.

Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.
Posted on Reply
#39
bug
the54thvoidLike @EarthDog has said, people did mention it back then. AMD even created PR about how hot Fermi was. Everyone mocked the GTX480 for being hot and hungry. AMD people lapped it up. Hell, that's why I went with HD5850’s.

www.guru3d.com/news-story/ati-ad-that-mocks-nvidia-fermi-spotted-on-youtube.html
And even so, those part weren't offering the performance of previous generation's ATI parts. Though the FX series tended to suck across the board.
DimiBuy the Dell S2417DG you can get it on amazon for 399 atm, i've seen it as low as 350. 165hz 1440p G-Sync monitor.

Its incredible and worth every penny. Coming from an IPS panel, i can't even tell this is a TN panel.
You have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.
Posted on Reply
#40
EarthDog
MEH, he won't respond. Too busy trying to get that foot out of his mouth he swallowed whole.
Posted on Reply
#41
Dimi
bugYou have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.
Look up some reviews, you'll find that more people share my opinion. I have another TN panel next to the S2417DG and it looks miles worse than my new one.

But sure, if you wanna be an ips elitist, you pay the price plus all the other major issues it brings with it. No thanks.
Posted on Reply
#42
Gasaraki
bistrocratit is just sad... sad that we must wait for 1.5 years (since gtx 10XX) to get this -100$(maybe) cheaper product from competitor, so nvidia will level down those -99$ for their 1.5 year old GPU's and wont haste to release next gen - because there is nothing in market that challanges nvidias market share. and the fact that these 2 year cycles with baerly no price cuts in betwean is in time frame when 4K or 120fps+ gaming monitor prices are really low and affordable is really sad :(
Not really 1.5 years yet but I get your point. This is why nVidia stocks are through the roof and AMD's are in the poopers.
Posted on Reply
#43
r9
I really really wanted amd to do well, but rx vega is the worst thing they or every one else ever released period. Huge chip expensive as f#&$ to produce, not to mention expensive as f$#_ memory and the cherry on top that ridiculous power draw. If I was Amd I would save my self a lot of embarrassment and never release Vega.
Posted on Reply
#44
Gasaraki
bugHarsh words.
Remember, AMD doesn't have infinite resource (neither does anyone else) and they covered a lot of ground with their CPUs in the past year. GPU division not keeping up isn't totally unexpected.
That is just an excuse people make every time. Remember before AMD, ATi was their own company. The AMD graphics department (ATi) is separate from the cpu department. They are not sharing resources or fab or anything.
Posted on Reply
#45
bug
DimiLook up some reviews, you'll find that more people share my opinion. I have another TN panel next to the S2417DG and it looks miles worse than my new one.

But sure, if you wanna be an ips elitist, you pay the price plus all the other major issues it brings with it. No thanks.
Ok, made me look.

Tom's Hardware:
A 24-inch TN panel of this quality can almost fool you into thinking you're looking at an IPS panel, until you move past 45° off-axis. There you’ll see the expected green color shift and 50-60% light falloff. From the top, detail is reduced significantly as well. However, at normal viewing distances and angles, the S2417DG is one of the better TN monitors we’ve seen. When gaming, we didn’t notice a problem, and we didn’t pine for an IPS screen.
PC Monitors:
Due to the viewing angle limitations, we will not be providing analysis of colour temperature variation using the colorimeter. The perceived variations here due to these viewing-angle related shifts can largely counteract measured deviations, even when you’re simply observing different sections of the screen from a normal viewing angle.
Neither seem to be thinking they were looking at an IPS and I couldn't find other reviews yet.
Posted on Reply
#46
noname00
Maybe AMD hoped the HBM2 price would be lower by now, and the high price of HBM2 is one of the reasons the card was delayed so much.

Between the high price of HBM2 (actually high price of ram memory in general) and high power consumption, I don't see how AMD will make a decent profit from Vega if it won't be at least as fast as a 1080ti.
Posted on Reply
#47
dozenfury
Based on AMD's recent history and what has dribbled out, I'm assuming the Vega ends up slightly faster than the 1070 but a bit slower than the 1080. That would be fair price/performance for $399 especially if there is a little headroom for improvement with driver optimizations. That price is also assuming the mining price gouging starts to wind down.
Posted on Reply
#48
TheLostSwede
News Editor
bugYou have to be blind to not be able to tell TN Film from IPS. TN Film changes contrast as soon as you move up or down in your chair. Also, IPS glows.
The latest generation of TN panels are actually not as bad as they used to be. See this review for example www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Also, not all IPS panels suffer badly from the glow, but some have a weird coating that makes it worse.
Posted on Reply
#49
Unregistered
dozenfuryBased on AMD's recent history and what has dribbled out, I'm assuming the Vega ends up slightly faster than the 1070 but a bit slower than the 1080. That would be fair price/performance for $399 especially if there is a little headroom for improvement with driver optimizations. That price is also assuming the mining price gouging starts to wind down.
"1075 ti" performance with driver updates, or probably even a "gtx 990" with its high power draw and heat outpuy. Better performance than a 1080, but also higher power draw and probably about the same fps/watt than/as a 980 ti. If games are well-optimized for vega and the drivers work properly, vega may end up being a bit of a dark horse in the lomg run, but that remains to be seen. 499 dollars would be a reasonable price tag IMO, especially if you factor in the money saved from a g-sync monitor. 300 dollars/euros less (depending on region) for the same performance seems like a good deal to me and even with a normal monitor, that does make 1080 performance a bit more accesible because of a 100 dollar saving. More than 500 dollars could be a tough sell.
#50
Prince Valiant
Dimi200$ difference between G-sync & Freesync? Yeah if you buy shitty monitors from Asus & Benq maybe.
It's not like there are tons of panel options for game oriented monitors. Any IPS at 144/165hz is still that accursed AUO panel as far as I know.

Edit: Waaait, nevermind.
Posted on Reply
Add your own comment
Jan 11th, 2025 19:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts