Wednesday, July 19th 2017
AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080
On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.
The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Sources:
Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
175 Comments on AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080
You're bitching about bad PR and then in the same breath, you expect AMD to say outloud they don't have the drivers finished yet. That wouldn't be a bad press then, somehow... It's no secret that AMD doesn't have the resources to pull "magic" shit NVIDIA does, so give them some bloody slack, geez. Everyone pissing on AMD like everyone's life depends on it. Tired of waiting? Buy god damn GTX 1080. Many have and many will. I've decided to wait even though I was this <> close to just hitting a BUY button for AORUS GTX 1080Ti. Didn't wait last time with Fury X, but will now. RX Vega may turn out to be a good card despite power consumption, but if it doesn't, it'll still make some sort of competition on the market, making NVIDIA cards potentially cheaper. If that makes just 50€ less, then so be it. Right now is actually the worst time to be jumping to anything with RX Vega being literally around the corner. But whatever man, gotta go on beer with the fanboys now...
Also, those 440W. Yeah, it's a lot. But that's water cooled AND overclocked version where you can place the radiator on the case exhaust. Meaning it won't really affect case internal temperature. For normal stock state it's 350W and for air cooled, 300W. Still not ideal, but for the right price and maybe special features, who cares? Sure, it'll heat up the room, but so does my GTX 980. I have to run AC anyway. So, what difference does it make? For winter, you'll save up on heating. This is no joke, I've had just PC heating up my place for 2 winters now. The central heating radiator was closed except for the coldest days.
People make way too much drama about power consumption. If it's great, excellent. If it's not, then you check other benefits or tradeoffs and decide. But just universally taking a piss at products that have a certain power draw or thermals is becoming a really annoying habit of the actual fanboys. Particularly from the green camp. My HD7950 at 1.2 GHz was also a freaking furnace. But it was stupid fast. I didn't care. That was my decision. And I have the same with GTX 980. I could run it ultra cool at stock. But I've decided to max it all out. It's also a furnace, I could fry eggs on the backplate, it's that hot. But that's what I wanted and willingly decided for it. Who are you to say what I want or don't want? And same applies to all potential buyers of RX Vega. They are adults for the most part, we don't need your parroting how Vega's power draw is shit and horrible. We'll decide about that when it's actually released.
You could have an entire custom loop cooled 1080ti/7700k rig for less wattage than the Vega card shown. Pot meet kettle. You have been doing the exact same thing as a pro for Vega for months. VRM design should have been the dead giveaway that this card would be eating. AMD didn't invest money into the beat reference VRM section for fun. These cards suck power like it is going out of style. Can you name a time when any manufacturer put the worst silicon on the most expensive cards it sells? Want to know something scary? There is a chance that is the best silicon AMD has right now.
"The Radeon™ Vega Frontier Edition graphics card is designed to simplify and accelerate game creation by providing a single GPU that is optimized for every stage of this workflow, from asset production, to playtesting, to performance optimization."
So which game features have AMD intensionally disabled on the card they call optimized for gaming? So, how much time are they going to need? Another year? They've had more time than usual to polish the driver. Exactly. It would be unusual if they picked the lower binning and sold it for more…
It's not what AMD has intentionally disabled, it's what AMD hasn't implemented yet... Vega FE was released without that because it was assumed it's not a game changer if it lacks that stuff for now. A pure gaming card like RX Vega however, entirely depends on those capabilities. If you don't have them yet, releasing it is somehow pointless. It's why I keep on bitching over idiots testing Vega FE as if it was a pure gaming card.
How much they are gonna need? RX Vega is getting released for real at the end of this month. That much.
AMD might be starting a new trend of giving developers access to new silicon before anyone else so they have an opportunity to optimize for it before the main product launches.
We don't really know until the consumer RX Vega card launches. Frontier Edition was just weird.
Thing is, we're just speculating. AMD has never said why they released a Frontier Edition ahead of the main product. They also never said why Vega keeps getting kicked down the road. I would love to be a fly on the wall in RTG meetings on Vega to hear their reasoning for these things.
384-bit GDDR5X is faster than 2048-bit HBM2 BTW… We do know HBM(2) supplies are a problem, even for Nvidia.