Wednesday, July 19th 2017
AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080
On the first stop in AMD's two-continent spanning RX Vega tour (which really only counts with three locations), the company pitted their upcoming RX Vega graphics card (we expect this to be their flagship offering) against NVIDIA's GTX 1080 graphics card. The event itself was pretty subdued, and there was not much to see when it comes to the RX Vega graphics card - literally. Both it and the GTX 1080 were enclosed inside PC towers, with the event-goers not being allowed to even catch a glimpse of the piece of AMD hardware that has most approximated a unicorn in recent times.
The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
Sources:
Reddit User @ Szunyogg, RX Vega Budapest Google Photos, WCCFTech
The Vega-powered system also made use of a Ryzen 7 processor, and the cards were running Battlefield 1 (or Sniper Elite 4; there's lots of discussion going on about that, but the first image below does show a first-person view) with non-descript monitors, one supporting FreeSync, the other G-Sync. The monitor's models were covered by cloth so that users weren't able to tell which system was running which graphics card, though due to ASUS' partnership in the event, both were (probably) of ASUS make. The resolution used was 3440 x 1440, which should mean over 60 FPS on the GTX 1080 on Ultra. It has been reported by users that attended the event that one of the systems lagged slightly in one portion of the demo, though we can't confirm which one (and I'd say that was AMD's intention.)All in all, I have to say, this tour doesn't inspire me confidence. This isn't the kind of "in your face" comparison we're used to seeing from companies who know they have a winning product; should the comparison be largely in favor of AMD, I posit the company would be taking every advantage of that by showcasing their performance leadership. There did seem to be an inordinate amount of smoke and mirrors here, though, with AMD going out of its way to prevent attendees from being able to discern between their and their competitors' offering.AMD reportedly told attendees that the AMD and NVIDIA systems had a $300 difference in AMD's favor. All other hardware being equal, and accounting for AMD's stance that a FreeSync monitor tends to cost around $200 less than a comparable NVIDIA G-Sync enabled one, that leaves around $100 savings solely towards the RX Vega part of the equation. This means the RX Vega could sell around the $459-$500 bracket, if current pricing of the GTX 1080 is what AMD considered.
175 Comments on AMD's RX Vega Low Key Budapest Event: Vega Pitted Against GTX 1080
I will likely buy one though as I am still looking for a faster video card to replace my RX 580 for Freesync gaming on my 2560x1440 144hz monitor....
That review is the one I've quoted from above ;)
Cmon, don't wcctardtech this crap.
Just saying.
This RX Vega or the FT edition?
In the CPU department, Ryzen is a huge step up from Bulldozer, but still >30% behind Skylake IPC. AMD need to keep developing, and I'm not talking about small tweaks. Intel will release its next architecture next year, so AMD needs to keep investing to keep up, relying on Zen for five years is not good news.
Regarding GPUs, AMD have been pretty stagnant since the launch of GCN, just minor tweaks while Nvidia keep innovating.
Meanwhile AMD have spent billions on projects like "skybridge" and K12, and APUs which are not profitable. AMD's research budget might be tight, but if they focused on two instead of five things, they could at least make a profit. The same story as always; default to waiting for "optimized" software. The final phase of the AMD product cycle. Well, they might not estimate the performance exactly, but they did know the consumption for a given clock frequency. Nvidia manages to get 200-300 MHz more, while consuming less energy. This is due to the chip design, which is no accident.
I'm just saying, when designing a GPU it's not like you can set in stone TDP and processing power and then go on the achieve both. There are margins and sometimes it happens that you end up overstepping them. Because of various factors.
GTX 1080ti if not
We will have to see what volta brings
The Fermi parts also just demolished amd in performance. The only and card capable of keeping up the gtx480 was the dual gpu 5970. The gtx590 came out and swept the field. It also just receives updates for dx12, driver updates on cards that are 7 years old. Something we are not seeing from AMD.
Now lets look at this AMD card. 440w for performance equal to a 180w gtx1080. Even Fermi didn't have a disparity like that. In fact Fermi consumed half that power and compete within 30 watts of the 5970.
This kind of improvements over 50%+ power increase from competition is nothing but an overclocked Fury X with better yield manufacturing and a fancy HBM2.0 not for the sake of advanced technology but to cover up the monster power hunger of an aging architecture to let the clocks stable with that many shading units for 375W
Poor Vega, you got an idiotic hype of PR
Poor Raji, you had only one job
media.bestofmicro.com/1/2/242390/original/Crysis-1920.png
1.bp.blogspot.com/_9vgJ1nwu_xA/S64R8D60rWI/AAAAAAAAClg/r4madSxcd0E/s1600/GTX480+benchmark+tests+5cn7g6.png
399 for a TN is not great value, its selling an overpriced el-cheapo TN.
www.tomshardware.com/reviews/dell-s2417dg-24-inch-165hz-g-sync-gaming-monitor,4788-5.html
Look at the first picture there with the gradient bars. Top middle of screen is blue-ish, the rest of the bars are not. Backlight bleed and bad color uniformity + contrast shift of TN.
Color and black uniformity charts for that panel are among the worst of all panels in the comparison. White uniformity is good, but how often do you look at 100% white canvas? That's right, never, and if you do, its extremely unpleasant to the eye. The gamma of this monitor also does not stick to 2.2 which means that whatever you do, you'll crush blacks or lose bright tones.
So, may look good to Toms' (which I find odd, the review reads like an advertorial) but in reality its crap. That line about unable to distinguish from IPS is straight from the review as well. Hell you can't even calibrate this panel to not show visible DeltaE errors.
If you then look at the comments to owners of this monitor you can also see that it suffers from everything that belongs to budget-segment TN: bad QC, sharp edges on the plastic bezels, bad OSD buttons, etc etc etc.
Credibility - 1
www.techpowerup.com/reviews/Gigabyte/GeForce_GTX_480_SOC/10.html
That same 5870 doesn't look at hot once the 480 gets a factory oc
Instead, even with Vega just now we get 'leaks' and 'events' that just look silly, are outright misleading, or don't give any new information, further fueling either a hype train so the product underdelivers (Fury X), or like Vega right now, massive disappointment before the release is even there.
And don't even get me started on the advertorials, even here on TPU, they're quite possibly even as low as you can go. Again, like I've said before, its completely bizarre.
and you claimed you needed a HD5970 to get the same performance... in fact from there you see a HD5970 does a lot better.
and yeah nothing looks as hot as the GTX480 as the GTX480 was extremely hot ;)