• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces Radeon Vega Frontier Edition - Not for Gamers

Welp, always one step behind AMD :(

Why people assume it's one steep behind just because their release schedules don't line up as they used to and AMD's isn't the first? Time wise, sure, it's "late", but that's how it is. Product releases don't match up with release timeframes anymore and I don't think they'll line up any time soon. I mean, that would mean releasing Vega now and releasing Navi sometime in Fall 2017. Which is just not going to happen.
 
Not in-house...
Oh, "going to train 100k developers"
They are also in 50 million cars, aren't they?
"Much faster than 480" lol.

Keep clinging to that one. I have played that card and played that card the entire time that the FX series was their lineup. It isn't going to change how piss poor the FX design was and how horridly it ruined their name in the market. That was on them, not intel, not the consumers. They did a bad job and payed for it with market share.
.

Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.
 
This statement makes no sense to me. What do you mean by this?
I mean that nvidia's claims are rather far from reality.
 
Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

Nobody's disputing that. What we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.
 
Am I the only one that really likes that AMD isent saying anything about the gaming version of "Vega" ?.

Why should they follow Nvidias release cycle and make a bigger and better card than Nvidia has as their top tier card ? It isent where the money is, thats mainstream gaming and not in the enthusiast sigment.

Just look at RX480/580 they have done well and made a lot of much needed money for AMD, and belive me when I predict next quarters financial status of AMD, is going to be a hell of a lot better with Ryzen sale numbers comming out, and now Vega to fuel the GFX department after.

I for one dosent like early leaks, fake news, call it whatever you want, the public dosent have a right to know until release date, and I second that 100%.

For the moment, if you want the best of the best, just go green camp(GFX1080TI), its over priced in most countries anyway.
If Vega can compete within a margin of 5% with a lower pricetag, I would say its a slamdunk. Just let the top tier cards all belong to Nvidia. Let them use a lot of money on developing a new arkitecture just for the bragging rights, and low profit pr unit, then you fill up the mainstream section with competive alternatives and make a much better profit pr unit in the end, I really like this a lot:clap:, since AMD has been bleeding a lot of money which they need to come back again.
 
Last edited:
What we are saying is that after the P4/Athlon 64 era, when AMD should have been able to capitalise, they instead shot themselves in the foot - multiple times - with the uncompetitive Bulldozer architecture.

They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)

There is something about "their own fault" narrative I simply cannot grasp.
 
They struggled with fab expenses and had to sell it, because they actually were not able to properly capitalize on superior products is what my memory tells me.
Similar story with Fermi outselling fantastic products AMD had. (can't blame this one on strongarming though, basically underdog always has to fight an uphill battle)

There is something about "their own fault" narrative I simply cannot grasp.

Fermi was hot and hungry but Nvidia did something people thought not possible. They dropped a full core Fermi as the 580.
It helped stem the bleeding from a power monster and set them up well.
 
  • Like
Reactions: 64K
The existence of ryzen seems to disagree with you. It's performance is the definition of "enthusiast". It's a whole 2-5% slower in games (while still maintaining over 100FPS) and wallops intel in creative production, with better price/perf to boot and an 8 core chip in the standard socket that doesnt suck.

That simply isnt true. When AMD was competitive, IE the 5000 and 7000 series, They sold quite well.

AMD could never get their driver game together when they had good hardware, and that scared away quite a few consumers. By the time they got their drivers mostly straightened out (400 series), their hardware was a generation behind.

Given how complicated GPUs are, it will take years for RTG to straighten out the damage that AMD did to the ATI GPU. With ryzen selling well and vega finally out, R+D should finally start catching up to nvidia.
On the 5000 series AMD did have the better product but in actuality Nvidia still outsold them. I believe adoredtv did a video on this.
 
Come on, more expensive, more power hungry, SLOWER Prescott outsold Athlon 64.

God knows what "market share" is shown on your picture, which country it is and if it is number of units or revenue. I'd bet on the former.

Market share didn't plummet until core 2 duo. AMD should have exceeded Intels market share during the p4 era, but bad business practices prevented that.

As for the chart that's the world Wide one that was posted on tpu not even like two months ago... So I don't know ask the news editors?
 
Or how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.
 
Or how people were still buying TOTALLY inferior GeForce FX cards even though any common sense would tell you to stay away from them like vampire from garlic essence. But when roles are reversed and AMD is just mildly worse at few specific things OMG OMG AMD SUXORZ (because only thing slightly worse is power consumption, in case of Polaris). Sometimes you just can't understand people.

Slightly worse? AMD doesn't offer any thing in the HEDT market period.
 
Vega has been hyped as a gamer card for over a year (also keep in mind we're almost a year out from the 1070/1080 releases), and just now less than 3 weeks before E3 it's suddenly switched up to be a professional/deep learning card instead of a gaming one?

It's amazing what AMD has been able to accomplish in the David vs. Goliath competition with Intel and Nvidia. But the reality is that there's no way with their relatively tiny R&D they can hang with them. Marketing is cheap by comparison and AMD is great at that. So over and over it's the same story with incredible claims pre-release that get people salivating, and then varying degrees of let down and defending AMD after release. AMD is to be commended for what they've been able to do. But I'd be kicking myself had I been waiting for a year now for Vega to learn it's both delayed again in gaming form, and that it's a professional card first.
 
Slightly worse? AMD doesn't offer any thing in the HEDT market period.

I'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.
 
Did we actually get any new useful information about Vega yesterday?
 
What are you all complaining about. This conference was not for consumers"gamers" it was basically for shareholders and investors. Our time is at at the end of the month. Vega is a segmented product. When a company holds a press release that outlines it's strategy for the next 2-5 years, how is it so that people hoping to hear info on an unreleased get upset because they didn't hear what they wanted to hear. I'm pretty sure that's what Computex is all about (us gamers). AMD did not hold any of you guys' hand and told you all not to make a purchase.

@efikkan yes and no....nothing concrete just things to majkere assumptions on again.
 
I'm talking Polaris vs Pascal (in its respective segment aka RX 580 vs GTX 1060). Only thing RX 580 can be objected is power consumption and even that one quite frankly isn't bad at all. They aren't loud, they aren't slow. And yet everyone is taking a piss at them almost more than anyone ever has back in the day for GeForce FX which was hot, loud, more power hungry and had a terminal design flaw in the rendering pipeline.

Performance of a 1060 and the power consumption of a 1070/1080. Depending on the card they are loud under load, even worse when you have two of them.
 
That's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.
 
That's the exact thing I don't understand people. It doesn't fucking matter. And it can't be loud, because if it is, then my GTX 980 is also stupendously loud. And I can hardly hear it. We used to all run 300W god damn graphic cards and no one bat an eye. But now all of a sudden, power consumption goes above framerate in priority which is like the basics of why we even bother buying this shit.

Power consumption has been a topic for discussion pretty much since they added power connectors to the card.

Remember I ran two of these prior to my 1080ti. They were some of the better cards out there. They were also hot, loud and slower than one Ti.

Now with a bios mod I have already seen over 380w worth of consumption from my Ti, but that's a far cry less than the 520w I saw from my pair of 480's.
 
Average Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.
 
Average Joe doesn't buy a pair of cards when they mostly can't afford a single one... And yes, you're all overdramatizing with the power consumption. Also, RX 480 is a 150W card (after the applied driver fix). That's 300W for a pair, not 520W... Do I really have to also drag out the fact that 2 cards were NEVER more efficient than single GPU card? It's physically impossible.

Couple of things 520w was dramatized that was actual power draw and it was enough that I had to go from my 750w platinum seasonic to a 1200w for stability.

Raja himself said that two 480's was more efficient than a 1080.
 
Let's look at it as a comparison of Fiji chip versus Vega chip

They can't put it next to the pro duo which was the last professional card released because it is slower. Unluckily people haven't quite taken that into thought yet. If it is slower than a pair of Fiji cards it's slower than the 1080ti let alone a titan.
 
Back
Top