• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Works on At Least Three Radeon RX Vega SKUs, Slowest Faster than GTX 1070?

That would show a massive improvement in the process. I mean the BEST RX480/580's can hit 15xx with an overclock on highend air and water. To release a card at stock with those numbers would be top 5% of 480/580 dies. This die is even larger...I cannot imagine the binning that would be required with current tech

Isn't the shader density lower than polaris? I mean if that's the case in theory the thermals should be more manageable...
 
Oh...is it suddenly ok to discuss this topic now? Really? Says who? Double standard much?

The things was certain individuals were going: "Hey look at this bench performs worse than 1080, Vega is going to suck donkey balls." Now it turns out that there 3 SKUs, doesn't that make the aforementioned assessment kinda premature and baseless? I don't think discussing the subject is bad but do think crystal balling based on scraps of info found laying about the internets is. Now, how is this a double standard when there aren't any conclusion being made other than the possibility of multiple SKUs.
 
Last edited:
Isn't the shader density lower than polaris? I mean if that's the case in theory the thermals should be more manageable...

I honestly don't know I have not looked into the design of Vega much since nothing out is from AMD. I will look into it more on actual product release.
 
You do realize the 1070, 1080 & 1080Ti are overpriced right? GPUs should never cost so much. It's ridiculously over priced. Hopefully AMD doesn't make the same mistake over pricing VEGA, regardless of how fast it is.

Are they expensive? Hell yes!

Are they over-priced? Not at all.

So long as people keep buying them at their current prices the market will allow for Nvidia to charge more and more with each new generation.

If they were not selling and stores were left with stock on the shelves and in their warehouses, THEN they would be over-priced. That's just how free-markets work. If you don't like it, speak with your money and hope others follow suit.
 
Are they expensive? Hell yes!

Are they over-priced? Not at all.

So long as people keep buying them at their current prices the market will allow for Nvidia to charge more and more with each new generation.

If they were not selling and stores were left with stock on the shelves and in their warehouses, THEN they would be over-priced. That's just how free-markets work. If you don't like it, speak with your money and hope others follow suit.

Yep, it's a free market and consumers dictate what is and isn't acceptable.

The whole over-priced argument is laughable.
 
Are they expensive? Hell yes!

Are they over-priced? Not at all.

So long as people keep buying them at their current prices the market will allow for Nvidia to charge more and more with each new generation.

If they were not selling and stores were left with stock on the shelves and in their warehouses, THEN they would be over-priced. That's just how free-markets work. If you don't like it, speak with your money and hope others follow suit.

Yep, it's a free market and consumers dictate what is and isn't acceptable.

The whole over-priced argument is laughable.


Your reasoning is so flawed I don't even know where to start.
Overpricing is a common tactic. When you are the de facto only player in a particular market you can dictate the price of your product. You can rise prices only so much BTW, because there is that sweet spot where you are stealing the maximum amount of money from the consumer without him noticing. This sweet spot is waaaaaaaaaay over the real "fair" price of your product in a competitive situation.
So, the product IS, in fact, overpriced.
Being competition what it is, if Vega delivers, you will see the prices where they should have been.
 
Are they expensive? Hell yes!

Are they over-priced? Not at all. (???)

So long as people keep buying them at their current prices the market will allow for Nvidia to charge more and more with each new generation.

If they were not selling and stores were left with stock on the shelves and in their warehouses, THEN they would be over-priced. That's just how free-markets work. If you don't like it, speak with your money and hope others follow suit.
Not overpriced?? Dude, you either have more money than common sense or you don't really know what you are talking about...

Do you know for example how much it was costing the fastest video card in, let's say, 2009?? yeah, 350$. That's because back then there was still a competition going on for high-end.
Now since nVidia is owning everything the price just got doubled.
And you are saying is not overpriced?? :))))))))
 
Yeah yeah if and if when and when, NV are mean blah blah and blah.
 
Prices increased even when competition existed. I remember when costs of top of the line GeForce 2 GTS were around 69.000 SIT (Slovenian Tolars). That's an equivalent of today's ~290€. And we had ATi, Matrox and 3dfx back then. Now, top of the line cards go for 700-800€. I think it's not the competitin but the complexity of the chips. As they get more powerful, costs of making them also increased. GeForce 2 GTS looks like a toy even when compared to bottom most model like Radeon RX550 for example...
 
Not overpriced?? Dude, you either have more money than common sense or you don't really know what you are talking about...

Do you know for example how much it was costing the fastest video card in, let's say, 2009?? yeah, 350$. That's because back then there was still a competition going on for high-end.
Now since nVidia is owning everything the price just got doubled.
And you are saying is not overpriced?? :))))))))
GTX 285 costed $400 (~$455 today) and GTX 295 costed $500 (~$570today)
But that was a refresh dropping the price of GTX 280 at $649 (~$735today).

Back in 2006 8800 GTX costed $600 (~$725 today).

So no, it's not overpriced. Get some knowledge instead of spreading misinformation.
 
adjusting for inflation (you can find dozen of tools online), 290€ from year 2000 will equate to 406€ of today.
Keep in mind that, overall, production cost have been halved in the meantime. Producing a chip in 2000 cost much more than producing it today. Memory has fluctuated though, due to availability.
 
Last edited:
GTX 285 costed $400 (~$455 today) and GTX 295 costed $500 (~$570today)
But that was a refresh dropping the price of GTX 280 at $649 (~$735today).

Back in 2006 8800 GTX costed $600 (~$725 today).

So no, it's not overpriced. Get some knowledge instead of spreading misinformation.
Misinformation? Hardly. Do some more research.
Ever heard of ATI HD 5870 cards? They smoked out of the water any overpriced nVidia cards released back then:
https://www.techpowerup.com/102342/radeon-hd-5870-aggressively-priced-report
https://en.wikipedia.org/wiki/Radeon_HD_5000_Series#Desktop_products
http://www.anandtech.com/show/2841
 
GTX 285 costed $400 (~$455 today) and GTX 295 costed $500 (~$570today)
But that was a refresh dropping the price of GTX 280 at $649 (~$735today).

Back in 2006 8800 GTX costed $600 (~$725 today).

So no, it's not overpriced. Get some knowledge instead of spreading misinformation.

Indeed, found this interesting: http://hexus.net/tech/news/graphics...price-history-high-end-nvidia-gpus-tabulated/

Prices are in line, and that is without any competition for what seems like donkeys years.
 
This is what AMD released officially.

AMD-VEGA-VIDEOCARDZ-16.jpg

AMD-VEGA-VIDEOCARDZ-13.jpg

AMD-VEGA-VIDEOCARDZ-12.jpg

AMD-VEGA-VIDEOCARDZ-26.jpg


Holly Cow, didn't realize the pics were so large. There's many more, but here is a few.
 
adjusting for inflation (you can find dozen of tools online), 290€ from year 2000 will equate to 406€ of today.
Keep in mind that, overall, production cost have been halved in the meantime. Producing a chip in 2000 cost much more than producing it today. Memory has fluctuated though, due to availability.

Still, 406€ for a top of the line GPU compared to almost twice as much today doesn't attribute to only AMD not being a tough competition. And how can be a chip with 400x more transistors at smaller node easier and cheaper to prduce than a large chip with few million transistors?
 
They are not overpriced. Nvidia' prices are in line where they were ten years ago. It's the free market, and only takes a simple understanding of economics to understand the market will bear whatever price people are willing to pay.

It has nothing to do with whether I have too much money and can afford to waste. Notice I don't have Nvidia's latest and greatest, and only bought an EOL 980Ti when 1080 was released. So it's not personal preferences or desire to waste money. But I understand economics, and can thus objectively state that if people were not buying all that Nvidia can sell at current prices, then said prices would drop.
 
They are not overpriced. Nvidia' prices are in line where they were ten years ago. It's the free market, and only takes a simple understanding of economics to understand the market will bear whatever price people are willing to pay.

It has nothing to do with whether I have too much money and can afford to waste. Notice I don't have Nvidia's latest and greatest, and only bought an EOL 980Ti when 1080 was released. So it's not personal preferences or desire to waste money. But I understand economics, and can thus objectively state that if people were not buying all that Nvidia can sell at current prices, then said prices would drop.

This is just plain wrong

https://en.wikipedia.org/wiki/GeForce_600_series

Nvidia's top end 600 series card was the 680 for only $500. Nvidia created two new GPU tiers in the Ti (first in the 700 series) cards and the Titan cards and charged significantly more for their full die than previous generations. You are straight up paying much more than you used to.

So if you are following GPU pricing, it's plain to see that Nvidia is milking it's customers due to lack of competition.
 
Nvidia's top end 600 series card was the 680 for only $500.
Except.......you know nothing about NVIDIA's chip structure. The 580 was the last 80 series to use the top level chip. NVIDIA uses x04 and x06 for example on their midlevel and lower chips. All their high ends end in x00 or x110. GK104 in the 680...yeah, NOT top end.

That 680 was NVIDIA's MID-LEVEL Kepler chip. So, $500 almost 4 years ago.....yep, about right where it is now.
 
Last edited:
  • Like
Reactions: 64K
This is just plain wrong

https://en.wikipedia.org/wiki/GeForce_600_series

Nvidia's top end 600 series card was the 680 for only $500. Nvidia created two new GPU tiers in the Ti (first in the 700 series) cards and the Titan cards and charged significantly more for their full die than previous generations. You are straight up paying much more than you used to.

So if you are following GPU pricing, it's plain to see that Nvidia is milking it's customers due to lack of competition.

The 680 wasn't top end. It was mid-range. Most people that followed GPUs knew that including W1zzard.
From his 680 review:

"Technically GK104, as its name reveals is an upper mid-range GPU, not a pure high-end part. Following NVIDIA's naming convention such a chip would be called GK100. This subtle difference makes the GeForce GTX 680 even more impressive. Technically we'd have to compare it to GTX 560 Ti, not GTX 580.."

https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_680/32.html

They got away with the $500 price because the 680 was a bit faster than the 7970 at $550 which was AMD's high end card at the time.

They didn't create a new tier with the 780 Ti. That was simply the Kepler high end like the 580 was the Fermi high end. They just didn't have a lot of motivation due to lack of competition to release it and the Kepler Titan was "selling like hotcakes" according to Mr. Huang.
 
For further enlightenment, to add to the learned @64K TPU review quote:

http://www.anandtech.com/show/5699/nvidia-geforce-gtx-680-review

"What you won’t find today however – and in a significant departure from NVIDIA’s previous launches – is Big Kepler. Since the days of the G80, NVIDIA has always produced a large 500mm2+ GPU to serve both as a flagship GPU for their consumer lines and the fundamental GPU for their Quadro and Tesla lines, and have always launched with that big GPU first. At 294mm2 GK104 is not Big Kepler, and while NVIDIA doesn’t comment on unannounced products, somewhere in the bowels of NVIDIA Big Kepler certainly lives, waiting for its day in the sun. As such this is the first NVIDIA launch where we’re not in a position to talk about the ramifications for Tesla or Quadro, or really for that matter what NVIDIA’s peak performance for this generation might be."

Underlining and bolding by me, for emphasis.
 
"What you won’t find today however – and in a significant departure from NVIDIA’s previous launches – is Big Kepler. Since the days of the G80, NVIDIA has always produced a large 500mm2+ GPU to serve both as a flagship GPU for their consumer lines and the fundamental GPU for their Quadro and Tesla lines, and have always launched with that big GPU first. At 294mm2 GK104 is not Big Kepler, and while NVIDIA doesn’t comment on unannounced products, somewhere in the bowels of NVIDIA Big Kepler certainly lives, waiting for its day in the sun. As such this is the first NVIDIA launch where we’re not in a position to talk about the ramifications for Tesla or Quadro, or really for that matter what NVIDIA’s peak performance for this generation might be."
Yes, Kepler was the first architecture they were unable to mass-produce enough of the "big chips" early on. You probably remember that "GTX 680" as actually renamed last minute, it was supposed to be called "GTX 670 Ti", with the planned "GTX 680" using the big chip. I remember the box of my GTX 680 had stickers on every side, overing the last minute name change. Kepler refresh actually used the same stepping of GK104, but with a revised GK110 to serve as the high-end.
 
I thought was other way around.

I believe deeper in that context means more instructions, "more steps," and each step now will be more narrower
 
Yeah, each step is smaller but there's more of them. Because the steps are smaller, fewer transistors have to flip which means clockspeeds can be higher. But when there's more steps, a problem (like cache miss) can invoke more clock cycles wasted trying to fill in the gaps (a frame time spike). Also with smaller steps, less of the processor has to be active carrying them out which translates to lower power consumption. GCN to date has and Kepler had short pipelines. Too carry out the simplest of tasks, huge swaths of the graphics core had to be activated. Maxwell changed to long pipelines for NVIDIA and Vega will do the same for AMD.

Remember, long pipelines require higher clockspeeds to get the same performance out of short pipelines at lower clockspeeds because there's more steps (each step requiring a clock to complete) to render a frame.
 
Back
Top