Monday, March 20th 2017

AMD's Upcoming RX 500 Rebrands to use LPP Process - Higher Clocks, Lower Power

AMD's upcoming RX 500 series of graphics cards is not going to set the world on fire with its feature-set. Essentially rebrands of AMD's mainstream Polaris GPUs used in current-generation RX 400 series, these have recently seen a slight delay on its time to market - now set at April 18th.

While architecture-level adjustments to this new series of cards so as to improve performance seem to be off the table, AMD is apparently looking to take advantage of manufacturing maturing and process improvements. The original Polaris 11 and Polaris 10 chips were manufactured using the Low Power Early (LPE) process, which looks to balance availability, yields, and time-to-market with performance and power. New reports peg the new dies to carry the Polaris 21 and Polaris 20 monikers, and will feature higher clocks on account of the new Low Power Performance (LPP) process.
As to the higher clocks, these apparently are only responsible for bridging the gap between the RX 480's reference and custom boards. The RX 580 will reportedly carry a 1340 MHz clock (74 MHz more than the reference RX 480), with the RX 570 carrying a much less significant 38 MHz increase over its RX 470 counterpart. The Radeon RX 560 will apparently make do with a clock speed of 1287 MHz.

These clock improvements only go so far as to allow AMD to claim a measure of increased performance comparing to their previous-generation, same architecture, one-year-old graphics cards. Vega is the only product from the company which will have some semblance of originality. A shame AMD didn't adopt some of Vega's refinements to its mainstream graphics cards.
Source: BenchLife
Add your own comment

62 Comments on AMD's Upcoming RX 500 Rebrands to use LPP Process - Higher Clocks, Lower Power

#51
Kanan
Tech Enthusiast & Gamer
cdawallWell to be fair it shows them within 1% of each other at all resolutions on all games DX11 and DX12...That is using reference cards as well which I am curious which setting W1z is using as far as testing on the 480 goes. They have two options now with the whole power fiasco.
Hardly a fiasko, only people reading these websites even knew about it, the other ones simply enjoyed the GPU, and it worked pretty normal too. We're talking about a few fucking watts here. The media is to blame for making a "fiasco" out of it, but for me it never was. What a joke, really.

And RX 480 is faster than 1060, if you compare non-throttling custom cards to custom GTX 1060. Ref AMD cards are just bad, some exceptions confirm the rule.
Posted on Reply
#52
medi01
danbert2000performance summary
danbert20001060 is generally faster at anything DX11
Bullshit.

That's why you actually shouldn't use TPU benchmarks. (show me improvement listed below on TPU site)

DX11
On release (July 2016), the GTX1060 was around 12% and 8% better than RX480 in 1080P and 1440P. Now (Dec 2016), the GTX1060 is 2% and 0% better than RX480 in 1080P and 1440P.

DX12
On release (July 2016), the GTX1060 was around 3% and 4% worse than RX480 in 1080P and 1440P. Now (Dec 2016), the GTX1060 is 6% and 6% worse than RX480 in 1080P and 1440P.
www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html


So no, not really.

And all that ignoring huge adaptive sync premium on nvidia compatible monitors.
Posted on Reply
#53
TheMailMan78
Big Member
medi01Bullshit.

That's why you actually shouldn't use TPU benchmarks. (show me improvement listed below on TPU site)

DX11
On release (July 2016), the GTX1060 was around 12% and 8% better than RX480 in 1080P and 1440P. Now (Dec 2016), the GTX1060 is 2% and 0% better than RX480 in 1080P and 1440P.

DX12
On release (July 2016), the GTX1060 was around 3% and 4% worse than RX480 in 1080P and 1440P. Now (Dec 2016), the GTX1060 is 6% and 6% worse than RX480 in 1080P and 1440P.
www.hardwarecanucks.com/forum/hardware-canucks-reviews/73945-gtx-1060-vs-rx-480-updated-review.html


So no, not really.

And all that ignoring huge adaptive sync premium on nvidia compatible monitors.
lol hardwarecanucks? Really man? Dude those guys are a known joke around here. I can tell you're new to this.
Posted on Reply
#54
cdawall
where the hell are my stars
KananHardly a fiasko, only people reading these websites even knew about it, the other ones simply enjoyed the GPU, and it worked pretty normal too. We're talking about a few fucking watts here. The media is to blame for making a "fiasco" out of it, but for me it never was. What a joke, really.

And RX 480 is faster than 1060, if you compare non-throttling custom cards to custom GTX 1060. Ref AMD cards are just bad, some exceptions confirm the rule.
Over pulling the pcie by more than a couple watts on a card that shouldn't have drawn more than 95w to begin with would be a fiasco to me as a company. That's called not delivering.
Posted on Reply
#55
eidairaman1
The Exiled Airman
cdawallOver pulling the pcie by more than a couple watts on a card that shouldn't have drawn more than 95w to begin with would be a fiasco to me as a company. That's called not delivering.
Known as not staying with in power specs of PCISiig
Posted on Reply
#56
Kanan
Tech Enthusiast & Gamer
cdawallOver pulling the pcie by more than a couple watts on a card that shouldn't have drawn more than 95w to begin with would be a fiasco to me as a company. That's called not delivering.
You're easily exeggerating. It got fixed later anyway, so who cares.

95W is wrong anyway, RX 480 drew a few watts more than 150 (the TDP of the card), RX 470 and 460 did their jobs as intended.
Posted on Reply
#57
cdawall
where the hell are my stars
KananYou're easily exeggerating. It got fixed later anyway, so who cares.

95W is wrong anyway, RX 480 drew a few watts more than 150 (the TDP of the card), RX 470 and 460 did their jobs as intended.
I'm not. Go look at everything raja said and what was released on the embedded market. The 480 was never meant to be a 150w gpu. The crap process at freaking glofo sucked AGAIN.

It also still isn't fixed the band aid still pulls the absolute max of spec. That is no good for xfire setups or older boards.
Posted on Reply
#58
Kanan
Tech Enthusiast & Gamer
cdawallI'm not. Go look at everything raja said and what was released on the embedded market. The 480 was never meant to be a 150w gpu. The crap process at freaking glofo sucked AGAIN.

It also still isn't fixed the band aid still pulls the absolute max of spec. That is no good for xfire setups or older boards.
I don't care what Raja said, a lot of people are doing mistakes, that's called being "human". RX 480 was tuned to be as fast as possible to be more competitive on the market, that's the reason why they upped the TDP to over 150W.

It pulls the max of spec? Last time I checked, the "maximum of spec" wasn't too much. Also it's _not_ pulling the maximum of spec, it's pulling about 10-15 W less after the update through driver. So it's good for everything, otherwise give me hard facts, examples of systems that blew AFTER the fix.

I also read, pulling *more than spec* is no problem on modern mainboards (as long as it's only 5-10 W), say mainboards not older than 8 to 10 years. You're really making something out of nothing.
Posted on Reply
#59
Frick
Fishfaced Nincompoop
rtwjunkieSeriously? Did you check his GPU in system specs? :rolleyes:

And since when are we not allowed to make comments about a brand without being labeled a fanboy? Some people can actually see things objectively, you know!
Everything on the internet is objective, but most objective of them all are opinions. This is a fact.
Posted on Reply
#60
cdawall
where the hell are my stars
KananI don't care what Raja said, a lot of people are doing mistakes, that's called being "human". RX 480 was tuned to be as fast as possible to be more competitive on the market, that's the reason why they upped the TDP to over 150W.

It pulls the max of spec? Last time I checked, the "maximum of spec" wasn't too much. Also it's _not_ pulling the maximum of spec, it's pulling about 10-15 W less after the update through driver. So it's good for everything, otherwise give me hard facts, examples of systems that blew AFTER the fix.

I also read, pulling *more than spec* is no problem on modern mainboards (as long as it's only 5-10 W), say mainboards not older than 8 to 10 years. You're really making something out of nothing.
That wasn't a mistake he was quoting what should have been released. What we ended up with instead was a typical "oh shit" product released well past its efficiency curve.

Good news is they have already released an RX480 embedded solution that consumes 95w TDP for the entire board to include the memory and at full desktop GPU speed. That is again what gives further food to this fire. Yet again AMD was horribly let down by GloFo and is forced to release something that is nowhere near what they bragged about in the press. This is bad for EVERYONE.

Also some food for thought for you. Would you consider the Gigabyte X99M-gaming 5 to be a modern motherboard? Two different boards with two different brands sets of 480's and neither could run crossfire at stock settings stable. Mind you these were in two completely different machines with different power supplies, processors, ram, SSD etc. Same exact over drawn issue. Luckily modern boards just shut off instead of catching on fire like the ones of old do. This problem was with the drivers released in 2017 so it is post power fix. The problem still exists, the driver fix was nothing more than a bandaid. Remember they were drawing 6.7A off of the 12v rail which exceeds the 5.5A spec. After 16.7.1 they draw 5.6A off of the 12v rail of the motherboard which...drum roll please....STILL EXCEEDS SPEC.

The card design literally looks like it should draw 95, cooler is barely adequate at 150w, absolutely excellent at 95w however. With the powerplan of the reference card that puts 47w off of the mobo and 47w off of the PCI-e 6 pin. Both of those numbers are normal and well within spec. This yet again points towards a card that should have been 95w and was designed to be 95w, but GloFo failed to deliver yields that were that efficient and we got what we got.

Take your freaking blinders off.
Posted on Reply
#61
Kanan
Tech Enthusiast & Gamer
cdawallThat wasn't a mistake he was quoting what should have been released. What we ended up with instead was a typical "oh shit" product released well past its efficiency curve.

Good news is they have already released an RX480 embedded solution that consumes 95w TDP for the entire board to include the memory and at full desktop GPU speed. That is again what gives further food to this fire. Yet again AMD was horribly let down by GloFo and is forced to release something that is nowhere near what they bragged about in the press. This is bad for EVERYONE.

Also some food for thought for you. Would you consider the Gigabyte X99M-gaming 5 to be a modern motherboard? Two different boards with two different brands sets of 480's and neither could run crossfire at stock settings stable. Mind you these were in two completely different machines with different power supplies, processors, ram, SSD etc. Same exact over drawn issue. Luckily modern boards just shut off instead of catching on fire like the ones of old do. This problem was with the drivers released in 2017 so it is post power fix. The problem still exists, the driver fix was nothing more than a bandaid. Remember they were drawing 6.7A off of the 12v rail which exceeds the 5.5A spec. After 16.7.1 they draw 5.6A off of the 12v rail of the motherboard which...drum roll please....STILL EXCEEDS SPEC.

The card design literally looks like it should draw 95, cooler is barely adequate at 150w, absolutely excellent at 95w however. With the powerplan of the reference card that puts 47w off of the mobo and 47w off of the PCI-e 6 pin. Both of those numbers are normal and well within spec. This yet again points towards a card that should have been 95w and was designed to be 95w, but GloFo failed to deliver yields that were that efficient and we got what we got.

Take your freaking blinders off.
Which blinders? I wanted a explanation and I got one. :P Thanks

Still the GPU works for most users, you have to admit that.
Posted on Reply
#62
cdawall
where the hell are my stars
KananWhich blinders? I wanted a explanation and I got one. :p Thanks

Still the GPU works for most users, you have to admit that.
I have two.
Posted on Reply
Add your own comment
Dec 19th, 2024 06:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts