• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 5500 Marketing Sheets Reveal a bit More About the Card

Not trolling, just trying to figure out based on the specs how to see if this card (or any other) is lower or top tier performance.
That improvement with lower specs comes with the change to DLNA achitecture, the change to 7nm or both?

So if you cant compare CU's, TFLOP's, TDP's there are no way to compare based on the specs, how can we know if it is an entry level or a enthusiast card?

Well you can do some predictions from current Turing to Navi performance parity:
RTX 2070S(9620,48 GFLops) = 1.088*RX 5700xt(9661,44 GFlops), RTX 2070(8580,096 GFlops) = 1.12 RX 5700(7704,58 GFlops)

So assuming that RX 5500 has the same actual gaming clocks as gtx1660/S by TFlops alone coefficient will be something like 1.09. But RX 5500(224GB/s) has more memory bandwidth than 1660(192GB/s), but less memory bandwidth than 1660S(336GB/s), so one cannot really get the coefficient before equaling memory bandwidth. @W1zzard could run 1080p tests with vanilla 1660 gddr5 OC 2333MHz, thus 4*2.333*192/8 = 224GB/s to equaling RX 5500 memory bandwidth. But that would take too much work for proving very little.
 
Not trolling, just trying to figure out based on the specs how to see if this card (or any other) is lower or top tier performance.
That improvement with lower specs comes with the change to DLNA achitecture, the change to 7nm or both?


So if you cant compare CU's, TFLOP's, TDP's there are no way to compare based on the specs, how can we know if it is an entry level or a enthusiast card?
Price, name, and market segment are all good indicators, not to mention that you can compare it to the already launched and widely reviewed RX 5700 and 5700 XT which is also RDNA. Other than that, wait for reviews. Spec sheets are a poor tool for discerning the position of a newly launched architecture unless you know very specifically the limitations of such a comparison.

Just the fact that this is called 5500 should tell you that it's a midrange card significantly below the 5700. That AMD are comparing it to the GTX 1650 is another rather clear indicator.
 
The comments on page 1 about AMD price gouging make me sad.

Some of you have very short memories because the 2070FE launched at $600, with partner cards typically sitting at $520-550 depending on how serious the build quality and cooler design were.

Along comes AMD with a 2070-killer at $400 at a $120-200 discount and there's a whole load of whining about it being overpriced. Without AMD's competition, there would never have been a SUPER relaunch with the deep discounts Nvidia had to make.

There's no pleasing some people.... :rolleyes:
 
Not bad, honestly!
But it's sucess it will depend on the price tag, if it comes at 150/160$, I would say it would be a fair price considering the price that we can actually get the RX 570/580 now ,but I'm pretty sure they will price it higher. :/

The downside is literally those 4gb of vram, it might be enough for most games, but next gen consoles will surely drive games to use much more memory and higher resolution textures, 4gb might go from acceptable to bad in 1 or 2 years.
I think something in between this RX 5500 and the RX 5700 priced in between the 200 and 300 woud be also a very desired product.
 
Last edited:
It's actually pretty simple; developing and deploying a flagship card costs more than it would make. It's happened in the past, and it will happen again where they have the faster card and still sell fewer units. It's not worth wasting their cash reserves on that 3% of the market when they can put that into developing cards that fit the average.

For what it's worth this strategy saved their asses in the past. The so called "Small-Die" era of TeraScale where they made extremely competitive mid-range cards in massive quantity for the average consumer instead of focusing efforts into flagships. The reduced unit cost also allowed them to pack newer tech into their cards, advance API feature support, and leverage OEMs with better pricing.
You're acting as if AMD didn't try to shoot for the stars. They tried and left lots of us a bit disappointed after the long wait and hype.
 
If b550 motherboard and rx 5500 xt comes out in itx versions, I have a plan to make cheap but great 1080p machine in a very tiny itx case.
 
The benchmarks' 43% lead over the 1650 say that the RX5500 will be a bit above the RX590, about 6% from the 1660. If that was true for ~$160-170, that would be awesome.
You must look at RX 5500's review. AMD didn't say truth, they exaggerate about their GPU's performance.
 
You must look at RX 5500's review. AMD didn't say truth, they exaggerate about their GPU's performance.

green eye did for rtx, so what's your point?

Where are your sys specs?
 
green eye did for rtx, so what's your point?

Where are your sys specs?
My PC specs are i7 4790 and RTX 2060. I used these graphics cards: GTX 1050 Ti, R7 370, R7 240 but i don't say AMD is bad. I think that they must improve their GPU's software. AMD GPUs have got many issues such as Wattman, old game(6-9 years ago) optimization is not good. For example, i have one more card which is HD 7770 and I tested card in AC Unity, it is getting 10 FPS for 720p Low(I used driver 19.11.3). HD 7770 is not in my PC. It is my friend's card but he gave to me.



If you look at AMD's endnotes of the RX 5500 benchmarks, you will see they choose best system to their hardware. In Nvidia's side, Nvidia's system is worse than AMD's. For this issue, customers don't know the truth.
 
If you look at AMD's endnotes of the RX 5500 benchmarks, you will see they choose best system to their hardware. In Nvidia's side, Nvidia's system is worse than AMD's. For this issue, customers don't know the truth.
What are you talking about? The endnotes for this leak clearly states that tests were run on "similarly configured systems" with the same CPU (R7 3800X), same amount of memory, same OS version, and up-to-date drivers.

Not that easy to read, but the bottom text is rather unequivocal.
guide_5.jpg
 
What are you talking about? The endnotes for this leak clearly states that tests were run on "similarly configured systems" with the same CPU (R7 3800X), same amount of memory, same OS version, and up-to-date drivers.

This thread is old enough to have devolved to fanboy FUD already. Just ignore it and /thread already ;)
 
@Chrispy_ @Valantar
Please back reality.
rx-5500-performance.jpg

Notebookcheck.com
Ekran Alıntısı.PNG


I'm not fanboy. I think this Navi 1 is useless because of the fact that Navi 2 is coming and it realases Q2 2020 with Ray Tracing. Navi 1 does not make sense to me due to the fact that it is only 1 year lifespan. For example, you have got RX 5700 XT and you are happy for now but in future games (Cyberpunk, GTA 6, Elder Scrolls 6) your hardware does not allow Ray Tracing and it is 400$. Maybe i can not play high settings with Ray Tracing but i will experience the RT.
I prefer wait to Ampere. You could see back to Pascal VS Maxwell. In same of terms, we will see it Ampere vs Turing.
 
@Chrispy_ @Valantar
Please back reality.
View attachment 138412
Notebookcheck.com
View attachment 138413

I'm not fanboy. I think this Navi 1 is useless because of the fact that Navi 2 is coming and it realases Q2 2020 with Ray Tracing. Navi 1 does not make sense to me due to the fact that it is only 1 year lifespan. For example, you have got RX 5700 XT and you are happy for now but in future games (Cyberpunk, GTA 6, Elder Scrolls 6) your hardware does not allow Ray Tracing and it is 400$. Maybe i can not play high settings with Ray Tracing but i will experience the RT.
I prefer wait to Ampere. You could see back to Pascal VS Maxwell. In same of terms, we will see it Ampere vs Turing.
1) RT in the market segment where the RX 5500 exists is utterly meaningless, as the hardware wouldn't be capable of performing adequately for it to be useful. Other than that, this is a valid point, and ultimately what has made me decide on waiting for an RDNA2 card rather than going for a 5700XT. This still doesn't mean that the 5700XT is useless - it still performs well for its price and power draw, and RT isn't going to be a must for a couple of years yet (at least). My reason for holding off is that I tend to keep GPUs for 4-5 years - if I upgraded more often, that point would be moot.

2) The numbers you're quoting from NotebookCheck are grossly misleading without context. The RX 5500M "average" is based off a single pre-production laptop with a relatively slow 35W R7 3700H CPU, while the GTX 1650 average is based off a wide variety of more mature designs, most of them with faster and higher power (45W) Intel i7 CPUs. Where they're getting the "average" for the 5300M from is worth questioning, as they only posted some preliminary 3DMark results for that a few days back, where it soundly beats both an i5-9300H+1650 and i7-10710U+1650MQ. Of course 3DMark is hardly representative of gaming performance, but it's all we've got for now.

3) Why are you referring to Videocardz' repackaging of AMD's PR benchmark numbers rather than their own slides?
 
I am answering one by one in my opinion to your placement:
1. I use GPUs for 2-3 years because of the sell price. I bought RTX 2060 in January 2019,
2. Site uses average framerate, GTX 1650 used with R5 3550H, R7 3750H, i5 9300H, i7 9750H. RX5500M's drivers are very bad, AMD knew drivers are bad but they show their GPU's are good.
If we look specifications,
GTX 1650 Laptop has 1024 cores, GTX 1650 has 896 cores but GTX 1650 Laptop is 3.2 Tflops, GTX 1650 is 3Tflops. RX 5500M is %13 slower than RX 5500. RX 5500 is %26 faster than GTX 1650 in TPU benchmarks.
Ekran Alıntısı.PNG

If we calculate these, RX 5500M is %5 faster than GTX 1650 Laptop NOT %30.(Average %5)

3. Videocardz uses AMD's benchmarks.
666719-radeon-rx-5500-1650-comparison.png
 
I am answering one by one in my opinion to your placement:
1. I use GPUs for 2-3 years because of the sell price. I bought RTX 2060 in January 2019,
2. Site uses average framerate, GTX 1650 used with R5 3550H, R7 3750H, i5 9300H, i7 9750H. RX5500M's drivers are very bad, AMD knew drivers are bad but they show their GPU's are good.
If we look specifications,
GTX 1650 Laptop has 1024 cores, GTX 1650 has 896 cores but GTX 1650 Laptop is 3.2 Tflops, GTX 1650 is 3Tflops. RX 5500M is %13 slower than RX 5500. RX 5500 is %26 faster than GTX 1650 in TPU benchmarks.
View attachment 138440
If we calculate these, RX 5500M is %5 faster than GTX 1650 Laptop NOT %30.(Average %5)

3. Videocardz uses AMD's benchmarks.
View attachment 138442
Again: NBC's numbers are based off, as you say, an average of all tested laptops with the GPU in question. They have one laptop tested with the RX 5500 - and a pre-production model at that, with AFAIK unoptimized drivers, as the GPU wasn't available yet at that time - with a Ryzen 7 3700H, which is a relatively middling gaming CPU. They have tested dozens of GTX 1650 models, which on the other hand are often equipped with much faster CPUs - and certainly few of them have slower CPUs than the R7 3700H (some might have U-series Intel, some might have R5 3500Hs, but both of those are rare), meaning that the faster CPUs themselves will pull the average up, skewing the comparison significantly. Beyond that, comparing a single data point to an average of dozens of others is ... problematic.

Comparing tflops across desktop and mobile Nvidia cards is problematic as the TFlops spec is normally calculated either at base clock or boost clock, but boost clock is power and thermal dependent and thus varies wildly due to GPU Boost 3.0. A desktop card will boost higher and for longer than a laptop card, even with identical on-paper specs, not to mention that power and thermal budgets for Nvidia's current mobile GPUs are OEM-dependent and can vary by quite stunningly high amounts (there are quite a few RTX 2060 laptops out there that are no faster than the same chassis with a GTX 1660 in it - and it's all down to power and cooling).

And yes, I know Videocardz was using AMD's benchmarks, I was just pointing out that it's rather odd of you to not use the original source material but rather a third-party representation of the data.

But still - shouldn't we be waiting for actual reviews of actually released products rather than arguing over speculation and theoretical numbers?
 
Wait until the NDA lifts and real review data comes out; Pre-production leaked engineering sample results on a pre-production driver aren't exactly 'reality'.

I'm only concerned about the 5300M anyway, in the desperate hope that AMD can bring some competition to the thin/light sub-15.6" laptop market. Currently there's the MX250 at 15-25W and the overpriced 1650 Max-Q at 35W and up. AMD have nothing to offer there at the moment.

For the rest of this year, and probably until stocks dry up, the RX570 is still king of the hill. It's a $120/£99/€99 before tax which is insanely good value and that's before special offers and game bundles are taken into consideration.
 
Last edited:
Back
Top