• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PSA: GPU-Z shows PCI-Express x16 for Radeon RX 6500 XT / Navi 24. It really is x4

I’d like to see this tested on a pcie gen 3 and gen 2 mobo to see how performance is impacted compared to gen 4.

I wonder if these cars are going to be mainly for OEMs??
 
Looking at some of the benches on the 5500XT at 4X, I really did not expect the performance to get kicked as much as it does even on 4.0 on some titles.
For pete's sake, stop drawing comparisons using a product with a completely different architecture.
 
4GB is not enough to make a card viable mining card.
OK didn't know that thx :) well i guess AMD shot them self in the foot then :)
 
I find very disturbing that both AMD and Nvidia release GPU without sending it to the press first.

In both case, it was know before hands that the reports would be negative but still. Just don't make crappy product then ! lol

But i am a bit more concerned about AMD doing this in a price range that most people are able to afford and they expect to ship a shit tons of cards. On the Nvidia Side, they are so overpriced that there are limited amount of people that are able to afford it and it's probably just a paper launch anyway.
 
Radeon RX 6500 XT = Lisa Su spitting in value seeking DIY GPU buyer's face:

Testing made with RX 5500 XT makes all that data invalid on the basis of different architectures.
 
What a turd of a product.

Performance is bad, vram capacity is bad, pcie link speed is comically gimped via bridge, Pricing is bad...

I mean I'm sure probably people will buy it, but I am amazed AMD would even put their names on this. The tarnishment to their reputation is cannot be worth the benefit.

This product is so bad it almost warrants a new brand altogether. Like a "Geo Metro 3D Accelerator Board" or "Pandemic Graphics"
 
Last edited:
IC and very fast VRAM should alleviate this bottleneck for most games imho on PCI-E 3.0 vs 4.0. And although 4GB VRAM isn't enough for ultra settings in newest games, it will prevent miners from using it, so...
 
Last edited:
Testing made with RX 5500 XT makes all that data invalid on the basis of different architectures.
Sure but do you think it will make that much of a difference? AMD still has worse compression than nVidia, so their cards are more likely to relay on system Ram.

Look at how bad AMD cards do with 16 GB ram on Battlefield 2042
compared to when they have 32 GB available:

When lower vram AMD cards run out of system ram or bandwidth to that resource, they take a dump in many cases.
 
"We love gamers" - Lisa Su

edit: In before some rabid fanboy attack me, Jensen Huang is no different, so chill.

The fruit doesn't fall far from the tree. The niece is no different than the uncle.. :kookoo: :rolleyes:
 
The fruit doesn't fall far from the tree. The niece is no different than the uncle.. :kookoo: :rolleyes:
They're not family. It's been clarified a number of times.
 
They're not family. It's been clarified a number of times.

No.
1642539197703.png
 
Lisa Su herself said they're not related.
Article is in Korean, btw.

Sure but do you think it will make that much of a difference? AMD still has worse compression than nVidia, so their cards are more likely to relay on system Ram.

Look at how bad AMD cards do with 16 GB ram on Battlefield 2042
compared to when they have 32 GB available:

When lower vram AMD cards run out of system ram or bandwidth to that resource, they take a dump in many cases.

Fair point, though considering the market these cards are targeting (that is, if they even manage to exist outside of paper launches), I don't think it will be that bad.

Though, picking out a dying game seems a bit odd.
 
Testing made with RX 5500 XT makes all that data invalid on the basis of different architectures.
I've been led to believe the reason HUB tested the 5500 XT like this is because they have been testing a 6500 XT but are under NDA until the embargo period lifts. That video is a sidestep of the NDA and there is an unspoken "this is what we're seeing on the 6500 XT"
 
So HWU did a video today testing the 5500XT at PCIe 3.0 x4.

It's ugly!

That puny 16MB of infinity cache is going to be doing some heavy lifting for sure.
 
I've been led to believe the reason HUB tested the 5500 XT like this is because they have been testing a 6500 XT but are under NDA until the embargo period lifts. That video is a sidestep of the NDA and there is an unspoken "this is what we're seeing on the 6500 XT"
Until the 6500 XT doesn't explicitly show up in multiple reviews I consider that testing to be invalid.
 
For pete's sake, stop drawing comparisons using a product with a completely different architecture.
I think people are missing the point it'll have 16MB L3 cache for Quake 3 Arena to run 800 x 600 resolution within the L3 cache itself along 4GB VRAM to install the game on.

So HWU did a video today testing the 5500XT at PCIe 3.0 x4.

It's ugly!

That puny 16MB of infinity cache is going to be doing some heavy lifting for sure.
2MB L2 vs 1MB L2 16MB L3 along with 4gbps faster memory. It's the VRAM capacity that worries me more with the 6500XT though it's cut down in some other area's in relation to the 5500XT. If they are naming it 6500XT it's expected to be faster than that card however. That's more than reasonable to assume. I have the impression it'll sit between the 5500XT and GTX 980 on performance, but with lower power draw and newer features.
 
Last edited:
Testing made with RX 5500 XT makes all that data invalid on the basis of different architectures.
I think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?
 
I think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?
Not sure if Steve is trying to get a message through without breaking the NDA, or just setting up a "before" PCIe scaling article so he can compare the last gen against the 6500XT when the NDA lifts tomorrow.

Perhaps the 6500XT doesn't suffer the same fate as the 5500XT because of RDNA2's infinity cache. At 16MB it's hard to see how much good it can do, but I'm not going to place bets for or against it.

Either way, less than 24 hours left to wait.
 
I think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?

Clickbait. Double the clicks with double the videos.
 
I don't speak Korean :D
And probably she lies - the usual business practice.. :kookoo: :rolleyes:
On the same grounds, the relationship with Nvidia's CEO is discredited. After all, TechTimes is another business.

:peace:
 
Clickbait. Double the clicks with double the videos.
Yeah right, that's clickbait but doing reviews on 20 different SKUs of the 2080ti or 1080ti as TPU and many others have is not.

HUBs video revealed some important information that as it was assumed that the lower end AMD cards would not be as affected by PCIe scaling as the higher end cards.

I am thinking that is why the 5500xt was limited to x8 in the first place. It allowed for more market segmentation as the 4GB card would have only fell short in 2 or 3 games behind the 8GB card had it been running pcie4 x16 or even pcie3 x16.
 
Back
Top