Tuesday, January 18th 2022

PSA: GPU-Z shows PCI-Express x16 for Radeon RX 6500 XT / Navi 24. It really is x4

AMD announced the Radeon RX 6500 XT and RX 6400 at CES just a few days ago. These new entry-level cards debut the company's first 6 nm GPU, codenamed "Navi 24"—the smallest chip from the RDNA2 family. Navi 24 is barely the size of a motherboard chipset, roughly 100 mm² in die size. The chip only features a 64-bit wide GDDR6 memory interface, needing just two memory chips to achieve 4 GB of memory size. While AMD has been fairly quiet about it, people quickly found out that the Navi 24 GPU only uses a PCI-Express 4.0 x4 host interface. While the physical connector is x16, there is only enough signal traces for x4.

Even the most updated 2.43.0 public version of GPU-Z misreports the bus interface as PCIe x16 4.0 though, which will certainly lead to confusion in the reviewer community who trust GPU-Z to report the correct specs and speeds for their articles. Maybe that's the reason why AMD has decided to not send us a sample this time—a first in 15 years.

Update Jan 20th: GPU-Z 2.44.0 has been released, which properly reports the PCIe bus configuration of RX 6500 XT.


The underlying technical reason for this misreporting is that since a few generations AMD has designed their GPUs with a PCI-Express bridge inside, which makes things much more flexible and helps to separate the IP blocks. The bridge distributes the transferred data to the various subdevices, like the graphics core and HD Audio interface, as displayed in the screenshot above. Internally the GPU core operates at x16, despite the external PCIe 4.0 interface, only the link between the GPU's integrated bridge and the motherboard runs at x4. Since current GPU-Z does not know that the running GPU is Navi 24 it asks the graphics core for its link speed and width, which happily reports "PCIe x16 4.0" (instead of "PCIe x4 4.0"), which is of course correct from the perspective of the graphics core. The problem is that upstream a bottleneck exists that operates at only x4. For supported GPUs, GPU-Z is of course aware of such a topology and will check the upstream devices for bottlenecks, but this capability has to be added on a case-by-case basis. This situation also affects the reported PCIe speed, too. For example on older Intel systems, which don't support PCIe 4.0. Internally the GPU always operates at PCIe 4.0, even on PCIe 3.0 or older motherboards.

We plan to correct this with an update to GPU-Z shortly.
Add your own comment

57 Comments on PSA: GPU-Z shows PCI-Express x16 for Radeon RX 6500 XT / Navi 24. It really is x4

#26
windwhirl
KainXSLooking at some of the benches on the 5500XT at 4X, I really did not expect the performance to get kicked as much as it does even on 4.0 on some titles.
For pete's sake, stop drawing comparisons using a product with a completely different architecture.
Posted on Reply
#27
JalleR
kapone324GB is not enough to make a card viable mining card.
OK didn't know that thx :) well i guess AMD shot them self in the foot then :)
Posted on Reply
#28
murak
Logged in just to say thank you for the PSA!
Posted on Reply
#29
Punkenjoy
I find very disturbing that both AMD and Nvidia release GPU without sending it to the press first.

In both case, it was know before hands that the reports would be negative but still. Just don't make crappy product then ! lol

But i am a bit more concerned about AMD doing this in a price range that most people are able to afford and they expect to ship a shit tons of cards. On the Nvidia Side, they are so overpriced that there are limited amount of people that are able to afford it and it's probably just a paper launch anyway.
Posted on Reply
#30
cellar door
W1zzardwhy AMD has decided to not send us a sample this time—a first in 15 years.
Isn't this an AIB only launch - maybe they simply are expecting an AIB partner to send one to TPU. Did they specifically give you a "NO"?
Posted on Reply
#31
RedelZaVedno
Radeon RX 6500 XT = Lisa Su spitting in value seeking DIY GPU buyer's face:

Posted on Reply
#32
windwhirl
RedelZaVednoRadeon RX 6500 XT = Lisa Su spitting in value seeking DIY GPU buyer's face:

Testing made with RX 5500 XT makes all that data invalid on the basis of different architectures.
Posted on Reply
#33
phanbuey
What a turd of a product.

Performance is bad, vram capacity is bad, pcie link speed is comically gimped via bridge, Pricing is bad...

I mean I'm sure probably people will buy it, but I am amazed AMD would even put their names on this. The tarnishment to their reputation is cannot be worth the benefit.

This product is so bad it almost warrants a new brand altogether. Like a "Geo Metro 3D Accelerator Board" or "Pandemic Graphics"
Posted on Reply
#34
HD64G
IC and very fast VRAM should alleviate this bottleneck for most games imho on PCI-E 3.0 vs 4.0. And although 4GB VRAM isn't enough for ultra settings in newest games, it will prevent miners from using it, so...
Posted on Reply
#35
Nihilus
windwhirlTesting made with RX 5500 XT makes all that data invalid on the basis of different architectures.
Sure but do you think it will make that much of a difference? AMD still has worse compression than nVidia, so their cards are more likely to relay on system Ram.

Look at how bad AMD cards do with 16 GB ram on Battlefield 2042
www.tomshardware.com/news/battlefield-2042-pc-performance-benchmarks-settings
compared to when they have 32 GB available:
www.techspot.com/article/2364-battlefield-2042-benchmarks/

When lower vram AMD cards run out of system ram or bandwidth to that resource, they take a dump in many cases.
Posted on Reply
#36
ARF
z1n0x"We love gamers" - Lisa Su

edit: In before some rabid fanboy attack me, Jensen Huang is no different, so chill.
The fruit doesn't fall far from the tree. The niece is no different than the uncle.. :kookoo: :rolleyes:
Posted on Reply
#37
windwhirl
ARFThe fruit doesn't fall far from the tree. The niece is no different than the uncle.. :kookoo: :rolleyes:
They're not family. It's been clarified a number of times.
Posted on Reply
#38
ARF
windwhirlThey're not family. It's been clarified a number of times.
No.
Posted on Reply
#39
windwhirl
ARFNo.
Lisa Su herself said they're not related.
web.archive.org/web/20190110110649/http://weeklybiz.chosun.com/site/data/html_dir/2018/08/31/2018083101687.html
Article is in Korean, btw.
NihilusSure but do you think it will make that much of a difference? AMD still has worse compression than nVidia, so their cards are more likely to relay on system Ram.

Look at how bad AMD cards do with 16 GB ram on Battlefield 2042
www.tomshardware.com/news/battlefield-2042-pc-performance-benchmarks-settings
compared to when they have 32 GB available:
www.techspot.com/article/2364-battlefield-2042-benchmarks/

When lower vram AMD cards run out of system ram or bandwidth to that resource, they take a dump in many cases.
Fair point, though considering the market these cards are targeting (that is, if they even manage to exist outside of paper launches), I don't think it will be that bad.

Though, picking out a dying game seems a bit odd.
Posted on Reply
#40
T_Zel
windwhirlTesting made with RX 5500 XT makes all that data invalid on the basis of different architectures.
I've been led to believe the reason HUB tested the 5500 XT like this is because they have been testing a 6500 XT but are under NDA until the embargo period lifts. That video is a sidestep of the NDA and there is an unspoken "this is what we're seeing on the 6500 XT"
Posted on Reply
#41
Chrispy_
So HWU did a video today testing the 5500XT at PCIe 3.0 x4.

It's ugly!

That puny 16MB of infinity cache is going to be doing some heavy lifting for sure.
Posted on Reply
#42
windwhirl
T_ZelI've been led to believe the reason HUB tested the 5500 XT like this is because they have been testing a 6500 XT but are under NDA until the embargo period lifts. That video is a sidestep of the NDA and there is an unspoken "this is what we're seeing on the 6500 XT"
Until the 6500 XT doesn't explicitly show up in multiple reviews I consider that testing to be invalid.
Posted on Reply
#43
InVasMani
windwhirlFor pete's sake, stop drawing comparisons using a product with a completely different architecture.
I think people are missing the point it'll have 16MB L3 cache for Quake 3 Arena to run 800 x 600 resolution within the L3 cache itself along 4GB VRAM to install the game on.
Chrispy_So HWU did a video today testing the 5500XT at PCIe 3.0 x4.

It's ugly!

That puny 16MB of infinity cache is going to be doing some heavy lifting for sure.
2MB L2 vs 1MB L2 16MB L3 along with 4gbps faster memory. It's the VRAM capacity that worries me more with the 6500XT though it's cut down in some other area's in relation to the 5500XT. If they are naming it 6500XT it's expected to be faster than that card however. That's more than reasonable to assume. I have the impression it'll sit between the 5500XT and GTX 980 on performance, but with lower power draw and newer features.
Posted on Reply
#44
lesovers
windwhirlTesting made with RX 5500 XT makes all that data invalid on the basis of different architectures.
I think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?
Posted on Reply
#45
Chrispy_
lesoversI think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?
Not sure if Steve is trying to get a message through without breaking the NDA, or just setting up a "before" PCIe scaling article so he can compare the last gen against the 6500XT when the NDA lifts tomorrow.

Perhaps the 6500XT doesn't suffer the same fate as the 5500XT because of RDNA2's infinity cache. At 16MB it's hard to see how much good it can do, but I'm not going to place bets for or against it.

Either way, less than 24 hours left to wait.
Posted on Reply
#46
windwhirl
lesoversI think Steve from HUB already has the test results for the RX 6500 XT and cannot publish the results yet!
So why would Steve post the PCI tests results for the RX5500 XT if it did not not reflect the RX 6500 XT results as well?
Clickbait. Double the clicks with double the videos.
Posted on Reply
#48
windwhirl
ARFI don't speak Korean :D
And probably she lies - the usual business practice.. :kookoo: :rolleyes:
On the same grounds, the relationship with Nvidia's CEO is discredited. After all, TechTimes is another business.

:peace:
Posted on Reply
#49
Nihilus
windwhirlClickbait. Double the clicks with double the videos.
Yeah right, that's clickbait but doing reviews on 20 different SKUs of the 2080ti or 1080ti as TPU and many others have is not.

HUBs video revealed some important information that as it was assumed that the lower end AMD cards would not be as affected by PCIe scaling as the higher end cards.

I am thinking that is why the 5500xt was limited to x8 in the first place. It allowed for more market segmentation as the 4GB card would have only fell short in 2 or 3 games behind the 8GB card had it been running pcie4 x16 or even pcie3 x16.
Posted on Reply
#50
windwhirl
NihilusYeah right, that's clickbait but doing reviews on 20 different SKUs of the 2080ti or 1080ti as TPU and many others have is not.
It's clickbait when you're using numbers from one card to apply it to a completely different one that's not even on the same architecture.

Heck, I could even consider it false advertisement.

Oh, and if he's effectively passing off 6500 XT numbers as 5500 XT's, and if there's an effective NDA over the RX 6500 XT, that's also breach of NDA.
Posted on Reply
Add your own comment
Dec 18th, 2024 02:41 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts