• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7800 XT

Back to the technicalities... (if you're not interested, skip this post)

The review says the Navi 32 GPU has 4 shader arrays. That brings us to 960 shaders, or 15 CUs per array. Considering that CUs come in pairs with RDNA 3, this doesn't compute.

However, according to an article on wccftech.com, the GPU has 3 shader arrays, which means 20 CUs per array, that is 1280 shaders per array.

@W1zzard - Not that it matters much, I just thought I'd bring this to your attention. Great review anyway, as always! :)
That's a good point.
One way it could be true is even each shader array actually has 16CUs in it and AMD are disabling some for yield reasons.
I've already questioned the 60CU in prior posts leading up to launch, the obvious number to fit the RDNA3 family was 64CU...
 
That's a good point.
One way it could be true is even each shader array actually has 16CUs in it and AMD are disabling some for yield reasons.
I've already questioned the 60CU in prior posts leading up to launch, the obvious number to fit the RDNA3 family was 64CU...
Unless AMD followed the trend started with RDNA 2, where some chips have 8 CUs per array (Navi 23 and 24), and some have 10 (Navi 21 and 22).

Every article says the 7800 XT is a maxed out chip, so I believe this is the case, and the "4 shader arrays" in the TPU review is a mistake / typo.
 
Unless AMD followed the trend started with RDNA 2, where some chips have 8 CUs per array (Navi 23 and 24), and some have 10 (Navi 21 and 22).

Every article says the 7800 XT is a maxed out chip, so I believe this is the case, and the "4 shader arrays" in the TPU review is a mistake / typo.
The 4 shader arrays could still be true. One of the CUs may simply be unpaired. There's a prcedent: the 290X had 4 shader arrays with 11 CUs per array. In GCN, 4 CUs shared two caches: a scalar cache and the instruction cache. For the 290X, this meant that each shader array, in addition to the usual 4 CU group, had another group consisting of 3 CUs that shared resources.

My bad I meant $674 but I was qouting the 7700XT.
The cheapest 7800XT I've seen has been one for 670 at Amazon, but it's no longer in stock.
 
The cheapest 7800XT I've seen has been one for 670 at Amazon, but it's no longer in stock.
Looking around the only ones available is one from a 3rd party seller on Newegg for $899 and an XFX card for $708 on Amazon. I have not checked Memory Express yet.
 
Looking around the only ones available is one from a 3rd party seller on Newegg for $899 and an XFX card for $708 on Amazon. I have not checked Memory Express yet.
Canada Computers has a few of the Sapphire Pulse and Nitro left, but they are in store only.
 
Canada Computers has a few of the Sapphire Pulse and Nitro left, but they are in store only.
I am not really interested in one other than for Curiosity and maybe update my Daughter's GPU from a 5600XT. I am liking the 7700XT for her. Her panel is a 60Hz panel with first generation Freesync so that should be enough for Sonic Frontiers and Lego Builders. Her 2 current favourite Games.

Canada Computers has a few of the Sapphire Pulse and Nitro left, but they are in store only.
This is probably the best GPU in terms of price right now

 
The 4 shader arrays could still be true. One of the CUs may simply be unpaired. There's a prcedent: the 290X had 4 shader arrays with 11 CUs per array. In GCN, 4 CUs shared two caches: a scalar cache and the instruction cache. For the 290X, this meant that each shader array, in addition to the usual 4 CU group, had another group consisting of 3 CUs that shared resources.
That's not impossible, but only the TPU review says there's 4 shader arrays. I linked an article on wccftech.com that says there's 3, which I find more plausible considering all the other numbers on the chip.
 
That's not impossible, but only the TPU review says there's 4 shader arrays. I linked an article on wccftech.com that says there's 3, which I find more plausible considering all the other numbers on the chip.
Getting numbers for the L1 cache size might settle it. With 3 shader engines, it's likely to be 1.5 MB and 4 shader engines should give us 2 MB.
 
Getting numbers for the L1 cache size might settle it. With 3 shader engines, it's likely to be 1.5 MB and 4 shader engines should give us 2 MB.
All I can find on it is "256 KB per array".
 
All I can find on it is "256 KB per array".
That's the size of a single L1 cache. Each L1 is 256 KB as opposed to 128 KB in RDNA 1 and 2. In the 7900 XTX, each shader array has 2 L1 caches for a total of 3 MB (2*6*256/1024).
 
The cost/frame just isn't the most decisive factor for everyone. If you live in a country with high electricity costs the "good deal" of a 7800 XT becomes quickly a not so good deal. ;) Plus the extra heat, noise & missing features. I also have seen first complains about horrendous coil whine on the Asus model. But sure, comparing the 7800 XT to the RTX 4070 in cost/frame, the 7800 XT is the clear winner.


I am however not convinced by any of those 2 cards. The performance uplift to previous gen just isn't there. Both cards already struggling to maintain 60FPS in modern games at UWQHD/WQHD resolutions. Personally I would save up for a 4080, maybe wait for the 4080ti refresh. When the price comes down to €900 for the 4080 that would be a region I could justify spending for. Also sub RTX 4080 and 7900 XT cards aren't in for a glorious future at +UWQHD/WQHD performance wise. Just fine for 1080p.
If electricity cost is a concern for you you should not be buying $500 GPUs, period.
 
Guys, let's not argue about electricity costs, please. ;)

If you buy a GPU that eats 100 W more in games, and you play 3 hours per day on average, and pay £0.3 per kWh (like I do), then your monthly bill will increase by £2.74 compared to the less hungry GPU.

Happy? :)
 
Guys, let's not argue about electricity costs, please. ;)

If you buy a GPU that eats 100 W more in games, and you play 3 hours per day on average, and pay £0.3 per kWh (like I do), then your monthly bill will increase by £2.74 compared to the less hungry GPU.

Happy? :)

I think its over 60 cents in Germany but for me it would have to be over 75 cents for me to care.
 
I think its over 60 cents in Germany but for me it would have to be over 75 cents for me to care.
Then it's 5.48 EUR per month. Wow! :)
 
Then it's 5.48 EUR per month. Wow! :)

Well if you have no life and game 8 hours a day then maybe it starts to make sense....
 
Well if you have no life and game 8 hours a day then maybe it starts to make sense....
Then it's 14.61 EUR per month. If you game 8 hours a day, then I guess you have no job, so no money to pay your bills, but no money to buy a 500 USD graphics card either, so no problem, I guess. :laugh:
 
Then it's 14.61 EUR per month. If you game 8 hours a day, then I guess you have no job, so no money to pay your bills, but no money to buy a 500 USD graphics card either, so no problem, I guess. :laugh:

Maybe your parents are buying it for you and you are trying to be considerate lmao

Here buy me the more expensive gpu it will save you in like 4 years....
 
Maybe your parents are buying it for you and you are trying to be considerate lmao

Here buy me the more expensive gpu it will save you in like 4 years....
That was nvidias marketing strategy with the 4060 btw trying to convince you to upgrade your 3060 :roll:
 
Maybe your parents are buying it for you and you are trying to be considerate lmao

Here buy me the more expensive gpu it will save you in like 4 years....
Maybe it's better to stop. We're getting into an extremely niche case here. :roll:
 
Maybe it's better to stop. We're getting into an extremely niche case here. :roll:

I mean if that is important to someone it's fine... For me it would have to be the cherry on top of other things I like better about a gpu.
 
That was nvidias marketing strategy with the 4060 btw trying to convince you to upgrade your 3060 :roll:
You mean the "buy more to save more" bulls***? That'll be an evergreen classic! :roll:
 
You mean the "buy more to save more" bulls***? That'll be an evergreen classic! :roll:

My favorite is when he announces somthing with a terrible price and nobody claps the look on his face is golden.
 
You mean the "buy more to save more" bulls***? That'll be an evergreen classic! :roll:
No I mean this. Youll pay 300€ now to save 130€ in the next 4 years.

1694200632789.png



was on their 4060 promo :banghead:
 
Back
Top