• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7800 XT

No I mean this. Youll pay 300€ now to save 130€ in the next 4 years.

View attachment 312731


was on their 4060 promo :banghead:

I remember that now.. Made me role my eyes pretty hard. Almost went blind....

I feel bad for someone making that switch and having to keep the gpu for 4 years...
 
That'll be an evergreen classic!

1694200974623.png
 
Ah yes! Like anybody cares about saving 100 bucks on power in 4 years, lol! :roll:
But think of all you can do with the money you save! you can buy, uuhh, groceries, or 2 AAA games (of the non 70usd variety)

They did the reverse streaming service advertisement, instead of advertising how little it costs per month, they advertise how much you'd save in 4 years, because even they know it's not worth it, so they have to be predatory.

But hey, I can buy one more chocolate bar per month! Hell, maybe even a beer to go along with it! Endless possibilites!
 
Ah yes! Like anybody cares about saving 100 bucks on power in 4 years, lol! :roll:
But think of all you can do with the money you save! you can buy, uuhh, groceries, or 2 AAA games (of the non 70usd variety)

They did the reverse streaming service advertisement, instead of advertising how little it costs per month, they advertise how much you'd save in 4 years, because even they know it's not worth it, so they have to be predatory.

But hey, I can buy one more chocolate bar per month! Hell, maybe even a beer to go along with it! Endless possibilites!
I think you are both missing the point. You are not SAVING anything. You have to first pay 300€ over your 3060 in order to get back 130 in the span of years. You are still down 170€. That's of course assuming you live in Germany where the kw/h is expensive.
 
I think you are both missing the point. You are not SAVING anything. You have to first pay 300€ over your 3060 in order to get back 130 in the span of years. You are still down 170€. That's of course assuming you live in Germany where the kw/h is expensive.
I am mostly taking the piss, I am very well aware that the "power savings" argument is absolute bologni, you're still losing in the end, because you still do not recoup your costs AND you are stuck with a dogshit GPU for years in order to cope for the money you spent, and with how games are running, I think we can all agree that the last gpu we'd want to be stuck with is a 4060 8gb.
 
Meanwhile in Russia:

RX 6800 XT: $350 (used) $1000 (new)
RX 6900 XT: $400 (used) N/A (new)
RX 6950 XT: $550 (used) $1000 (new)
RX 7700 XT: $630
RX 7800 XT: $770

What is more absurd than this..?
 
Bingo, 4080 is overpriced by about 200$ and competes with the 4090 as anyone (!) who can afford a 4080 will as well afford a 4090 - but anyone who CAN'T can maybe afford the XTX which sits at about 200$ less than the 4080. A much better price to performance card.

The same could be said about nearly every other Nvidia card:

4070 Ti is overpriced by 200$, should be 600$, it's a 192 bit card with just 12 GB and the performance can't hold a candle to the 7900 XT which it competes with at the same 800$.

4070 is overpriced by 100-200$.

4060 Ti is overpriced by about 100$ - and at 300$ no one would complain that it only has 8 GB VRAM.

4060 is overpriced by about 50-100$.

At AMD only the 7700 XT is currently strictly overpriced, but i think this is intentional to drive more people into the 7800 XT and will probably later be decreased, 7800 XT is nearly sold out at my shop, whereas 7700 XT is completely ignored, a stark contrast - it's obvious the price is too high and will be reduced later. 400-450$ is okay.

Yup, Nvidia's whole product stack is turbofucked for this generation. I think AMD is just trying to move the rest of the 6000 series which is why they've priced the 7700 XT this way. I'm thinking we'll see some deep price reductions around the holidays for the 7700 XT when most of the 6000 series has sold out. The 6700 XT can be had for around $330 USD and 6750 XT for $400 USD right now.
 
But think of all you can do with the money you save! you can buy, uuhh, groceries, or 2 AAA games (of the non 70usd variety)

They did the reverse streaming service advertisement, instead of advertising how little it costs per month, they advertise how much you'd save in 4 years, because even they know it's not worth it, so they have to be predatory.

But hey, I can buy one more chocolate bar per month! Hell, maybe even a beer to go along with it! Endless possibilites!
No you can't because you spent that money and more on upgrading your GPU. :laugh:

I think you are both missing the point. You are not SAVING anything. You have to first pay 300€ over your 3060 in order to get back 130 in the span of years. You are still down 170€. That's of course assuming you live in Germany where the kw/h is expensive.
Exactly. :)
 
Back to the technicalities... (if you're not interested, skip this post)

The review says the Navi 32 GPU has 4 shader arrays. That brings us to 960 shaders, or 15 CUs per array. Considering that CUs come in pairs with RDNA 3, this doesn't compute.

However, according to an article on wccftech.com, the GPU has 3 shader arrays, which means 20 CUs per array, that is 1280 shaders per array.

@W1zzard - Not that it matters much, I just thought I'd bring this to your attention. Great review anyway, as always!

That's a good point.
One way it could be true is even each shader array actually has 16CUs in it and AMD are disabling some for yield reasons.
I've already questioned the 60CU in prior posts leading up to launch, the obvious number to fit the RDNA3 family was 64CU...

Unless AMD followed the trend started with RDNA 2, where some chips have 8 CUs per array (Navi 23 and 24), and some have 10 (Navi 21 and 22).

Every article says the 7800 XT is a maxed out chip, so I believe this is the case, and the "4 shader arrays" in the TPU review is a mistake / typo.

The fully enabled Navi 32 die features 4,096 shaders across 64 Compute Units (CUs) paired with 64MB of L3 cache and a 256-bit memory controller. Additionally, it packs 256 Texture Units and 128 Raster Output Units alongside 64 RT Cores (one per CU).
 
what drugs did you use? [...] my relapsed dude! [...] You literally have no clue even how fans operate [...]
I'm an enthusiast of both acoustics and PC cooling actually. As I mentioned earlier, I do have a fanless idle setup. >30W is just too much for a relatively small GPU radiator in a closed (no mesh) case without airflow. Due to its orientation (backplate blocking the convection), the standard GPU radiator is terrible at passive cooling. However, just because I dared list 3 severe driver-related issues I had with my AMD GPU, I'm suddenly your enemy and must be discredited and offended. Thanks to TPU for the ignore option.
 
No, they support both 32 and 64 since RDNA (1). GCN could only do 64 while RDNA can do 32 and 64, hence it's a more flexible and better arch.
Wrong, For RDNA 3 CU, the 1st 64 stream processors support both w32 and w64, while the 2nd 64 stream processors only support w32.

RDNA 2 CU and RDNA 3 CU have the same texture unit count.

AIDA64's RX 7900 XTX shows half of TFLOPS potential.

RDNA3 Dual issue compute is extremely limited and RX7900 GPUs are unlikely to exceed 36 tflops due to limitations. Making 61tflops RX7900 GPUs typically perform at only 36tflops. This explains the inconsistent compute performance gain of the 7900XTX.

Try again.
 
Last edited:
The fully enabled Navi 32 die features 4,096 shaders across 64 Compute Units (CUs) paired with 64MB of L3 cache and a 256-bit memory controller. Additionally, it packs 256 Texture Units and 128 Raster Output Units alongside 64 RT Cores (one per CU).
Now that's the third different kind of info. :confused:

I still find the 3 x 20 CU situation more plausible considering AMD isn't in the habit of not releasing fully enabled chips at first launch. The small price difference between the 7700 XT and 7800 XT suggests that yields are good, so they don't even have to.

Where are the die shots and block diagrams?
 
Now that's the third different kind of info. :confused:

I still find the 3 x 20 CU situation more plausible considering AMD isn't in the habit of not releasing fully enabled chips at first launch. The small price difference between the 7700 XT and 7800 XT suggests that yields are good, so they don't even have to.

AMD keeps the higher bins and full chips for other products - their usual priority is to use them in the mobile segment, so maybe you should look for the 64 CU silicon in something like RX 7800M XT or something specific for Apple.

Where are the die shots and block diagrams?
 
AMD keeps the higher bins and full chips for other products - their usual priority is to use them in the mobile segment, so maybe you should look for the 64 CU silicon in something like RX 7800M XT or something specific for Apple.
I think it's 3 arrays with 20 CUs each, so the 7800 XT is actually a maxed out chip. Please excuse the quality of my drawing - these are the best quality images I could find to work with.
Capture1.PNG
arch1.jpg


Edit: Then, you get the 54 CUs for the 7700 XT by disabling one CU pair per array.
 
Last edited:
  • Haha
Reactions: ARF
Edit: Then, you get the 54 CUs for the 7700 XT by disabling one CU pair per array.
Lisa, please disable another one pair so it'll be 48 CUs, then lower the clock by a couple hundred MHz and sell it by the name of 7600 XT. For $350.
 
I think it's 3 arrays

3 Shader Engines and each has 6 Shader Arrays, but how that rounds up to 3840, no idea. maybe 6 arrays in total.
 
3 Shader Engines and each has 6 Shader Arrays, but how that rounds up to 3840, no idea. maybe 6 arrays in total.
Looking into it, I was meant to say "shader engines", not "shader arrays". I admit, I get confused by these terms. :ohwell:

I just took a Navi 31 block diagram, and edited it in Paint for scaling purposes. It seems to fit my theory on Navi 32:
arch1 (1).jpg
arch1 (2) - Copy.jpg


Edit: Corrected numbers in the description.
 
Last edited:
Once upon a time they also cost you as much as some second hand cars, so there's that as well. So while comparing the current gen to previous gen prices is fair you also are conveniently ignoring just how inflated those prices were for so long, not to mention AMD & especially Nvidia lowered them so much only to dump excess stock after the crypto bust! Heck if that bubble hadn't popped Nvidia would still be charging you one kidney's worth of monies for that 3080. We'll talk when the 7xxx series goes on regular sale 3-6 months down the line.
I'm not conveniently forgetting anything. I'm comparing the current non-silicon-shortage prices with the pre-silicon-shortage prices. If you think that everything's just fine that's not my problem. Everything isn't fine and this is a sentiment that his shared by top YouTube reviewers.

By all means though, don't listen to me. I have nothing to lose from that.

One would think, emphasis on think because it seems the people who are drawing these parallel do not seem to use their brain.
I completely agree with you but there's no shortage of people who don't seem to use their brain. Most people know as much about video cards as we do about home appliances. We know the brands and we know the brands' reputations, but that's about it.
The 4080 despite being a "replacement" for the 3080 is a whopping 500 dollars more than the 3080 was at launch. So it's a replacement only if you ignore that astronomical 70% increase in price.
Oh I know about nVidia's pricing and that's why I won't touch GeForce cards. The thing is, a replacement card is supposed to be at least 25% faster without a price increase. The performance of rival of the RX 6800 XT is the RTX 4070 which means that, this generation, Radeons have dropped a tier in price. This is because AMD had this insane idea that there should be three level-9 halo cards. This kind of logic is worthy of Monty Python:

Meanwhile the 7800XT is 500 dollars vs 650 dollars that the 6800XT was at launch, so it's actually cheaper.
Yes, but it's three years later so it should be cheaper.
So, if one were to think, they'd realize they have to look at the prices and come to the conclusion that neither of these cards are replacements for their older counterparts.

video_image-EuCb8BBCy.jpeg

And of course, the reason for this is:
video_image-AE16vUEqS.jpeg

The RX 7900 XTX should be, at most, the RX 7900 XT. The RX 7900 XT should be, at most, the RX 7800 XT, etc.

All AMD has done is make their level-8 appear to be only competitive with nVidia's level-7 card. It's not a good look.

Nvidia released a $1200 card they call the 4080.
Yeah, but they kept the same performance configuration. AMD didn't have to follow suit and could easily have made the RX 7800 XT compete with the RTX 4080 performance-wise for $700 or less. They just decided to screw people a little less than nVidia but still make a cash grab. They fudged the part numbers to make it appear justifiable.
AMD released a $500 card they call the 7800 XT.
And it shouldn't called that. It's not a level-8 card, it's a level-7 card at best.
They chose to do different things with their completely arbitrary names.
The names are not, nor have they ever been, arbitrary. From the long-ago days of ATi, Radeon numbers have always followed a specific nomenclature:

Level 9 - Halo-class (RX 7900 XTX, R9 290X, HD 7970, HD 6970, HD 5970, HD 2900 XT, X1900 XTX)
Level 8 - Enthusiast-class (RX 6800 XT, R9 280X, HD 7870, HD 6870, HD 4870, HD 3870, X1800 XT)
Level 7 - High-End Gaming (RX 6700 XT, RX 5700 XT, R9 270, HD 7770, HD 6770, HD 5770, HD 4770)
Level 6 - Mainstream Gaming (RX 6600, RX 5600 XT, RX 260, HD 7670, HD 6670, HD 5670, HD 4670, HD 3670)
Level 5 - Entry-Level Gaming (RX 6500 XT, RX 5500 XT, R9 250X, HD 7470, HD 6550, HD 5550, HD 4550, HD 3570)
Level 4 - Low-Power Gaming / Glorified Video Adapter (RX 6400, R7 240, HD 7450, HD 6450, HD 5450, HD 4450, HD 3450)
Level 3 and Below - Integrated Graphics (HD 3300, HD 3200, HD 2100)

When are people going to realise that I never make things up?

NVIDIA GPUs support FreeSync DP monitors.


Model numbers are just marketing labels. The ASIC model number is the real model number.

Like any competitor, AMD competes against the competition, not against itself.
Read my reply to Lew Zealand above.
 
Yeah, but they kept the same performance configuration. AMD didn't have to follow suit and could easily have made the RX 7800 XT compete with the RTX 4080 performance-wise for $700 or less. They just decided to screw people a little less than nVidia but still make a cash grab. They fudged the part numbers to make it appear justifiable.

And it shouldn't called that. It's not a level-8 card, it's a level-7 card at best.

The names are not, nor have they ever been, arbitrary. From the long-ago days of ATi, Radeon numbers have always followed a specific nomenclature:

Level 9 - Halo-class (RX 7900 XTX, R9 290X, HD 7970, HD 6970, HD 5970, HD 2900 XT, X1900 XTX)
Level 8 - Enthusiast-class (RX 6800 XT, R9 280X, HD 7870, HD 6870, HD 4870, HD 3870, X1800 XT)
Level 7 - High-End Gaming (RX 6700 XT, RX 5700 XT, R9 270, HD 7770, HD 6770, HD 5770, HD 4770)
Level 6 - Mainstream Gaming (RX 6600, RX 5600 XT, RX 260, HD 7670, HD 6670, HD 5670, HD 4670, HD 3670)
Level 5 - Entry-Level Gaming (RX 6500 XT, RX 5500 XT, R9 250X, HD 7470, HD 6550, HD 5550, HD 4550, HD 3570)
Level 4 - Low-Power Gaming / Glorified Video Adapter (RX 6400, R7 240, HD 7450, HD 6450, HD 5450, HD 4450, HD 3450)
Level 3 and Below - Integrated Graphics (HD 3300, HD 3200, HD 2100)

When are people going to realise that I never make things up?

I like numbering schemes, too. But AMD's numbering scheme is occasionally or even usually consistent. Note that occasionally or even usually consistent is in fact: not consistent.

RX 590
RX 580
RX 570
RX 560

Odd that those are missing from your seemingly complete list. Where's Vega 56 or 64? Is the Radeon VII something or other?

AMD deviates from your imagined strict numbering system when it pleases them, has done so recently, and they chose to do it again. A few people who like their numbers a certain way care I guess, but the rest of us care only for:

What does my money buy?

That's it. If I prefer to buy a Great Aunt Gertude GPU because it gets me 144 FPS in BF 2042 and costs me $350, then I'm buying that GAG GPU if it's the cheapest. Name is irrelevant.
 
I like numbering schemes, too. But AMD's numbering scheme is occasionally or even usually consistent. Note that occasionally or even usually consistent is in fact: not consistent.

RX 590
RX 580
RX 570
RX 560

Odd that those are missing from your seemingly complete list. Where's Vega 56 or 64? Is the Radeon VII something or other?

AMD deviates from your imagined strict numbering system when it pleases them, has done so recently, and they chose to do it again. A few people who like their numbers a certain way care I guess, but the rest of us care only for:

What does my money buy?

That's it. If I prefer to buy a Great Aunt Gertude GPU because it gets me 144 FPS in BF 2042 and costs me $350, then I'm buying that GAG GPU if it's the cheapest. Name is irrelevant.
I couldn't agree more. Although, I admit, it's kind of nice having a 7800 as a CPU and GPU as well. :D
 
That's it. If I prefer to buy a Great Aunt Gertude GPU because it gets me 144 FPS in BF 2042 and costs me $350, then I'm buying that GAG GPU if it's the cheapest. Name is irrelevant.
Yes, my Genuine Intel® 0000 @ 1.80 GHz (ES), better known as QVYE, can't agree more. Bought for what was supposed to be current gen i3 price and almost got a current gen i7 (LGA1700 didn't exist when I bought it). Value means more than names.
1694481857335.png

a replacement card is supposed to be at least 25% faster without a price increase.
RTX 3080 Ti was $1200 at launch.
RTX 4080 was $1200 at launch. Which is a tiny tad cheaper considering inflation but whatever.
Can you say RTX 3080 Ti is more than 80% as fast as a 4080? I can't. But that dozen hunnit is still a bit burglary since I'm comparing to a COVID+mining boom inflated Ampere pricing.
RX 7800 XT is worth the same dollar amount as RX 6700 XT's MSRP and you know what? 6700 XT is destroyed by 7800 XT.
When are people going to realise that I never make things up?
When you go back a couple billion years and make humankind impossible to exist. People will always doubt you, right you or not.
 
Technically speaking...name-y-wise, my old HD 7950 is higher and therefore wins :p
You should be using it with a 7950X3D for maximum 7950-yness. :rockout:
 
Back
Top