• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

RDNA4 Prediction Time!!!

No, and the 6800XT msrp was $649. Why are we rushing to the defense of a flat out lie here… Again Nvidia can’t even meet that standard.
Isn't everyone mad at nvidia for not doing that? Like really, how much faster do you think a 649$ gpu should be compared to the 6800xt?
 
Best price/performance for people who play esports/live service titles. If AMD fixed their Reflex alternative and manage to convince someone like Riot and Blizzard to bake it in, they should be easily able market it as such. 5% slower than 4080 in raster and 15% slower in RT. $649. 7900XTX to quietly continue at the top for rocm developers' home rigs.
 
Isn't everyone mad at nvidia for not doing that? Like really, how much faster do you think a 649$ gpu should be compared to the 6800xt?

So one company get’s a free pass and the other doesn’t.

From what I see people are unhappy about the price of GPUs, and as the market leader Nvidia has done everything in their power to slide pricing up and overall value down, especially with the 4000 series. AMD undercut but followed suit in offering little to no value increase at the most important market segments; entry and mid range markets.

On top of that 50% generational performance increases are long dead without high costs, higher power, and massive dies. I personally find raster and rt performance around 4080/7900XTX levels at ~$600 to be decent considering the large majority of people playing games are running them at 1080/1440p and have little to no need for that sort of graphics horsepower if we’re actually going to be honest.

Everyone whose asked me to build or price something for them in an effort to get ahead of potential tariff changes has dropped their jaw at gpu pricing and find it hard to stomach $500-600 for a 7800 or 4070. I would never recommend a 4060/4060ti at this point, same for a 7600/7600XT, they’re all horrendous value at the end of their life cycle.
 
So one company get’s a free pass and the other doesn’t.
What does that even mean? What company gets a free pass and why does it matter? Amd is not going to offer good gen on gen performance because some other company gets a free pass? What kind of argument is that man?
 
You stated AMD has to provide a 200% uplift from the 6800XT. That is mathematically faster than a 4090. The 3070 > 5070 won’t provide that same uplift, yet for AMD you find that’s “reasonable”, yet you deflect when I brought up the comparison in generational improvements when it comes to Nvidia. Why the free pass, why the exhausting effort to thread crap any AMD GPU rumor thread? Your bias is exhausting.
 
As someone too old for some of this crap, let me suggest how the companies market this.

5700xt = 6600xt = 7500xt...or each generation the previous level's performance becomes the current one.
3080 = 4070 = 5060...once you factor out the framegen shenanigans, and overlook the occasional hits with VRAM at higher resolutions

Likewise, the AMD story is about 20-30% improvement per generation, varying wildly between what testing is performed.
Nvidia on the other hand likes to slip in framegen...but the raw numbers are generally improvements that are consistent across the board...and add to 40-50%.


Mathematically, this leaves AMD behind...because net 30% might be 50% in lighting, 100% in reflections, and 5% in multi-rendering...which is why they are so hard to actually measure as a percentage. AMD messes with said architecture a lot...and the results are likewise scattered. That said, their raw raster is usually on-par with the Nvidia generational improvements...which only means they aren't bad until you start enabling ray tracing and see why the feature absolutely curb stomps AMD product.


What is the point then? Well, if I can get 3080 level performance I probably have enough raster performance to run QHD/UHD with all of the bells and whistles at a reasonable rate. In fact, I probably have enough to run two QHD monitors and still have 120+ Hz refresh...which means anything above that is performance you'd need to make ray tracing a thing. If I can be delivered a card that is more than capable of running that today, with enough VRAM to make-up for lazy developers and huge textures, then anything above that performance is literally wasted potential. AMD seems to be setting this goal with the 9070, and Nvidia seems to be setting it with the 5070. My prediction is that they'll fight for the first reasonably priced "high performance" slot since the start of the pandemic, and the winner will finally be the consumer. Whichever is less powerful will be forced to reprice itself, and it'll do so because neither AMD nor Nvidia are capable of charging a huge premium that would allow these to sit around.

The net-net of this is that we are going to see the first genuinely well priced and featured video cards since the beginning of the pandemic...and it'll be at the cost of an entire console or a single GPU. We'll then see Intel come in with sloppy seconds to mop up the value users...because they can sustain lower margin higher sales to establish their brand. That'll hopefully finally kill off the 3060 market...because it's time to see that die. It's almost as bad as the 1660s and they basically coasted through the pandemic and crypto boom as the only real option. Good lord, it's been a hard 5 years in the GPU space.
 
You stated AMD has to provide a 200% uplift from the 6800XT. That is mathematically faster than a 4090. The 3070 > 5070 won’t provide that same uplift, yet for AMD you find that’s “reasonable”, yet you deflect when I brought up the comparison in generational improvements when it comes to Nvidia. Why the free pass, why the exhausting effort to thread crap any AMD GPU rumor thread? Your bias is exhausting.
What did I deflect and why are you even bringing up nvidia? Nvidia has stagnated for 2 generations now and sells software.

You still havent told me how a 50% generation uplift is crazy, all you are coming back with is "but but nvidia"
 
What did I deflect and why are you even bringing up nvidia? Nvidia has stagnated for 2 generations now and sells software.

You still havent told me how a 50% generation uplift is crazy, all you are coming back with is "but but nvidia"

Because neither company has offered 50% generational improvement in almost 10 years or several generations. It is absolutely not the norm and hasn’t been for some time. Competition is brought up to prove that neither are offering a 50% generational improvement, but for some reason AMD has to according to your delusion.
 
Because neither company has offered 50% generational improvement in almost 10 years or several generations. It is absolutely not the norm and hasn’t been for some time. Competition is brought up to prove that neither are offering a 50% generational improvement, but for some reason AMD has to according to your delusion.
How did you figure that they haven't done that for 10 years? Off the top of my head, 2080 - - > 3080, 3090 - - > 4090, 2070 - - > 3070, and I can keep going on and on.
 
No, and the 6800XT msrp was $649
I clearly stated it was the 6800's MSRP, not the XT's. Why are people are so keen on reading things wrong.
Again Nvidia can’t even meet that standard.
They can. Easily. 4090 is more than 50% better than 3090. It's just they don't want to do that and don't even have to (AMD didn't compete seriously enough, now they don't compete at all).
*You can’t expect 50% generational increases and selectively give one company a free pass on failing to meet that criteria, then enforcing that should be the norm for another company.
However you can be mad at one company being greedy and another being lame at the same time. You defend the abusers here.
 
I clearly stated it was the 6800's MSRP, not the XT's. Why are people are so keen on reading things wrong.

They can. Easily. 4090 is more than 50% better than 3090. It's just they don't want to do that and don't even have to (AMD didn't compete seriously enough, now they don't compete at all).

However you can be mad at one company being greedy and another being lame at the same time. You defend the abusers here.
8nm vs 5nm, 3x improvement in density. Yeah, I think that in this "small margin", Nvidia was able to achieve a 50% improvement in performance.

Nvidia isn't going to work magic without significant advances in the manufacturing process that allow it to do so. Even the advances in AI presented are achieved by native FP4 support, not by massive architectural improvements.
 
8nm vs 5nm, 3x improvement in density. Yeah, I think that in this "small margin", Nvidia was able to achieve a 50% improvement in performance.
They got infinite money. No one should care about the "how" in the question. That old Samsung 8 nm node wasn't even half that bad, it's just nVidia didn't want to set the bar too high. It would've undercut their further sales of Ada series they planned to have even more cut-down than it ended up being (remember "4080 12 GB?"). They underdeliver on purpose.

We had +50 to 60 % more bang for our buck for the better part of latest decades (GTX 670 = 175% GTX 470; GTX 970 = 160% GTX 670; GTX 1070 = 150% GTX 970; RTX 2070 = first exception but allowed RT and DLSS; RTX 3070 = 155% RTX 2070...). Now we have +0%. AT BEST. And it's definitely not because poor nVidia are cornered and can't do stuff. No. It's us customers who are cornered.
 
They got infinite money. No one should care about the "how" in the question. That old Samsung 8 nm node wasn't even half that bad, it's just nVidia didn't want to set the bar too high. It would've undercut their further sales of Ada series they planned to have even more cut-down than it ended up being (remember "4080 12 GB?"). They underdeliver on purpose.

We had +50 to 60 % more bang for our buck for the better part of latest decades. Now we have +0%. AT BEST. And it's definitely not because poor nVidia are cornered and can't do stuff. No. It's us customers who are cornered.
I think people got too used to getting shafted that they think 50% gen on gen is out of this world. Crazy
 
They got infinite money. No one should care about the "how" in the question. That old Samsung 8 nm node wasn't even half that bad, it's just nVidia didn't want to set the bar too high. It would've undercut their further sales of Ada series they planned to have even more cut-down than it ended up being (remember "4080 12 GB?"). They underdeliver on purpose.

We had +50 to 60 % more bang for our buck for the better part of latest decades (GTX 670 = 175% GTX 470; GTX 970 = 160% GTX 670; GTX 1070 = 150% GTX 970; RTX 2070 = first exception but allowed RT and DLSS; RTX 3070 = 155% RTX 2070...). Now we have +0%. AT BEST. And it's definitely not because poor nVidia are cornered and can't do stuff. No. It's us customers who are cornered.
Nobody has infinite money, simply burning capital doesn't do any magic in this segment, look at how much Intel spent and where it is now.

It's true, Nvidia could have lowered margins a little last generation, the gaming segment was in green and very profitable for the company, especially when its marketshare is so high it's easy to reduce profit margins without taking a heavy hit. But that's not the logic of a publicly traded company with stratospheric expectations.

Now to expect a +50% jump from Blackwell which is essentially still 5nm is ridiculous. If even B100/B200 (Jensen's money machine) brought limited improvements in raw computing power, it's because there was zero possibility of doing better.
 
If we look at current rumors RX 9070 XT 256bit 16GB shold be around 30% to 40% faster than RX 7800 XT 256bit 16GB (in raster) and even better in RT (Which is real gen to gen boost and you can not be mad at it)

That already shows way better performance increase over RTX 5070 192bit 12GB in relation to RTX 4070 192bit 12GB (Poor perfromance per dollar as always in last 6 years)

RTX 20 Series and RTX 40 Series my all time least favorite GPU generations (What a shit show) will see if RTX 50 Series will continue this show.

My favorite generation was GTX 900 Series after that GTX 1000 Series.
 
Last edited:
I clearly stated it was the 6800's MSRP, not the XT's. Why are people are so keen on reading things wrong.

They can. Easily. 4090 is more than 50% better than 3090. It's just they don't want to do that and don't even have to (AMD didn't compete seriously enough, now they don't compete at all).

However you can be mad at one company being greedy and another being lame at the same time. You defend the abusers here.

The 6800XT was what was being referenced, not the 6800. Why are people so keen on reading things wrong. This is what fevgatos used as a his benchmark, you decided to change it for who knows what reason.

4090FE is roughly ~39% faster than the 3090FE. Lower tier cards since the 1000 series including AMD products have even smaller performance lifts.

Pricing is ridiculous for both vendors, and Nvidia sets the bar as market leader. I’m merely pointing out that out that his original statement of “the 9070XT should be 200% of a 6800XT in performance” at ~$650 is a pipe dream no matter who is selling you the card. But for some reason that should be a normal expectation? The reality is no one has done that for years.
 
I think people got too used to getting shafted that they think 50% gen on gen is out of this world. Crazy

People don't mind getting screwed as long as it's the company they like screwing them....
 
look at how much Intel spent and where it is now.
After ten years of being uncontested, any monopolist grows too comfortable and thus, immobile. In 2017, when AMD suddenly released CPUs worth looking at after a decade of mess up on top of a mess up, Intel were caught way off guard. They had been too stubborn to realise their approach is of no good and the shots had already been fired.

AMD could've pulled a Ryzen card on the dGPU market, too, yet they don't do that. Not sure if it's a conspiracy (more likely scenario IMHO) or just skill issue (according to Occam's razor, this is a more likely reason).

What I'm pointing at is quid was being spent not the most optimal way by Intel. nVidia are creating their own market by advancing in AI, RT, CUDA, etc way before AMD learn what these things are all about. However, they can provide even better products. More raw performance, more VRAM, more everything. They just don't do that because what's the point, 4070 will outsell 7800 XT hard any day even if more expensive just because it's an nVidia GPU.
Now to expect a +50% jump from Blackwell which is essentially still 5nm is ridiculous.
I never expected that, I knew from the start it's gonna be Ada on 'roids at the very most. Now that enough information "leaked" I can surely tell that 5090 is just a glorified 4090 Ti with about +40% edge; 5080 is just a 4080 Ti with about 15 to 20 % advantage over its predecessor, 5070 Ti is something to fill the gap between 4070 Ti Super and 4080; and 5070 is essentially a 4070 Super with better VRAM bandwidth. And, of course, with much faster AI (because it's where the quid is at) and multi frame generation to make it look like they did something for the gamers.
The 6800XT was what was being referenced, not the 6800. Why are people so keen on reading things wrong. This is what fevgatos used as a his benchmark, you decided to change it for who knows what reason.
Let me explain like you're five.

They uttered, 200% of 6800 XT.
They also uttered, 50% generational improvement.
I recognised the MSRP of $650 being a very likely scenario for a 9070 XT so I decided to look at what it is in 2020's money.
It ended up almost matching 6800 non-XT.
Then I concluded to 6800 non-XT by 1.5 power 2 is 225% of 6800 non-XT, or roughly 200% of 6800 XT.

That way the 50% generational uplift really makes for a 650-dollar GPU that doubles the performance of 6800 XT.
 
The 6800XT was what was being referenced, not the 6800. Why are people so keen on reading things wrong. This is what fevgatos used as a his benchmark, you decided to change it for who knows what reason.

4090FE is roughly ~39% faster than the 3090FE. Lower tier cards since the 1000 series including AMD products have even smaller performance lifts.

Pricing is ridiculous for both vendors, and Nvidia sets the bar as market leader. I’m merely pointing out that out that his original statement of “the 9070XT should be 200% of a 6800XT in performance” at ~$650 is a pipe dream no matter who is selling you the card. But for some reason that should be a normal expectation? The reality is no one has done that for years.
The 3090 is less than 60% of the 4090 in performance. That's according to TPUs review. So nowhere near your 39% figure. Don't tell me you are using 1080p where the 4090 is severely limited.

I already gave you plenty recent examples with a 50% gen on gen performance increase. I don't know why you are doubling down on this... 50% should be trivial for a company that's not trying to shaft us. Actually I'd argue 50% is them trying to shaft us already.

Let me explain like you're five.

They uttered, 200% of 6800 XT.
They also uttered, 50% generational improvement.
I recognised the MSRP of $650 being a very likely scenario for a 9070 XT so I decided to look at what it is in 2020's money.
It ended up almost matching 6800 non-XT.
Then I concluded to 6800 non-XT by 1.5 power 2 is 225% of 6800 non-XT, or roughly 200% of 6800 XT.

That way the 50% generational uplift really makes for a 650-dollar GPU that doubles the performance of 6800 XT.
Yeap it's called compound effect.
 
After ten years of being uncontested, any monopolist grows too comfortable and thus, immobile. In 2017, when AMD suddenly released CPUs worth looking at after a decade of mess up on top of a mess up, Intel were caught way off guard. They had been too stubborn to realise their approach is of no good and the shots had already been fired.

AMD could've pulled a Ryzen card on the dGPU market, too, yet they don't do that. Not sure if it's a conspiracy (more likely scenario IMHO) or just skill issue (according to Occam's razor, this is a more likely reason).

What I'm pointing at is quid was being spent not the most optimal way by Intel. nVidia are creating their own market by advancing in AI, RT, CUDA, etc way before AMD learn what these things are all about. However, they can provide even better products. More raw performance, more VRAM, more everything. They just don't do that because what's the point, 4070 will outsell 7800 XT hard any day even if more expensive just because it's an nVidia GPU.

I never expected that, I knew from the start it's gonna be Ada on 'roids at the very most. Now that enough information "leaked" I can surely tell that 5090 is just a glorified 4090 Ti with about +40% edge; 5080 is just a 4080 Ti with about 15 to 20 % advantage over its predecessor, 5070 Ti is something to fill the gap between 4070 Ti Super and 4080; and 5070 is essentially a 4070 Super with better VRAM bandwidth. And, of course, with much faster AI (because it's where the quid is at) and multi frame generation to make it look like they did something for the gamers.

Let me explain like you're five.

They uttered, 200% of 6800 XT.
They also uttered, 50% generational improvement.
I recognised the MSRP of $650 being a very likely scenario for a 9070 XT so I decided to look at what it is in 2020's money.
It ended up almost matching 6800 non-XT.
Then I concluded to 6800 non-XT by 1.5 power 2 is 225% of 6800 non-XT, or roughly 200% of 6800 XT.

That way the 50% generational uplift really makes for a 650-dollar GPU that doubles the performance of 6800 XT.

No one was claiming a thing about inflation or what that’s going to buy you today. You’re the one moving the goal posts here and injecting yourself into a discussion of the claim whether or not a 200% performance jump is rational. It’s not, no one has or is offering that. You insist on people having a difficult time reading, but start blathering on about a card and costs that had nothing to so about what was being discussed in the first place, after interjecting on his behalf.

It’s clear he can’t provide any rationale evidence as to why the generational performance increase must be 50% gen over gen, when it’s a fact that hasnt been the case for almost 10 years. Have you not listened to Jensen repetitively claim moore’s law is dead for the past few years?

I guess I’m misremembering the 3000 series improvements due to them never being available at sane prices, making comparisons pointless. Regardless generational improvements have continued to shrink, and it won’t be getting better, 50% is not something to expect.
 
Last edited:
The 3090 is less than 60% of the 4090 in performance. That's according to TPUs review. So nowhere near your 39% figure. Don't tell me you are using 1080p where the 4090 is severely limited.

I already gave you plenty recent examples with a 50% gen on gen performance increase. I don't know why you are doubling down on this... 50% should be trivial for a company that's not trying to shaft us. Actually I'd argue 50% is them trying to shaft us already.


Yeap it's called compound effect.

I'm not so optimistic about 50%

I'd say 30% better price to performance gen on gen is about right. I'm less bothered by actual perfomance gains as long as I'm getting 30% more for my money

So the 9070 should offer at least 60% better price to performance vs whatever 6000 card it's priced most similar to msrp wise.

The fact that AMD fans might finally get decent image quality with upscaling would be the cherry on top


I don't want 50% more performance if it cost me 70% more money like the 4080 gave us for example.
 
If AMD managed to achieve 40% on the same node, there must be something seriously wrong with the 7800 XT. 20% is more likely.

4090 is 71% faster than 3090.

1737583380789.png
 
20% is more likely.
No it looks like it's nvidia's path. RX 9070 XT shold be around 30-40% faster in raster and even more in RT.
 
Last edited:
No it looks like it's nvidia's path. RX 9070 XT shold be around 30-40% faster.

People can talk about % all they want this product will live or die based on what AMD prices it at. If they think they can get away with pricing like Nvidia it will die if they've learned from the 7900XT and 7700XT it's got a chance to be a decent product.
 
You insist on people having a difficult time reading
I still do because you clearly misunderstand me.

Their claims:
1. "I want +50% every gen."
2. "I want a 650 dollar GPU that's two times faster than a 6800 XT."

I only calculated how and why these statements are not contradicting themselves. I have never said a word about +50 or whatever being rational or not. And if you ask my opinion (I know you don't), no, +50% generational uplift is boring. Once they pull 80+ I'll be excited.
 
Back
Top