• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
May 13, 2008
Messages
989 (0.16/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Lol you have yet to show an example of non upscaled RT running OK on even a 4090. Mate. These cards are solid in unobtanium land. Upscaling is still a crutch; vendor/version/game specific support required.
1741092789311.png


The point is 4090 won't be a 4090 on 3nm. It will be a 6080 (and likely faster so 1440p->4k up-scaling more consistent). Again, I think the 'whatever they call the 5080 replacement w/ 18GB" will be 1440p.
Because again, 5080 is 10752sp @ 2640mhz. 9216sp @ 3780sp is 22% faster in RT/raster, and 18GB of RAM is 12.5% more than 16GB. How far is a 5080 away from 60?
What if you turn on FG (native framerate)? OH, that's right...nVIDIA literally HIDES IT FROM YOU BC OF THIS REASON.

Again, then 9070 xt / 128-bit cards will be 1080p. No up-scaling (in a situation like this; yes more demanding situations exist and hence why higher-end cards exist).

I don't get how other people don't see this? I think it's clear as day.
Upscaling has never been a reasonable thing on a top of the line GPU. If a 5080 isn't suitable for RT at 1440p, like you said, then you just proved my point: we're far from RT being usable in games.
I just do not agree, and it really goes to show that a lot of people have not used RT and/or up-scaling. It's very normal. Asking for 4kRT is ridiculously absurd (this gen). Pick a game and look at even a 5090.
Also, 1440p->4k up-scaling looks pretty good, even with FSR. Now 960->1440p will look good (always has been okay with DLSS, but now will with FSR4, I think), which again, is the point of 9070 xt.
1440p isn't good-enough (in my view, especially with any longevity and/or using more features like FG) on 5080 bc it doesn't have to be...yet. It could be, but it isn't bc there is no competition.
Now, up-scaling is even important for 1080p->4k, which is *literally the point of DLSS4*. Even for a 5080 (because of the situation above).
'5070 is a 4090' is because 1080p up-scaling IQ has improved to the point they think they can compare it (along with adding FG) to a 4090 running native 4k (in raster). That is the point of that.
Up-scaling is super important. On consoles, they have (and continue to use) DRS. This is no different than that, really. Asking for consistant native frames at high-rez using RT is just not realistic for most budgets.
This is the whooollleee point of why they're improving up-scaling. RT will exist, in some cases in a mandatory way. You will use up-scaling, and you will prefer it looks ok. OR, you will not play those games.
OR, you will spend a fortune on a GPU. Or you willl lower other settings (conceivably quite a bit as time goes on). That's just reality.

Look, I get that some people still get hung up on things like "but gddr7" and such. GUYS, a 5080 FE needs 22gbps ram to run at stock (to saturate compute at 2640mhz). Do you know why those speeds?
Think of what AMD is putting out, and where that ram clocks. That is what you do (when you can). You put out the slowest thing you can to win; nothing more. Give nothing away you can sell as an upgrade.
Especially, as I'm showing you above, when it can be tangible. Save it for next-gen and sell it then. nVIDIA truly could give you 24GB and 3.23ghz clocks. They didn't...but they could.

This will bear out when people overclock 9070 XT's (somehow often just short of a stock 5080FE or similar to 5070ti OC) and/or there is a 24gbps card.
Somehow magically similar in many circumstances, especially if 3.47ghz or higher.
Because it's really, honestly, just MATH. It's not opinion. It's MATH. Yes, some units/ways of doing things differ; yes excess bw helps some (perhaps ~6% stock in this case?), but so do extra ROPs on N48.
I don't have the math on the ROPs; I'm sure it differs by resolution. I've never looked into it. But compute similar; I don't know about how much the TMUs help RT (yet). But the main point remains.
Compute is compute (in which 10752 @ 2640= 8192 @ 3465). Bandwidth is bandwidth. Buffer is buffer. It's all solvable. None of this is magic, but they will try to sell you bullshit which I am not.


SMH.

Sometimes I just want to dip until Rubin comes out. We'll just see...won't we. Period not question mark.
 
Last edited:
Joined
Jul 24, 2024
Messages
453 (2.02/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Future of RDNA4 is UDNA, which means AMD will unify it's AI stuff accelerating architecture with gaming one. AMD will no longer focus on gamers, they will focus more on AI, as did Nvidia. After all, it's the money that matters and money is in "AI". Except for consoles. AMD will try to remain present in consoles and APUs. Let's be honest: dGPUs are nothing more than a burden now for both Nvidia and AMD. Since no one gives a damn about Intel's AI accelerators and server CPUs anymore, they might end up as only dGPU manufacturer. Or they can wrap it up and focus everything on foundry business.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
43,902 (6.80/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Thanks, I'll have to watch it when I have a moment to absorb it all.

People that haven't been around for as long as some of us don't remember some of the weird crap nVIDIA has been caught doing.

Randomly cut L2, some but not all ram doubled over a bus, the whole 970 situation, disabling of whole clusters instead of separate pieces of them (as sold) that *can* impact performance...etc etc etc etc.

I'm fairly certain the ways nVIDIA cuts costs hasn't changed...I'm fairly certain they just got better at hiding it. *This* is what I keep trying to explain. They obfuscate, and when they can't, they lie.

When they get caught in their lies they say "it isn't a big deal" (Huang is very good at this), when it IS a big deal! Their ability to gaslight is incredible. I'm sure Steve has touched upon that at some point.

I'm not on some tirade or trying to fanboy...It just boggles my mind how much they have gotten away with, and now they get away with even more (because many people don't understand how GPUs work).

I'm not perfect in explaining it all, and I don't get *everything* right all the time, but I certainly can see a lot of things they have done and are doing that most don't appear to notice/understand.

Thank goodness for GPU-Z and how it works, or this would've been able to slip by as well.
How many never use GPU-Z, though?
Just as how many don't understand how they segment/limit/obsolete products in a way ridiculously unfair to the consumers?
I try to explain them, but I don't know how to do it without coming across impartially and get people to understand. This is why I get frustrated. I'm not cheerleading, but what they do HAS to stop.
And it won't unless people understand all this stuff. How to get it across, I really honestly don't know!

edit: I was writing that as you were (some people don't remember that stuff, and that isn't even all of it). This is true, but it's still connected to the shader cluster AFAIK? Perhaps I am mistaken.
The 1060 3gb vs 6 gb, same exact chip but shaders were nerfed between the 2.

This just really isn't true. Look at a 4090. It will typically upscale most games from 1440p to 4k, which is a very reasonable thing. Games like Wukong it will run at 1440p (yes, upscaled).
Yes, other cards are *purposely* stilted so it doesn't line up well. Like how a 5080 isn't suitable for 1440p (60fps mins). 9070 xt likely will be for 1080p. Then they may try to say 'xtx' good-enough for 1440p averages.
Which is exactly what 5080 does, for a crap-load more money.
Especially when you add features like FG and/or add DLSS which requires ~5-7% more performance (look at W1zards DLSS3 vs 4 benches if you don't believe me on that one btw), and again, this is by design.

That's kinda of what I've been saying! :p

9070 xt is likely about this with a cheaper card than nVIDIA has (Which again, is the point of what nVIDIA does). That you can build one towards a spec; you don't have to go high/low, and can stay consistant.
Where 5070 is a POS (that needs to upscale regardless) or instead you need at least a $750 card to get 1080p 60fps mins.

Again, I know there are different situations; When I say 1080p60 mins, in Wukong that means upscaling. In Spiderman it doesn't. I think Spiderman is the more realistic notion for many reasons.
Including the fact they make the Playstation and that game will almost-certainly be ported with settings from the PC at some ratio that is similar to a then-available 3nm card on the market with 60fps mins.

In Spider-man 3, who's to say that won't be the default? It probably WILL BE. That's all I'm trying to say. All of that makes sense, doesn't it?

I think nVIDIA (and AMD) will hit the nail on the head next-gen, with (hopefully) better pricing. I don't think they'll be able to dance around it any further. If you take the clocks/specs I've suggested, this bears out.
Not happening with green as you've seen now since rtx inception.
 
Top