• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

The future of RDNA on Desktop.

Joined
May 13, 2008
Messages
1,063 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Lol you have yet to show an example of non upscaled RT running OK on even a 4090. Mate. These cards are solid in unobtanium land. Upscaling is still a crutch; vendor/version/game specific support required.
1741092789311.png


The point is 4090 won't be a 4090 on 3nm. It will be a 6080 (and likely faster so 1440p->4k up-scaling more consistent). Again, I think the 'whatever they call the 5080 replacement w/ 18GB" will be 1440p.
Because again, 5080 is 10752sp @ 2640mhz. 9216sp @ 3780sp is 22% faster in RT/raster, and 18GB of RAM is 12.5% more than 16GB. How far is a 5080 away from 60?
What if you turn on FG (native framerate)? OH, that's right...nVIDIA literally HIDES IT FROM YOU BC OF THIS REASON.

Again, then 9070 xt / 128-bit cards will be 1080p. No up-scaling (in a situation like this; yes more demanding situations exist and hence why higher-end cards exist).

I don't get how other people don't see this? I think it's clear as day.
Upscaling has never been a reasonable thing on a top of the line GPU. If a 5080 isn't suitable for RT at 1440p, like you said, then you just proved my point: we're far from RT being usable in games.
I just do not agree, and it really goes to show that a lot of people have not used RT and/or up-scaling. It's very normal. Asking for 4kRT is ridiculously absurd (this gen). Pick a game and look at even a 5090.
Also, 1440p->4k up-scaling looks pretty good, even with FSR. Now 960->1440p will look good (always has been okay with DLSS, but now will with FSR4, I think), which again, is the point of 9070 xt.
1440p isn't good-enough (in my view, especially with any longevity and/or using more features like FG) on 5080 bc it doesn't have to be...yet. It could be, but it isn't bc there is no competition.
Now, up-scaling is even important for 1080p->4k, which is *literally the point of DLSS4*. Even for a 5080 (because of the situation above).
'5070 is a 4090' is because 1080p up-scaling IQ has improved to the point they think they can compare it (along with adding FG) to a 4090 running native 4k (in raster). That is the point of that.
Up-scaling is super important. On consoles, they have (and continue to use) DRS. This is no different than that, really. Asking for consistant native frames at high-rez using RT is just not realistic for most budgets.
This is the whooollleee point of why they're improving up-scaling. RT will exist, in some cases in a mandatory way. You will use up-scaling, and you will prefer it looks ok. OR, you will not play those games.
OR, you will spend a fortune on a GPU. Or you willl lower other settings (conceivably quite a bit as time goes on). That's just reality.

Look, I get that some people still get hung up on things like "but gddr7" and such. GUYS, a 5080 FE needs 22gbps ram to run at stock (to saturate compute at 2640mhz). Do you know why those speeds?
Think of what AMD is putting out, and where that ram clocks. That is what you do (when you can). You put out the slowest thing you can to win; nothing more. Give nothing away you can sell as an upgrade.
Especially, as I'm showing you above, when it can be tangible. Save it for next-gen and sell it then. nVIDIA truly could give you 24GB and 3.23ghz clocks. They didn't...but they could.

This will bear out when people overclock 9070 XT's (somehow often just short of a stock 5080FE or similar to 5070ti OC) and/or there is a 24gbps card.
Somehow magically similar in many circumstances, especially if 3.47ghz or higher.
Because it's really, honestly, just MATH. It's not opinion. It's MATH. Yes, some units/ways of doing things differ; yes excess bw helps some (perhaps ~6% stock in this case?), but so do extra ROPs on N48.
I don't have the math on the ROPs; I'm sure it differs by resolution. I've never looked into it. But compute similar; I don't know about how much the TMUs help RT (yet). But the main point remains.
Compute is compute (in which 10752 @ 2640= 8192 @ 3465). Bandwidth is bandwidth. Buffer is buffer. It's all solvable. None of this is magic, but they will try to sell you bullshit which I am not.


SMH.

Sometimes I just want to dip until Rubin comes out. We'll just see...won't we. Period not question mark.
 
Last edited:
Joined
Jul 24, 2024
Messages
464 (2.00/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Future of RDNA4 is UDNA, which means AMD will unify it's AI stuff accelerating architecture with gaming one. AMD will no longer focus on gamers, they will focus more on AI, as did Nvidia. After all, it's the money that matters and money is in "AI". Except for consoles. AMD will try to remain present in consoles and APUs. Let's be honest: dGPUs are nothing more than a burden now for both Nvidia and AMD. Since no one gives a damn about Intel's AI accelerators and server CPUs anymore, they might end up as only dGPU manufacturer. Or they can wrap it up and focus everything on foundry business.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
44,011 (6.81/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Thanks, I'll have to watch it when I have a moment to absorb it all.

People that haven't been around for as long as some of us don't remember some of the weird crap nVIDIA has been caught doing.

Randomly cut L2, some but not all ram doubled over a bus, the whole 970 situation, disabling of whole clusters instead of separate pieces of them (as sold) that *can* impact performance...etc etc etc etc.

I'm fairly certain the ways nVIDIA cuts costs hasn't changed...I'm fairly certain they just got better at hiding it. *This* is what I keep trying to explain. They obfuscate, and when they can't, they lie.

When they get caught in their lies they say "it isn't a big deal" (Huang is very good at this), when it IS a big deal! Their ability to gaslight is incredible. I'm sure Steve has touched upon that at some point.

I'm not on some tirade or trying to fanboy...It just boggles my mind how much they have gotten away with, and now they get away with even more (because many people don't understand how GPUs work).

I'm not perfect in explaining it all, and I don't get *everything* right all the time, but I certainly can see a lot of things they have done and are doing that most don't appear to notice/understand.

Thank goodness for GPU-Z and how it works, or this would've been able to slip by as well.
How many never use GPU-Z, though?
Just as how many don't understand how they segment/limit/obsolete products in a way ridiculously unfair to the consumers?
I try to explain them, but I don't know how to do it without coming across impartially and get people to understand. This is why I get frustrated. I'm not cheerleading, but what they do HAS to stop.
And it won't unless people understand all this stuff. How to get it across, I really honestly don't know!

edit: I was writing that as you were (some people don't remember that stuff, and that isn't even all of it). This is true, but it's still connected to the shader cluster AFAIK? Perhaps I am mistaken.
The 1060 3gb vs 6 gb, same exact chip but shaders were nerfed between the 2.

This just really isn't true. Look at a 4090. It will typically upscale most games from 1440p to 4k, which is a very reasonable thing. Games like Wukong it will run at 1440p (yes, upscaled).
Yes, other cards are *purposely* stilted so it doesn't line up well. Like how a 5080 isn't suitable for 1440p (60fps mins). 9070 xt likely will be for 1080p. Then they may try to say 'xtx' good-enough for 1440p averages.
Which is exactly what 5080 does, for a crap-load more money.
Especially when you add features like FG and/or add DLSS which requires ~5-7% more performance (look at W1zards DLSS3 vs 4 benches if you don't believe me on that one btw), and again, this is by design.

That's kinda of what I've been saying! :p

9070 xt is likely about this with a cheaper card than nVIDIA has (Which again, is the point of what nVIDIA does). That you can build one towards a spec; you don't have to go high/low, and can stay consistant.
Where 5070 is a POS (that needs to upscale regardless) or instead you need at least a $750 card to get 1080p 60fps mins.

Again, I know there are different situations; When I say 1080p60 mins, in Wukong that means upscaling. In Spiderman it doesn't. I think Spiderman is the more realistic notion for many reasons.
Including the fact they make the Playstation and that game will almost-certainly be ported with settings from the PC at some ratio that is similar to a then-available 3nm card on the market with 60fps mins.

In Spider-man 3, who's to say that won't be the default? It probably WILL BE. That's all I'm trying to say. All of that makes sense, doesn't it?

I think nVIDIA (and AMD) will hit the nail on the head next-gen, with (hopefully) better pricing. I don't think they'll be able to dance around it any further. If you take the clocks/specs I've suggested, this bears out.
Not happening with green as you've seen now since rtx inception.
 
Joined
Sep 7, 2017
Messages
22 (0.01/day)
What I find amazing is how do you disable 8 ROPs in a 16 ROP cluster? Doesn't make any sense.
It does make sense if it traced to a failure in the Die Certification Process. Which means that Jensen is off by 99.5% in his estimation of the number of chips affected. It will be interesting to watch the number of Cards pushed out the door by next Tuesday morning for each vendor

Future of RDNA4 is UDNA, which means AMD will unify it's AI stuff accelerating architecture with gaming one. AMD will no longer focus on gamers, they will focus more on AI, as did Nvidia. After all, it's the money that matters and money is in "AI". Except for consoles. AMD will try to remain present in consoles and APUs. Let's be honest: dGPUs are nothing more than a burden now for both Nvidia and AMD. Since no one gives a damn about Intel's AI accelerators and server CPUs anymore, they might end up as only dGPU manufacturer. Or they can wrap it up and focus everything on foundry business.
1) Last I heard Intel's Falcon Shores Project is dead.

2) Regarding AMD's future path with GPUs. AMD has stated it will be a MultiChiplet Design. To many people this evokes an image of two Die that handles different segments of two different processes. Those processes being Graphics Apllicications, which we associate with Radeon Products and Gaming Workloads and Compute Workloads that we associate with CAD (Computer Aided Design) which we associate with CDNA. WIth the advent of Chiplets and Infinity Fabrics questions were ask if this technology could be applied to Game Cards. That Infinity Fabric for as good as it is introduces latency issues that disrupts the video. But how about the problem is attacked a different way ? That is create a GPU Chip with two Chiplets that are pseudo independent from each other. These chiplets function pretty much on there own but they have the ability to borrow resources from their Partner Chiplet. In the case of the CDNA Chiplet every functions straight forward until the workload is complete at which point it is shipped to the Rasterization Section for final processing. This would eliminate the need for a Rasterization Section on the CDNA Chiplet. Turning to the case of the Gaming workload the data is applied directly to the input of the Gaming Chiplet but the Gaming Chiplet has access to the memory on the CDNA Chiplets so the off board VRam we see on Radeon Cards can be eliminated. Maybe we could even revisit the issue of using HBM on GPU Cards again ? The Final Advantage in this type of arrangement is the creation of 1 unified card that could be used for either application and price for either market. I'm talking about $2999 for the Design Engineering Model and two variants of the Gaming Model depending on how much VRam the user wants. 599 for the low end model or 999 for the High End Model. Control the whole process using manufacturer issued Key Codes to upgrade the card. Use the same process to upgrade a Gaming Card to a Work Station Card.
 
Joined
Oct 19, 2022
Messages
375 (0.43/day)
Location
Los Angeles, CA
Processor AMD Ryzen 7 9800X3D (+PBO 5.4GHz)
Motherboard MSI MPG X870E Carbon Wifi
Cooling ARCTIC Liquid Freezer II 280 A-RGB
Memory 2x32GB (64GB) G.Skill Trident Z Royal @ 6200MHz 1:1 (30-38-38-30)
Video Card(s) MSI GeForce RTX 4090 SUPRIM Liquid X
Storage Crucial T705 4TB (PCIe 5.0) w/ Heatsink + Samsung 990 PRO 2TB (PCIe 4.0) w/ Heatsink
Display(s) AORUS FO32U2P 4K QD-OLED 240Hz (DP 2.1 UHBR20 80Gbps)
Case CoolerMaster H500M (Mesh)
Audio Device(s) AKG N90Q w/ AudioQuest DragonFly Red (USB DAC)
Power Supply Seasonic Prime TX-1600 Noctua Edition (1600W 80Plus Titanium) ATX 3.1 & PCIe 5.1
Mouse Logitech G PRO X SUPERLIGHT
Keyboard Razer BlackWidow V3 Pro
Software Windows 10 64-bit
This guy gets it. :laugh:

I think it'll run 1080p fine for a while, until 18GB becomes a standard (just like 16GB is becoming over 12GB now), but I think 1440p is going to require some concessions (especially if you use features like FG, etc).
It's all relative. Some people (like me) will say 1080p, others will say they can make it work fine for 1440p (until more games are built towards 192-bit/18GB configs and/or built toward the PS6).
I'm not going to argue, because as I've said countless times...you can make ANYTHING work and different things are acceptable to different people.
I try to be cautious when explaining things (to the best of my ability/understanding at any given moment), which is why I trend towards worse-case and using minimum 60fps frame rates at high settings.

Yeah look at the 3090, it was the high-end of Ampere (with the Ti) and has 24GB VRAM, but it's already struggling on modern games without DLSS...and it doesn't have FG either. So a 9070 XT with 16GB won't really last 2-3 generations without lowering settings a lot... PC Gaming has become a joke, sadly :(

Which modern games?

I'm currently playing Kingdom Come Deliverance 2 at 1440 UW high with no FSR on a 6750 XT. A 9070 XT will allow me to max it out.

Space Marine 2 is also fine with reduced settings (on my 6750 XT), only that it doesn't look as good.

The last big title I played before that was Alan Wake 2, which ran acceptably as long as I stayed away from RT.

Hellblade 2 is doable with FSR - only that I don't like upscaling.

Avatar FoP runs like shit, I give you that.

Anyway, my backlog is huge, I could complete half of them on my 6750 XT no problem. :)


Also price. GDDR7 is expensive, the benefits wouldn't necessarily have made it worth it.

KCD2, Space Marine 2, AW2 and Hellblade II are actually all pretty well optimized games lol. But some newer games (mostly the ones on UE5) run poorly hand have a lot of traversal stutters...
 
Joined
Jan 14, 2019
Messages
15,037 (6.68/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
KCD2, Space Marine 2, AW2 and Hellblade II are actually all pretty well optimized games lol. But some newer games (mostly the ones on UE5) run poorly hand have a lot of traversal stutters...
Well, no one said that my plans can't go south. But I'll keep to it for now, and we'll see. :)
 
Joined
Sep 7, 2017
Messages
22 (0.01/day)
Yeah look at the 3090, it was the high-end of Ampere (with the Ti) and has 24GB VRAM, but it's already struggling on modern games without DLSS...and it doesn't have FG either. So a 9070 XT with 16GB won't really last 2-3 generations without lowering settings a lot... PC Gaming has become a joke, sadly :(
The Game Developers share some responsibility for that development.
 
Joined
Dec 9, 2024
Messages
258 (2.72/day)
Location
Missouri
System Name The
Processor Ryzen 7 5800X
Motherboard ASUS PRIME B550-PLUS AC-HES
Cooling Thermalright Peerless Assassin 120 SE
Memory Silicon Power 32GB (2 x 16GB) DDR4 3200
Video Card(s) RTX 2080S FE | 1060 3GB & 1050Ti 4GB In Storage
Display(s) Gigabyte G27Q (1440p / 170hz DP)
Case SAMA SV01
Power Supply Firehazard in the making
Mouse Corsair Nightsword
Keyboard Steelseries Apex Pro
Yeah look at the 3090, it was the high-end of Ampere (with the Ti) and has 24GB VRAM, but it's already struggling on modern games without DLSS...and it doesn't have FG either. So a 9070 XT with 16GB won't really last 2-3 generations without lowering settings a lot... PC Gaming has become a joke, sadly :(
To be completely fair, that was a flagship product. Don't know what people were expecting. But the rate of generation uplift NVIDIA provides is decreasing, the 40 series (pre super) and 50 series are good examples of that, whereas the gen uplift on AMD is inconsistent rather than a noticeable decline.

The next generation of GPU's will probably have either a super big generational uplift akin to what NVIDIA used to pump out, or very little. And that's just the GPU side of things, AAA developers woefully don't optimize their games till after launch 99% of the time it seems.
 
Joined
Sep 7, 2017
Messages
22 (0.01/day)
To be completely fair, that was a flagship product. Don't know what people were expecting. But the rate of generation uplift NVIDIA provides is decreasing, the 40 series (pre super) and 50 series are good examples of that, whereas the gen uplift on AMD is inconsistent rather than a noticeable decline.

The next generation of GPU's will probably have either a super big generational uplift akin to what NVIDIA used to pump out, or very little. And that's just the GPU side of things, AAA developers woefully don't optimize their games till after launch 99% of the time it seems.
Look at the reports out about the the Nvidia 5060. There are reports out stasting there will be 2 variants. One with 16 GB and the Second to be sporting only 8 GB. Yes it runs at lower performance numbers. But 8 GB ? Really ? I think AMD should get at least 2 generations out of the 9070XT with 3 generations more likely.

The question is what will AMD do if sales of these new cards are still very strong in the sometime with no signs of the pace slacking off, posing a genuine threat of exhausting an intended 2 year Supply of Silicon in 12 to 15 months. What happens then ? ? ?
 
Last edited:
Joined
Jun 19, 2024
Messages
554 (2.07/day)
System Name XPS, Lenovo and HP Laptops, HP Xeon Mobile Workstation, HP Servers, Dell Desktops
Processor Everything from Turion to 13900kf
Motherboard MSI - they own the OEM market
Cooling Air on laptops, lots of air on servers, AIO on desktops
Memory I think one of the laptops is 2GB, to 64GB on gamer, to 128GB on ZFS Filer
Video Card(s) A pile up to my knee, with a RTX 4090 teetering on top
Storage Rust in the closet, solid state everywhere else
Display(s) Laptop crap, LG UltraGear of various vintages
Case OEM and a 42U rack
Audio Device(s) Headphones
Power Supply Whole home UPS w/Generac Standby Generator
Software ZFS, UniFi Network Application, Entra, AWS IoT Core, Splunk
Benchmark Scores 1.21 GigaBungholioMarks
Joined
Aug 20, 2007
Messages
21,925 (3.42/day)
Location
Olympia, WA
System Name Pioneer
Processor Ryzen 9 9950X
Motherboard MSI MAG X670E Tomahawk Wifi
Cooling Noctua NH-D15 + A whole lotta Sunon, Phanteks and Corsair Maglev blower fans...
Memory 128GB (4x 32GB) G.Skill Flare X5 @ DDR5-4000(Running 1:1:1 w/FCLK)
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs, 1x 2TB Seagate Exos 3.5"
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64
The Game Developers share some responsibility for that development.
The big engine developers are far more to blame. We game devs generally only work with the big two and we can't change the starting entry costs much. Not economical to do anything else these days, anyways.
 
Joined
Sep 7, 2017
Messages
22 (0.01/day)
Wut? UDNA is 2027.
that's the time frame that I am familiar with ...... if its on time .... that's why I say why not 9070 Series Refresh with GDDR7 and a GPU manufactured on a Hotter Production Node (N4X instead of N4P). I bet we'd get a healthy performance bump out of that and afford UDNA with a release buffer if it falls behind.

On a side note its Thursday Afternoon here on the US West Coast 6-March and the Social Media Chatter is stating the 9070s are posting as Soldout on Etailors. The Big Retailers are still showing lots of stock.

Here is a video that might be of interest to this discussion. The video is a Moldy Oldy but it talks about some interesting stuff.

Advance to the 6:15 mark. https://youtu.be/NKHNbRUukN8?si=6_-2LHi9w0EM64oc
 
Joined
Mar 7, 2023
Messages
1,036 (1.40/day)
Processor 14700KF/12100
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast @ 6000
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte ud850gm pg5
Keyboard msi gk30
Nah we're not allowed to have high end cards anymore.

Amd is trying to focus on its stengths, better value lower down the stack. I don't think its a bad stategy per se, though there is something to be said of halo cards creating more mindshare.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
44,011 (6.81/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Amd is trying to focus on its stengths, better value lower down the stack. I don't think its a bad stategy per se, though there is something to be said of halo cards creating more mindshare.
Hivemind
 
Joined
Dec 25, 2020
Messages
7,896 (5.13/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Halo cards push the envelope and set a standard of performance and features. A GPU like the RTX 5090 represents the state of the art, the very best in graphics processing today, and the stuff that will come to happen in the coming years. A performance segment model is just a "taster" of that. The problem is that as GPU complexity rose, so did their cost. You could buy both brands' flashiest full chip models at the $600 range 15 years ago. Through worst case scenario inflation, $880 USD in today's money. A 5090 will set you back with 2K at the absolute minimum, realistically speaking 3 to 4K depending on where you live and how bad the taxes are. Minimum.

So that status quo is changing slowly, especially since games no longer follow hardware. Production costs are sky high, and only increase as higher audiovisual fidelity and general complexity is expected due to the introduction of more advanced console hardware and even better PC hardware. So, the result is that most games will look the same whether you have an RTX 5090 or a now what, 5 year old 6700 XT (which is an incredibly lame graphics card compared to the former), the difference being how fast you can run it. And that's really bad for business, considered what the GPUs are costing these days.
 
Joined
May 13, 2008
Messages
1,063 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Amd is trying to focus on its stengths, better value lower down the stack. I don't think its a bad stategy per se, though there is something to be said of halo cards creating more mindshare.

See, you think Halo? I think >100TF. Actually, I think literally whatever twice what 1080p60 (9070 xt) is. Which is conveniently slightly over 100TF for one company.
Absolutely not for marketing purposes and to sell a part that could potentially do 100TF but not quite that thing, which is ~105TF (the difference between DLSS3/4).

Given that's ~3200mhz (~3050mhz RT?) on a 9070 xt...~12288 @ ~4300mhz raster (~4100mhz RT if raster is still 1.05x ratio; but could be 1:1)? Little lower, I'm rounding, you can figure out the exact IYW.

Hmm...HDL on 3nm is ~3780-4050...but HPL 4050-4510. AMD went HPL for both N31/N48 (up to ~3.2[3]ghz], ~3460-3700), nvidia HDL for 5nm/4nm (~2.93, up to ~3.4ghz). Wonder what they will do on 3nm?

Wanna hear the funny part? That would be 12288sp/256-bit (w/40gbps memory; slight oc) on 3nm. What do you think nVIDIA will do?
Remember, nVIDIA's arch (current cache structure) would be equalzied to 3780mhz/36gbps. nVIDIA likes to have their choice of suppliers to leverage against each other, and Micron only make up to 36gbps...which (if like their GDDR5/6x) likely can't clock to 40gbps. nVIDIA's RT is similar perf as AMD's comparative raster clock so...would need >~40gbps.
Think how incredibly fucking convenient that could be a couple different ways. Genius, right? It really sort of is. Except for all the wrong reasons. I don't know if they are doing this, but this is what they do. Once you notice it, you have to admire it...because it truly is using every aspect of the market to fuck over consumers however humanly possible.


Just a thing to think about next-gen. Surely nVIDIA wouldn't make cards that are limited right below that on the sku that most people could even conceivably afford to upsell to the halo, right? RIGHT?

RIGHT?!

If it can do it, they surely won't update DLSS (or something else) to make sure it can't, right? RIGHT? (*Man pointing to temple meme*)

Potentially the Halo could be 18432sp. Yeah....I don't need that; either side.

I'm good with RT, I'll buy PT when that becomes 'mainstream' in exactly never (ok, not never...but in a very long time; PS7?).

I don't even care if it's performant on a halo, personally, but somebody has to build the ecosystem. This will always be true.

Have fun with that, ya'll. I'm sure nVIDIA will outdate every card at every resolution as often as possible every step of the way for the next 8-10 years.

...And the RT (now PT) arguement starts alllllllllll overrrrrrrrrrr agaiiiiiiiiiiiiiinnnnnnnnnnn.

Halo cards push the envelope and set a standard of performance and features. A GPU like the RTX 5090 represents the state of the art, the very best in graphics processing today, and the stuff that will come to happen in the coming years. A performance segment model is just a "taster" of that. The problem is that as GPU complexity rose, so did their cost. You could buy both brands' flashiest full chip models at the $600 range 15 years ago. Through worst case scenario inflation, $880 USD in today's money. A 5090 will set you back with 2K at the absolute minimum, realistically speaking 3 to 4K depending on where you live and how bad the taxes are. Minimum.

So that status quo is changing slowly, especially since games no longer follow hardware. Production costs are sky high, and only increase as higher audiovisual fidelity and general complexity is expected due to the introduction of more advanced console hardware and even better PC hardware. So, the result is that most games will look the same whether you have an RTX 5090 or a now what, 5 year old 6700 XT (which is an incredibly lame graphics card compared to the former), the difference being how fast you can run it. And that's really bad for business, considered what the GPUs are costing these days.
5090 is a stop-gap for people with too much money. I am ofc being sarcastic, and respect if people can afford one, but it still is not a tier. By tier, I mean games following the hardware.

Vis-a-vis, you are wrong.

I will always think buying things not on a tier is not very intelligent (for most people; some have reasons). It's cool if you can afford the best, and cool they can make it, but for most people they make zero sense.

4090/256-bit 3nm is a tier 1440p/4k up-scale. Going from 1440p to 4k is a very obvious amount that will clearly be the next real halo (notice how it's ~18432/~3780 if you scale it?)

This is with RT, which will become standard/requirement in some, perhaps many titles; a feature; not running it faster.

Vis-a-vis, you are wrong (and I'm not saying I like it either, but is how these businesses and industries survive).

I will buy the best version of the former. So will many, I think, given 1440p->4k up-scaling quality and the cost/perf to do 4k native...which just generally for almost all scenarios just doesn't make any sense.

Some will buy lower, like a 9070 xt that will do 1080p native. Some will buy a tier in-between (for up-scaling 1080p to 4k mostly; Some 1440p native I think?). I haven't looked at/tested every combination.

And no, you don't have to obide by my guidelines and run any combination of settings work for your budget. I simply think most people like this standardized experience, and how good cards are generally crafted.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,896 (5.13/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
5090 is a stop-gap for people with too much money. I am ofc being sarcastic, and respect if people can afford one, but it still is not a tier. By tier, I mean games following the hardware.

Vis-a-vis, you are wrong.

I will always think buying things not on a tier is not very intelligent. It's cool if you can afford the best, and cool they can make it, but for most people they make zero sense.

4090/256-bit 3nm is a tier 1440p/4k up-scale. Going from 1440p to 4k is a very obvious amount that will clearly be the next real halo.

This is with RT, which will become standard/requirement in some, perhaps many titles; a feature; not running it faster.

Vis-a-vis, you are wrong.

I will buy the best version of the former. So will many, I think, given 1440p->4k up-scaling quality and the cost/perf to do 4k native...which just generally for almost all scenarios just doesn't make any sense.

Since there is no graphics card that is faster, nor more feature complete than the RTX 5090, I'm right by default on that statement you've outlined. It wins in raster, it wins in RT, it wins in encoding, it has more memory, it has faster memory, it's... simply unequaled and unmatched by anything in the market right now. So, yes, it is in a tier all its own and it is the only option for people who want "the very best."

1741331008105.png


Something better is always around the corner, although, if there is one valuable lesson for vendors to learn from this launch season... I think they need a breather. Not release new hardware for some time. Just eliminate design kinks and improve the software to the very best it can be.
 
Joined
May 13, 2008
Messages
1,063 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Since there is no graphics card that is faster, nor more feature complete than the RTX 5090, I'm right by default on that statement you've outlined. It wins in raster, it wins in RT, it wins in encoding, it has more memory, it has faster memory, it's... simply unequaled and unmatched by anything in the market right now. So, yes, it is in a tier all its own and it is the only option for people who want "the very best."

View attachment 388291

Something better is always around the corner, although, if there is one valuable lesson for vendors to learn from this launch season... I think they need a breather. Not release new hardware for some time. Just eliminate design kinks and improve the software to the very best it can be.

You simply do not understand. You kind of sound like like the iphone meme.

You're simply wrong. A fast card can still be a bad card. Some people understand this, some do not.

A 5080 is a fast card, but can not maintain a high-end 1440p RT experience with 60fps. You can run it, but it will not maintain the industry standard and how many games are optimally intended to be experienced.

Will they replace it with one? Yes. Will they try to sell it to you as many ways possible before they give you that card? Probably...If you're nVIDIA. AMD will likely do it once. Then make it again cheaper.

Like they did with 9070 xt for 1080p. But they will do it also for 1440p. And 4k. 5090 is in-between...and you will need to upgrade either your 5080/5090 for 1440p or 4k to maintain a similar experience.

Do you understand?

This is not a war of brands, friend. This I don't believe you understand. It's about common-sense, practicality, and longevity. Cards on tiers age gracefully down to the next tier.
(nVIDIA tries to make this not happen, but I think you don't understand and I am done explaining it).

When nVIDIA makes a good card (like the 4090), I champion it. It helped push the industry forward, without a doubt. So does 9070 xt (for it's price and the AMD ecosystem). 5090 is not that.

5090 is a trinket; a novelty of what can be done, but not good-enough to be the next 4090. It is fast, without doubt. It does *not* move the industry forward. This is incorrect.
 
Last edited:
Joined
Dec 25, 2020
Messages
7,896 (5.13/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
You simply do not understand. You kind of sound like like the iphone meme.

You're simply wrong. A fast card can still be a bad card. Some people understand this, some do not.

A 5080 is a fast card, but can not maintain a high-end 1440p RT experience with 60fps. You can run it, but it will not maintain the industry standard and how many games are optimally intended to be experienced.

Will they replace it with one? Yes. Will they try to sell it to you as many ways possible before they give you that card? Probably...If you're nVIDIA. AMD will likely do it once. Then make it again cheaper.

Like they did with 9070 xt for 1080p. But they will do it also for 1440p. And 4k. 5090 is in-between...and you will need to upgrade either your 5080/5090 for 1440p or 4k to maintain a similar experience.

Do you understand?

This is not a war of brands, friend. This I don't believe you understand. It's about common-sense, practicality, and longevity. Cards on tiers age gracefully down to the next tier.
(nVIDIA tries to make this not happen, but I think you don't understand and I am done explaining it).

When nVIDIA makes a good card (like the 4090), I champion it. It helped push the industry forward, without a doubt. So does 9070 xt (for it's price and the AMD ecosystem). 5090 is not that.

5090 is a trinket; a novelty of what can be done. It is fast, without doubt. It does *not* move the industry forward. This is incorrect.

I get the feeling there's a language barrier going on in here. You don't seem to be getting what I'm saying. I'm not saying it's problem free. The launch has certainly been botched, and many aspects of it have been less than ideal, ranging from supply to early driver bugs.

But to call Blackwell a bad architecture because of these issues is a huge stretch. If you say RDNA 4 is good while calling Blackwell crap, it's very much a war of brands and you've shown which side you've picked. It's an Ada refresh. It's painfully obvious this is to Ada what TeraScale 3 (Cayman) was to TeraScale 2 (Evergreen). It's not supposed to be a "GCN" (an eureka moment). Yet it introduced many firsts and brought quite a few engineering improvements nonetheless. And neither was RDNA 4 for that matter, it wasn't designed to be the next big thing for AMD, especially since it's unable to overcome its own predecessor, and it's predecessor's competitor in raw performance figures. Yet it would be both absurd and extremely unfair to call it "worthless trash" because its highest spec product can't lay a smackdown on a 3 year old RTX 4080.

The practicality argument doesn't fly when it's a product that's quite literally designed to be "extreme", halo products have always been less than practical and aren't really made for John Doe, but rather for people who know what they are buying. And you're forgetting the RTX 5090 literally has the most generous and extreme specs of any GPU right now. It's got twice as much computing power, twice as much memory, twice as much bandwidth, twice as much everything. Short of veering into conspiracy theories about drivers designed to intentionally reduce a product's performance level, it is also easily the GPU that will hold its own the very best against the next generation, whenever that comes.

Sure, you're well within your right to think it's a ludicrous product. No one would blame you for it, after all, a 600W behemoth isn't for everyone regardless of how powerful it may be. But it's not a poor product in the slightest.
 
Joined
Jan 14, 2019
Messages
15,037 (6.68/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Halo cards push the envelope and set a standard of performance and features. A GPU like the RTX 5090 represents the state of the art, the very best in graphics processing today, and the stuff that will come to happen in the coming years.
No, the 5090 isn't the best you can have. It's the most you can have. Am I the only one who sees a big gaping canyon between the meaning of these two words?

Just like the Bugatti Chiron isn't the best car money can buy. It's a shit commuter, it's unusable off road, there's no space for your shopping in it, etc. Sure, it's the fastest car available, but that alone doesn't make it the best.

You simply do not understand. You kind of sound like like the iphone meme.

You're simply wrong. A fast card can still be a bad card. Some people understand this, some do not.

A 5080 is a fast card, but can not maintain a high-end 1440p RT experience with 60fps. You can run it, but it will not maintain the industry standard and how many games are optimally intended to be experienced.

Will they replace it with one? Yes. Will they try to sell it to you as many ways possible before they give you that card? Probably...If you're nVIDIA. AMD will likely do it once. Then make it again cheaper.

Like they did with 9070 xt for 1080p. But they will do it also for 1440p. And 4k. 5090 is in-between...and you will need to upgrade either your 5080/5090 for 1440p or 4k to maintain a similar experience.

Do you understand?

This is not a war of brands, friend. This I don't believe you understand. It's about common-sense, practicality, and longevity. Cards on tiers age gracefully down to the next tier.
(nVIDIA tries to make this not happen, but I think you don't understand and I am done explaining it).

When nVIDIA makes a good card (like the 4090), I champion it. It helped push the industry forward, without a doubt. So does 9070 xt (for it's price and the AMD ecosystem). 5090 is not that.

5090 is a trinket; a novelty of what can be done. It is fast, without doubt. It does *not* move the industry forward. This is incorrect.
Ah, I see you made my point in a different way, thanks.
 
Joined
Dec 25, 2020
Messages
7,896 (5.13/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
No, the 5090 isn't the best you can have. It's the most you can have. Am I the only one who sees a big gaping canyon between the meaning of these two words?

Just like the Bugatti Chiron isn't the best car money can buy. It's a shit commuter, it's unusable off road, there's no space for your shopping in it, etc. Sure, it's the fastest car available, but that alone doesn't make it the best.


Ah, I see you made my point in a different way, thanks.

Which goes down into semantics, if it's the most you can have due to whatever conditions (market, technology limitations, choice of the vendor not to release anything better, etc.) it's also the best you can have. I'm sure that they have better products in their labs, or business segments. Same goes to AMD, I'm fairly sure they must have at least a few "9090 XTX" engineering samples in their labs that never got anywhere for whatever reason.

The Bugatti allegory won't apply to a card like a 5090 because, well, you can read the review. It'll do everything that any other GPU below it can do, and also what the others can't. But we've gone off topic here ;)
 
Joined
Jan 14, 2019
Messages
15,037 (6.68/day)
Location
Midlands, UK
System Name My second and third PCs are Intel + Nvidia
Processor AMD Ryzen 7 7800X3D
Motherboard MSi Pro B650M-A Wifi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance EXPO DDR5-6000 CL36
Video Card(s) PowerColor Reaper Radeon RX 9070 XT
Storage 2 TB Corsair MP600 GS, 4 TB Seagate Barracuda
Display(s) Dell S3422DWG 34" 1440 UW 144 Hz
Case Kolink Citadel Mesh
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply 750 W Seasonic Prime GX
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE Plasma
Which goes down into semantics, if it's the most you can have due to whatever conditions (market, technology limitations, choice of the vendor not to release anything better, etc.) it's also the best you can have. I'm sure that they have better products in their labs, or business segments. Same goes to AMD, I'm fairly sure they must have at least a few "9090 XTX" engineering samples in their labs that never got anywhere for whatever reason.

The Bugatti allegory won't apply to a card like a 5090 because, well, you can read the review. It'll do everything that any other GPU below it can do, and also what the others can't. But we've gone off topic here ;)
Tell that "it's the best" to a 1080p gamer with a 600 W power supply looking for a $400 card. No, it's not the best. The term "best" implies that it's the best suited for doing the job, which depends on a lot of things. See, this is the Nvidia marketing right there. Making people believe that their product is the best no matter what, which is dumb.

On the AMD analogy, which of their products is the best? The 9070 XT which has better RT and FSR 4, or the 7900 XTX which is faster and has more VRAM? (The correct answer, imo, is, as almost always: it depends)
 
Joined
Dec 25, 2020
Messages
7,896 (5.13/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Sony MDR-V7 connected through Apple USB-C
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
Tell that "it's the best" to a 1080p gamer with a 600 W power supply looking for a $400 card. No, it's not the best. The term "best" implies that it's the best suited for doing the job, which depends on a lot of things. See, this is the Nvidia marketing right there. Making people believe that their product is the best no matter what, which is dumb.

On the AMD analogy, which of their products is the best? The 9070 XT which has better RT and FSR 4, or the 7900 XTX which is faster and has more VRAM?

I mean, of course. But that's also a matter of common sense, a 1080p gamer with a 600W power supply doesn't have the budget or the need for a product of that price. I'll go a step beyond... that's probably the kind of gamer that the RX 9060 and 5060 "vanilla" versions are gonna target. That, indeed, would be the equivalent of installing a Ferrari engine onto a VW bug.

If your computer is of a higher end variety, and can support a 7900 XTX, that is probably still what you should buy. At least for now, although I expect many improvements from the RX 9070 XT to be relevant enough to make it remain more desirable, for example the new Radeon Image Sharpening 2.0 feature they announced last week will not be available on RDNA 3, the driver release notes state it is exclusive to RDNA 4. I know the technical reason why, although I'm not sure this has been divulged publicly anywhere just yet so, I'll write you a rain check on that one.
 
Joined
May 13, 2008
Messages
1,063 (0.17/day)
System Name HTPC whhaaaat?
Processor 2600k @ 4500mhz
Motherboard Asus Maximus IV gene-z gen3
Cooling Noctua NH-C14
Memory Gskill Ripjaw 2x4gb
Video Card(s) EVGA 1080 FTW @ 2037/11016
Storage 2x512GB MX100/1x Agility 3 128gb ssds, Seagate 3TB HDD
Display(s) Vizio P 65'' 4k tv
Case Lian Li pc-c50b
Audio Device(s) Denon 3311
Power Supply Corsair 620HX
Which goes down into semantics, if it's the most you can have due to whatever conditions (market, technology limitations, choice of the vendor not to release anything better, etc.) it's also the best you can have. I'm sure that they have better products in their labs, or business segments. Same goes to AMD, I'm fairly sure they must have at least a few "9090 XTX" engineering samples in their labs that never got anywhere for whatever reason.

The Bugatti allegory won't apply to a card like a 5090 because, well, you can read the review. It'll do everything that any other GPU below it can do, and also what the others can't. But we've gone off topic here ;)

Ok, so if you had a 5090 would you be happy running native 4k at 40-some FPS dips? This is what I'm asking. For $3000 or whatever, would you happy with that? Because that is the *purpose* of this card.
A 4090 will do pretty much everything below that at 60fps minimums (with an oc). You'll have to make concessions as 256-bit cards will be faster and games built toward them, but acceptable concessions.

'Cause I'll be happy with one that can upscale 1440p->4k and keep 60fps for $1000. I'd prefer it was about $3.50, but probably will be ~$750-$1000.

DF gets pissed when the CyberPunk marketplace dips to 59fps (I'm joking, but you know, only kind of).

I know different people have different levels of what's acceptable, I know you can tune settings; maybe you want to be the guy running 4x frame gen on a 15fps native 4k screen that has input lag up the ass.

It's your call. I just try to think about people keeping a very smooth experience w/o hiccups and that you can depend on. If a new tier of games comes out, maybe you have to drop one tier on some cards.

This is why I keep calling 9070 xt a 1080p card. Because that is the future it will live in. You can run higher, but it will for a very long time at least do that thing even in a very nice looking game.

Maybe it goes from 1440p native to 960p->1440p upscaled in some instances, etc, but in those situations (where tiers descend) it usually is very much something like that; still on those tiers.

Hence, imo, you want to be on one of those tiers or will always be below it. This is especially true when things happen like minimum frame buffers, that happen when things like new consoles ship.

Yes, 5090 has 32GB of ram; perhaps like a console (and could absorb any instance it has to match the shared memory subsystem to it's own framebuffer), but in that instance the compute will not scale.

It truly is not about being fast. It's about making sense. A card can be fast and make sense. It can be fast, make sense, and be expensive. 5090...well...

...Hold on, gotta heat up some Meatloaf. BRB.
 
Last edited:
Top