• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Next Gen GPU's will be even more expensive

las

Joined
Nov 14, 2012
Messages
1,693 (0.38/day)
System Name Meh
Processor 7800X3D
Motherboard MSI X670E Tomahawk
Cooling Thermalright Phantom Spirit
Memory 32GB G.Skill @ 6000/CL30
Video Card(s) Gainward RTX 4090 Phantom / Undervolt + OC
Storage Samsung 990 Pro 2TB + WD SN850X 1TB + 64TB NAS/Server
Display(s) 27" 1440p IPS @ 360 Hz + 32" 4K/UHD QD-OLED @ 240 Hz + 77" 4K/UHD QD-OLED @ 144 Hz VRR
Case Fractal Design North XL
Audio Device(s) FiiO DAC
Power Supply Corsair RM1000x / Native 12VHPWR
Mouse Logitech G Pro Wireless Superlight + Razer Deathadder V3 Pro
Keyboard Corsair K60 Pro / MX Low Profile Speed
Software Windows 10 Pro x64
This isn't a measure of power efficiency. What is is you now have a 200 W GPU achieving roughly the same a 320 W of previous generation did.
Not really, 4070 performs like 3080 overall in raster but 4070 wins in most new games and have 2GB more VRAM and options for Frame Gen on top + DLSS 3 support, which will then massively beat 3080/6800XT from last gen, at just 200 watts, meaning cool and quiet operation. Down from 350-400 watts. Radeon 6800/6900 series also had massive power spikes which was fixed with 7000 series. AMD still struggles with multi monitor and video playback power usage. Using like 2-3 times as much as Nvidia.

Also, 4070 SUPER exist, at 599 MSRP, where 3080 was 699 MSRP but closer to 1000+ in reality because of GPU mining craze.

The best way to compare GPUs are, and will always be, on performance per watt. Ada have like way way better performance per watt than Ampere and Radeon 6000 series as well. Beats Radeon 7000 too. With full support for DLSS 3 and FG.

AMD probably won't be able to deliver 4090 performance till 2026+ with RDNA5 thats how far behind AMD is. Had my 4090 since Sep 2022... Bought for 1500 dollars, sold my 3090 for 1000 dollars and used a temp 6800XT for a few months, which is why I know exactly how far behind AMD is on drivers/features. Wonky experience with lack of drivers for new games on release is the norm for AMD.

AMD is cheaper for a reason, and still don't sell.

When you factor in the much lower resell value and higher power usage (idle, video, multi monitor + gaming), it is simply not worth it for 90% of people to even consider an AMD GPU and AMD is at like 10% dGPU marketshare now. Lets see if RDNA4 will make them regrab some. They will need very aggressive pricing to do that and FSR needs to be improved.

They will even go back to monolithic with RDNA4, showing MCM failed with 7000 series.
 
Last edited:
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
And this is also why 5090 is rumoured to get like up to 25.000 cores which is 50% more than 4090

5090 is probably going to be a beast, maybe MCM, maybe not, however watt usage will be high and price will be too.
5090 with 25k cores? You wish. That's the high-end GB202, which won't appear in the consumer market, similar to how the top AD102 never did.

More likely scenario is something around 20k cores, which is like a 25% jump from the 4090, coupled with a probably bit larger memory bus, with way faster GDDR7.

I also doubt it'll be a MCM design.
Nvidia changed almost nothing in the architecture with Ada, and if this continues with Blackwell, then all advancements (or the lack of them) will likely go hand in hand with node shrinks.
They did change some pretty nice stuff... for machine learning stuff.
Same goes for blackwell. Nvidia's improvments have mostly been focused on where the money is at.
Nvidias focus is on AI and Enterprise and they saw with 3000 series that they could easily beat AMD in the gaming market using a cheap process
That's a great example, since GA100 was on TSMC 7nm, while the rest of the chips were on Samsung's crappy node.

it is simply not worth it for 90% of people to even consider an AMD GPU and AMD is at like 10% dGPU marketshare now. Lets see if RDNA4 will make them regrab some.
AMD is not even trying that hard, as others already said. Their main focus is on CPUs, and the remaining GPU work is meant for either mobile (like laptops) or their enterprise stuff.
 
Joined
Sep 3, 2019
Messages
3,578 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
AMD is not even trying that hard, as others already said. Their main focus is on CPUs, and the remaining GPU work is meant for either mobile (like laptops) or their enterprise stuff.
Has anyone heard rumors of AMD switching (yet again) to UDNA (RNDA+CDNA) after a couple of series? (Probably after RDNA5)
If its true... jeeez make up your minds!!
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Has anyone heard rumors of AMD switching (yet again) to UDNA (RNDA+CDNA) after a couple of series? (Probably after RDNA5)
If its true... jeeez make up your minds!!

I think people are tired of them flip flopping every couple generations unlike the Ryzen/Epyc side their gpu divisions doesn't seem to know what they're doing hopefully this unification helps but it didn't seem to lead to better products during the GCN/Vega generations.

I'm honestly taking it as we don't care about the Desktop gpu market and rather just focus on the AI market and just castrate whatever doesn't sell for the Desktop market.

Hopefully I'm wrong but I'm not holding my breath they'll get their shit together.
 
Joined
Jan 20, 2019
Messages
1,592 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Has anyone heard rumors of AMD switching (yet again) to UDNA (RNDA+CDNA) after a couple of series? (Probably after RDNA5)
If its true... jeeez make up your minds!!

If Nvidia is successfully managing a single architecture which serves both gaming and DC/AI, and does it by reaping rewards with cost efficiency, streamlined software, less development headaches, etc.... it does seem to be the more approachable long term strategy. Dropping specialized architectures will have trade-offs but perhaps AMD needs an ALL-IN-ONE solution for a more efficient ecosystem to more capably compete with Nvidia at the highest level. As the saying goes: "if you can't beat em, join em".
 
Joined
Aug 23, 2017
Messages
113 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
Placing lighting, RT or otherwise isn't as simple as just placing the light. You still have decide the light type (spot, directional, point, ect), light color, and the many other light properties. Certain lights will disperse photons in a specific direction with a very sharp cutoff while others will disperse them around the lighting fixture. More realistic light propagation simulation ala RT doesn't change these requirements. You are still going to want to adjust these parameters regardless of whether you are using RT or rasterized lighting. In addition, some devs may want to bake lighting when fully dynamic lighting isn't required and other times they will need to use different types of reflections (aside from SSR, UE5 supports cheaper reflection captures) because having a ton of RT reflections isn't remotely feasible. Devs are going to use a gambit of reflection types according to their requirements including RT, SSR, and reflection captures (which includes planar, box, ect).

You may have misunderstood what they said, implementing RT lighting over rasterized lighting might be easy because you've already done most of the work setting the correct lighting properties but implementing RT lighting does not absolove you of doing that work in the first place. In addition you may still have to go through and adjust lighting parameters for ideal visibility under RT. I've noticed in many RT enabled games the devs don't do this and often text or objects that are supposed to be visible are harder to see. If the dev you talked to really did think that you can simply place a generic light source throughout their entire game without editing it's properties then I very much doubt they were much of a dev to begin with.



Star Wars outlaws having RT on by default doesn't mean the game doesn't use rasterization (it does, hence why you get ok performance on cards like the 1080 Ti). It also uses hybrid RT, so a portion of the work is done on tradition shader cores.

I'm sure we'll move to RT only at some point but people have been saying "once mid tier hardware is capable of it" since the 2000 series (heck there were even some bold individuals claiming mid tier 2000 series cards were it already) and here we are, mid-tier GPUs now costing $700 USD and requiring upscaling just to run modern games with RT enabled at a decent frame-rate. The progress has been less than impressive.

It's not even the first time I've heard someone say 'nice mid-tier hardware is capable of RT well get RT only games'. That was 6 years ago now and we are still waiting. I'd MUCH rather have ray-traced sound, much less resource intensive and extremely beneficial. Even better, dynamically generate sounds based on how sound propogation work. Every pot should make a different sound based on the weapon hitting it, where it hits it, ect. Literally anything other than graphics for once. Decent AI for example sure would be nice. Modern AAA games are as wide as an ocean and as shallow as a puddle.
Outlaws uses software rt at all times. There is no crappy ambient occlusion in this game.
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
I'm honestly taking it as we don't care about the Desktop gpu market and rather just focus on the AI market and just castrate whatever doesn't sell for the Desktop market.
The current ISA split is awful, while the actual feature difference is minimal.
Focusing on a single ISA is a way better approach, just see Nvidia.

That doesn't mean that the halo enterprise products need to have the exact same hardware as the other chips, just see how the x100 chips are widly different from the rest of the stack, but the software stack is still a single piece. So the CUDA code that I write for my 3090 will work just as fine on a A100/H100, something that's not possible with AMD's current offerings.
Heck, apart from their MI200/250/300 chips, the rest of their stack has awful ROCm support.

But yeah, this move is totally meant to make enterprise stuff easier. Their CDNA products also are quite few, so it's an entire software stack just for 1 or 2 SKUs, while all the other RDNA SKUs have shit support and people who don't have a MI300 can't properly validate their code.
If Nvidia is successfully managing a single architecture which serves both gaming and DC/AI, and does it by reaping rewards with cost efficiency, streamlined software, less development headaches, etc.... it does seem to be the more approachable long term strategy. Dropping specialized architectures will have trade-offs but perhaps AMD needs an ALL-IN-ONE solution for a more efficient ecosystem to more capably compete with Nvidia at the highest level. As the saying goes: "if you can't beat em, join em".
Adding to this, and as I said above, while Nvidia has a single high-level arch the actual chips can be wildly different, but the software stack remains the same, something that is really lacking on AMD's side.
 
Joined
Sep 3, 2019
Messages
3,578 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
If Nvidia is successfully managing a single architecture which serves both gaming and DC/AI, and does it by reaping rewards with cost efficiency, streamlined software, less development headaches, etc.... it does seem to be the more approachable long term strategy. Dropping specialized architectures will have trade-offs but perhaps AMD needs an ALL-IN-ONE solution for a more efficient ecosystem to more capably compete with Nvidia at the highest level. As the saying goes: "if you can't beat em, join em".
Thing is that AMD had a unified arch and split it in 2 after Vega (and RadeonVII?)

I guess it was a bet that didn’t pay off. Also guessing that nVidia forsaw the future better like 5+years ago.

Let’s hope they do it right this time
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Thing is that AMD had a unified arch and split it in 2 after Vega (and RadeonVII?)
Yes, CDNA is pretty similar to the GCN found in Vega. But each CDNA gen (3 gens so far) only had 1~4 SKUs each.
I guess it was a bet that didn’t pay off. Also guessing that nVidia forsaw the future better like 5+years ago.
Nvidia has been doing this for over 15 years now, and going a different high-end chip under the same software stack since Pascal (P100 in 2016).
They really played the long game to be where they're at now.
 
Joined
Jan 20, 2019
Messages
1,592 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
Adding to this, and as I said above, while Nvidia has a single high-level arch the actual chips can be wildly different, but the software stack remains the same, something that is really lacking on AMD's side.

Won't be an easy transition and as always the biggest fear is at the software/code sync level. AMD will have to ensure the existing software/dev tools remain compatible whilst furthering the cause for the the new arch's optimizations/toolchain updates/APIs/application support, etc. Can AMD pull it off.. why not! Will it be a well-polished early release, i wouldn't hold my breath. I hope AMD gets it right the first time and early performance issues, driv/soft incompatibilities, bugs/glitches, power management issues, etc are quickly resolved in a respectable time frame. If not they will have shot themselves in the foot with no big or fast enough bandaid to mask/shelter the madness.

Thing is that AMD had a unified arch and split it in 2 after Vega (and RadeonVII?)

I guess it was a bet that didn’t pay off. Also guessing that nVidia forsaw the future better like 5+years ago.

Let’s hope they do it right this time

Yep a punt which defied the obvious.... a unified arch was always going to help efficiently solve problems caused by having different/incompatible hardware designs. With ML/DL+AI driving big changes, and multi-die GPUs potentially seeing broader mainstream adoption, you've got a whole host of additional challenges to contend with. So its a good time to switch up and simplify tackling everything from a single front (UDNA). NVIDIAs been at it for some time, hence lots of ground to cover for AMD. We can only hope they quickly adapt without the long pitstops or support being MIA.
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Won't be an easy transition and as always the biggest fear is at the software/code sync level. AMD will have to ensure the existing software/dev tools remain compatible whilst furthering the cause for the the new arch's optimizations/toolchain updates/APIs/application support, etc. Can AMD pull it off.. why not! Will it be a well-polished early release, i wouldn't hold my breath. I hope AMD gets it right the first time and early performance issues, driv/soft incompatibilities, bugs/glitches, power management issues, etc are quickly resolved in a respectable time frame. If not they will have shot themselves in the foot with no big or fast enough band-id to mask/shelter the madness.
I believe the new ISA will be closer to either RDNA or CDNA, while the other will just be killed. This would make the transition smoother.

But that's just me doing wild guesses.
 
Joined
Jan 20, 2019
Messages
1,592 (0.74/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I believe the new ISA will be closer to either RDNA or CDNA, while the other will just be killed. This would make the transition smoother.

But that's just me doing wild guesses.

That makes sense.

Or, if feasible, perhaps they will build off one of their existing architectures and integrate key features from the other one. It simplifies the conversion, channels through much of the required software/driver compatibility, etc. Maybe not the best fit with potential limitations going forward or the complex nature of unifying features from both Archs.

Actually toss that, I hope it's a fresh take with a ground-up custom design... a cleanly well-polished instruction set.
 
Joined
Aug 31, 2024
Messages
24 (0.21/day)
System Name Linux Desktop
Processor ryzen 7 5800x
Motherboard msi meg x570s ace max
Cooling nzxt kraken x63
Memory gskill ddr4 3600 32gigs
Video Card(s) radeon 6800 xt
Storage corsair mp600 m2 2800 2tb
Display(s) dark matter ultrawide 3440x1440
Case bequiet
Mouse pulsar xlite v3 es
Keyboard sp-111
Software OpenSUSE Tumbleweed
I’m glad that AMD is going that route.

Lets be honest, you and everyone else complaining about the “lack of competition “ really want AMD to force Ngreedia to cut prices just so all of you can buy a cheaper Ngreedia gpu.

You never had the most minimum intention in giving your money to AMD.

Uhhhhhhh what the hell are you blabbering about?

I currently have a 6800xt and love it. I just want them to keep making high end gpus so I can buy them.
 
Joined
Sep 3, 2019
Messages
3,578 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Uhhhhhhh what the hell are you blabbering about?

I currently have a 6800xt and love it. I just want them to keep making high end gpus so I can buy them.
We all want that and I will include nVidia buyers too.
But in order to do this AMD needs a much better architecture build from ground up.
So at this point with what they have, they can't compete with nVidia on the high end.
Its clear and simple.

That's why there are rumors that they are building a unified (RNDA+CDNA) architecture for later (introduced maybe after RDNA5)
So for at least a couple of gens dont expect big flagship GPUs from AMD.

For marketing and image reasons they are saying now that they need to gain market share first from entry to middle level, where the supposed bigger share is.
But in reality they dont have anything viable to show on the high end. They could... but it would be expensive and would end up be much inferior to 5080/5090.
And there is no point to allocate fab wafers on such a product. Wafer allocation is on other parts and they are keeping GPU dies small, for now.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
For marketing and image reasons they are saying now that they need to gain market share first from entry to middle level, where the supposed bigger share is.
But in reality they dont have anything viable to show on the high end. They could... but it would be expensive and would end up be much inferior to 5080/5090.
And there is no point to allocate fab wafers on such a product. Wafer allocation is on other parts and they are keeping GPU dies small, for now.
What they're saying, and what you marked as "reality" are both realities, imo. Two sides of the same coin, kind of.
They don't have anything to compete with in the high-end, therefore they're focusing their efforts in areas where they can score some easy wins due to bigger sales numbers, smaller dies, cheaper R&D, etc.
 
Joined
Sep 3, 2019
Messages
3,578 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
What they're saying, and what you marked as "reality" are both realities, imo. Two sides of the same coin, kind of.
They don't have anything to compete with in the high-end, therefore they're focusing their efforts in areas where they can score some easy wins due to bigger sales numbers, smaller dies, cheaper R&D, etc.
Yes that's better, actually I was thinking it the last 5min after posting that.
One does not eliminate the other
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Yes that's better, actually I was thinking it the last 5min after posting that.
One does not eliminate the other

Going after the midrange hasn't helped them though they've tried it multiple times in the past.

RDNA2 really helped with their perception that they can compete and make a good product the problem is they never stick with it.

I'm just pessimistic they can compete whatsoever with their Radeon portion of their business even when they have a clearly better product people still buy nvidia instead becuase of perception of their brand and that won't change till they are clearly as good or better for multiple generations in a row.
 
Joined
Sep 3, 2019
Messages
3,578 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
Going after the midrange hasn't helped them though they've tried it multiple times in the past.

RDNA2 really helped with their perception that they can compete and make a good product the problem is they never stick with it.

I'm just pessimistic they can compete whatsoever with their Radeon portion of their business even when they have a clearly better product people still buy nvidia instead becuase of perception of their brand and that won't change till they are clearly as good or better for multiple generations in a row.
The "Yes that's better" was aiming to the better explanation @AusWolf stated about what AMD is doing.

I can agree with you but they dont have anything else right now really.
Yes they need to be consistent for several gens just like they did with Zen1~Zen3.

If the "new" stuff comes after RDNA5 and its the starting point of Zen1 then imagine a few more gens to be up for serious competition.
Its grim... for now
 
Joined
Sep 10, 2018
Messages
6,965 (3.03/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
The "Yes that's better" was aiming to the better explanation @AusWolf stated about what AMD is doing.

I can agree with you but they dont have anything else right now really.
Yes they need to be consistent for several gens just like they did with Zen1~Zen3.

If the "new" stuff comes after RDNA5 and its the starting point of Zen1 then imagine a few more gens to be up for serious competition.
Its grim... for now


I'm still less optimistic Zen worked becuase they went after marketshare and it paid off big time for them. I'd argue the 8700k and 9900k were better than the 1700X/2700X but it didn't matter AMD priced their products significantly better especially over time.

RTX 4000 series left a titanic sized door for them to offer way better products from a price to performance perspective and they just did what they always do offer a slightly worse product for slightly less money. For some depending on what's important significantly worse.

Everything I'm hearing Zen5 is almost non existent sales wise and even with much higher ram/mobo prices Zen 4 was way more successful at launch it just shows how much they're sputtering in the Desktop space on both sides.

We need the AMD Ryzen team of 2017/18 not the current one honestly. They've honestly forgotten what's got them to where they're at.

I hope I'm wrong because overall AMD is one of the great stories over the last Decade.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Yeah, Ada got the better optical flow thingy for FG, but I don't see any other advancement, honestly. Enabling RT results in the same performance drop as it did on Ampere or Turing. The efficiency and clock speeds gains are due to the node shrink, not the architecture.

This is the difference between the Nvidia and AMD mindset. If Nvidia says they've got some cool new stuff, everybody praises them for it without checking out how (or if) it works in real life first. If AMD says the same, everybody mocks them for not delivering 20% better everything than Nvidia for a 50% lower price.
Probably irrelevant for most people, but they did upgrade the encoder/decoder chips, my 3080 couldnt do 1440p or above in youtube without stuttering, monitoring the video chip on gpu-z at fastest possible polling showed on every stutter the chip was saturated, on my 4080 those issues are gone. So I can hardware decode youtube now lol.

Not yet tested encoding but they now officially support HEVC so I expect thats a big improvement. That might be appreciated by a fair amount of people, as NVENC on 3000 series was pretty bad, ended up with massive files compared to software encoding for anything even remotely comparable quality.

Will probably start gaming regularly again (including recording) by end of this month, I slow down my gaming in summer as the heat puts me off too much.
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Probably irrelevant for most people, but they did upgrade the encoder/decoder chips, my 3080 couldnt do 1440p or above in youtube without stuttering, monitoring the video chip on gpu-z at fastest possible polling showed on every stutter the chip was saturated, on my 4080 those issues are gone. So I can hardware decode youtube now lol.
They did not. The decoder in both the 3000 and 4000 series is the exact same, 5th gen NVDEC. Your issues may have been due to some other weird software thing in your system.
Not yet tested encoding but they now officially support HEVC so I expect thats a big improvement.
For the encoder, it's pretty similar to the Ampere one, with the addition of AV1 encoding support. The 4080 and 4090 also have two encoders now instead of just a single one.
HEVC was already supported in Ampere. The quality is the same in Ada.
That might be appreciated by a fair amount of people, as NVENC on 3000 series was pretty bad, ended up with massive files compared to software encoding for anything even remotely comparable quality.
HW encoding will always give you larger files since it's a tradeoff for speed vs quality vs file size.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
They did not. The decoder in both the 3000 and 4000 series is the exact same, 5th gen NVDEC. Your issues may have been due to some other weird software thing in your system.

For the encoder, it's pretty similar to the Ampere one, with the addition of AV1 encoding support. The 4080 and 4090 also have two encoders now instead of just a single one.
HEVC was already supported in Ampere. The quality is the same in Ada.

HW encoding will always give you larger files since it's a tradeoff for speed vs quality vs file size.

The only thing that changed was the drivers and the GPU swap.

HEVC I could only use it in OBSS with a unofficial plugin and it was overloading the chip quite badly. Would get frame drops when was overloaded.


I am probably thinking of AV1 as the new supported codec for encoding, but regardless I could only use HEVC unofficially in OBSS.

This page does support what you saying its 5th gen on both generations, but end of the day I can only report what I observe, I couldnt run hardware decode at high resolutions and 60fps on my 3080 and now can, and when I monitored it in GPUz the video chip was hitting high utilisation briefly when it stuttered. Willing to accept may have been a driver problem.
 
Joined
May 10, 2023
Messages
352 (0.59/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
HEVC isnt supported in Ampere either, they added it as a new thing to the 4000 series. I could only use it in OBSS with a unofficial plugin and it was overloading the chip quite badly. Would get frame drops when was overloaded.
It is. HEVC encoding has been a thing since Pascal (1000 series).
Take a look here:

Also on the ADA whitepaper, on pages 24-25:
Ada GPUs take streaming and video content to the next level, incorporating support for AV1 video encoding in the Ada eighth generation dedicated hardware encoder (known as NVENC). Prior generation Ampere GPUs supported AV1 decoding, but not encoding. Ada’s AV1 encoder is 40% more efficient than the H.264 encoder used in GeForce RTX 30 Series GPUs. AV1 will enable users who are streaming at 1080p today to increase their stream resolution to 1440p while running at the same bitrate and quality, or for users with 1080p displays, streams will look similar to 1440p, providing better quality.
NVIDIA collaborated with OBS Studio to add AV1 — on top of the recently released HEVC and HDR support — within an upcoming software release, expected later this year. OBS is also optimizing encoding pipelines to reduce overhead by 35% for all NVIDIA GPUs. The new release will additionally feature updated NVIDIA Broadcast effects, including noise and room echo removal, as well as improvements to virtual background.
We’ve also worked with Discord to enable end-to-end livestreams with AV1. In an update releasing later this year, Discord will enable its users to use AV1 to dramatically improve screen sharing, be it for game play, schoolwork, or hangouts with friends.
To further aid encoding performance, GeForce RTX 4090 and RTX 4080 are equipped with dual NVENC encoders. This enables video encoding at 8K/60 for professional video editing or four 4K/60. (Game streaming services can also take advantage of this to enable more simultaneous sessions, for instance.) Blackmagic Design’s DaVinci Resolve, the popular Voukoder plugin for Adobe Premiere Pro, and Jianying — the top video editing app in China — are all enabling AV1 support, as well as a dual encoder through encode presets. Dual encoder and AV1 availability for these apps will be available in October. NVIDIA is also working with the popular video-effects app Notch to enable AV1, as well as Topaz to enable support for AV1 and the dual encoders.
In addition to NVENC, Ada GPUs also include the fifth-generation hardware decoder that was first launched with Ampere (known as NVDEC). NVDEC supports hardware-accelerated video decoding of MPEG-2, VC-1, H.264 (AVCHD), H.265 (HEVC), VP8, VP9, and the AV1 video formats. 8K/60 decoding is also fully supported

The issue you may have faced is that HEVC support was not added on OBS at the time, and the plugin you used was not that good. Or you were just hitting the encoder's limit, the 4090 improved on that by having two encoders than can be used simultaneously now.
I am probably thinking of AV1 as the new codec, but regardless I could only use HEVC unofficially in OBSS.
It now has official support, and has had it for quite some years for recording. For streaming, AFAIK only youtube accepts it.

but end of the day I can only report what I observe, I couldnt run hardware decode at high resolutions and 60fps on my 3080 and now can, and when I monitored it in GPUz the video chip was hitting high utilisation briefly when it stuttered.
Sure, and I do believe you, but it may have been a problem with your system or driver for some unkown reason. The whole 3000 series has the same decode capability as the 4000 series in the end of the day.
Maybe your driver reinstall solved the issue, go figure.
 
Joined
Oct 15, 2011
Messages
2,479 (0.51/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sparkle Titan Arc A770 16 GB
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
Or they have to repeat what they've done a decade ago with Maxwell. Same node, much more bang per watt compared to Kepler.
Maxwell, was what Kepler should have been!

Nvidia can raise prices all it wants, many cards are already priced at what the market will bear.

People were already extending how long they keep their cards to help accommodate for increased pricing but $1,000 for a 5070 Ti that will have a tiny memory bus and little VRAM doesn't sound anywhere remotely appealing.
Sounds more like the '21 "video-card-mageddon"!

Outlaws uses software rt at all times.
But, does it easily make the CPU requirement a Ryzen 9 5900X or equivalent?
 
Last edited:
Joined
May 29, 2017
Messages
383 (0.14/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
Let’s hope that rx 8800 xt will cost the same amount as rx 7800 xt. Rx 7800 xt is a good decent card but a bit boring becouse of it’s 6800xtish performance level. If rx 8800 xt manages to be at least 25-40% faster at the same price it’s already a win. Personally not interested in nvidia becouse it will be overpriced.
 
Top