• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU upgrade - buy 7900xtx or 4080 Super now or wait for next gen release in 2025?

Status
Not open for further replies.
Joined
Sep 3, 2019
Messages
3,598 (1.85/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
And why not? The 4080 uses less power. It generates less heat. If you opt to overclock either, you'll see 7900 XTX at the 450-500W realm, 4080's won't even allow you to go so far, even on the high-end variants. Feel free to prove me otherwise (you can't).
4080: 320W
7900XTX: 355W

That’s 35W difference or +11%
I guess this amount of extra heat is big deal today.

Yes there are AIB 7900XTX cards up to 500W but soon you realize that past 380W the return is next to nothing.

Every AIB 4080 is fixed on 320W? No OC versions exist?

My card can be set from 314W up to 467W but I keep it at 375W

The issue with the 7900XTX is that it is literally 7% faster for $400 more than the 7900XT where I live.
I think it’s more than 7% and let’s not forget that most people that buying 7900XT/XTX are not running games at 1080p. At least 1440p and UW/4K.

Here prices now are…
7800XT from 500+€
4070 from 560+€
4070S from 630+€
7900GRE from 680+€
7900XT from 700+€
4070Ti from 800+€
4070Ti S from 870+€
7900XTX from 900+€
4080S from 1100+€
4080 from 1300+€

I guess that every market region has its own unique quirks.
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I'm going to point out the obvious here, Hotfixes don't count as a plus towards driver support. You are in essence patting Nvidia on the back because they fixed something they screwed up on in the first place. Are you counting the hotfix for the discord Nvidia bug as equal to a full AMD driver release? That's beyond hypocritical, you are always criticizing AMD for their bugs but here you are counting Nvidia bugs and hotfixes as a plus towards driver support.

Similar could be said of studio drivers. Studio drivers are indentical to the game ready drivers. They are mearly more tested than the game ready drivers, are on a longer update cadence, and their release typically coincides with the release of creative app updates. It could more be argued that it's a downside that Nvidia has two separate packages for studio and game drivers as users must choose between which they want to install as compared to if Nvidia to just periodically release a LTS (long term support) version of their game ready driver (in fact they should just stop calling their main driver game ready, a lot of non-gamers use this driver version in different fields). Again you are trying to present something that's actually a limination as a positive and as if Nvidia is doing some massive amount of work on top of their regular driver updates to get studio drivers out the door when it's actually the opposite. I do think they are nice to have but it's entirely misleading to pretend they are equal to the work that goes into a full driver release.

All but 1 AMD driver in the last 2 years has been WHQL and you can see that right in the link I provided. Not the I really put much stock in Microsoft's ability to test drivers thoroughly but you brought it up.

You used stalker 2 as an example of AMD not have timely driver support but in fact AMD cards already perform as expected in that title: https://www.techpowerup.com/review/stalker-2-fps-performance-benchmark/5.html

View attachment 373374


So why exactly would AMD need to release a driver for this game?

For what I gathered, you are trying to just throw raw numbers behind your argument that Nvidia has superior driver support without thinking about the reasoning behind those numbers. Having more hotfix drivers is NOT a good thing and carries other implications, having studio drivers was not the flex you thought it was, and AMD not needing a driver for stalker 2 because it already performs well is not a negative but a plus.

I think even if we ignored all the above, just looking at the number of driver releases on the AMD side you can see your original statement in regards to consistent and timely driver support was simply false. AMD's driver releases have been pretty consistent for years. Sometimes they miss a month and often when that happens you can see they release two drivers in the following month, typically one early and one later in the month. Again it's 1-2 releases per month with the average being 1. I'm not sure I'd want to increase that cadence either, quality over quantity.

You do realize that validated ISV drivers are pretty much the sole reason enterprise GPUs have existed up until now, right? I don't know how you came up with the conclusion that having these on gaming cards, extensive support, access to an entire, maintained ecosystem and not having to wait for the next release cycle for known issues that have already been fixed are negatives, but you do you

Regarding performance: I must disagree: 10 fps below the 4080 in an UE5 game that runs on software lumen is nowhere even close to the expected performance level for a 7900 XTX, unless there have been massive strides in UE5 performance in the Nvidia side since these games were released:





4080: 320W
7900XTX: 355W

That’s 35W difference or +11%
I guess this amount of extra heat is big deal today.

Yes there are AIB 7900XTX cards up to 500W but soon you realize that past 380W the return is next to nothing.

Every AIB 4080 is fixed on 320W? No OC versions exist?

While many focus solely on the usefulness of upscaling to achieve otherwise impossible performance figures from a card, usage of DLSS often allows you another road: to slash GPU power without sacrificing much in image quality, which leads to a quieter, cooler and more efficient machine. You can often have games running at ~180 W at 4K/120 on a 4080 using DLSS Quaiity with Frame Generation enabled. FSR can (and in my opinion, should) be used to the same effect - but it helps that DLSS is generally ahead in IQ comparison tests.

My personal 4080 is a ROG Strix OC and one of the craziest models out there - 320 W is base TGP limit extensible to 420 W - I've never been able to get this card to push more than 350 W even at 3 GHz on air + VRAM clocked up to 4080S speed
 
Joined
Jul 13, 2016
Messages
3,351 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
You do realize that validated ISV drivers are pretty much the sole reason enterprise GPUs have existed up until now, right? I don't know how you came up with the conclusion that having these on gaming cards, extensive support, access to an entire, maintained ecosystem and not having to wait for the next release cycle for known issues that have already been fixed are negatives, but you do you

Except only Nvidia's Quadro drivers and cards are certified by independent software vendors (ISVs): https://www.nvidia.com/en-us/design-visualization/software/isv-certifications/

Did you happen to get this answer from Google AI or ChatGPT without double checking the facts? I asked them both this question and they state that Studio drivers are ISV certified but ironically they link to the quadro page I provided above. When I asked ChatGPT to take another look, it revised it's answer to my answer, in that only Quadro drivers are ISV certified as it could not find any evidence that Studio drivers are ISV verified.

Rule number 1 of using AI, always double check it's output.


Regarding performance: I must disagree: 10 fps below the 4080 in an UE5 game that runs on software lumen is nowhere even close to the expected performance level for a 7900 XTX, unless there have been massive strides in UE5 performance in the Nvidia side since these games were released:

First, I never stated only in UE5 games or only those with software lumen. I just want to specify that as it appears you may be trying to change some goals posts to use in your next comment as a gatcha.

I'm going to give you a list of recent titles that were benched by TPU that demonstrate that the 7900 XTX's performance in Stalker 2 is within the normal range:


1732678259073.png


1732677828120.png

1732677812654.png

1732677983540.png

1732678203940.png



The 7900 XTX tends to run at or around 4080 performance. It'll only beat it in raster heavy games, as in the example you provided. Thus as I stated before, it's performing within expected margins in stalker 2. You could make the argument that it's a bit weak but it's not even close to an outlier. I could easily make the argument that your example is more an outlier, at least when we are only considering AAA releases over the current year.

Considering that you are trying to prove that AMD is being lazy by not releasing a driver for Stalker 2, you need to provide significant evidence that Stalker 2 is well outside normal performance margins. Something the data simply doesn't support.
 

Attachments

  • 1732678214073.png
    1732678214073.png
    104.3 KB · Views: 11
Last edited:
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Except only Nvidia's Quadro drivers and cards are certified by independent software vendors (ISVs): https://www.nvidia.com/en-us/design-visualization/software/isv-certifications/

Did you happen to get this answer from Google AI or ChatGPT without double checking the facts? I asked them both this question and they state that Studio drivers are ISV certified but ironically they link to the quadro page I provided above. When I asked ChatGPT to take another look, it revised it's answer to my answer, in that only Quadro drivers are ISV certified as it could not find any evidence that Studio drivers are ISV verified.

Rule number 1 of using AI, always double check it's output.




First, I never stated only in UE5 games or only those with software lumen. I just want to specify that as it appears you may be trying to change some goals posts to use in your next comment as a gatcha.

I'm going to give you a list of recent titles that were benched by TPU that demonstrate that the 7900 XTX's performance in Stalker 2 is within the normal range:


View attachment 373427

View attachment 373423
View attachment 373422
View attachment 373424
View attachment 373425


The 7900 XTX tends to run at or around 4080 performance. It'll only beat it in raster heavy games, as in the example you provided. Thus as I stated before, it's performing within expected margins in stalker 2. You could make the argument that it's a bit weak but it's not even close to an outlier. I could easily make the argument that your example is more an outlier, at least when we are only considering AAA releases over the current year.

Considering that you are trying to prove that AMD is being lazy by not releasing a driver for Stalker 2, you need to provide significant evidence that Stalker 2 is well outside normal performance margins. Something the data simply doesn't support.

I was specifically bringing this apparent performance regression in UE5 up (Wukong and now Stalker 2 are really not measuring up to the earlier UE5 games), but you seem to have found an even worse pattern, it seems that the performance on these newer games has fallen off a cliff? The XTX was never much of a competitor to the 4090 but it surely didn't look that bad next to the 4080 either... how is that a justification for any of the issues brought up? It's actually losing to the 3090 Ti in that many games? That's... terrible, even worse than I thought.

You accuse me of using ChatGPT (something I do not do and I will never do), but just something I thought you'd like to know? The Quadro branding was retired with Turing, half a decade ago. And I'd like to know what does AMD offer as an alternative to Omniverse, RTX Studio, Canvas, or even their LLM solution, all of which are fully available at no extra cost to GeForce customers today. In fact, let me take a few steps back, where's ROCm for consumer cards?

And I didn't say AMD was lazy. said that their product does not measure up to the competition. I know for a fact they aren't lazy. But there comes a time I want results instead of promises, talk of resetting the development roadmap and coming up with a reunified architecture isn't even a secret anymore. Look, writing a GPU driver is a complex thing. It's not easy. However, it's high time you people start demanding AMD to do what multi-billion-dollar companies their size ought to do; relying on eternal goodwill of a few boomers who have fond memories of their vintage ATi's on tech forums and the one-off Linux user is not a sound business strategy - as their latest earnings report clearly shows.
 
Joined
Jul 31, 2024
Messages
469 (3.15/day)
Love to hear your thoughts.

I think it's a very bad point to buy now a graphic card.

For myself the graphic cards are already too long on the market. I suspect a new generation. I personally believe that buying a graphic card at release is the best when it is a high end card, like a Nvidia 4080. (first few week after release - when the price has calmed down)

Advise: Buy if you need a card / Buy if you see a decent offer / Wait if you can wait (which most likely means another 18 months)

Just a bit off topic - as i saw linux above me:
Regarding Linux and AMD. The amd graphic cards suck. Amd does not force me to recompile the kernel module - nvidia does. But same as nvidia - no proper support for advanced features - like undervolting, frequency control or fan curve control. I checked it several times in the past with current linux-firmware and current kernel sources
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I think it's a very bad point to buy now a graphic card.

For myself the graphic cards are already too long on the market. I suspect a new generation. I personally believe that buying a graphic card at release is the best when it is a high end card, like a Nvidia 4080. (first few week after release - when the price has calmed down)

Advise: Buy if you need a card / Buy if you see a decent offer / Wait if you can wait (which most likely means another 18 months)

Just a bit off topic - as i saw linux above me:
Regarding Linux and AMD. The amd graphic cards suck. Amd does not force me to recompile the kernel module - nvidia does. But same as nvidia - no proper support for advanced features - like undervolting, frequency control or fan curve control. I checked it several times in the past with current linux-firmware and current kernel sources

I believe you have to set up the coolbits thing for the hardware controls to work


Green with Envy for GUI control similar to Afterburner on Windows


It's been a while I last tried it. That said, I never got any of my NV cards working well with Linux, especially if they're the current generation, while I did have my previous AMD cards working well there.
 
Joined
Jul 13, 2016
Messages
3,351 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
I was specifically bringing this apparent performance regression in UE5 up (Wukong and now Stalker 2 are really not measuring up to the earlier UE5 games), but you seem to have found an even worse pattern, it seems that the performance on these newer games has fallen off a cliff? The XTX was never much of a competitor to the 4090 but it surely didn't look that bad next to the 4080 either... how is that a justification for any of the issues brought up? It's actually losing to the 3090 Ti in that many games? That's... terrible, even worse than I thought.

The cost of RT effects has increased and this disproportionately impacts AMD. This is why you see the performance gap between the chart you provided, which is of a more raster heavy game, and the charts I provided.

Raster heavy game? 7900 XTX tends to lead, sometimes by a decent margin.
Software Lumen / Light RT? 7900 XTX tends to be around or slightly behind 4080
Heavy RT? 7900 XTX falls to around 4070 Ti performance.

And yes it's bad. I mentioned this market trend in a review awhile back but it stands to further weaken an already weak AMD GPU division.

You accuse me of using ChatGPT (something I do not do and I will never do), but just something I thought you'd like to know?

That was not an accusal but a genuine question given the odd coincidence.

The Quadro branding was retired with Turing, half a decade ago. And I'd like to know what does AMD offer as an alternative to Omniverse, RTX Studio, Canvas, or even their LLM solution, all of which are fully available at no extra cost to GeForce customers today. In fact, let me take a few steps back, where's ROCm for consumer cards?

This has zero relevance to driver release consistency, which is the subject matter of contention. I already stated that I agree Nvidia has the superior feature set so it's pointless to bring up.

And I didn't say AMD was lazy. said that their product does not measure up to the competition.

Yes and of which we already agreed on except for two points. You stated driver consistency was poor and I demonstrated otherwise.

I know for a fact they aren't lazy. But there comes a time I want results instead of promises, talk of resetting the development roadmap and coming up with a reunified architecture isn't even a secret anymore. Look, writing a GPU driver is a complex thing. It's not easy. However, it's high time you people start demanding AMD to do what multi-billion-dollar companies their size ought to do; relying on eternal goodwill of a few boomers who have fond memories of their vintage ATi's on tech forums and the one-off Linux user is not a sound business strategy - as their latest earnings report clearly shows.

This is nice boilerplate marketing speak but nothing of value is stated. Mind you, you are calling them lazy here. That's the obvious implication when you say they aren't doing enough or do nothing but talk and promises. It's a round about way to say it without having to address anything that's actually being discussed. It's like when anyone says "no offense but", my eyes roll over when the platitudes come out.

My question is, how does this relate to your statement that AMD has poor driver consistency and your argument that AMD needs to release a driver for stalker 2?
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,711 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
The cost of RT effects has increased and this disproportionately impacts AMD. This is why you see the performance gap between the chart you provided, which is of a more raster heavy game, and the charts I provided.

Raster heavy game? 7900 XTX tends to lead, sometimes by a decent margin.
Software Lumen / Light RT? 7900 XTX tends to be around or slightly behind 4080
Heavy RT? 7900 XTX falls to around 4070 Ti performance.

And yes it's bad. I mentioned this market trend in a review awhile back but it stands to further weaken an already weak AMD GPU division.



That was not an accusal but a genuine question given the odd coincidence.



This has zero relevance to driver release consistency, which is the subject matter of contention. I already stated that I agree Nvidia has the superior feature set so it's pointless to bring up.



Yes and of which we already agreed on except for two points. You stated driver consistency was poor and I demonstrated otherwise.



This is nice boilerplate marketing speak but nothing of value is stated. Mind you, you are calling them lazy here. That's the obvious implication when you say they aren't doing enough or do nothing but talk and promises. It's a round about way to say it without having to address anything that's actually being discussed. It's like when anyone says "no offense but", my eyes roll over when the platitudes come out.

My question is, how does this relate to your statement that AMD has poor driver consistency and your argument that AMD needs to release a driver for stalker 2?

This should prove a point, yes it's GCN but when unnecessary proprietary cruft is removed the truth comes out.

And having to use AI to enhance graphics is laziness, and a pc should be able to go offgrid/air gapped and have the same graphics capability as if it was on the net. AI is bandwitch hogging.
 
Joined
Mar 29, 2014
Messages
496 (0.13/day)
I wouldn't count on any price drop on 4090's. In fact you can expect shortages on all NV top tier cards for the next couple of years atleast. Also I wouldn't expect much uplift in price/performance for the 5090. Gaming is not what these cards are made for anymore.
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
This should prove a point, yes it's GCN but when unnecessary proprietary cruft is removed the truth comes out.

And having to use AI to enhance graphics is laziness, and a pc should be able to go offgrid/air gapped and have the same graphics capability as if it was on the net. AI is bandwitch hogging.

Internet access is not required to use AI upscalers and frame generators such as DLSS, FSR 2/3 or XeSS, you may run these completely offline with no loss of functionality.

NVIDIA and Intel GPUs have something called matrix multiplication units, and it is in this portion of the hardware that the "AI calculations" to make these upscalers possible take place. AMD is the only vendor which does not have them; which is the true reason why FSR is "vendor-agnostic", it's just shader-level code and thus would run on anything that can run the graphics API in question. Matrix multiplication functionality is not new or even remotely exclusive technology, dedicated tensor processing was introduced with Volta architecture in 2017 and Intel introduced this with ARC-branded graphics (Alchemist/A770).

To AMD's credit, while Navi 31 remains the only current high-end hardware which does not have dedicated tensor processing, they did introduce some very useful extensions for processing this type of workload with RDNA 3, although these still take the GPU's shader resources to execute. This is why something like FSR 3 is viable with high performance on this hardware. Thankfully, since AMD open sourced their implementation, it's quite easy for you to see how it works:


I wouldn't count on any price drop on 4090's. In fact you can expect shortages on all NV top tier cards for the next couple of years atleast. Also I wouldn't expect much uplift in price/performance for the 5090. Gaming is not what these cards are made for anymore.

I'm fairly sure it's gonna be 70-100% faster than the 4090, so 2-2.5x faster than the 4080 or 7900 XTX. Price/performance or not, 4K gamers are gonna buy them as fast as they can make those. Can't say I'm gonna resist. If they don't meet this target, then the 5090 might very well be a pointless product from the get go unless you mean to buy them for LLMs or something.

My question is, how does this relate to your statement that AMD has poor driver consistency and your argument that AMD needs to release a driver for stalker 2?

It's simple, man. AMD tends to implement only the strictly required API features. Nothing novel or off the beaten path. Even with the advent of the PAL-based UMDs, it seems they're endlessly stuck in a rut. New features take forever to be implemented, and when they are, they often don't work. Then they spend the lifetime of a generation fixing things, often in games that are at that point years old. Just 24.10.1 was targeting fixes for crashes and stability issues on games like DayZ and Doom Eternal, with open problems on Space Marine 2, known VR crashes, etc.


Rome wasn't built in a day, but if they were laying only a brick each day, the Coliseum wouldn't be finished to this day...
 
Joined
Jun 2, 2017
Messages
9,388 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Internet access is not required to use AI upscalers and frame generators such as DLSS, FSR 2/3 or XeSS, you may run these completely offline with no loss of functionality.

NVIDIA and Intel GPUs have something called matrix multiplication units, and it is in this portion of the hardware that the "AI calculations" to make these upscalers possible take place. AMD is the only vendor which does not have them; which is the true reason why FSR is "vendor-agnostic", it's just shader-level code and thus would run on anything that can run the graphics API in question. Matrix multiplication functionality is not new or even remotely exclusive technology, dedicated tensor processing was introduced with Volta architecture in 2017 and Intel introduced this with ARC-branded graphics (Alchemist/A770).

To AMD's credit, while Navi 31 remains the only current high-end hardware which does not have dedicated tensor processing, they did introduce some very useful extensions for processing this type of workload with RDNA 3, although these still take the GPU's shader resources to execute. This is why something like FSR 3 is viable with high performance on this hardware. Thankfully, since AMD open sourced their implementation, it's quite easy for you to see how it works:




I'm fairly sure it's gonna be 70-100% faster than the 4090, so 2-2.5x faster than the 4080 or 7900 XTX. Price/performance or not, 4K gamers are gonna buy them as fast as they can make those. Can't say I'm gonna resist. If they don't meet this target, then the 5090 might very well be a pointless product from the get go unless you mean to buy them for LLMs or something.



It's simple, man. AMD tends to implement only the strictly required API features. Nothing novel or off the beaten path. Even with the advent of the PAL-based UMDs, it seems they're endlessly stuck in a rut. New features take forever to be implemented, and when they are, they often don't work. Then they spend the lifetime of a generation fixing things, often in games that are at that point years old. Just 24.10.1 was targeting fixes for crashes and stability issues on games like DayZ and Doom Eternal, with open problems on Space Marine 2, known VR crashes, etc.


Rome wasn't built in a day, but if they were laying only a brick each day, the Coliseum wouldn't be finished to this day...
Do you ever stop? Every time someone proves you wrong you move the Goalposts with some other narrative. What does the Nvidia data say I wonder?

Fixed Gaming Bugs

  • DSR/DLDSR custom resolutions may not appear in certain games [4839770]
  • [Call of Duty MWIII] filename change preventing users from using GFE Freestyle Filters [4927183]

Fixed General Bugs


  • [Bluestacks/Corsair iCUE] May display higher than normal CPU usage [4895184][4893446]
  • When "Shader Cache size" is set to "disabled" cache files may still be created [4895217]
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Do you ever stop? Every time someone proves you wrong you move the Goalposts with some other narrative. What does the Nvidia data say I wonder?

Fixed Gaming Bugs

  • DSR/DLDSR custom resolutions may not appear in certain games [4839770]
  • [Call of Duty MWIII] filename change preventing users from using GFE Freestyle Filters [4927183]

Fixed General Bugs


  • [Bluestacks/Corsair iCUE] May display higher than normal CPU usage [4895184][4893446]
  • When "Shader Cache size" is set to "disabled" cache files may still be created [4895217]

I don't stop, because while I am offending your feelings, I am not lying, nor did I ever say that Nvidia has no bugs. it's just that they tend to be on the insignificant side, more often than not.

And before you rush to your knightly defense of AMD, read what you post (and the replies to my own post, mind you, I wasn't "proven wrong", if anything, there's a point or two of disagreement there with mostly it being cordial back and forth), perhaps. There was never a bug in question regarding MW3 (this is not the 2011 game they are talking about - it's this year's reboot) - Activision developers changed a file's name in their game which then required them to release an update so the software continued to work. Perfectly reasonable and such changes are to be fully expected in an always-online, GaaS title. Or do I need to remind you of the recent CS2 VAC incident?

And wow, what a grave showstopper of a problem that DSR resolutions hadn't shown in a couple of games, because clearly every piece of software out there works the same way and requests resolution lists in the exact same way... also another great whopper of a major bug, shader cache files being generated but not used with the cache off (even though, in most cases, you shouldn't ever turn it off). Wow! High severity issues.

:rolleyes:
 
Last edited:
Joined
Jul 13, 2016
Messages
3,351 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
This should prove a point, yes it's GCN but when unnecessary proprietary cruft is removed the truth comes out.

And having to use AI to enhance graphics is laziness, and a pc should be able to go offgrid/air gapped and have the same graphics capability as if it was on the net. AI is bandwitch hogging.

Unfortunately proprietary junk has become the norm now and Nvidia dictates the implementation of so much in games. It not only makes it harder for AMD to compete but it's severely locks out any other competitors as well (Intel and a potential ARM competitor for example). There's just way too much software lock-in nowadays. AMD could come back in the CPU market because those CPUs could utilize all the features of existing software. How in the world does anyone do that in the GPU market though? It's nearly impossible from the decades of proprietary features Nvidia has baked into the ecosystem.

Internet access is not required to use AI upscalers and frame generators such as DLSS, FSR 2/3 or XeSS, you may run these completely offline with no loss of functionality.

Correct. The only instance you'd need internet for AI is something like Microsoft's AI or the one Skyrim mod that lets you talk to NPCs. I really really hope we don't start seeing games that do require internet for games.

To AMD's credit, while Navi 31 remains the only current high-end hardware which does not have dedicated tensor processing, they did introduce some very useful extensions for processing this type of workload with RDNA 3

You are probably thinking of WMMA that increases matrix multiplication 2x (at most) over the prior gen. Not as good as dedicated units, hence the 7900 XTX gets about half the performance of the 4090 is most AI tasks.

It's simple, man. AMD tends to implement only the strictly required API features. Nothing novel or off the beaten path. Even with the advent of the PAL-based UMDs, it seems they're endlessly stuck in a rut. New features take forever to be implemented, and when they are, they often don't work. Then they spend the lifetime of a generation fixing things, often in games that are at that point years old. Just 24.10.1 was targeting fixes for crashes and stability issues on games like DayZ and Doom Eternal, with open problems on Space Marine 2, known VR crashes, etc.

Strictly speaking in regards to the last 8-10 years I agree with you. If we are talking old, ATI / AMD both AMD and Nvidia used to take turns releasing exciting new features.

And yeah, AMD has been stuck in a rut. Their GPU division doesen't make a lot of sales or money which in turn makes devs unlikely to spend much resources on optimizing for them and makes it unlikely that AMD will invest a significant amount of money closing that gap when the returns aren't there. I mean just look at Intel, they made huge strides in the driver department but it hasn't paid off for them sales wise. Intel cards can't use any of the proprietary software features in any game because they've haven't been in the dGPU market until recently. There is litterally decades of inertia behind Nvidia in terms of both mindshare and software.
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Unfortunately proprietary junk has become the norm now and Nvidia dictates the implementation of so much in games. It not only makes it harder for AMD to compete but it's severely locks out any other competitors as well (Intel and a potential ARM competitor for example). There's just way too much software lock-in nowadays. AMD could come back in the CPU market because those CPUs could utilize all the features of existing software. How in the world does anyone do that in the GPU market though? It's nearly impossible from the decades of proprietary features Nvidia has baked into the ecosystem.

Correct. The only instance you'd need internet for AI is something like Microsoft's AI or the one Skyrim mod that lets you talk to NPCs. I really really hope we don't start seeing games that do require internet for games.

You are probably thinking of WMMA that increases matrix multiplication 2x (at most) over the prior gen. Not as good as dedicated units, hence the 7900 XTX gets about half the performance of the 4090 is most AI tasks.

Strictly speaking in regards to the last 8-10 years I agree with you. If we are talking old, ATI / AMD both AMD and Nvidia used to take turns releasing exciting new features.

And yeah, AMD has been stuck in a rut. Their GPU division doesen't make a lot of sales or money which in turn makes devs unlikely to spend much resources on optimizing for them and makes it unlikely that AMD will invest a significant amount of money closing that gap when the returns aren't there. I mean just look at Intel, they made huge strides in the driver department but it hasn't paid off for them sales wise. Intel cards can't use any of the proprietary software features in any game because they've haven't been in the dGPU market until recently. There is litterally decades of inertia behind Nvidia in terms of both mindshare and software.

And there you have it, we are in full agreement! :toast:
 
Joined
Jun 2, 2017
Messages
9,388 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I don't stop, because while I am offending your feelings, I am not lying, nor did I ever say that Nvidia has no bugs. it's just that they tend to be on the insignificant side, more often than not.

And before you rush to your knightly defense of AMD, read what you post (and the replies to my own post, mind you, I wasn't "proven wrong", if anything, there's a point or two of disagreement there with mostly it being cordial back and forth), perhaps. There was never a bug in question regarding MW3 (this is not the 2011 game they are talking about - it's this year's reboot) - Activision developers changed a file's name in their game which then required them to release an update so the software continued to work. Perfectly reasonable and such changes are to be fully expected in an always-online, GaaS title. Or do I need to remind you of the recent CS2 VAC incident?

And wow, what a grave showstopper of a problem that DSR resolutions hadn't shown in a couple of games, because clearly every piece of software out there works the same way and requests resolution lists in the exact same way... also another great whopper of a major bug, shader cache files being generated but not used with the cache off (even though, in most cases, you shouldn't ever turn it off). Wow! High severity issues.

:rolleyes:
Knightly. Or you shouting out your Bias. You act like nothing AMD does is appreciated. Are you not the one that has said that your experience with Linux is not that good for you? Meanwhile AMD being an advocate of Open source means those handhelds are super smooth. It is obvious that even though they are the same GPU cores the narrative handles APUs much differently. Someone posts a video about RX580 vs 1060 and you respond with Upscaling tech that the 1060 does not even support. It is almost insane how the narrative tries to make AMD cards useless even though Gaming benchmarks on this very site belie that very narrative. I don't blame you though. I have been investigating for myself and have posted evidence of how the narrative is applied and you can find it in chat comments of some big Youtubers like Robeytech, PC World, MSI Gaming and Hardware Unboxed. You will go on though as this narrative will have people pay 5k where I live for the 5090. Don't worry you can expect tariffs to add another 25% to that too. I am of the firm opinion that unless money is no issue that spending over 1k on a GPU is daft but the kicker is that already today the 7900XTX is in some scenarios 25% the cost of the 4090 brand new. Is it worth that much to have tech that the next generation of card is going to have some feature to make your 5k brick obsolete? Does the narrative even mention the 3090?

Look at how you got triggered because I posted Nviidia bugs from their own site.
 
Joined
Jul 31, 2024
Messages
469 (3.15/day)
Are you not the one that has said that your experience with Linux is not that good for you? Meanwhile AMD being an advocate of Open source means those handhelds are super smooth.

No! Please do not write nonsense.

why?

AMD open source support is not worth mentioning. There is no gui for creating fan curves, undervolting, underclocking. High clock memory bug - and loud fans with the 6800 card - both in windows 11 pro and gnu gentoo linux. This topic is endless why amd has bad software in my current windows 11 pro or my gnu gentoo linux.

Go ahead and tell us how long you are using the linux kernel with gnu userspace? How many distros you used?
I still have my first installer Slackware 96. Just for explanation that Stands for 1996.

Before you write, I expect from you your current screenshot of your current gnu linux setup in the screenshot thread in this forum. + validation that its you like time + date + your forum name. (when you check my posts you will see several i3wm.org screenshot of my current setup + also one in the mentioned screenshot thread)

This is always with my current gnu gentoo linux, which dates back to 2006. (my forums.gentoo.org account which is basically the installation date)
I used nvidia graphic card for several months again in 2023 - a msi 960 gtx 4gb - second hand - to determinet the current driver status. AFter i had sold 2 or 3 years before my gaming laptop asus g75vw.
As mentioned sold ASUS G75VW laptop I had several, also duplicate, AMD processors: 5800X -> 3700X downgrade -> another fresh purchased 5800X -> second hand 3100x (as i was selling hte mainboard and 5800x) mainboard faulty intel wifi - 5800x never did boost -> current ryzen 7600X (i think i did not bougt better because four different 5800x processors on three different mainboard never boosted I expected. 7600x is cheap garbage - so less expected functionality or less promised functionality from amd does not really annoy me that much - i do compile a lot - i see if marketing is bullshit or not)
gpu: asrock challenger 6600XT 8GB D -> Nvidia 960GTX -> nothing - ryzen 7600X graphics -> Msi Radeon 6800 Z Trio -> current Powercolor 7800XT hellhoud (a good one in regards of beeing very silent)
On my socket am4 days i had 3 different mainboards to try + several different cpu cooler solutions.

Edit: Mainboards - a very nice topic. AMD Mainboards wiht B550 or X670. AMD - I blame amd for it - novuton does not release specs for the chip. You can barely read out any mainboards. or barely any laptops in gnu linux. the topic is endless about linux + amd + bad support. And that is just looking at my hardware. There is so much hardware available on the market.

edit: compiler support - next big topic - the optimisations suck for amd processors especially the ryzen 5800x and the ryzen 7600x at the point of time i owned those - amd does nothing - really nothing -
 
Last edited:

Ruru

S.T.A.R.S.
Joined
Dec 16, 2012
Messages
13,040 (2.97/day)
Location
Jyväskylä, Finland
System Name 4K-gaming / console
Processor AMD Ryzen 7 5800X / Intel Core i7-6700K
Motherboard Asus ROG Crosshair VII Hero / Asus Z170-K
Cooling Alphacool Eisbaer 360 / Alphacool Eisbaer 240
Memory 32GB DDR4-3466 / 16GB DDR4-3000
Video Card(s) Asus RTX 3080 TUF OC / Powercolor RX 6700 XT
Storage 3.3TB of SSDs / several small SSDs
Display(s) Acer 27" 4K120 IPS + Lenovo 32" 4K60 IPS
Case Corsair 4000D AF White / DeepCool CC560 WH
Audio Device(s) Sony WH-CN720N
Power Supply EVGA G2 750W / Fractal ION Gold 550W
Mouse Logitech MX518 / Logitech G400s
Keyboard Roccat Vulcan 121 AIMO / NOS C450 Mini Pro
VR HMD Oculus Rift CV1
Software Windows 11 Pro / Windows 11 Pro
Benchmark Scores They run Crysis
-Is there enough benefit with pcie 5.0 GPU's upcoming?
I highly doubt. With 4090, you practically can't tell that are you running it @ 3.0 x8 or @ 4.0 x16.

 
Joined
Nov 13, 2024
Messages
88 (2.00/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
I'm fairly sure it's gonna be 70-100% faster than the 4090, so 2-2.5x faster than the 4080 or 7900 XTX. Price/performance or not, 4K gamers are gonna buy them as fast as they can make those. Can't say I'm gonna resist. If they don't meet this target, then the 5090 might very well be a pointless product from the get go unless you mean to buy them for LLMs or something.
Wait... 70 - 100 % faster? Isn't it the same node and ""only"" 150 more W? (33% increase in W).

Therefore a increase in frames per watt (because of the architecture) of around 27 % ~ 50 %

(I am sure I am missing/wrong about something here...)

Also that GPU will probably cost around 2700 $ ~ 3000 $ if it lives up to your expectations...
 
Joined
Jun 19, 2008
Messages
396 (0.07/day)
Location
Chiefs Kingdom
System Name Never Had a Name
Processor i7-8700K
Motherboard ASUS ROG Maximus XI Hero Z390
Cooling Corsair H110i Pro
Memory EVGA SuperSC 16GB 2 x 8GB DDR4-3000 PC4-24000
Video Card(s) GIGABYTE RTX 2080 Ti WindForce OC - GV-N208TWF3OC-11GC (140 power target fw: 90.02.0B.40.6B)
Storage Samsung 960 EVO 500GB MLC V-NAND PCIe Gen 3 x4 NVMe M.2 2280
Display(s) Asus PG278QR 27" / Asus MG248Q 24"
Case Cooler Master HAF 922
Power Supply EVGA SuperNOVA 1000P2
Mouse Corsiar M65 Pro
Keyboard Corsair K95
Software Windows 10 64bit
Benchmark Scores http://www.speedtest.net/result/6097610737.png
I wouldn't count on any price drop on 4090's. In fact you can expect shortages on all NV top tier cards for the next couple of years atleast. Also I wouldn't expect much uplift in price/performance for the 5090. Gaming is not what these cards are made for anymore.
I am with you on those thoughts regarding pricing, I am right on the fence for a 4080S or 5080, 90's are just out of control and I won't spend that kind of money.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,711 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Unfortunately proprietary junk has become the norm now and Nvidia dictates the implementation of so much in games. It not only makes it harder for AMD to compete but it's severely locks out any other competitors as well (Intel and a potential ARM competitor for example). There's just way too much software lock-in nowadays. AMD could come back in the CPU market because those CPUs could utilize all the features of existing software. How in the world does anyone do that in the GPU market though? It's nearly impossible from the decades of proprietary features Nvidia has baked into the ecosystem.



Correct. The only instance you'd need internet for AI is something like Microsoft's AI or the one Skyrim mod that lets you talk to NPCs. I really really hope we don't start seeing games that do require internet for games.



You are probably thinking of WMMA that increases matrix multiplication 2x (at most) over the prior gen. Not as good as dedicated units, hence the 7900 XTX gets about half the performance of the 4090 is most AI tasks.



Strictly speaking in regards to the last 8-10 years I agree with you. If we are talking old, ATI / AMD both AMD and Nvidia used to take turns releasing exciting new features.

And yeah, AMD has been stuck in a rut. Their GPU division doesen't make a lot of sales or money which in turn makes devs unlikely to spend much resources on optimizing for them and makes it unlikely that AMD will invest a significant amount of money closing that gap when the returns aren't there. I mean just look at Intel, they made huge strides in the driver department but it hasn't paid off for them sales wise. Intel cards can't use any of the proprietary software features in any game because they've haven't been in the dGPU market until recently. There is litterally decades of inertia behind Nvidia in terms of both mindshare and software.
Intel was in dgpu a long time ago with the i740 and it was a flop then larrabee was a failure to launch and it seems the igpu still haunts them. But anyways i got off the nv plantation a long time ago especially after they had no vista support for NF2.
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Wait... 70 - 100 % faster? Isn't it the same node and ""only"" 150 more W? (33% increase in W).

Therefore a increase in frames per watt (because of the architecture) of around 27 % ~ 50 %

(I am sure I am missing/wrong about something here...)

Also that GPU will probably cost around 2700 $ ~ 3000 $ if it lives up to your expectations...

The GB202 is said to be exceptionally large. We'll have to wait and see but I am expecting those figures, remember AD102 in the 4090 is cut down (128 of 144 SMs with 72 of 96 MB of L2 enabled), it uses the slowest variant of GDDR6X used in Ada generation cards while GB202 is rumored to be 512-bit GDDR7...

It's all speculating of course but that has been consistently the target, with all improvements cumulated and some of the rumors aligned, I certainly see it happening. What you are absolutely right is that it probably will be very expensive.

We are going to be knowing as soon as January for sure, though. W1zz is already preparing his benchmark suite for it (saw this thread asking for what games we want benched?) so expect news Soon(TM)
 
Joined
Nov 13, 2024
Messages
88 (2.00/day)
System Name le fish au chocolat
Processor AMD Ryzen 7 5950X
Motherboard ASRock B550 Phantom Gaming 4
Cooling Peerless Assassin 120 SE
Memory 2x 16GB (32 GB) G.Skill RipJaws V DDR4-3600 DIMM CL16-19-19-39
Video Card(s) NVIDIA GeForce RTX 3080, 10 GB GDDR6X (ASUS TUF)
Storage 2 x 1 TB NVME & 2 x 4 TB SATA SSD in Raid 0
Display(s) MSI Optix MAG274QRF-QD
Power Supply 750 Watt EVGA SuperNOVA G5
The GB202 is said to be exceptionally large. We'll have to wait and see but I am expecting those figures, remember AD102 in the 4090 is cut down (128 of 144 SMs with 72 of 96 MB of L2 enabled), it uses the slowest variant of GDDR6X used in Ada generation cards while GB202 is rumored to be 512-bit GDDR7...

It's all speculating of course but that has been consistently the target, with all improvements cumulated and some of the rumors aligned, I certainly see it happening. What you are absolutely right is that it probably will be very expensive.

We are going to be knowing as soon as January for sure, though. W1zz is already preparing his benchmark suite for it (saw this thread asking for what games we want benched?) so expect news Soon(TM)
that is ludicrous... my whole prebuilt PC was cheaper in the chip crisis era than that card at launch (if it really is 2700 $ or higher)

i know that thing will curb stomp my 3080 10GB, but... . . . . .

Well at least these GPU hold their values like iPhones (which is bad for those who don't have a lot of disposable income)


Anyways, maybe you can get a good deal on a used 4080S/4090 Card when Balckwell comes out?
(Ebay GPUs in my area are more expensive than new, its ridiculous looked it up again and i was wrong, 4090 on ebay, for an example, are 100 ~ 150 € less expensive)

(Edit: Also thanks for explaining what I missed/wrong about)
 
Last edited:
Joined
Jun 2, 2017
Messages
9,388 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
No! Please do not write nonsense.

why?

AMD open source support is not worth mentioning. There is no gui for creating fan curves, undervolting, underclocking. High clock memory bug - and loud fans with the 6800 card - both in windows 11 pro and gnu gentoo linux. This topic is endless why amd has bad software in my current windows 11 pro or my gnu gentoo linux.

Go ahead and tell us how long you are using the linux kernel with gnu userspace? How many distros you used?
I still have my first installer Slackware 96. Just for explanation that Stands for 1996.

Before you write, I expect from you your current screenshot of your current gnu linux setup in the screenshot thread in this forum. + validation that its you like time + date + your forum name. (when you check my posts you will see several i3wm.org screenshot of my current setup + also one in the mentioned screenshot thread)

This is always with my current gnu gentoo linux, which dates back to 2006. (my forums.gentoo.org account which is basically the installation date)
I used nvidia graphic card for several months again in 2023 - a msi 960 gtx 4gb - second hand - to determinet the current driver status. AFter i had sold 2 or 3 years before my gaming laptop asus g75vw.
As mentioned sold ASUS G75VW laptop I had several, also duplicate, AMD processors: 5800X -> 3700X downgrade -> another fresh purchased 5800X -> second hand 3100x (as i was selling hte mainboard and 5800x) mainboard faulty intel wifi - 5800x never did boost -> current ryzen 7600X (i think i did not bougt better because four different 5800x processors on three different mainboard never boosted I expected. 7600x is cheap garbage - so less expected functionality or less promised functionality from amd does not really annoy me that much - i do compile a lot - i see if marketing is bullshit or not)
gpu: asrock challenger 6600XT 8GB D -> Nvidia 960GTX -> nothing - ryzen 7600X graphics -> Msi Radeon 6800 Z Trio -> current Powercolor 7800XT hellhoud (a good one in regards of beeing very silent)
On my socket am4 days i had 3 different mainboards to try + several different cpu cooler solutions.

Edit: Mainboards - a very nice topic. AMD Mainboards wiht B550 or X670. AMD - I blame amd for it - novuton does not release specs for the chip. You can barely read out any mainboards. or barely any laptops in gnu linux. the topic is endless about linux + amd + bad support. And that is just looking at my hardware. There is so much hardware available on the market.

edit: compiler support - next big topic - the optimisations suck for amd processors especially the ryzen 5800x and the ryzen 7600x at the point of time i owned those - amd does nothing - really nothing -
Do you know what AMD software is? I can see you have also been triggered. I could write a laundry list of Intel problems but all I have to say is none of those CPUs lost performance as the silicone melted from too much voltage but that would be juvenile.

What does the Steam Deck run on? Is it not a Linux based distro? Is that OS not designed to run on AMD hardware?

Well I use my AM4 chips for CPU mining and they are doing just fine.

Screenshot 2024-11-28 105215.png
 
Joined
Dec 25, 2020
Messages
7,061 (4.83/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
that is ludicrous... my whole prebuilt PC was cheaper in the chip crisis era than that card at launch (if it really is 2700 $ or higher)

i know that thing will curb stomp my 3080 10GB, but... . . . . .

Well at least these GPU hold their values like iPhones (which is bad for those who don't have a lot of disposable income)


Anyways, maybe you can get a good deal on a used 4080S/4090 Card when Balckwell comes out?
(Ebay GPUs in my area are more expensive than new, its ridiculous)

I'm personally holding out hope it'll cost more or less the same as the 4090 does today on average (that means, up to $1800-2000 for the basic models), I just hope it doesn't exceed that value, I can't afford anything above that.

Do you know what AMD software is? I can see you have also been triggered. I could write a laundry list of Intel problems but all I have to say is none of those CPUs lost performance as the silicone melted from too much voltage but that would be juvenile.

What does the Steam Deck run on? Is it not a Linux based distro? Is that OS not designed to run on AMD hardware?

Well I use my AM4 chips for CPU mining and they are doing just fine.

View attachment 373674

You haven't even read his post... Jesus, man. You try to say your defense is justified and not knightly, and then you pull that one. Slow down. We get it, you have an inflamed, passionate love for AMD, but fighting everyone you see speak negatively of their products on forums isn't gonna keep them afloat, demanding change and improvement will. If not, they deserve to close up shop.

If no one in that multi-billion-dollar international conglomerate ever stopped to sit down and look at what makes its competition successful and at least try to follow in the same vein, perhaps they actually deserve to.
 
Joined
Jun 2, 2017
Messages
9,388 (3.40/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I'm personally holding out hope it'll cost more or less the same as the 4090 does today on average (that means, up to $1800-2000 for the basic models), I just hope it doesn't exceed that value, I can't afford anything above that.



You haven't even read his post... Jesus, man. You try to say your defense is justified and not knightly, and then you pull that one. Slow down. We get it, you have an inflamed, passionate love for AMD, but fighting everyone you see speak negatively of their products on forums isn't gonna keep them afloat, demanding change and improvement will. If not, they deserve to close up shop.

If no one in that multi-billion-dollar international conglomerate ever stopped to sit down and look at what makes its competition successful and at least try to follow in the same vein, perhaps they actually deserve to.
"AMD open source support is not worth mentioning. There is no gui for creating fan curves, undervolting, underclocking. High clock memory bug - and loud fans with the 6800 card - both in windows 11 pro and gnu gentoo linux. This topic is endless why amd has bad software in my current windows 11 pro or my gnu gentoo linux."
 
Status
Not open for further replies.
Top