• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Falling Behind: Radeon dGPUs Absent from Steam's Top 20

Joined
Nov 27, 2023
Messages
2,402 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
Someone that has ever in his life worked in photoshop can quickly tell you that PPI is a laughable metric and raw pixel count is the one and only metric that matters for image quality.
Okay, this is just a ridiculous maximalistic statement at this point. Static images and actual screen quality in daily usage aren’t the same thing. There is more to it than raw pixel count. You are arguing for something that nobody brought up. Do you actually, unironically think that there won’t be a point in desktop screens where resolution increases will no longer lead to perceivable improvements in image quality and the performance hit will just not be worth it? Hint - there absolutely will come such a time.
I know everyone likes clowning on Apple (for some reason), but their Retina concept isn’t actually just a marketing meme and there was thought put behind it. And most serious researchers, even those noting some imperfections with it, tend to agree on the principle.

Well if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.
Yes, it’s called a laptop screen. And yes, the WHOLE GOAL IS IMPROVING PERCEIVED DISPLAY QUALITY, that’s the whole point, not just jacking off to numbers.

Edit: Also, make a discussion or something guys, I just noticed the thread we are in and this is a derail if I ever saw one.
 
Joined
Jan 14, 2019
Messages
12,361 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
@fevgatos
You are technically correct (the best type, I suppose), but what Aus is driving at it that at sizes where hypothetically both 1440p and 4K would hit a certain ppi that is high enough for the task, like 150+ for typical monitor distance or 300+ for mobile usage, the PERCEIVED image quality will be very close, if not indistinguishable. I certainly can’t reliably tell the difference between, say, a recent iPhone with 460 ppi and a new Galaxy Ultra with 500+, not in a meaningful way.
Exactly. If you upgrade your 1440p monitor with 300 ppi to a bigger 4K one with the same 300 ppi, your picture in gaming (not in Photoshop) will look the same... Just bigger.

When I upgraded from 1080p 24" to 1440 UW 34", my ppi, and therefore, my image quality stayed roughly the same. I just got more FOV and desktop area.

Well if your goal is just to make it appear better then we might as well be using a 14" 1080p monitor at a distance of 2 meters, it will be so tiny you won't see any misshaps.
Yeah, might as well if you don't mind the size.

The photoshop was just an example to demonstrate the concept. Set it to 1:1 pixel view and just draw a horizontal line with each pixel having a unique color. On a 1080p monitor you can only get 1920 different colors, on a 4k screen you can get double that. This is literally what image detail is, how many unique colors you can get.
It was a bad example, an apples to oranges comparison. What you're talking about has nothing to do with what I'm talking about.

It's not different in games. What do you think happens in a game when you drop resolution from 4k to 1080p? What do you think happens to the image? There were 8.3m pixes at 4k, now we are down to 2m. Do you think those extra 6.3m pixels were doing nothing there?
Don't forget you're dropping resolution on the same screen size which lowers your ppi to the floor. That's why it'll look like crap.

Of course you have less detail on a smaller resolution screen, but if it's a smaller size, you won't notice it.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,504 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Okay, this is just a ridiculous maximalistic statement at this point. Static images and actual screen quality in daily usage aren’t the same thing. There is more to it than raw pixel count. You are arguing for something that nobody brought up. Do you actually, unironically think that there won’t be a point in desktop screens where resolution increases will no longer lead to perceivable improvements in image quality and the performance hit will just not be worth it? Hint - there absolutely will come such a time.
I know everyone likes clowning on Apple (for some reason), but their Retina concept isn’t actually just a marketing meme and there was thought put behind it. And most serious researchers, even those noting some imperfections with it, tend to agree on the principle.


Yes, it’s called a laptop screen. And yes, the WHOLE GOAL IS IMPROVING PERCEIVED DISPLAY QUALITY, that’s the whole point, not just jacking off to numbers.

Edit: Also, make a discussion or something guys, I just noticed the thread we are in and this is a derail if I ever saw one.
We are not anywhere near that point for desktop monitors though. 16k resolution at 42" might get to that diminishing point.
 
Joined
Nov 27, 2023
Messages
2,402 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
We are not anywhere near that point for desktop monitors though. 16k resolution at 42" might get to that diminishing point.
Wrong. 8K at 27 will already do the trick. At normal desktop seating distance anything beyond 300 ppi would be much of muchness for anyone with normal or near normal vision, either naturally or compensated.

But, as always, you are picking a really weird hill to die on and argue for, so I am out to not further derail.
 
Joined
Jun 14, 2020
Messages
3,504 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Don't forget you're dropping resolution on the same screen size which lowers your ppi to the floor. That's why it'll look like crap.
Forget the monitor, you have 2 monitors side by side. A 1080p and a 4k. What happens to those 6.3m pixels?
 
Joined
Jan 14, 2019
Messages
12,361 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Forget the monitor, you have 2 monitors side by side. A 1080p and a 4k. What happens to those 6.3m pixels?
Are PPI and viewing distance the same? If so, then I don't care.

4K can give you a much bigger screen area, or a much better image quality (ppi), or a little bit of both. That's what happens with the 6.3m pixels.

It's not the pixels alone that give you a better image, but your perception of them, which is highly dependent on your ppi and viewing distance (not to mention your eyesight).

Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
 
Joined
Jun 14, 2020
Messages
3,504 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Are PPI and viewing distance the same? If so, then I don't care.

4K can give you a much bigger screen area, or a much better image quality (ppi), or a little bit of both. That's what happens with the 6.3m pixels.

It's not the pixels alone that give you a better image, but your perception of them, which is highly dependent on your ppi and viewing distance (not to mention your eyesight).

Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
The question isn't whether you care, the question is what were those 6.3m pixels displaying if not extra details?

Edit: I have a question too. You have one image. You open it in Photoshop on your 4K screen and zoom in on it. Then, you open it on your phone and don't zoom in on it. Where will it look better?
Depends on the resolution of the phone, doesn't it?
 
Joined
Jan 14, 2019
Messages
12,361 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
The question isn't whether you care, the question is what were those 6.3m pixels displaying if not extra details?
No. The question is, would you notice those 6.3m extra pixels?

Depends on the resolution of the phone, doesn't it?
No. It depends on the PPI of the phone compared to your monitor.

Edit: Okay, let's make it simple. Here's two pictures. Lean back in your chair and tell me, which one looks better to you? (Hint: they're the exact same picture)

1.png
1.png
 
Last edited:
Joined
Jun 14, 2020
Messages
3,504 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Currently on my phone, the top one looks worse, there is a lot of lost detail especially on the edges. Will try on my monitor in the afternoon.
 
Joined
Jan 14, 2019
Messages
12,361 (5.74/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Currently on my phone, the top one looks worse, there is a lot of lost detail especially on the edges. Will try on my monitor in the afternoon.
Wait, I had to edit my post because my image editor didn't save properly. Look again (I'm not sure if you can see what I mean on your phone, though - I can on mine).

I don't mind if you PM me your observations. We've probably derailed this thread long enough.
 
Joined
Jul 24, 2024
Messages
251 (1.87/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
4k DLSS Q looks disgustingly better than 1440p native. Period.
It's not about the PPI, ppi is irrelevant.
You clearly don't seem to understand how human eye vision relates to change in display's PPI. It's true, that 24" 1440p display will have same amount of pixels as 28" 1440p. Yet your eye will see a different picture, you can easily stop using more sophisticated AA method on that 24" because it won't be so noticeable as on higher display.

Make a screenshot while playing on lower resolution, then display that same screenshot on bigger monitor with same resolution. In case your vision is really okay as you said, you should be able to notice a difference. I had once a discussion with Mac laptop dude. He bought that laptop with 13" screen with shitload of pixels to work with photos. "Retina is like made for this, it's the best you can get." And yet professional video and photo editors use much higher EIZO displays. Why? Because at that 13" 2048x1536 retina they can't see sh*t with their own eyes, they can't see what their filters and other applied effects do.

While having more PPI on same physical display size increases that display's resolution, it decreases human eye options to perceive those details. This, of course, differs from person to person. It's like hearable human ear frequency range differs from person to person. And this eye or ear "resolution" will degrade with time (as person gets older). I hate when AC/DC adaptor near my bed does that high pitched constant tone when it's charging up a phone. My girlfriend can't hear it and she's even 4 years younger.

When video, image or sound record gets downscaled, portion of information gets permanently destroyed. You can't really re-create back that information (unless it's stored in some form), you can only guess or use methods to improve that guessing accuracy (e.g. interpolation). That's what DLSS/FSR/XeSS does. Of course, upscaling algorithms are getting better and better, but still, you can't re-create the missing information. You can generate something similar but it will never be same as original, meaning it will never be same as authors intended it to be. That's why I call everything else but native fake.

Maybe try getting 24-bit FLAC sound record with rich tone variery, especially rich in lower frequencies, convert it to 320 kbps MP3, then convert it back to 24-bit FLAC and compare both using $200 headphones and standard DAC. You will notice that especially lower tones are somehow poor or even missing. When you downscale 24-bit FLAC to MP3 with like 7-8 times less bitrate, you lost a lot of information which you can't really recreate by upscaling it back to original resolution. Even with application of so called "AI", various filters, etc, it won't reach the quality of original sound. It may end up having similar sized file as original record but it won'be the same, it will be something different than autor made it to be.
 
Last edited:

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
13,072 (2.39/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
Stop with the derail. Pretty sure there was an actual monitor post made recently - go find it, go there.

OT's will just get deleted, and posters reply banned.
 
Joined
Nov 13, 2023
Messages
23 (0.06/day)
Processor Intel i7-10700
Cooling Noctua
Video Card(s) Nvidia Gigabyte 4070 Ti Windforce OC
Display(s) Msi 170 Hz 1440p
Case Phantek A400
Power Supply Seasonic 850w Platinum
I'm not surprised at all by this news, Amd keeps having problems with Radeon every generation like high idle power consumption, stuttering and low utilization issues with old games ( like 15-20 years old games), less frequent driver updates for older products and the list goes on.

At the same time it's really a shame that Nvidia is allowed to raise their prices as much as they want, we need competition, but Amd and Intel aren't capable of competing... the GPU market is SO bad right now.
 
Joined
Jun 2, 2017
Messages
9,251 (3.37/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
So who is the end user? Enthusiasts? Or regular people who just want to run games?
There is the rub. In every scenario AMD is cheaper as well. You have people commenting on how middle of the road a card like the 7900XT without realizing that it is in the top of every chart that matters to Gamers. There is even a poll to prove that.

I'm not surprised at all by this news, Amd keeps having problems with Radeon every generation like high idle power consumption, stuttering and low utilization issues with old games ( like 15-20 years old games), less frequent driver updates for older products and the list goes on.

At the same time it's really a shame that Nvidia is allowed to raise their prices as much as they want, we need competition, but Amd and Intel aren't capable of competing... the GPU market is SO bad right now.
The News vs the truth.

1. High Idle power? I have no idea what you mean. Do you mean more than 10 watts, 20 Watts? I would also ask you to show me a GPU that has better idle power than the 6500XT.
2. Give me a Game, I have plenty of old ones too. Do you mean DOS or Steam like Kingdoms of Aamluar or Sleeping Dogs? Maybe Just Cause 2? I have not seen what you describe. Maybe Praetorians?
3. Less frequent updates for older products. Yeah it kind of blows that Vega is no longer getting driver support. Except my 5600G has the latest driver.

What is the truth is that the narrative takes away the fact that China was openly buying as many 4090/4080 GPUs as they could and Nvidia was allowed to use that as part of their numbers. It is like when TW3 Kingdoms launched on Steam. It instantly became the most popular TW titile in terms of Sales but TWWH is the real driver of the Total War economy. Then you combine that with the tech media all using 4090s to the point where you will see comments on TPU like you can't Game at 4K unless you have a 4090/4080 and the 7900XTX/7900XT are not as good at RT so they are not worth the money. Then you look at prices and realize that sales are down across the board with MBs and GPUs priced to the moon. Even Storage as volatile as it has been has been it has nothing like the price gouging that Nvidia started. As an example if the 7900XT was $450 US there would be no reason to buy anything else. Where I live that is about the cost of a 4060 ti. Is a 4060Ti better than a 7900XT at anything? Before you answer that read some GPU reviews on TPU and focus on where the 7900XT is on the Gaming charts.

These modern reviews also do not use CPU Intensive Games that are the ones that make your PC cry. Like City Skylines2 or Factorio, it is the first Game at 4K where I had to turn on Hyper RX once my population started reaching 1 million. Try that at 4K high and you will see clearly the separation between CPUs in cores and clock speed. In fact most Games at 4K high on these modern systems is a great way to guage CPU performance. It let me know that a 7800X3D is not as fast a 7900X3D in City Skylines2 at 4K and niether will it produce as many FPS in Racing Games like AMS2. Reveiws use 4K Ultra or whatever the highest setting the Game allows and that all but ensures that the GPU does all the work as the frame buffer will alwys be on when you allow all the candy.

If you want pure raster AMD is a great choice.
 
Joined
Nov 13, 2023
Messages
23 (0.06/day)
Processor Intel i7-10700
Cooling Noctua
Video Card(s) Nvidia Gigabyte 4070 Ti Windforce OC
Display(s) Msi 170 Hz 1440p
Case Phantek A400
Power Supply Seasonic 850w Platinum
There is the rub. In every scenario AMD is cheaper as well. You have people commenting on how middle of the road a card like the 7900XT without realizing that it is in the top of every chart that matters to Gamers. There is even a poll to prove that.


The News vs the truth.

1. High Idle power? I have no idea what you mean. Do you mean more than 10 watts, 20 Watts? I would also ask you to show me a GPU that has better idle power than the 6500XT.
2. Give me a Game, I have plenty of old ones too. Do you mean DOS or Steam like Kingdoms of Aamluar or Sleeping Dogs? Maybe Just Cause 2? I have not seen what you describe. Maybe Praetorians?
3. Less frequent updates for older products. Yeah it kind of blows that Vega is no longer getting driver support. Except my 5600G has the latest driver.

What is the truth is that the narrative takes away the fact that China was openly buying as many 4090/4080 GPUs as they could and Nvidia was allowed to use that as part of their numbers. It is like when TW3 Kingdoms launched on Steam. It instantly became the most popular TW titile in terms of Sales but TWWH is the real driver of the Total War economy. Then you combine that with the tech media all using 4090s to the point where you will see comments on TPU like you can't Game at 4K unless you have a 4090/4080 and the 7900XTX/7900XT are not as good at RT so they are not worth the money. Then you look at prices and realize that sales are down across the board with MBs and GPUs priced to the moon. Even Storage as volatile as it has been has been it has nothing like the price gouging that Nvidia started. As an example if the 7900XT was $450 US there would be no reason to buy anything else. Where I live that is about the cost of a 4060 ti. Is a 4060Ti better than a 7900XT at anything? Before you answer that read some GPU reviews on TPU and focus on where the 7900XT is on the Gaming charts.

These modern reviews also do not use CPU Intensive Games that are the ones that make your PC cry. Like City Skylines2 or Factorio, it is the first Game at 4K where I had to turn on Hyper RX once my population started reaching 1 million. Try that at 4K high and you will see clearly the separation between CPUs in cores and clock speed. In fact most Games at 4K high on these modern systems is a great way to guage CPU performance. It let me know that a 7800X3D is not as fast a 7900X3D in City Skylines2 at 4K and niether will it produce as many FPS in Racing Games like AMS2. Reveiws use 4K Ultra or whatever the highest setting the Game allows and that all but ensures that the GPU does all the work as the frame buffer will alwys be on when you allow all the candy.

If you want pure raster AMD is a great choice.
1)Read the teachpowerup reviews about the high idle power, i think it has been fixed now. Still it was there, even for previous rdna generations

2) Unfortunately for me i haven't see many examples of games where this happened, i read about it at least a year ago but i've seen people complaining about it at least 3-4 times, like here https://www.overclock.net/threads/rx-6000-cards-are-disgustingly-bad-for-old-games.1805753/ and https://www.reddit.com/r/buildapc/comments/127kaqo and https://www.reddit.com/r/AMDHelp/comments/1bqmkbj
"Nvidia spents lots and lots on game-specific optimizations throughout the years that AMD didn't catch up, and right now it makes no financial sense for AMD to work on the old stuff. Plus higher market share of discrete Nvidia means many lesser-known games only optimized for Nvidia instead of AMD. Overall you're more likely to have a worse experience playing older/less popular games with AMD compared to Nvidia."

3) not even talking about Vega, when rx 7000 released many people were angry because updates for rx6000 slowed so much.... also unrelated but destiny 2 had awful performance issues on Amd in 2020 when the nex dlc dropped and Amd took like MONTHS to fix the absymally low performance. I had 2 friends with a 5700xt that complained so much about this....

Also Nvidia has Dynamic Super Resolution which allows me to play older games at 5120x2880p instead of 1440p ( and when i get a 4k monitor i can play older games in 8k), Amd's virtual super resolution can go above 4k but they don't even advertise it... in fact until now i thought it couldn't go above 4k at all....

"As an example if the 7900XT was $450 US there would be no reason to buy anything else" I agree but that will never happen
 
Joined
Jun 2, 2017
Messages
9,251 (3.37/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
1)Read the teachpowerup reviews about the high idle power, i think it has been fixed now. Still it was there, even for previous rdna generations

2) Unfortunately for me i haven't see many examples of games where this happened, i read about it at least a year ago but i've seen people complaining about it at least 3-4 times, like here https://www.overclock.net/threads/rx-6000-cards-are-disgustingly-bad-for-old-games.1805753/ and https://www.reddit.com/r/buildapc/comments/127kaqo and https://www.reddit.com/r/AMDHelp/comments/1bqmkbj
"Nvidia spents lots and lots on game-specific optimizations throughout the years that AMD didn't catch up, and right now it makes no financial sense for AMD to work on the old stuff. Plus higher market share of discrete Nvidia means many lesser-known games only optimized for Nvidia instead of AMD. Overall you're more likely to have a worse experience playing older/less popular games with AMD compared to Nvidia."

3) not even talking about Vega, when rx 7000 released many people were angry because updates for rx6000 slowed so much.... also unrelated but destiny 2 had awful performance issues on Amd in 2020 when the nex dlc dropped and Amd took like MONTHS to fix the absymally low performance. I had 2 friends with a 5700xt that complained so much about this....

Also Nvidia has Dynamic Super Resolution which allows me to play older games at 5120x2880p instead of 1440p ( and when i get a 4k monitor i can play older games in 8k), Amd's virtual super resolution can go above 4k but they don't even advertise it... in fact until now i thought it couldn't go above 4k at all....

"As an example if the 7900XT was $450 US there would be no reason to buy anything else" I agree but that will never happen
1. Wait you just said it is a nothing burger

2. Anecdotal from 3 people but I am not going to argue I had a 5600Xt but got a 6800Xt at launch and it has been Golden, same thing happened when I got a 7900XT. When you think about it Older Games have less features so the raw performance of modern PCs should be huge as an advantage and not disadvantage, Take TW Rome and see how fast a modern PC is. The Max resolution is 1080P. I seriously have no idea where you get that. Do you mean Hairworks on the Witcher? Maybe you mean Physx. That died as soon as Nvidia locked you out if the program detected an AMD card and would not work. I could give you the features in CP2077 but to be honest the raw raster performance and what 4K looks like on a modern PC is fine for my eyes and everyone else in my circle. This is anecdotal as well but my friends with 3080s are not as happy as those with 6800Xts.

3. Yep those 3 months to add 7000 to the universal stack were so long and had such a negative effect on performance that the entire community wet their panties while the other 90% of users did not even know. anything had happened. We can both use individual Games to critique both AMD and Nvidia performance. Hogwarts and Starfield come to mind.

4. AMD is not Intel and also have added lot's of things like Freesync and FSR (As much as it is derided it is universal) but even before that AMD cards have always been cheaper than

AMD software is that good and I know that AMD has advertised it but again you raised an issue that is a nothing burger.

I am not going to deny that most AMD cards do not sell as well as Nvidia cards but when the entire narrative is against you and people that have not used AMD cards strongly opine on them it shows. I will give you an example. How many of the main Youtube channels have done a deep dive on AMD software? How many tech media sites have done a deep dive on AMD software? How many know what AMD software is today if they only use Nvidia cards in their videos? That is sad because AMD software today can easily give you whatever you want. I have been playing a lot of City Skyines 2 lately and my population on my latest build is over 800,000 with 200km of Train Tracks, 35 Subways stops and 6 interchanges. 1/3 of the CIty is Office buildings so there is plenty of traffic from outside. Playing at 4k it tanked my FPs to from50s and 60s to the low 20s to high 30s. Well I went into AMD software and clicked on the Icon for CS2 and instantly HYper RX was activated. Went back into City Skylines 2 and now we are in the 60s and back in Freesync range for butter smooth Gameplay. That means fast Vehicles on the Highways and fast moving foot traffic at Train and Subway stations. AMD are actually tryiing and the fine wine is in full effect. Just look at how many people are still happy with their 6800XT. If I was playing at 1440P I would be happy with a 6800XT too but I am an enthusiast (Not a negative for those on 1440P) and 1440P was my Qnix monitor from like 14 years ago. I just happen to like AMD as they align more with my thought process on what the PC should be. No AMD user paid for them to implement Freesync and the whole world has benefited with TVs coming with (VRR) Freesync technology for the masses to enjoy.
 
Joined
Sep 17, 2014
Messages
22,499 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Last edited:
Joined
Apr 2, 2011
Messages
2,820 (0.56/day)
So...after 9 pages this seems like a discussion about AMD versus Nvidia. I don't buy it.

What I do buy is that AMD is changing their position in the market and Steam is not a great indicator of their future. Steam doesn't track Playstation or Xbox consoles. It also skips the Switch...but that's an Nvidia product. I...have to acknowledge that the Switch has outsold the Playstation...but if you look at overall gaming PC sales versus console sales the point of contention is still that consoles outsell gaming PCs. That's...a lot of money.

I see AMD succeeding on the CPU side, and thriving with their APUs...and acknowledge that their next generation of video cards will not be high end. It seems like they are going the Nintendo route, where what they put out is not the best. It is not pushing new features. It is pushing value and profitability for a chunk of the market that they think they can milk, as long as they actually pay it attention. It's the same new logic that UserBenchmark has been called out on, in their ever amusing war against everything AMD. They have to acknowledge that AMD wins in some things, but follow it up by insulting their consumers and AMD's marketing team. They always cite that Intel is better...even when the numbers disagree. AMD evaluated the GPU race and has stepped back from the PC enthusiast market to carve out their niche in consoles...which based upon financials seems to have been a good step.

Before anyone calls it, Nvidia made a better one with selling their hardware as AI development tools. No questions there. I just look forward to the next two generations as Nvidia funnels all of their development there...and the eventual collapse of the power hungry glorified LLMs that make up current AI model. AMD will have consumers in the $300 window, but if the 4060 is what Nvidia is willing to do for that consumer hopefully there will be a reckoning.
 
Joined
Mar 17, 2017
Messages
5 (0.00/day)
I think the point was that the $100 range of GPUs have been replaced with a gaping hole in recent years, which AMD could fill if they wanted to, but they don't for some wild reason.
The expectations for GPUs have risen. They're being asked to do more than they used to, including driving larger and higher resolution monitors. Plus there has been inflation since the days of $100 gaming GPUs. I don't think those are ever coming back. In addition, we now have competent integrated GPUs; those have taken over the niche that $100 cards used to occupy.
 
Joined
Jul 24, 2024
Messages
251 (1.87/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
The expectations for GPUs have risen. They're being asked to do more than they used to, including driving larger and higher resolution monitors. Plus there has been inflation since the days of $100 gaming GPUs. I don't think those are ever coming back. In addition, we now have competent integrated GPUs; those have taken over the niche that $100 cards used to occupy.
Today's GPUs are too complicated - they are too versatile. On one side, it's good to be versatile, on the other you can't really focus on everything really well (you lack resources) . Today's GPU abbreviation stand more for General Processing Unit than Graphics Processing Unit. I don't see reason to have NPU included in both GPU and CPU. It's okay to have iGPU and NPU included in laptop SoC, because there is lack of space, so putting it together is convenient, but having iGPU and/or NPU in desktop CPU is waste of silicon by my opinion, waste of space, waste of bandwidth. More cores or large caches or better I/O capabilities would be much more useful. I mean, seriously, is there anyone who really buys Intel desktop CPU with intend to run it with that shitty iGPU? You can buy GPU which will have much higher so called "AI" performance than that tiny little NPU in desktop CPU. Or, you can buy PCI-E based "AI" accelerator add-in card, in case you don't want to use NPU capabilities of your GPU or your GPU does not support "AI".
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I mean, seriously, is there anyone who really buys Intel desktop CPU with intend to run it with that shitty iGPU?
Literally every IT department in the world, which together buy the vast majority of CPUs produced.
 
Joined
Jun 19, 2020
Messages
102 (0.06/day)
Steam HW Survey is NO statistic.
It lacks many statistics requisits - sample choosing methodology (size of the sample, method of choosing sample), deviations, probability of error level, etc.

Thus, its data cannot be taken seriously. Afterall: "I only believe in statistics that I doctored myself." (Joseph Goebbels)
 
Joined
Jul 24, 2024
Messages
251 (1.87/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
Literally every IT department in the world, which together buy the vast majority of CPUs produced.
True, serves me right. I should have been more specific about my question being related to TPU.
 
Top