• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Do you want AMD to make better low-end graphics cards?

Do you want from AMD to make better low-end graphics cards?


  • Total voters
    84
Joined
Jul 29, 2022
Messages
525 (0.60/day)
The RX6600 is indeed good, the problem is that after 6 years and 3 generations, for the same price bracket, you get no or almost no improvements, you only see improvements if you buy the more expensive tier cards. Like, imagine if the Geforce 970 and 3050 had the same price and/or performance. That's what it looks like comparing a Polaris to the 6600: 6 years and almost no increase in performance per dollar (and in fact a reduction in hardware features).

Plus you can get second hand cards for ridiculously cheap. I can get a Vega56 with Samsung HBM2 (an incredibly good overclocker that makes it deceptively powerful) for something like $170. With a bit of tweaking it performs on the level of a 6600XT, which costs twice as much new (and 50% as much even when second hand).
 
Joined
Aug 21, 2015
Messages
1,751 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Plus you can get second hand cards for ridiculously cheap. I can get a Vega56 with Samsung HBM2 (an incredibly good overclocker that makes it deceptively powerful) for something like $170. With a bit of tweaking it performs on the level of a 6600XT, which costs twice as much new (and 50% as much even when second hand).

Even taking into consideration that the TPU performance chart contains calculated estimates, I find that hard to believe; tweaks don't generally net one ~50%. Your observation on pricing, though, is definitely true; 6600 XT should for sure be <$300.

1667954907306.png


Now, Vega 64 vs. the vanilla 6600, that's more plausible:

1667954963474.png
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I voted yes, although I own a 6400 and also a 6500 XT and I'm not unhappy with either. The 6400 is a nice little HTPC / casual gaming card, and the ASUS Tuf 6500 XT is the quietest dual-fan card I've ever had. I recently bought a 6750 XT, but I'll probably reuse either the 6400 or the 6500 XT in my reconfigured HTPC. The only thing I would have changed on these is the launch price.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,245 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Guys if you think the focus of the thread should have been about Nvidia, make a post about their low end cards. This ones about AMD.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I voted yes, although I own a 6400 and also a 6500 XT and I'm not unhappy with either. The 6400 is a nice little HTPC / casual gaming card, and the ASUS Tuf 6500 XT is the quietest dual-fan card I've ever had. I recently bought a 6750 XT, but I'll probably reuse either the 6400 or the 6500 XT in my reconfigured HTPC. The only thing I would have changed on these is the launch price.
It is amazing how the difference with owners disposition with the budget 6400/6500XT cards vs those that rely solely on reviews to make their argument. I have said this before but when I took my 6500XT apart (Thanks Gigabyte for skimping on Paste) I was blown away with how small the chip was. Even the memory chips are like double the size of the GPU. I have also said before that the budget will probably become an AMD/Intel APU but most likely AMD as that chip could easily fit in the die of a AM5 chip. A 65 Watt TDP on the CPU but a 200 Watt limit on the chip could mean at least 100W for the GPU. Cooling it would be a bother but you would actually have a comparable VRM with the monsters on modern boards.
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
It is amazing how the difference with owners disposition with the budget 6400/6500XT cards vs those that rely solely on reviews to make their argument. I have said this before but when I took my 6500XT apart (Thanks Gigabyte for skimping on Paste) I was blown away with how small the chip was. Even the memory chips are like double the size of the GPU. I have also said before that the budget will probably become an AMD/Intel APU but most likely AMD as that chip could easily fit in the die of a AM5 chip. A 65 Watt TDP on the CPU but a 200 Watt limit on the chip could mean at least 100W for the GPU. Cooling it would be a bother but you would actually have a comparable VRM with the monsters on modern boards.
Well, reviewers (especially on YouTube) rely on readers/viewers to keep going, so most of them resort to clickbait to achieve what they need. Nobody watches your video if you title it "6500 XT review - it's okay" (although I would).

Another thing is that monitor resolutions keep going up together with GPU sizes, but people who don't plan on going with the flow and are still happy with 1080p (like myself) don't need the latest and greatest shiny tech.

This is why I've changed my motto of "read reviews before buying" to "only first-hand experience can paint the full picture".
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,245 (1.28/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
This is why I've changed my motto of "read reviews before buying" to "only first-hand experience can paint the full picture".
I certainly think this applies strongly to more subjective things like DLSS/FSR/FG image quality, I do get sick of hearing outright negativity for those from people who've not actually experienced it, but I'd argue that a cards rast performance is a lot easier to judge from reviews alone. Just some reviewers are more thorough than others, and some go out of their way to find weaknesses etc.

So I agree somewhat but not fully, far more can be ascertained from a review comparing like for like performance against other video cards, the proof is much more in the numbers than it is subjective for this type of thing imo. But, I certainly have heard from a lot of non-3080 buyers tearing strips of it for only having 10GB VRAM, but as an owner I've never once hit the limit, and I game at 4k, yet other people keep telling me how bad it is. Experience certainly seals the deal.
 
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Well, reviewers (especially on YouTube) rely on readers/viewers to keep going, so most of them resort to clickbait to achieve what they need. Nobody watches your video if you title it "6500 XT review - it's okay" (although I would).

Another thing is that monitor resolutions keep going up together with GPU sizes, but people who don't plan on going with the flow and are still happy with 1080p (like myself) don't need the latest and greatest shiny tech.

This is why I've changed my motto of "read reviews before buying" to "only first-hand experience can paint the full picture".
This is the beauty of the PC space we are all the same yet so different too. I agree with you on reviewers but it does have a silver lining, during the height of Covid Inwin launched an AIO. It is called the SR series and came in 240 and 360 variants.

There are 5 reviews on TPU and none of them are in English. There are also 10 or so videos in English on the cooler on Youtube and 2 of them are reviews. Due to none of the big Media outlets covering it as I assume no samples were sent out the cooler launched with no fanfare. Well that had the cooler within like a week of launch going for $119.99 for the 360 version. This is when Asetek coolers like the Corsair 360 AIOs were over $200 for the same. The Inwin has a bigger cold plate, a huge head with 2 pumps and ARGB with one of the easiest installs ever. I have used 5 of them in builds and have a 240 cooling a 5900X that is pretty impressive for the temps. The last 360 I bought was $109.99.

Where I disagree is display technology. 4K is a dream on a desktop, especially at bigger sizes. The pixel density is crazy. If you read Comic books think of the Omnibus you could buy today vs the actual Comic book in terms of the difference in Animated content, some Games, reading text and watching videos in Windowed mode is impressive. OLED and VA even though they are different both look very beautiful in 4K if they (in the case of VA) have 400+ nits of brightness. A 4K 144hz panel is the cat's meow and Freesync/VRR is a must as the beauty of that is there is range of 47-144 HZ. It does suck though that you have to spend to enjoy it.

I certainly think this applies strongly to more subjective things like DLSS/FSR/FG image quality, I do get sick of hearing outright negativity for those from people who've not actually experienced it, but I'd argue that a cards rast performance is a lot easier to judge from reviews alone. Just some reviewers are more thorough than others, and some go out of their way to find weaknesses etc.

So I agree somewhat but not fully, far more can be ascertained from a review comparing like for like performance against other video cards, the proof is much more in the numbers than it is subjective for this type of thing imo. But, I certainly have heard from a lot of non-3080 buyers tearing strips of it for only having 10GB VRAM, but as an owner I've never once hit the limit, and I game at 4k, yet other people keep telling me how bad it is. Experience certainly seals the deal.
One of the things I do is read and watch reviews but also read user reviews on the retail site for the same product. I had a 2 page "conversation" with a user the other day on the 5900x/5800X3D and 5600X and even though I have used heavily all of those CPUs he still "knew" more than me about how they were different and should be avoided vs Intel. It is kind of sad though because that used to be reserved on Reddit but I find on TPU there are a few users that are Electronics Engineers because they can read and watch a video on the product.
 
Joined
Sep 6, 2013
Messages
3,391 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Îťoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Comparing creative and nvidia -sigh- . An apologist, joy.
At no point in NVs history were they ever remotely in danger of becoming obsolete. With or without the switch to the greedy margin based sales practices. They have always been healthy as a hog in every business sense. Saying they were ever in danger of becoming the next creative is hilarious. Creative died because they couldn't compete with high quality integrated audio and your saying that was the road ngreedia was going down...sure. Terrible analogy.
Integrated graphics always have been and generally speaking, still are trash. I wonder why AMD and Intel are scrambling to fill the gap with decent onboard options after what, 20 years of Intels junk? There's a massive river of cash flow just waiting to be tapped. Anyone that owns, or is planning on buying a halfway decent pc wants it paired with quality graphics. They want every aspect of their pc experience to be satisfying and integrated, rarely fills that role.
Comparing anything to an mx is just sad btw. Not sure why you felt that was relevant? Advancements are naturally going to outperform old tech.
Integrated didn't cause anything. It's simply what's being used to try and fill the hole that's been left in the very bottom of the market. There's another level of performance that's being ignored. That's where the aforementioned 6600m type of value and performance fits in perfectly. But both AMD and NV are to busy hyping to take advantage. It's a big ass niche called value. The forgotten market that NV and AMD are trying to squeeze out of existence.
You come to that conclusion, of Nvidia never being in any danger, by looking at what Nvidia had became, a company with market capitalization bigger than both Intel and AMD combined. And while capitalization doesn't say the whole truth about a company's real size, it does say something about it's evolution and future. So, you judge based on the results until today. But things could be different today if for example Intel had never abandoned it's efforts on building discrete GPUs, or if they where more open minded when Jensen was starting to transform that simple graphics processor to a powerful compute unit that could make server processors look pathetic in specific workloads.

Let's not continue this, because you obviously can't see the whole picture here and also you have zero understanding of the market. But I do suggest we wait and see how Nvidia will deal with Intel in the next 5-10 years. If AMD keeps getting bigger, even slowly and doesn't collapse and Intel gets more competitive in GPUs, let's see to whom Nvidia will be selling it's GPUs and at what profit margins. That's the reason why they wanted to buyout ARM. Because they see a possibility that scares them.

But let's wait and see.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
You come to that conclusion, of Nvidia never being in any danger, by looking at what Nvidia had became, a company with market capitalization bigger than both Intel and AMD combined. And while capitalization doesn't say the whole truth about a company's real size, it does say something about it's evolution and future. So, you judge based on the results until today. But things could be different today if for example Intel had never abandoned it's efforts on building discrete GPUs, or if they where more open minded when Jensen was starting to transform that simple graphics processor to a powerful compute unit that could make server processors look pathetic in specific workloads.

Let's not continue this, because you obviously can't see the whole picture here and also you have zero understanding of the market. But I do suggest we wait and see how Nvidia will deal with Intel in the next 5-10 years. If AMD keeps getting bigger, even slowly and doesn't collapse and Intel gets more competitive in GPUs, let's see to whom Nvidia will be selling it's GPUs and at what profit margins. That's the reason why they wanted to buyout ARM. Because they see a possibility that scares them.

But let's wait and see.
I would love to be a fly on the wall for HP and Lenovo when they have meetings with Intel and AMD on how much they can save not using Nvidia. Specifically for this thread there could be a day where AMD is dominating the laptop market even for the high end due to efficiency with Intel in a close 2nd.
 
Joined
Sep 6, 2013
Messages
3,391 (0.82/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Îťoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 + GT 710 (PhysX) / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
I would love to be a fly on the wall for HP and Lenovo when they have meetings with Intel and AMD on how much they can save not using Nvidia. Specifically for this thread there could be a day where AMD is dominating the laptop market even for the high end due to efficiency with Intel in a close 2nd.
I have to admit that Nvidia keeps surprising me with their resilience and innovation, especially lately, meaning the last 4-5 years. But I still believe that if AMD keeps getting stronger and Intel starts making cards that are good for higher than the mid range market, Nvidia will start having a problem selling it's cards. Users will still want them and OEMs will still have to buy them, but probably it will be more difficult for Nvidia to negociate high prices. Both Intel and AMD will be selling CPUs, chipsets and GPUs as a package, at a discount that it will be difficult for Nvidia to follow, without dropping it's profit margins. Maybe increasing prices in DIY market is a way to somewhat balance it.
 
Joined
Jun 20, 2022
Messages
302 (0.33/day)
Location
Germany
System Name Galaxy Tab S8+
Processor Snapdragon 8 gen 1 SOC
Cooling passive
Memory 8 GB
Storage 256 GB + 512 GB SD
Display(s) 2.800 x 1.752 Super AMOLED
Power Supply 10.090 mAh
Software Android 12
Apart from performance increases with a new generation I would like all GPU (and CPU) developers to make better chips (accross all budgets) that will increase performance without increasing power and I'd rather like to see new products with equal performance but reduced power consumption. Energy efficency (can be achieved by increasing power as we can see with the latest procucts) is only important in data centers where every bit of efficency counts. For the - to use a German expression - Ottonormalverbraucher it is not really important at the end of the day, especially - and I would like to point to a different thread on this forum "GPU launches no longer exciting" where some people came to the conclusion that quality improvements in games - and I fully support that argument - are not really worth the price (performance hits) you have to pay. I really would like to see improving RT performance - looks like it starts to be a thing - but rasterization performance is already at a point where most people don't need higher (at least at the cost of power consumption) frame rates.

We are living in a world where every saved Watt counts, not every gained FPS. If you want your children to have a worthwhile environvent living in you should take this into account.
 
Joined
Jun 2, 2017
Messages
9,354 (3.39/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I have to admit that Nvidia keeps surprising me with their resilience and innovation, especially lately, meaning the last 4-5 years. But I still believe that if AMD keeps getting stronger and Intel starts making cards that are good for higher than the mid range market, Nvidia will start having a problem selling it's cards. Users will still want them and OEMs will still have to buy them, but probably it will be more difficult for Nvidia to negociate high prices. Both Intel and AMD will be selling CPUs, chipsets and GPUs as a package, at a discount that it will be difficult for Nvidia to follow, without dropping it's profit margins. Maybe increasing prices in DIY market is a way to somewhat balance it.
There is much more to say on the future but it is not here now. this does not take away form the fact that AMD has a chance to absolutely do with the budget market what they did with the consoles. As both Microsoft and Sony would have probably used Nividia if cost was not the mitigating factor but that has turned out to be great for Software development of Games and Drivers (Most of the time).
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I certainly think this applies strongly to more subjective things like DLSS/FSR/FG image quality, I do get sick of hearing outright negativity for those from people who've not actually experienced it, but I'd argue that a cards rast performance is a lot easier to judge from reviews alone. Just some reviewers are more thorough than others, and some go out of their way to find weaknesses etc.

So I agree somewhat but not fully, far more can be ascertained from a review comparing like for like performance against other video cards, the proof is much more in the numbers than it is subjective for this type of thing imo. But, I certainly have heard from a lot of non-3080 buyers tearing strips of it for only having 10GB VRAM, but as an owner I've never once hit the limit, and I game at 4k, yet other people keep telling me how bad it is. Experience certainly seals the deal.
It's not just that. Reviewers generally shat on Intel's Comet Lake and Rocket Lake for their high power consumption and low-to-no IPC increase compared to Coffee Lake. They only forgot to add that high power consumption is nowhere to be seen in everyday use, they're super efficient at idle, and that they're super easy to cool because of the large 14 nm central die, not to mention the PL switches which make them just as configurable as AMD's PPT. I had to buy an 11700 to find all this out. In fact, it was much easier to cool than the overhyped R5 3600 (I had one of those too). They (especially Steve at HUB) also called Intel out because not every B560 motherboard that they tested could push unlimited power through the CPU, which is a ridiculous expectation to begin with.

Edit: they also forgot to add (to the topic of IPC increase) that people don't usually upgrade with every generation, so a Coffee Lake comparison isn't what they should have judged Comet / Rocket Lake by.

Reviewers shit on the Radeon 6400 / 6500 XT duo as well, although you only hear positives from owners (myself included). Actually, reviewers tend to crap on everything that doesn't produce 5% more FPS than the competition, and for a lower price, even though no one can see a 5% difference in a game, especially when you only have one graphics card with nothing to compare it to.

This is the beauty of the PC space we are all the same yet so different too. I agree with you on reviewers but it does have a silver lining, during the height of Covid Inwin launched an AIO. It is called the SR series and came in 240 and 360 variants.

There are 5 reviews on TPU and none of them are in English. There are also 10 or so videos in English on the cooler on Youtube and 2 of them are reviews. Due to none of the big Media outlets covering it as I assume no samples were sent out the cooler launched with no fanfare. Well that had the cooler within like a week of launch going for $119.99 for the 360 version. This is when Asetek coolers like the Corsair 360 AIOs were over $200 for the same. The Inwin has a bigger cold plate, a huge head with 2 pumps and ARGB with one of the easiest installs ever. I have used 5 of them in builds and have a 240 cooling a 5900X that is pretty impressive for the temps. The last 360 I bought was $109.99.

Where I disagree is display technology. 4K is a dream on a desktop, especially at bigger sizes. The pixel density is crazy. If you read Comic books think of the Omnibus you could buy today vs the actual Comic book in terms of the difference in Animated content, some Games, reading text and watching videos in Windowed mode is impressive. OLED and VA even though they are different both look very beautiful in 4K if they (in the case of VA) have 400+ nits of brightness. A 4K 144hz panel is the cat's meow and Freesync/VRR is a must as the beauty of that is there is range of 47-144 HZ. It does suck though that you have to spend to enjoy it.


One of the things I do is read and watch reviews but also read user reviews on the retail site for the same product. I had a 2 page "conversation" with a user the other day on the 5900x/5800X3D and 5600X and even though I have used heavily all of those CPUs he still "knew" more than me about how they were different and should be avoided vs Intel. It is kind of sad though because that used to be reserved on Reddit but I find on TPU there are a few users that are Electronics Engineers because they can read and watch a video on the product.
That's another thing: most reviewers only test what companies send to them. That's why I love smaller YouTube channels like RandomGamingHD, and that's why I love TPU as well, to be fair. They go out of their ways to get samples from not so hyped sources.

As for 4K, I would probably agree with you if I ever sat down in front of a 4K monitor... but I don't want to. :laugh: Higher resolutions need faster graphics cards, and I don't have infinite money, unfortunately. :ohwell: I do have a 55" 4K TV though, but I sit far enough from it not to notice any difference between 1080p and 4K content.
 
Last edited:
Joined
Feb 22, 2022
Messages
610 (0.59/day)
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Custom Watercooling
Memory G.Skill Trident Z Royal 2x16GB
Video Card(s) MSi RTX 3080ti Suprim X
Storage 2TB Corsair MP600 PRO Hydro X
Display(s) Samsung G7 27" x2
Audio Device(s) Sound Blaster ZxR
Power Supply Be Quiet! Dark Power Pro 12 1500W
Mouse Logitech G903
Keyboard Steelseries Apex Pro
I would like some lower tier graphics cards with proper media decoders and enough pcie lanes to not get bottlenecked by older pcie standards.
 
Joined
Sep 11, 2019
Messages
275 (0.14/day)
There is much more to say on the future but it is not here now. this does not take away form the fact that AMD has a chance to absolutely do with the budget market what they did with the consoles. As both Microsoft and Sony would have probably used Nividia if cost was not the mitigating factor but that has turned out to be great for Software development of Games and Drivers (Most of the time).
Not using Nvidia for Xbox one and Playstation 4 onward, had nothing to do with cost. After both companies decided to leave behind the power arch, they evaluated ARM, MIPS, and x86. After testing prototypes, both concluded x86 SOC was the way to go for a number of reasons. That left Nvidia out. AMD built an entire product division for the purpose of working on both projects, so that was a huge plus as well.

You are correct of course, that MS axed Nvidia primarily because of cost, back in the OG Xbox days.

And I agree that it isn't us owners of the 6400 and 6500 that complain about them. We knew exactly what we were buying, and the cards do exactly what is expected. The media decode missing is a non factor for my use case. My 5600 in ECO with 4.65GHz boost can easily handle all media duties. It is heavily bottlenecked by a 6400 gaming, so it has the cycles to give for other tasks.
 
Joined
Jan 14, 2019
Messages
12,548 (5.79/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
And I agree that it isn't us owners of the 6400 and 6500 that complain about them. We knew exactly what we were buying, and the cards do exactly what is expected. The media decode missing is a non factor for my use case. My 5600 in ECO with 4.65GHz boost can easily handle all media duties. It is heavily bottlenecked by a 6400 gaming, so it has the cycles to give for other tasks.
That's another thing reviews blew up for no reason and confused people to no end: the 6400 / 6500 XT are not missing video decode. ;) They're only missing AV-1 decode which is still only used in a handful of cases, like some select 4K/8K youtube streams, and I've heard Netflix.

What they're truly missing is video encode, which you only need if you record gameplay, which (in my opinion) isn't something you should do with a low-end graphics card anyway.
 
Joined
Sep 11, 2019
Messages
275 (0.14/day)
That's another thing reviews blew up for no reason and confused people to no end: the 6400 / 6500 XT are not missing video decode. ;) They're only missing AV-1 decode which is still only used in a handful of cases, like some select 4K/8K youtube streams, and I've heard Netflix.

What they're truly missing is video encode, which you only need if you record gameplay, which (in my opinion) isn't something you should do with a low-end graphics card anyway.
You can record with windows, afterburner, or OBS, which can stream too. So even that is doable on a modern CPU like a 5600. I agree though that none of that matters much for a low end card.
 
Joined
Aug 21, 2015
Messages
1,751 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
That's another thing reviews blew up for no reason and confused people to no end: the 6400 / 6500 XT are not missing video decode. ;) They're only missing AV-1 decode which is still only used in a handful of cases, like some select 4K/8K youtube streams, and I've heard Netflix.

What they're truly missing is video encode, which you only need if you record gameplay, which (in my opinion) isn't something you should do with a low-end graphics card anyway.

People slammed the 6500 XT for being no better, or arguably worse, than the card it replaced while asking $30 more in MSRP. The Navi 24 XT chip it uses even seems to be held back by the card's own memory bus. Then there's the performance drop on PCIe3 due to the X4 interface. Working well for your and others' use cases doesn't make it a good card.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
It is amazing how the difference with owners disposition with the budget 6400/6500XT cards vs those that rely solely on reviews to make their argument. I have said this before but when I took my 6500XT apart (Thanks Gigabyte for skimping on Paste) I was blown away with how small the chip was. Even the memory chips are like double the size of the GPU. I have also said before that the budget will probably become an AMD/Intel APU but most likely AMD as that chip could easily fit in the die of a AM5 chip. A 65 Watt TDP on the CPU but a 200 Watt limit on the chip could mean at least 100W for the GPU. Cooling it would be a bother but you would actually have a comparable VRM with the monsters on modern boards.

The APUs have low graphics performance limited severely by the memory interface and version throughput.
Imagine how the poor OEMs put single channel memory and how bad the integrated graphics performance would be.
If the budget segment becomes APUs, then this segment will be dead.
 
Joined
Aug 21, 2015
Messages
1,751 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
If the budget segment becomes APUs, then this segment will be dead.

It's on life support already. Death may be inevitable. And I will mourn.
 
Joined
Sep 11, 2019
Messages
275 (0.14/day)
People slammed the 6500 XT for being no better, or arguably worse, than the card it replaced while asking $30 more in MSRP. The Navi 24 XT chip it uses even seems to be held back by the card's own memory bus. Then there's the performance drop on PCIe3 due to the X4 interface. Working well for your and others' use cases doesn't make it a good card.
Good is relative. What is completely missing from the analysis is that they were better than the other bad cards they launched against. They did not exist in a vacuum. The fact the 1050ti was still being made, and sold for more than either RX is shameful. The price of the 1650 during the pandemic was laughable. The 1030 selling for $150 was deplorable.

We keep circling back to historical data that means precisely dick to the market when these cards were released.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
It's on life support already. Death may be inevitable. And I will mourn.

Honestly, I prefer them stop making any APUs, since I don't need them and won't feel their absence.
 
Joined
Dec 31, 2020
Messages
999 (0.69/day)
Processor E5-4627 v4
Motherboard VEINEDA X99
Memory 32 GB
Video Card(s) 2080 Ti
Storage NE-512
Display(s) G27Q
Case DAOTECH X9
Power Supply SF450
I'm all for RX 6800 16GB that is 50% performance of 4090 as a low end. Yes absolutely. I mean 3840 processors single issue or 3072 dual issue, lets say that dual issue is as effective as hyperthreading is and 512 GB/s of bandwidth however they may achieve that, with GDDR7 128 bit and 256 MB L3.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
I have a task for AMD :)
Please make the new Radeon RX 7400 such - 130% faster than the old RX 6400:

1668021971126.png


A shrunk Navi 23 237 sq. mm and 132 W TDP can be Navi 34 118 sq. mm and 66 W TDP.
 
Top