# ASUS Radeon RX 6500 XT TUF Gaming



## W1zzard (Jan 19, 2022)

The new AMD Radeon RX 6500 XT we review today is the world's first graphics card using a 6 nanometer GPU. With just 100 W during gaming, the RX 6500 XT is very gentle in its power requirements, but also held back by the 4 GB VRAM size and PCI-Express 4.0 x4 interface.

*Show full review*


----------



## olstyle (Jan 19, 2022)

I think those raytracing numbers in Control and Doom need a visual check. +1% sound like it was not actually enabled.


----------



## kruk (Jan 19, 2022)

As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
- sub 90W power consumption
- same encoding/decoding options as previous gen
- 6 GB VRAM
- 96-bit bus with 16 MB Infinity Cache
- PCI-E 8x
- sub $200 MSRP (that is basically the only thing that was delivered)

The thing that released is more of an RX 550 successor and it should really be named RX 6300 XT (also, the efficiency is horrible). I guess I'm waiting for next-gen and/or hoping Navi 23 gets a cut down version ...


----------



## Tetras (Jan 19, 2022)

Wow, the 6600 is way more efficient, I was worried such high clock speeds would mean excessive voltage, but is it the voltage that's responsible?


----------



## Fatalfury (Jan 19, 2022)

RX 6500 XT might be to GO-to Card for  entry level prebuilts from DELL, ACER etc..  but Only if they can PRICE IT RIGHT!!.
maybe just for esports  titles & entrylevel pc gamer .(1080p medium 75Hz Freesync gaming)
 but i wouldnt pay $1 more than the MSRP.
maybe if one can wait 3-6 months when it becomes MSRP(fingers crossed) and enough stock.


----------



## ExcuseMeWtf (Jan 19, 2022)

Would be okay at MSRP. But let's not dream too much.


----------



## Lionheart (Jan 19, 2022)

Jeez this makes the RX 6600 & XT look great!


----------



## trsttte (Jan 19, 2022)

kruk said:


> As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
> - sub 90W power consumption
> - same encoding/decoding options as previous gen
> - 6 GB VRAM
> ...



The RX5500 XT had a 128bit bus and 4 or 8gb vram, and launched for 169 USD. They just needed to the same with an architecture refresh or even just push more 5500XT out the door (haven't looked at the numbers yet but seems like it would be better at this point)


----------



## Agent_D (Jan 19, 2022)

trsttte said:


> The RX5500 XT had a 128bit bus and 4 or 8gb vram, and launched for 169 USD. They just needed to the same with an architecture refresh or even just push more 5500XT out the door (haven't looked at the numbers yet but seems like it would be better at this point)


Around 2% higher average relative performance with around 10-15% less power than 5500xt. 5500xt looks like it would still be the better card when comparing direct retail pricing.


----------



## Turmania (Jan 19, 2022)

If it was a <75W GPU, I would give it a thumbs up not considering the current prices. but overall I believe it is a waste of sand.


----------



## trsttte (Jan 19, 2022)

Agent_D said:


> Around 2% higher average relative performance with around 10-15% less power than 5500xt. 5500xt looks like it would still be the better card when comparing direct retail pricing.



It wasn't worth the engineering time for anyone involved in designing it. I doubt even the die space savings on the newer process are worth the price to design a new card and of the new process it self

Waiting for the Gamers Nexus "Waste of sand" award


----------



## TheinsanegamerN (Jan 19, 2022)

ExcuseMeWtf said:


> Would be okay at MSRP. But let's not dream too much.


Except it wouldnt. This card is often slower then the RX 570, a $150 GPu from 2017. 

It is almost always slower then the 4GB RX 480, a $200 GPU from 2016 riding on the old 14nm node. 

Both these cards had better decoder and encoder options, 8GB VRAM options, and were available over half a decade ago. 

The 6500xt, at $200, is an utter ripoff.


----------



## Selaya (Jan 19, 2022)

4x4 is fine they said
it totally wont bottleneck the card they said

whats next, the sun's rising from the west they say?


----------



## TheinsanegamerN (Jan 19, 2022)

trsttte said:


> The RX5500 XT had a 128bit bus and 4 or 8gb vram, and launched for 169 USD. They just needed to the same with an architecture refresh or even just push more 5500XT out the door (haven't looked at the numbers yet but seems like it would be better at this point)


Hell they could have just given us the full 5500xt (the 5500xt had 14 of 16 CUs enabled) with an 8GB bus on the 6nm node. But no, AMD needs to milk every penny they can from their bizzarely loyal consumer base with this oxygen thief of a GPU.


----------



## foxhntr (Jan 19, 2022)

Gamers Nexus' The Disappointment PC 2022 begins.


----------



## TheinsanegamerN (Jan 19, 2022)

kruk said:


> As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
> - sub 90W power consumption
> - same encoding/decoding options as previous gen
> - 6 GB VRAM
> ...


Here's what gets me: this 6500xt is often slower then a 570, though not always by much. By extension it's MAYBE 5-10% faster then a RX 560.

The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.


----------



## catulitechup (Jan 19, 2022)

200us..........what a trash card






have better options in used market and dont give a buck to this fucked greedy corporations


----------



## Tetras (Jan 19, 2022)

TheinsanegamerN said:


> Here's what gets me: this 6500xt is often slower then a 570, though not always by much. By extension it's MAYBE 5-10% faster then a RX 560.
> 
> The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.



Polaris was very efficient when it was undervolted, pushed beyond 1100-1150 Mhz it tended to lose efficiency quite dramatically (on stock volts anyhow). 1300-1350 Mhz was pretty bad on the stock curve. So, I wonder if the 6400 will be _a lot_ more efficient than the 6500, but if the clocks are substantially lower it's also going to end up slower. Very puzzling, perhaps AMD just cut the die down too much.


----------



## beautyless (Jan 19, 2022)

Only one model at 299$ available at Newegg. Other options with 199$ sold out.


----------



## dj-electric (Jan 19, 2022)

TheinsanegamerN said:


> The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.


The technically correct answer is the agressive core clock speeds out of this poor and thin GPU complex.
The casually correct answer is that RTG is now at rock bottom with product launch ethics.

The RX 6500 XT card truly has proven that the saying "There are no bad products, just bad pricing" can still somehow be wrong, even for a graphics card in 2022. Amazing.


----------



## catulitechup (Jan 19, 2022)

3 fans for this crap 4gb of 64bit







this is a new image for scam


----------



## Tetras (Jan 19, 2022)

beautyless said:


> Only one model at 299$ available at Newegg. Other options with 199$ sold out.



Sold out here too. If people aren't buying these for mining, then who is?  If you have a 570/1060 it's not even (or barely) an upgrade.


----------



## catulitechup (Jan 19, 2022)

Tetras said:


> Sold out here too. If people aren't buying these for mining, then who is?  If you have a 570/1060 it's not even (or barely) an upgrade.



scalpers but this time them need so many luck for try sell this utter garbage more than msrp


----------



## Forza.Milan (Jan 19, 2022)

everyone at Amd who was design it, must put to jail, its all totally crime..


----------



## Vecix6 (Jan 19, 2022)

Intro page says 64 ROPs...  but this card has only 32 ROPs at leat at AMD website specs. It gots 64 TMUs, not ROPs.


----------



## trsttte (Jan 19, 2022)

dj-electric said:


> The RX 6500 XT card truly has proven that the saying "There are no bad products, just bad pricing" can still somehow be wrong, even for a graphics card in 2022. Amazing.



I think it's the opposite, had this card cost a casual 70$ to 100$ bucks it could be great and finally a replacement for the terrible GT1030, GT730 and RX550 on the low end. They could even reduce the clocks a bit to keep it under the pcie 75w envelope and not require any power connectors. You'd then also actually get the same performance for less price as moore law states.

As it is, it was better to release nothing.


----------



## catulitechup (Jan 19, 2022)

Another perl from amd when them said 4gb of vram dont be enough but now them removed



> AMD removes its blog post claiming '4GB VRAM is not enough for games', right before introducing RX6500XT graphics - VideoCardz.com
> 
> 
> AMD quietly hides its “4GB is not enough for today’s games” blog post KitGuru noticed that AMD is suddenly not considering 4GB as a problem for games. As the company surprisingly removed its 2020 blog entry covering this problem. Next-gen games will require more VRAM for textures, especially...
> ...


----------



## ModEl4 (Jan 19, 2022)

Insane, if you put it in a PCI-Express 3 system, it has nearly the same performance as a $169 570 which means 13% worst price performance than a card launched nearly 5 years ago. I can't wait for the SRPs of the upcoming 3D V-cache enabled Ryzens...


----------



## Sandbo (Jan 19, 2022)

I still remember I paid $249 for my RX 480 in 2016.
This shit is like 10% faster than that, asks for $350, in 2022.

I call that an insult.


----------



## catulitechup (Jan 19, 2022)

Continuing with amd scamming day

another piece trash................. for 230us

















and let me guess them compare scam and incomplete gpu with T600, T600 have encode capabilities and full PCIe 3.0 x16









						NVIDIA T600 Specs
					

NVIDIA TU117, 1335 MHz, 640 Cores, 40 TMUs, 32 ROPs, 4096 MB GDDR6, 1250 MHz, 128 bit




					www.techpowerup.com


----------



## OC-Ghost (Jan 19, 2022)

Yikes! Wonder how many years until we see budget cards launching at under 150 msrp or if those days are gone.


----------



## TheinsanegamerN (Jan 19, 2022)

OC-Ghost said:


> Yikes! Wonder how many years until we see budget cards launching at under 150 msrp or if those days are gone.


With inflation the way it is I dont see that market coming back short of a 2008 style crash. Polaris was a fluke of a design created by a company desperate for market share.


----------



## defaultluser (Jan 19, 2022)

You know, when you are slower than my ancient GTX 1060 6GB, you know you're doing it wrong!

The 5500 XT was  a faster card than this nightmare.   As was the 1650 Super 

*I no-longer have to worry about competitive multi-core GPU scaling in drivers when this card is guaranteed to be GPU-limited on all systems - makes the 3050 OFFICIALLY UNOPPOSED at this price bracket!*

So how much longer do we have to wait for the 3050 review reveal date??


----------



## W1zzard (Jan 19, 2022)

olstyle said:


> I think those raytracing numbers in Control and Doom need a visual check. +1% sound like it was not actually enabled.


Indeed, the charts have been updated. RT can't be enabled in those two titles. Probably because 4 GB VRAM



trsttte said:


> The RX5500 XT had a 128bit bus and 4 or 8gb vram, and launched for 169 USD. They just needed to the same with an architecture refresh or even just push more 5500XT out the door (haven't looked at the numbers yet but seems like it would be better at this point)





defaultluser said:


> The 5500 XT was a faster card than this


Indeed .. added RX 5500 XT numbers .. the card is faster than RX 6500 XT



Vecix6 said:


> Intro page says 64 ROPs...


Fixed


----------



## ModEl4 (Jan 19, 2022)

If i remember correctly, in an interview asked about supply/demand situation Lisa Su said that there will be in the next 2 years 20 new fabs (She didn't comment on the geometries, so what percentage of them will be cutting edge) so unless we have a diplomatic episode and a (staged...) supply mandate regarding Taiwan/China development or what ever they devise, we should see a relief at the end of 2023, maybe...


----------



## oxrufiioxo (Jan 19, 2022)

What a trashfire of a gpu launch...


----------



## iO (Jan 19, 2022)

But hey, at least they're generous and give you two bottlenecks instead of just one. /s

Wouldn't be suprised if it was supposed to be OEM only but then decided to release it to retail because even the worst crap sells nowadays.


----------



## Charcharo (Jan 19, 2022)

R9 390 (GCN 2, Hawaii/Grenada) has now crushed Kepler, most of Maxwel, some of Pascal, cut down Polaris, Fiji, *AND* RDNA 2 LMAO.


----------



## ShurikN (Jan 19, 2022)

I didn't think a card worse than vanila 1650 could be made and sold, but when you least expect it, Amd delivers.


----------



## TheinsanegamerN (Jan 19, 2022)

W1zzard said:


> Indeed .. added RX 5500 XT numbers .. the card is faster than RX 6500 XT


5500xt is faster then more expensive 6500xt. Says it all really. 

Makes me really wish we had gotten that full fat 5500xt with 8GB of VRAM for a budget card....


----------



## ModEl4 (Jan 19, 2022)

W1zzard said:


> Indeed, the charts have been updated. RT can't be enabled in those two titles. Probably because 4 GB VRAM
> 
> 
> 
> ...


Are you sure that it is the 4GB version, because in your other 5500XT reviews the 4GB version was always slower than 1650 super, maybe some of the tested games that are different influenced the results?


----------



## WhoDecidedThat (Jan 19, 2022)

TheinsanegamerN said:


> Here's what gets me: this 6500xt is often slower then a 570, though not always by much. By extension it's MAYBE 5-10% faster then a RX 560.
> 
> The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.





It runs at 2900 MHz @ 1.2 volt at stock. This is like running Nvidia's Ampere at 2100-2200 MHz or like running a CPU at 5.5 GHz. This has been clocked way beyond what was needed. I reckon this GPU will have decent power efficiency at 1.0 volt around 2500 MHz (in laptops).


----------



## Xuper (Jan 19, 2022)

This thing should be called RX 6400 XT , gap between 6500 and 6600 is huge


----------



## Zubasa (Jan 19, 2022)

Ah yes the year 2022 is off to a great start, both AMD and nVidia have submitted their entries in GN's "Waste of Sand" award. 
Right now it seems AMD is in the lead of said award here.


----------



## TheinsanegamerN (Jan 19, 2022)

Xuper said:


> This thing should be called RX 6400 XT , gap between 6500 and 6600 is huge


It really should have been a 896 core chip with its 64 bit bus, clocked lower to fit in 50 watt TDP, and the 6500xt should have been a 1280 core chip wit a 96 bit 6GB bus minimum. 


blanarahul said:


> View attachment 233117
> It runs at 2900 MHz @ 1.2 volt at stock. This is like running Nvidia's Ampere at 2100-2200 MHz or like running a CPU at 5.5 GHz. This has been clocked way beyond what was needed. I reckon this GPU will have decent power efficiency at 1.0 volt around 2500 MHz (in laptops).


Jesus. We're back to the days of clocking the snot out of hawaii to match the 970 again....


----------



## Taraquin (Jan 19, 2022)

6500XT is a piece of garbage! If they had used atleast PCIe 4.0 x8 that would have been something but this is just bad, 1650 super performance for a higher price, worse energy consumption...


----------



## defaultluser (Jan 19, 2022)

The last time I paid over $200 for a 32-ROP card  was my trusty GTX 960!

*But even NVIDIA wasn't stupid enough to castrate that old thing with such a skimpy memory bus (in-addition to tiny VRAM)*

I would rather  pay an extra $50 in virtual bucks to buy a real upgrade from my HTPC's 960 (the 3050) - *when you take into account the fact that I'm still running PCIe  3.0 motherboard, this card will be slower than a 1060!*


----------



## r9 (Jan 19, 2022)

What a piece of garbage.
Not worth $200 let alone $350, but it's okay as the PC market is place where people have more money than common sense.


----------



## phanbuey (Jan 19, 2022)

The good thing is if there were any doubts that AMD is just like Intel and nVgreedia, then this release definitely dispelled them.


----------



## WhoDecidedThat (Jan 19, 2022)

Radeon 5500 XTRadeon 6500 XTTransistor Count6.4 billion5.4 billionMemory Bandwidth224 GB/sec144 GB/secBus InterfacePCIe 4.0 x8PCIe 4.0 x4Shader Processing Power59 GPixel/sec90 GPixel/secPixel Fill Rate162 GTexel/sec180 GTexel/secTexture Fill Rate5.2 TFLOPs5.8 TFLOPs

They built a halfway decent chip, then f**king gimped its memory bandwidth and bus interface. Great job AMD. This could have been so much better with atleast PCIe 4.0 x8 if not additional memory and memory bandwidth.


----------



## z1n0x (Jan 19, 2022)

AMD need to issue public apology for this "Insult edition" GPU!

edit: And substantial price cut!


----------



## Quicks (Jan 19, 2022)

https://tpucdn.com/review/asus-radeon-rx-6500-xt-tuf-gaming/images/gpuz-overclocking.gif









This is showing that 6500XT only has 16 ROPS.


----------



## Chrispy_ (Jan 19, 2022)

What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.

Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.

The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.

*EDIT:
I've learned new info since yesterday morning:








						ASUS Radeon RX 6500 XT TUF Gaming
					

Power efficient? :wtf: It has worse performance to power ratio than the 1650 Super, which is based on a 12 nm chip. This one is 6 nm. It sips 100 Watts for nothing, barely surpassing the base 1650 which doesn't even need a power connector! I can de-tune my 2070 to 125 W, and it will still run...




					www.techpowerup.com
				



*


----------



## oxrufiioxo (Jan 19, 2022)

Man this card is listed for 360 usd on newegg and still sold out... What a time to be alive  









						ASUS TUF Gaming Radeon RX 6500 XT Video Card TUF-RX6500XT-O4G-GAMING - Newegg.com
					

Buy ASUS TUF Gaming AMD Radeon RX 6500 XT OC Edition Graphics Card (AMD RDNA 2, PCIe 4.0, 4GB GDDR6, HDMI 2.1, DisplayPort 1.4a, Dual Ball Fan Bearings, All-aluminum Shroud, GPU Tweak II) TUF-RX6500XT-O4G-GAMING with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com


----------



## Selaya (Jan 19, 2022)

phanbuey said:


> The good thing is if there were any doubts that AMD is just like Intel and nVgreedia, then this release definitely dispelled them.


tbf this just seems like an extreme case of the stupid; in general amd's radeon department's been plagued w/ ineptitude as of late (dgpus that serve more than just a displayout -looking at you, gt1030- should have x16, period), unlike nv (aka ngreedia) or intel (aka pay-extra-for-OC) which usually exhibit an extreme case of avarice (in different flavors), but seldom outright ineptitude (RKL notwithstanding)


----------



## medi01 (Jan 19, 2022)

Starts at 350 Euro in DE.
So, tell me how it's "too expensive" for what crap it is.


----------



## Turmania (Jan 19, 2022)

At least do not promote it as world first 6NM gpu  what a waste of sand.


----------



## Quicks (Jan 19, 2022)

2022 hall of fame worst GPU's ever made. #1


----------



## TheinsanegamerN (Jan 19, 2022)

Selaya said:


> tbf this just seems like an extreme case of the stupid; in general amd's radeon department's been plagued w/ ineptitude as of late (dgpus that serve more than just a displayout -looking at you, gt1030- should have x16, period), unlike nv (aka ngreedia) or intel (aka pay-extra-for-OC) which usually exhibit an extreme case of avarice (in different flavors), but seldom outright ineptitude (RKL notwithstanding)


AMD has been plagues with "the stupid" since their GPU division was ATi. Remember them leaving GPU driver development to OEMs with ryzen APUs? The driver bug that cost GCN significant performance due to VRAM allocation? Evergreen being good, so AMD rebranded it only to get blindsided by thermi 2.0: actually good edition, then abandoning evergreen years before nvidia dropped fermi? The attrocity that was the 2900xt. The 5600x debacle. The 5500xt x8 debacle. the hot, loud, late, expensive vega. rabrandeon. frame time pacing. Fury X. Clock rate dropping. Black screeens. flickering. The list goes on. 

Polaris and the 5700xt were flukes.


----------



## catulitechup (Jan 19, 2022)

this sea of shit of rx 6500 xt have a simple solution....................
....................................................................................................
...........rx 6505 xt added 8gb model, encode capabilities, real pci-e and lower price


----------



## AnotherReader (Jan 19, 2022)

This is a disappointing release. I had expected it to be as fast as a RX 580 at 1080p on average. However, in some games, such as F1 2021 and Watch Dogs: Legion, it falls behind even the RX570. I think the 4 lanes of PCIe is a bigger deal than the 4 GB of VRAM. Also, the performance relative to the RX 6600 is all over the place.



*Game**Release Year**RX 570**RX 6500 XT**RX 6600**6500XT/6600**6500XT/RX 570*DOOM Eternal2020​72.5​75.1​181.1​41%​104%​Watch Dogs: Legion2020​28.2​27.2​60.4​45%​96%​F1 20212021​96.5​86.7​181.9​48%​90%​CyberPunk 20772020​24.7​26.5​52.1​51%​107%​Deathloop2021​38.9​40.8​78.7​52%​105%​Control2019​35.5​39.6​75.1​53%​112%​The Witcher 32015​54.3​64.1​106.2​60%​118%​Far Cry 62021​45.3​58.3​87.9​66%​129%​Far Cry 52018​64.3​82.6​117.3​70%​128%​


----------



## WhoDecidedThat (Jan 19, 2022)

To those saying this is a laptop GPU remodeled for desktop, in laptops the RX 6500M will have to send a frame buffer back to the CPU 30+ times a second which will further worsen the very limited bus bandwidth situation. This is bad for laptops too.


----------



## AnotherReader (Jan 19, 2022)

TheinsanegamerN said:


> Here's what gets me: this 6500xt is often slower then a 570, though not always by much. By extension it's MAYBE 5-10% faster then a RX 560.
> 
> The RX 560 was a 75 watt card from 2017. I have one. How on earth, with that clock speed, 6nm node, and 18 Gbps GDDR6 memory, did AMD manage to engineer a card that draws 100 watts yet is barely any faster then a 5 year old 14nm card with 5 Gbps GDDR5 memory? I mean jesus there's wastes of space and then there is actual TALENT at making something utterly horrible.


I think you're forgetting that the RX 570 is much faster than the RX 560. This is from TechPowerup's review of the Sapphire RX 570 Pulse:





The RX 560 is comparable to the GTX 1050. This means that the _RX 570 is nearly twice as fast_.



Chrispy_ said:


> What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.
> 
> Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.
> 
> The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.



I was expecting the 3050 to dominate this even before the review. I suspect AMD expects that the 3050 will be much more expensive due to miners so they didn't price the 6500 XT according to its performance.


----------



## Metroid (Jan 19, 2022)

power consumption is just too high, wth amd? I guess this is the worse silicon I have ever seen.


----------



## MachineLearning (Jan 19, 2022)

I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.

And handicapped overclocking on top of that? The memory bus and PCIe bandwidth hold it back anyway, I honestly think this could run at 6GHz core and still not even improve more than 5%. Capping the sliders is a slap in the face especially when BIOS modding is near impossible on Navi afaik.


----------



## AnotherReader (Jan 19, 2022)

MachineLearning said:


> I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.


They clocked it way beyond its sweet spot to match the RX 580 and 1650 Super.

Oh and in Canada, it's priced even more absurdly. This was the cheapest model sold by Newegg. That is equivalent to 240 USD before shipping.


----------



## mechtech (Jan 19, 2022)

W1zz I may have missed it in your extensive review but did you confirm what hardware video codecs encode/decode the 6500xt actually supports??

And thanks for the rx570/580.


----------



## Rob94hawk (Jan 19, 2022)

It's so bad not even the miners & scalpers will touch it.


----------



## Nihilus (Jan 19, 2022)

kruk said:


> As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
> - sub 90W power consumption
> - same encoding/decoding options as previous gen
> - 6 GB VRAM
> ...


100% this.  Has they used a 96 bit bus and a few more CUs, they could have used much lower clocks while still getting better performance (1660ti) and MUCH better efficiency.  Not to mention 6 GB vram, which would probably avoid pci-e bottleneck had they still needed to use 4 Pci lanes.

This was a terrible debut for the N6 node as it was being clocked at insane levels making it appear inefficient.



catulitechup said:


> sea of shit of rx 6500 xt have a simple solution....................
> ....................................................................................................
> ...........rx 6505 xt added 8gb model, encode capabilities, real pci-e and lower price


That's not happening.  Using a pathetic 64 bit bus eliminated any chance of an 8GB version.


----------



## usiname (Jan 19, 2022)

MachineLearning said:


> I'm seriously wondering how this happened. 1650 Super is 12nm, Turing. Capped at 100W. This card uses about the same power, about the same performance, probably similar real-world price. But this is on a newer architecture, smaller node. AMD bungled this horribly, whoever within RTG thought this was a good idea should be given permanent unpaid leave imo. This is technological regression.
> 
> And handicapped overclocking on top of that? The memory bus and PCIe bandwidth hold it back anyway, I honestly think this could run at 6GHz core and still not even improve more than 5%. Capping the sliders is a slap in the face especially when BIOS modding is near impossible on Navi afaik.


RX 6600 has the best performance/watt and RX 6500 XT looks so stupid. I expected more with such high TDP, but the Vram and the PCIe made this card terrible.


----------



## W1zzard (Jan 19, 2022)

Quicks said:


> https://tpucdn.com/review/asus-radeon-rx-6500-xt-tuf-gaming/images/gpuz-overclocking.gif
> 
> 
> 
> ...


Navi 24 isn't supported correctly yet in GPU-Z


----------



## Marshal_90 (Jan 19, 2022)

It's understandable that AMD fans are mad, but from a business stand point they had no choice but to launch this GPU with limited RAM and bandwidth. 

It's a difficult time for manufacturers. AMD had to do these limitation to keep the price low. Honestly though, it's not that bad. Obviously it's not meant to be for anything higher than 1080p. 
Now if we compare this one with RX 580 or even RX 590 the performance is actually impressive even with half of their shaders.

I believe that AMD is doing a good job during these entire chip shortage thing and we will see a strong come back maybe at the 2H of 2022 or maybe beginning of 2023!


----------



## ixi (Jan 19, 2022)

Ok, I was expecting power of rx 580 or more...


----------



## mechtech (Jan 19, 2022)

kruk said:


> As a buyer in this segment, I wanted RX 6500 XT to be a proper RX 5500 XT successor:
> - sub 90W power consumption
> - same encoding/decoding options as previous gen
> - 6 GB VRAM
> ...


IMHO anything less than a 128-bit memory bus on todays graphics is silly.  128bit mem bus should be the minimum nowadays.   Gddr6 is fast let it breathe even if the gpu can’t handle it.   A 64-bit mem bus on a gpu is like a 32-bit os.  Ya it works but it’s 2022.  Why?


----------



## TheUn4seen (Jan 19, 2022)

This is rubbish beyond words. Well, the word "scam" comes to mind.


----------



## Blaazen (Jan 19, 2022)

IMO bad energy efficiency is caused by too high clocks. I don't care much of gaming performance with this kind of cards. I'd rather see more video outputs and AV1 codec support.


----------



## trsttte (Jan 19, 2022)

catulitechup said:


> Continuing with amd scamming day
> 
> another piece trash................. for 230us
> 
> ...



Let's not bring Radeon Pro to this discussion, Workstation cards are always more expensive. Not saying that it's great, especially for AMD that usually doesn't segment both markets nearly as agressively as nvidia, it's still bad, but just having that blue workstation colour makes it more expensive.



Chrispy_ said:


> What a turd. The 3050 is going to wipe the floor with this, and we know how that performs already because of the same silicon already being in the laptop market for several months.
> 
> Unfortunately this bodes extremely badly for RDNA2 graphics built into the 6000-series laptop APUs, as it demonstrates that RDNA scales down horribly to lower bandwidths and performance levels.
> 
> The only silver lining is that Vega IGPs are also turds, so perhaps it will still be an improvement.



This was extremely overclocked, it doesn't correlate with what the APU will perform. I'd think it actually bodes pretty well for the APUs given they have similar compute unit counts.



Marshal_90 said:


> It's understandable that AMD fans are mad, but from a business stand point they had no choice but to launch this GPU with limited RAM and bandwidth.
> 
> It's a difficult time for manufacturers. AMD had to do these limitation to keep the price low. Honestly though, it's not that bad. Obviously it's not meant to be for anything higher than 1080p.
> Now if we compare this one with RX 580 or even RX 590 the performance is actually impressive even with half of their shaders.
> ...



They had a choice. Bringing a new product to market costs an imense amount of money just in setting up production and certifications, along with all the design and engineering that went into it. They could have just re-branded the 5500xt as a 6500xt like they did in the past and save all the engineering and certification costs, maybe put some of that dough towards absorving any supply chain hurdles. They could even raise it's initial price point from the 169$ to 199$ because inflation on everything, it would be much better for everyone involved (including us!)


----------



## birdie (Jan 19, 2022)

Scalpers who have bought the cards up will most likely feel sorry about their "investment".

I sure hope on one will touch those ebay listings and ultimately the cards will be returned or/and sold at MSRP which is still too high for this product. Considering its price and features, it should have been called RX 6400 XT or even RX 6300 XT and sold at $130 at most.

@W1zzard 

Is too much to ask to

1) Remove all the cards above RTX 3060 Ti (there's just no point showing them)
2) Add performance in PCI-E 3.0 mode

in the charts at https://www.techpowerup.com/review/asus-radeon-rx-6500-xt-tuf-gaming/31.html

In the future no one will check the PCIe scaling article but these graphs will surely be cited over and over again and PCIe 3.0 performance is crucially missing.

Thank you


----------



## W1zzard (Jan 19, 2022)

birdie said:


> to the charts at


My summary charts are always the whole market. The individual game tests usually are cut off at 75% and 125%, but that's kinda difficult with the 6500 XT being so slow, and people want to see how it does in relation to other interesting cards. I found the deltas to Vega 64 quite surprising


----------



## birdie (Jan 19, 2022)

W1zzard said:


> My summary charts are always the whole market. The individual game tests usually are cut off at 75% and 125%, but that's kinda difficult with the 6500 XT being so slow, and people want to see how it does in relation to other interesting cards. I found the deltas to Vega 64 quite surprising



Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.


----------



## neatfeatguy (Jan 19, 2022)

I don't know about you guys, but that 2060 12GB card looks like a fricking steal compared to this.



birdie said:


> Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.











						AMD Radeon RX 6500 XT PCI-Express Scaling
					

The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.




					www.techpowerup.com


----------



## JalleR (Jan 19, 2022)

birdie said:


> Still PCIe 3.0 perf would be nice to see. The charts are already insanely high, so adding yet another line won't hurt. Absolute most people who are considering this card don't have PCIe 4.0 motherboards.











						AMD Radeon RX 6500 XT PCI-Express Scaling
					

The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.




					www.techpowerup.com


----------



## Dr. Dro (Jan 19, 2022)

My disappointment is immeasurable

Price would make or break this and at 350... I can only hope that the local pricing here is below that of the GTX 1650 or this will rot in stock


----------



## Testsubject01 (Jan 19, 2022)

ExcuseMeWtf said:


> Would be okay at MSRP. But let's not dream too much.



Just no! It's a mobile chip on a desktop board, with cut down bandwidth AND very limited VRAM on PCI-E 4 and even more on PCI-E 3. Without a number of features, every low-end card in the last decade supported. AMD should be scolded to release a product in 2022, that in some titles get outperformed by their previous released models (back to 2017!) in the 200$ bracket (RX590, Radeon 5500XT).


----------



## gasolina (Jan 19, 2022)

Wow just wow, they put every piece of utterly trash into a graphic card and it generates profit from bottom of the trash can . These day anything will sell as hot cake and since intel cant make gpu it make sense now that this trend will continue at least 2 more years.


----------



## yeeeeman (Jan 19, 2022)

Tetras said:


> Wow, the 6600 is way more efficient, I was worried such high clock speeds would mean excessive voltage, but is it the voltage that's responsible?


No, frequency also plays a role in power consumption.


----------



## Dr. Dro (Jan 19, 2022)

It arrived at 2300 BRL (421 USD at the current exchange rates) in Brazil's largest tech e-commerce, KaBuM!, and all manufacturers' cards seem to be priced the same. Only ASUS DUAL and no TUF available, presumably the TUF would be even pricier...

It's priced higher than the GeForce GTX 1650's most expensive variants... I'm not sure, chief...


----------



## yeeeeman (Jan 19, 2022)

rdna 2 strategy was to have a big cache + very high frequency to make up for less CUs => smaller dies => more money. This has worked to some extents in the higher end cards because the infinity cache had a decent size of 32MB and more, but here, 16 megs cannot hide the fact that this gpu is gimped in every single aspect. bus width is super small, pcie lanes are very few, CUs are barely higher than an APU. So the cache and close to 3Ghz frequency cannot make up for all of that. I just don't understand how AMD took the decision to launch it as a 6500xt product when they clearly saw it is performing very badly.


----------



## WhoDecidedThat (Jan 19, 2022)

I read somewhere that GDDR5 has become quite expensive (12$ per GB) so an additional 2/4 GB of RAM might have been too much (for the price). Even still, while 64 bit memory is somewhat tolerable, having only PCIe x4 interface is simply unjustifiable. Go f**k yourself AMD.


----------



## mb194dc (Jan 19, 2022)

Ugh, awful card. Not a great time for getting decent PC hardware at reasonable cost, that is for for sure.


----------



## 80-watt Hamster (Jan 19, 2022)

I try not to be this cynical, but this product looks more and more like the answer to the question, "What can we shove out the door quickly with components we have on hand that will make it look like we're trying to serve the low end of the market?"  Heck, maybe even laptop manufacturers are wondering why AMD's trying to sell them a chip with a memory bus that can't even handle its own shader capability.


----------



## birdie (Jan 19, 2022)

neatfeatguy said:


> AMD Radeon RX 6500 XT PCI-Express Scaling
> 
> 
> The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.
> ...





JalleR said:


> AMD Radeon RX 6500 XT PCI-Express Scaling
> 
> 
> The AMD Radeon RX 6500 XT comes with only a narrow PCI-Express x4 interface. In this article, we took a closer look at how performance is affected when running at PCI-Express 3.0; also included is a full set of data for the academically interesting setting of PCI-Express 2.0.
> ...





birdie said:


> *In the future no one will check the PCIe scaling article but these graphs will surely be cited over and over again and PCIe 3.0 performance is crucially missing.*



When replying to posts make sure you've read everything.


----------



## Sithaer (Jan 19, 2022)

Yikes, this makes my upgrade from a RX 570 4GB _'which I used since 2018'_ to a GTX 1070 in 2021 August for ~400 $ feel like a good deal.
Especially since I'm still using a PCIe 3.0 mobo.


----------



## ShurikN (Jan 19, 2022)

80-watt Hamster said:


> I try not to be this cynical, but this product looks more and more like the answer to the question, "What can we shove out the door quickly with components we have on hand that will make it look like we're trying to serve the low end of the market?"


Not cynical, you've hit the nail on the head.


----------



## Mistral (Jan 19, 2022)

So, it's essentially a somewhat mining-resistant RX580, as long as you are 1080p on PCIe4. Otherwise, an RX570 and depending on the game an RX560...

Slots perfectly well into the current market, doesn't it...?


----------



## 80-watt Hamster (Jan 19, 2022)

Mistral said:


> So, it's essentially a somewhat mining-resistant RX580, as long as you are 1080p on PCIe4. Otherwise, an RX570 and depending on the game an RX560...
> 
> Slots perfectly well into the current market, doesn't it...?



Anything slots perfectly well into the current market as long as it supports DX11.


----------



## neatfeatguy (Jan 19, 2022)

birdie said:


> When replying to posts make sure you've read everything.



I did read everything. I suppose how you made you comment I quoted I didn't understand that you were asking for a comparison directly in the review, which it looks like you were suggesting. I took your comment as you would like to see a pcie 3.0 comparison to which I linked the pcie comparison review.

How you word your posts, they can be read differently by others over how you want them to come across.


----------



## Outback Bronze (Jan 19, 2022)

Sorry AMD,

Piss poor release.


----------



## Valantar (Jan 19, 2022)

Damn, that performance is a real bummer. It's actually slower than the 5500 XT? That's inexcusable, even for what is clearly a die shrunk as much as possible. And clearly AMD knew this - they obviously both run software simulations as well as hardware tests (easily done with a custom bios on a test card with a larger die) before committing to the silicon design. This just makes no sense to me.

I was really hoping this would be a return to form for AMD at the $200 mark and a worthy upgrade for the RX 570 in my secondary system, but clearly that isn't what we got.


----------



## Ravenas (Jan 19, 2022)

Wow https://www.bestbuy.com/site/xfx-sp...g-graphics-card-black/6495087.p?skuId=6495087


----------



## lightning70 (Jan 19, 2022)

It's a big piece of garbage with PCI Ex 4.0 x4 bandwidth and 4gb memory. I think it's on the trash list. AMD is really that bad and expensive.


----------



## Pepamami (Jan 19, 2022)

the card should be named RX 6400, not 6500


----------



## Dr. Dro (Jan 19, 2022)

Pepamami said:


> the card should be named RX 6400, not 6500



But an RX 6400 exists... and it's an even worse, cut down version of this

It can always go below rock bottom...


----------



## Pepamami (Jan 19, 2022)

Dr. Dro said:


> But an RX 6400 exists... and it's an even worse, cut down version of this
> 
> It can always go below rock bottom...



since GPU costs x3-4 of normal prices, products like rx 6500 may exists, but I dont like the naming, not the gpu itself
I would be fine with RX 6400 XT name


----------



## Assimilator (Jan 19, 2022)

Garbage.

Garbage.

_Garbage._

Complete and utter trash. And board partners are putting triple-slot coolers on this unflushed turd? Why???

The saddest thing is that with the current market conditions, this offensively bad "product" will probably sell like hotcakes simply on price.

There is some wine that will never age finely no matter how many fanboys wish for it, and this card is that wine. Or should I say piss... like what AMD is taking here. Well, at least there's a silver lining... I won't be buying an AMD GPU out of principle for a long time.


----------



## Pepamami (Jan 19, 2022)

Assimilator said:


> The saddest thing is that with the current market conditions, this offensively bad "product" will probably sell like hotcakes simply on price.
> 
> There is some wine that will never age finely no matter how many fanboys wish for it, and this card is that wine. Or should I say piss... like what AMD is taking here. Well, at least there's a silver lining... I won't be buying an AMD GPU out of principle for a long time.



Its suppose to be trashy as possible, with all bottlenecks, so they can make more cards.
Low budged cards were existing before, and low budged cards were moved to mid-range not because "AMD is bad", because of today's situation of GPU market overall.


----------



## wolf (Jan 19, 2022)

This is absolutely shameful and abysmal, I can hardly believe they even launched it.

I'll remember the 6500XT when I hear that "AMD actually cares about gamers"


----------



## catulitechup (Jan 19, 2022)

Ravenas said:


> Wow https://www.bestbuy.com/site/xfx-sp...g-graphics-card-black/6495087.p?skuId=6495087









save 80 bucks....................


----------



## Turmania (Jan 20, 2022)

We are all laughing and rightly so, it is a waste of sand. But believe me, this will sell, and sell a lot. when an avg. joe goes to a shop to buy a pc. he will see 1000 usd card and this and he will choose this for 300 or 250 and it will be on shelves available.


----------



## Mussels (Jan 20, 2022)

Au pricing is out

I gave away an Asus Rog Strix RX580 8GB for christmas, you're telling me it was worth $600, AMD?


----------



## ModEl4 (Jan 20, 2022)

It's just hit me, except RX 6500XT, we have also RX 6400 (OEM). RX 6500XT is 28% higher clocked in game clocks and +21% in boost clocks, also it has +33% more CUs, which lead to a 1.7X difference (game clocks) or 1.62X (boost clocks) regarding shading and texel output. But let's just say for argument's sake that RX 6400 is only -20% of RX6500XT in performance, this means it will be slower than a GTX 1650 GDDR6 on a PCI-express 3 system and probably match it in a PCI-express 4 system in this optimistic -20% scenario. So regression also in the below 75W GPU category despite 6nm vs 12nm, such a complete failure...



			https://tpucdn.com/review/gigabyte-geforce-gtx-1650-oc-gddr6/images/relative-performance_1920-1080.png


----------



## z1n0x (Jan 20, 2022)

wolf said:


> This is absolutely shameful and abysmal, I can hardly believe they even launched it.
> 
> I'll remember the 6500XT when I hear that "*AMD actually cares about gamers*"


"We love gamers shareholders" - Lisa Su


----------



## Tetras (Jan 20, 2022)

Turmania said:


> We are all laughing and rightly so, it is a waste of sand. But believe me, this will sell, and sell a lot. when an avg. joe goes to a shop to buy a pc. he will see 1000 usd card and this and he will choose this for 300 or 250 and it will be on shelves available.



The saddest part is that some may buy it even though they have a 1060 or rx 570, not realising it's barely any faster.


----------



## z1n0x (Jan 20, 2022)

If AMD wasn't blinded by profit margins and RTG wasn't run by idiots, they would take the "L" on this one and do a big price cut in attempt to save some face.

Exchanging reputation/brand perception/mindshare for short term profits is bad idea.


----------



## GoldenX (Jan 20, 2022)

We're still in January and we have the worst product of 2022 already.
Can't wait to see the 3050 at 600 USD.


----------



## Pepamami (Jan 20, 2022)

z1n0x said:


> Exchanging reputation/brand perception/mindshare for short term profits is bad idea.


yes they should have done another 600$+ card.
But I bet, if we check "performance"/"real price in stores", RX 6500 will look like a "decent" product


----------



## Fluffmeister (Jan 20, 2022)

Superb product, I hope all the AMD die-hards buy it.

They are the underdogs after all, hell... a borderline multibillion dollar charity case. I feel soo sorry for them, you should all do too.


----------



## AusWolf (Jan 20, 2022)

Now I understand why nvidia cut raytracing support from the 1650 and other TU117 cards.

This would be a good HTPC card if it didn't need a power cable and if it had VP1 acceleration. In its current form though, it's rubbish.


----------



## mechtech (Jan 20, 2022)

man it's unanimous across the web

glad I'm not the guy that approved making this card.  then again, probably sell every single one they make


----------



## wolf (Jan 20, 2022)

GoldenX said:


> We're still in January and we have the worst product of 2022 already.
> Can't wait to see the 3050 at 600 USD.


At least that's shaping up to be better actual product. 8gb VRAM, non gimped pcie interface, no stripped encoding / features. 

Ive said in the past "there are no bad products, just bad prices", and that might well apply to the desktop 3050, but this 6500XT *is* a bad product, *and* a bad price, it's _insulting_.


----------



## ReallyBigMistake (Jan 20, 2022)

Watch Dogs Legion uses the Disrupt Engine and not Dunia. Dunia is only used on Far Cry games with Far Cry 3-6 using Dunia 2. Dunia is a fork of CryEngine 1


----------



## Jism (Jan 20, 2022)

Mussels said:


> Au pricing is out
> 
> I gave away an Asus Rog Strix RX580 8GB for christmas, you're telling me it was worth $600, AMD?



Lmao.. I have to laugh so hard on prices being jacked up to the 600$ territory, slapping a 3 fan cooler on it thinking it's big or worth the money.

600$ would net you the fastest of the generation these days. Now even 4000$ would grant you the same.

I think all manufactors went the apple route. Charge premium prices for products.

It's just insane. I would never buy from a scalper either. Then i just re-patch and clean my current videocard.


----------



## AusWolf (Jan 20, 2022)

wolf said:


> At least that's shaping up to be better actual product. 8gb VRAM, non gimped pcie interface, no stripped encoding / features.
> 
> Ive said in the past "there are no bad products, just bad prices", and that might well apply to the desktop 3050, but this 6500XT *is* a bad product, *and* a bad price, it's _insulting_.


Sadly true. 1650 Super level performance with 1650 Super level power consumption... which would be fine at 12 nm, not at 6. Where are the advantages of the new production node?

With gimped video en-/decoding and useless raytracing, this is a $200 card from 2016, not 2022. AMD should be ashamed.


----------



## Ravenas (Jan 20, 2022)

I'm really surprised to see such bashing on this forum. It's a power efficient card that performs well at 1080p, and costs $200. Relatively speaking, if the card wasn't on the market, you would still see 5500 XT prices at $500 and the world's worst, I sold my 5700 XT for $900 in 2020 after using it for a year and a half. I sold it more than I bought for at $450. 

I am happy to see it on the market.


----------



## AusWolf (Jan 20, 2022)

Ravenas said:


> I'm really surprised to see such bashing on this forum. It's a power efficient card that performs well at 1080p, and costs $200. Relatively speaking, if the card wasn't on the market, you would still see 5500 XT prices at $500 and the world's worst, I sold my 5700 XT for $900 in early 2020.
> 
> I am happy to see it on the market.


Power efficient?  It has worse performance to power ratio than the 1650 Super, which is based on a 12 nm chip. This one is 6 nm. It sips 100 Watts for nothing, barely surpassing the base 1650 which doesn't even need a power connector! I can de-tune my 2070 to 125 W, and it will still run miles around this thing overclocked.


----------



## Ravenas (Jan 20, 2022)

AusWolf said:


> Power efficient?  It has worse performance to power ratio than the 1650 Super, which is based on a 12 nm chip. This one is 6 nm. It sips 100 Watts for nothing, barely surpassing the base 1650 which doesn't even need a power connector! I can de-tune my 2070 to 125 W, and it will still run miles around this thing overclocked.



It’s power consumption is extremely low, it plays most games at 1080P well, and it costs only $200. It’s not a bad card. Expecting more for $200 is pretty much just fantasy thought.


----------



## Mistral (Jan 20, 2022)

Did AMD mean to launch this card with RSR always enabled and the software team was just behind schedule?


----------



## InVasMani (Jan 20, 2022)

AMD just needs to make a APU that can crossfire with these and suddenly they would actually sell, but apparently they are sold out anyway at scalped prices, but like to who!!?


----------



## phanbuey (Jan 20, 2022)

InVasMani said:


> AMD just needs to make a APU that can crossfire with these and suddenly they would actually sell, but apparently they are sold out anyway at scalped prices, but like to who!!?



I hope it's just a bunch of scalpers who get stuck with them... maybe this is the long game for amd... "The way we are going to get GPUS to gamers is to release a card no sane gamer will buy at MSRP but scalpers will buy up instantly, then stick them with millions of dollars of worthless GPUs LEL"

"And we are going to do it right before the 7000 series launch just so those guys are sitting on a ton of inventory when the new cards drop."


----------



## seth1911 (Jan 20, 2022)

Quicks said:


> 2022 hall of fame worst GPU's ever made. #1


No its the CAD Radeon Pro W 6300M with 32bit Interface


----------



## wolf (Jan 20, 2022)

Ravenas said:


> It's a power efficient card





Ravenas said:


> It’s power consumption is extremely low


Power efficiency and low consumption are not the same thing. A card could use only 50 Watts, but if it only performed 25% as good as this card it would be half as efficient. Just because it's 'low' doesn't mean it's good, and with the 6500XT that seems to be exactly the case. 


AusWolf said:


> Power efficient?  It has worse performance to power ratio than the 1650 Super


Spot on. 

I keep looking for a way, any way, that this card can redeem itself or claw back some positives, and I'm just not seeing any.


----------



## Ravenas (Jan 20, 2022)

wolf said:


> Power efficiency and low consumption are not the same thing. A card could use only 50 Watts, but if it only performed 25% as good as this card it would be half as efficient. Just because it's 'low' doesn't mean it's good, and with the 6500XT that seems to be exactly the case.
> 
> Spot on.
> 
> I keep looking for a way, any way, that this card can redeem itself or claw back some positives, and I'm just not seeing any.



I did misspeak, I was referring to power consumption, and responded that way in my follow up. The card is priced accordingly, and delivers at that price. A $200 1080P gaming card at a relatively small power draw. Again it’s not a bad card for the price point. 

People expecting significantly more for $200 at this day are just in La La land. The 1650 has been on eBay priced at ~ $299, if it is SO much better go buy them.


----------



## wolf (Jan 20, 2022)

Ravenas said:


> Again it’s not a bad card for the price point.


Hard disagree, when equal or more performance was on offer for less money 2-3-4-5 years ago, even a $199 MSRP for this card is an insult, let alone the street price.


Ravenas said:


> The 1650 has been on eBay priced at ~ $299, if it is SO much better go buy them.


Some definitely will.


Ravenas said:


> I did misspeak, I was referring to power consumption, and responded that way in my follow up.


Fair and no worries.

But for real, while everything in this market is overpriced relative to the performance you get, I'd much rather pay more and actually get a much better product, even the 6600/XT are a major step up in performance for similar price to performance, not to mention far less gimped.


----------



## InVasMani (Jan 20, 2022)

It's barely a flesh wound best describes all the ways this cards been cutdown at the expense of performance.


----------



## W1zzard (Jan 20, 2022)

ReallyBigMistake said:


> Watch Dogs Legion uses the Disrupt Engine and not Dunia. Dunia is only used on Far Cry games with Far Cry 3-6 using Dunia 2. Dunia is a fork of CryEngine 1


Indeed, and nobody noticed that mistake for like a year. Fixed in 34 reviews


----------



## InVasMani (Jan 20, 2022)

The fact that a GTX 285 has more memory bandwidth is rather cringe.


----------



## AusWolf (Jan 20, 2022)

Ravenas said:


> It’s power consumption is extremely low, it plays most games at 1080P well, and it costs only $200. It’s not a bad card. Expecting more for $200 is pretty much just fantasy thought.


Only 200 bucks? Really? Its main competitor, the 1650 Super was launched for 159 in 2019! So for 40 bucks more, you get a 3 year-old entry level card? It doesn't look like a good value proposition to me. As for its power consumption, it's not low enough to get rid of a power connector, despite being based on the soooo advanced 6 nm process. So for extremely small form factor PCs, it's rubbish. It doesn't have Navi 2's video decode engine, so it's rubbish for HTPCs as well. It's already been established that it's too expensive for the gaming performance it offers. What is it good for, then?



wolf said:


> I keep looking for a way, any way, that this card can redeem itself or claw back some positives, and I'm just not seeing any.


Same here.


----------



## Turmania (Jan 20, 2022)

On average rx 6600 consumes 15W more but performs twice of this. So 70fps becomes around 130. And considering this is built on world first 6nm.


----------



## Pepamami (Jan 20, 2022)

AusWolf said:


> Its main competitor, the 1650 Super was launched for 159 in 2019


its 250$-400$ in 2021-2022. In 2019 that new "abomination" would have cost like 80-100$ max, amd literaly made it out of dirt.
This card is not amd problem, its a new symbol of "current GPU market drama"


----------



## mama (Jan 20, 2022)

Who is this card for?  If AMD made a low end card, why cripple PCIe Gen 3?


----------



## wolf (Jan 20, 2022)

Pepamami said:


> This card is not amd problem, its a new symbol of "current GPU market drama"


This card is absolutely AMD'S problem, and they could have gotten away with a much less negative launch if it had pcie 8x and didn't strip the encoding capabilities (yes, I realise why this had to be this way for navi 24 specifically), OR, called it the Radeon 6300/6400 or maybe 6500 LE and charged an msrp of $119-149 which would affect the street price.


----------



## AusWolf (Jan 20, 2022)

Pepamami said:


> its 250$-400$ in 2021-2022. In 2019 that new "abomination" would have cost like 80-100$ max, amd literaly made it out of dirt.
> This card is not amd problem, its a new symbol of "current GPU market drama"


The 1650 Super is currently priced by scalpers. The 6500 XT is priced by AMD and partners. So yes, it's absolutely AMD's problem.
They're not simply going with the flow of "GPU market drama". They're outright profiteering on it.

I've just checked, Sapphire's Pulse is out at Overclockers UK for £225. That's 306 US dollars. The Asus TUF in this review goes for £329 = 448 USD. These are retail prices, not scalped ones. The 1650 goes for £150-200 on ebay, and that's a GPU that doesn't cry for mama when used in a PCI-e 3.0 (let alone 2.0) motherboard.


----------



## seth1911 (Jan 20, 2022)

The fact that a gt710 have a 8 lane connection


----------



## catulitechup (Jan 20, 2022)

wolf said:


> This card is absolutely AMD'S problem, and they could have gotten away with a much less negative launch if it had pcie 8x and didn't strip the encoding capabilities (yes, I realise why this had to be this way for navi 24 specifically), OR, called it the Radeon 6300/6400 or maybe 6500 LE and charged an msrp of $119-149 which would affect the street price.



this utter garbage dont deserve money because many old gpus like gt 730 (gt 710 of 192 shaders i think so too, gk208 based model*) have nvenc and appear around 70us (microcenter) and older radeon too VCN case radeon hd 77xx and newer models, almost forget various apus models have encode capabilities too

* https://www.techpowerup.com/gpu-specs/geforce-gt-710.c3027
curiously fucked nvidia delete nvenc listed gpus like kepler gpus but this gpus still in market, this article from el gato have more information about nvenc present in various kepler gpus









						Which NVIDIA graphic cards do support NVENC technology?
					

NVENC is a technology used by NVIDIA that handles video hardware encoding. Many NVIDIA GPUs support this technology, among others some GeForce GPUs used in desktop and mobile computers.In order to ...




					help.elgato.com
				




@W1zzard maybe good add encode and decode support in gpu database for more information


----------



## ARF (Jan 20, 2022)

wolf said:


> This card is absolutely AMD'S problem, and they could have gotten away with a much less negative launch if it had pcie 8x and didn't strip the encoding capabilities (yes, I realise why this had to be this way for navi 24 specifically), OR, called it the Radeon 6300/6400 or maybe 6500 LE and charged an msrp of $119-149 which would affect the street price.



Or simply Radeon R3 6050 LE..


----------



## Ravenas (Jan 20, 2022)

wolf said:


> Hard disagree, when equal or more performance was on offer for less money 2-3-4-5 years ago, even a $199 MSRP for this card is an insult, let alone the street price.
> 
> Some definitely will.
> 
> ...



I find it comical that people are still talking about pricing that was 3-5 years ago, and using that as a baseline for what they should be paying for a GPU today. It’s absurd. You can spend the last two years complaining about pricing, or you can move on and just play consoles.

I think $200 for a card that can play most games at 1080P is not a bad proposition, especially for someone who is not looking for anything more than that. Why would I buy a 6800 XT if I don’t need it? You’re complaining about pricing, but on the opposite hand your recommending spending extra dollars just to justify what you think is the most well engineered for an environment you may not need.


----------



## Chrispy_ (Jan 20, 2022)

trsttte said:


> it doesn't correlate with what the APU will perform. I'd think it actually bodes pretty well for the APUs given they have similar compute unit counts.


I read yesterday that the limited display outputs and lack of HW encode/decode features are because Navi24 was never planned as a desktop dGPU. This is the primary reason for the limitation of two display outputs.

It was originally intended as a mobile dGPU to boost Ryzen 6000-series APUs so the removed HW encode/decode blocks are not included because they already exist in the APU and would be a superfluous waste of silicon on the dGPU. The limited memory bus width is also because it was only ever going to be given access to slow main memory over a 64-bit bus. The PCIe x4 interconnect is because that's all it needed to talk directly to the APU, and wider buses drain power and the extra traces take up_ extremely valuable _PCB real estate in a mobile part.

So, this is a laptop dGPU hastily cobbled together into a desktop card and clocked to the moon. *It doesn't tell us anything useful about scaling like I originally thought,* because it's completely incomparable to the desktop RX6000-series with its GDDR6 and fully-featured silicon.


----------



## catulitechup (Jan 20, 2022)

Chrispy_ said:


> I read yesterday that the limited display outputs and lack of HW encode/decode features are because Navi24 was never planned as a desktop dGPU. This is the primary reason for the limitation of two display outputs.
> 
> It was originally intended as a mobile dGPU to boost Ryzen 6000-series APUs so the removed HW encode/decode blocks are not included because they already exist in the APU and would be a superfluous waste of silicon on the dGPU. The limited memory bus width is also because it was only ever going to be given access to main memory. The x4 bus is because that's all it needed to talk directly to the APU, and wider buses drain power and the extra traces take up extrememly valuable PCB real estate in a mobile part.
> 
> So, this is a laptop dGPU hastily cobbled together into a desktop card and clocked to the moon. *It doesn't tell us anything useful about scaling like I originally thought,* because it's completely incomparable to the desktop RX6000-series with its GDDR6 and fully-featured silicon.



this irresponsible amounts of mediocrity in new gpu


----------



## Tetras (Jan 20, 2022)

Ravenas said:


> I find it comical that people are still talking about pricing that was 3-5 years ago, and using that as a baseline for what they should be paying for a GPU today. It’s absurd. You can spend the last two years complaining about pricing, or you can move on and just play consoles.
> 
> I think $200 for a card that can play most games at 1080P is not a bad proposition, especially for someone who is not looking for anything more than that. Why would I buy a 6800 XT if I don’t need it? You’re complaining about pricing, but on the opposite hand your recommending spending extra dollars just to justify what you think is the most well engineered for an environment you may not need.



You have a different opinion to practically every tech reviewer on this card. Technology is supposed to progress and this can't beat cards released half a decade ago for the same price. People have been waiting for reasonably priced upgrades for a long time and to be confronted with this, it's just insulting how much they crippled it. It can't even beat the preceding model (5500 XT).

I agree that it's nice to have_ something _at this price point, but wolf is right, it's bad value for money, especially on PCI-E 3.0. A 6600 or 6600 XT on launch day was much better value than this and it was a meaningful upgrade for many people, the architecture is actually an improvement on the market too (fps vs watt), unlike with the 6500 XT. It's not new that the midrange card is better value than the budget one(s), it's been like that for years, but the performance level here is a big regression (especially when it's realistically going to end up $300-$400). We'll see if the 3050 is better, but AMD gave it a low-bar to beat.


----------



## trsttte (Jan 20, 2022)

Chrispy_ said:


> I read yesterday that the limited display outputs and lack of HW encode/decode features are because Navi24 was never planned as a desktop dGPU. This is the primary reason for the limitation of two display outputs.
> 
> It was originally intended as a mobile dGPU to boost Ryzen 6000-series APUs so the removed HW encode/decode blocks are not included because they already exist in the APU and would be a superfluous waste of silicon on the dGPU. The limited memory bus width is also because it was only ever going to be given access to slow main memory over a 64-bit bus. The PCIe x4 interconnect is because that's all it needed to talk directly to the APU, and wider buses drain power and the extra traces take up_ extremely valuable _PCB real estate in a mobile part.
> 
> So, this is a laptop dGPU hastily cobbled together into a desktop card and clocked to the moon. *It doesn't tell us anything useful about scaling like I originally thought,* because it's completely incomparable to the desktop RX6000-series with its GDDR6 and fully-featured silicon.



I agree, what I meant and what I was answering to is that given the similar number of compute units the ryzen 6000 APUs should perform quite well and give quite the uplift compared to previous generations.

I don't think there's any info on die size yet for Ryzen 6000, but going by 5000 series APUs which had a 180mm2 die and 4000 series which used 156mm2, and doing some napkin math with the Ryzen chiplet size of ~75mm2 and navi24 size of 100mm2 we'll be looking at pretty damn competent APUs. We can also see that any Ryzen 6000 laptop that ships with a navi24 dGPU will be a waste of silicon and power because it won't offer that much more than what the APU already had (unless we see the return of some crossfire scheme)


----------



## Tetras (Jan 20, 2022)

trsttte said:


> I agree, what I meant and what I was answering to is that given the similar number of compute units the ryzen 6000 APUs should perform quite well and give quite the uplift compared to previous generations.
> 
> I don't think there's any info on die size yet for Ryzen 6000, but going by 5000 series APUs which had a 180mm2 die and 4000 series which used 156mm2, and doing some napkin math with the Ryzen chiplet size of ~75mm2 and navi24 size of 100mm2 we'll be looking at pretty damn competent APUs. We can also see that any Ryzen 6000 laptop that ships with a navi24 dGPU will be a waste of silicon and power because it won't offer that much more than what the APU already had (unless we see the return of some crossfire scheme)



Getting near to RX 570 performance in an APU would be quite impressive, but judging by how far they have to push navi24's clocks to achieve this, it seems very unlikely. With fast memory, I think the 5700G is somewhere between the 550 and 560, so with much lower clocks to fit the power envelope, I'm not even sure the APU will be much faster.


----------



## Chrispy_ (Jan 20, 2022)

catulitechup said:


> this irresponsible amounts of mediocrity in new gpu


Well, it's more "an incomplete GPU never designed for use as a standalone solution", turned into a standalone solution because of greed/market desperation.

Knowing now what it really is, it's explains the design choices that seemed odd and unnecessarily dumb for a desktop GPU.

What it doesn't excuse is the price. The MSRP of $199 is insulting and inappropriate, even in the current market. $350 street prices are so ridiculous that you're genuinely better off getting a $175 GTX 970 on ebay



trsttte said:


> I agree, what I meant and what I was answering to is that given the similar number of compute units the ryzen 6000 APUs should perform quite well and give quite the uplift compared to previous generations.
> 
> I don't think there's any info on die size yet for Ryzen 6000, but going by 5000 series APUs which had a 180mm2 die and 4000 series which used 156mm2, and doing some napkin math with the Ryzen chiplet size of ~75mm2 and navi24 size of 100mm2 we'll be looking at pretty damn competent APUs. We can also see that any Ryzen 6000 laptop that ships with a navi24 dGPU will be a waste of silicon and power because it won't offer that much more than what the APU already had (unless we see the return of some crossfire scheme)


The APU will be power-limited to a cTDP of 28-45W most likely, and sharing that budget with the CPU cores and memory controller. So even if it has the full 12x RDNA2 CUs, it will be clocking pretty conservatively given that the graphics portion of the chip will rarely have an opportunity to use more than 50W under boost and closer to 30W sustained.

I wouldn't call Navi24 a waste of silicon. In a laptop it's designed to just be extra RDNA2 CUs that the APU has access to - with additional VRAM and additional cooling hardware because it's isolated silicon. It's highly unlikely that the Navi24 silicon will be boosting to 2.8GHz and is more likely to run at somewhere between maybe 1600MHz and 2200MHz if I had to guess. If it's being paired with 28-45W APUs, the extra Navi24 die will also likely be tuned to consume a similar TDP, perhaps in the region of 35-65W. 

There will be a kind of crossfire scheme coming back into play for these, I can't remember which AMD presentation I saw it in - either the RDNA2 slides or Ryzen 6000 slides. It was tied in with a revamp of the dynamic power juggling systems that are only possible with an APU and GPU from the same generation and where AMD controls the entire platform (so APU, dGPU, chipset, motherboard implementation).

Nobody outside AMD has seen it active yet, but that's what AMD's goal is with their active-bridge patent that uses infinity cache to string multiple pieces of RDNA2 silicon together using a narrower bus without huge penalties. I have low expectations of the technology but remain hopeful that it's not the same driver/compatibility/performance disappointment that SLI/crossfire were; This multi-die approach has worked really well for AMD with Ryzen and Epyc. Hopefully they can work the same magic with their GPUs.



Tetras said:


> Getting near to RX 570 performance in an APU would be quite impressive, but judging by how far they have to push navi24's clocks to achieve this, it seems very unlikely. With fast memory, I think the 5700G is somewhere between the 550 and 560, so with much lower clocks to fit the power envelope, I'm not even sure the APU will be much faster.


I don't think the 12CU in the APU will ever have the power budget to stretch their legs but if we can get even close to a regular GTX 1650 or even beat a 1050Ti that's a pretty significant step up from what we currently have with Vega8.


----------



## seth1911 (Jan 20, 2022)

AMD dont care about retail buyers since Ryzen 2.
No Renoir, no 5300G, B450 and Ryzen 5xxx, A320 and Ryzen 5xxx. blablabla.


AMD is in some ways worse than intel.
AMD can do that for sure but if they fail anytime in the future like Bulldozer they will have a huge problem, cause the Lowbudget Consumers are mostly now by Intel with CPUs like I3 10100F or I5 10400F.
(Alderlake I3 12100F is insane)


----------



## Chrispy_ (Jan 20, 2022)

seth1911 said:


> AMD dont care about retail buyers since Ryzen 2.
> No Renoir, no 5300G, B450 and Ryzen 5xxx, A320 and Ryzen 5xxx. blablabla.
> 
> 
> ...


Ignoring the low-end segment to maximise profits from limited TSMC allocation is the smartest short-term strategy.

The problem AMD has is that it can no longer be considered "short-term" and the longer term implications of that strategy are a loss of customers. Every Intel B560 or B660 motherboard sold in the entry-level market is a socket that AMD don't make any compatible upgrades to sell later.


----------



## catulitechup (Jan 20, 2022)

mechtech said:


> man it's unanimous across the web
> 
> glad I'm not the guy that approved making this card.  then again, probably sell every single one they make


----------



## InVasMani (Jan 20, 2022)

Bad as the Navi 24 chip is three of these together in crossfire with good scaling would be close in performance to the top end cards and that's only PCIE x12 link speed throw in a fourth and it should be able to match or beat them potentially in a MCM design. There is actually a possibility a chip like it could be AMD's longer term intentions. 

If they can make each operate one x4 PCIE links in a x16 link slot and work great in tandem they'd really have something alright. There is definitely room to improve this type of chip design in the future as well to fix the more negative and less sufficient aspects of the design itself. I'm not saying that's what AMD is doing here or trying to do, but the theoretical possibilities of it aren't that bad at the same time.

A very tough estimation is that the TDP would be about 321w to 428w or less for a 3-way or 4-way MCM chip with these if not a bit lower and the thing of it is AMD could very much improve it a bit in time and in such a design updating the GDDR memory to faster memory periodically would provide a nice lift to the overall design. There is a good bit of room for them to make some hybrid designs as well a 2-way MCM hybrid with a pair of these and a pair of x4 NVME's would be good  and a option to SLI them. 

The Rembrandt thing is a bit interesting and what if they made a APU like Rembrandt on a discrete GPU!? A bit of 3D stacked cache a with APU that has a Navi 24 chip and a CCX CPU cluster and who knows maybe a NVME slot or two in a PCIE x16 slot and link!? A pair in CF would be neat. If they continued to be x4 they would still be quite interesting on a board with several x16 or x4 slots on it to CF together.

The discrete card itself as it stands certainly sucks, but the Navi 24 chip itself is eyebrow raising in how it might be applied elsewhere especially with some refinement done to it. It could be the start of some new approaches that aren't so bad perhaps. If they could apply the chip itself in other ways it might just turn out to be a better chip than we give credit just not immediately and unlikely with this perticular design short of it having been designed to CF with other Navi 24 chips from the start then possibly it has some redeeming factors that we weren't aware of to it.


----------



## Shatun_Bear (Jan 20, 2022)

I wouldn't pay £50 for one of these.


----------



## trsttte (Jan 20, 2022)

InVasMani said:


> Bad as the Navi 24 chip is three of these together in crossfire with good scaling would be close in performance to the top end cards and that's only PCIE x12 link speed throw in a fourth and it should be able to match or beat them potentially in a MCM design. There is actually a possibility a chip like it could be AMD's longer term intentions.
> 
> If they can make each operate one x4 PCIE links in a x16 link slot and work great in tandem they'd really have something alright. There is definitely room to improve this type of chip design in the future as well to fix the more negative and less sufficient aspects of the design itself. I'm not saying that's what AMD is doing here or trying to do, but the theoretical possibilities of it aren't that bad at the same time.



Just no.

Crossfire was always at a disadvange to SLI because there was no direct communication between both GPUs, and both technologies failed anyway because any comunication that needs to go through the PCIe slot is orders of magnitude slower than the GPU accessing VRAM. This has not changed, especially when we're talking about only 4x pcie 4.0 lanes which would be the same as the 8x pcie 3.0 lanes you'd have in the old days (bifurcating the x16 slot to have 2 gpus), difference being VRAM now is even faster than it was before.

The laptop scenario has some enticing possibilities but will probably suffer from the same problem - the pcie communications will still be much slower than vram communication so for real time processing (gaming for example) you'll be plagued with stuttering and frame drops.

MCM will certainly work wonders but it is in no way comparable to Crossfire because the GPU has full access to the video memory. What kind of bottlenecks emerge from the architecture remain to be seen - for example each MCM part might not have direct access to the entire memory bus like on the first ryzen chips where the L3 cache was divided per CCX within each CCD and other whatever issues that may happear - but it's a different design case.


----------



## tussinman (Jan 20, 2022)

Shatun_Bear said:


> I wouldn't pay £50 for one of these.


You know things are real bad when even Shatun_Bear of all people is dissing an AMD product


----------



## AdmiralThrawn (Jan 20, 2022)

What the hell are they doing. Intel blows them out of the water and the only response they have in the market is a gpu that performs worse than a 570 and 580 in most games. Terrible marketing and absolutly no innovation. The 3050 is a much better card and only 50 dollars more give or take. They could do much better.


----------



## catulitechup (Jan 20, 2022)

AdmiralThrawn said:


> The 3050 is a much better card and* only 50 dollars more *give or take


----------



## Fluffmeister (Jan 21, 2022)

tussinman said:


> You know things are real bad when even Shatun_Bear of all people is dissing an AMD product



I hear ya, I felt a great disturbance in the Force, as if millions of voices suddenly cried out in terror and were suddenly silenced. I fear something terrible has happened.


----------



## AusWolf (Jan 21, 2022)

Ravenas said:


> I find it comical that people are still talking about pricing that was 3-5 years ago, and using that as a baseline for what they should be paying for a GPU today. It’s absurd. You can spend the last two years complaining about pricing, or you can move on and just play consoles.
> 
> I think $200 for a card that can play most games at 1080P is not a bad proposition, especially for someone who is not looking for anything more than that. Why would I buy a 6800 XT if I don’t need it? You’re complaining about pricing, but on the opposite hand your recommending spending extra dollars just to justify what you think is the most well engineered for an environment you may not need.


I don't know about others, but I'm not complaining about paying $200 for an entry-level card. I'm complaining about paying $200 for an entry-level card that shits itself in a PCI-e 3.0 (and older) slot, misses modern video acceleration features and sips nearly 100 Watts when 3-4 year old cards offer similar performance with no power connector needed.



Chrispy_ said:


> I read yesterday that the limited display outputs and lack of HW encode/decode features are because Navi24 was never planned as a desktop dGPU. This is the primary reason for the limitation of two display outputs.
> 
> It was originally intended as a mobile dGPU to boost Ryzen 6000-series APUs so the removed HW encode/decode blocks are not included because they already exist in the APU and would be a superfluous waste of silicon on the dGPU. The limited memory bus width is also because it was only ever going to be given access to slow main memory over a 64-bit bus. The PCIe x4 interconnect is because that's all it needed to talk directly to the APU, and wider buses drain power and the extra traces take up_ extremely valuable _PCB real estate in a mobile part.
> 
> So, this is a laptop dGPU hastily cobbled together into a desktop card and clocked to the moon. *It doesn't tell us anything useful about scaling like I originally thought,* because it's completely incomparable to the desktop RX6000-series with its GDDR6 and fully-featured silicon.


That's fine, but it doesn't make the final product any less shit.


----------



## Chrispy_ (Jan 21, 2022)

AusWolf said:


> That's fine, but it doesn't make the final product any less shit.


Correct.

In the pre-ETH GPU market, it would be a tough sell at $149. Chances are good the street price would be discounted by 10-20% until it was able to justify its existence over old 5500XT stock (MSRP $169 in 2019) and superior used options like a GTX 1060.

Sadly, the normal GPU market is the ETH GPU market. Gamers get table scraps if they're lucky, and the 6500XT is a pretty meagre table scrap of unwanted offal.


----------



## Ravenas (Jan 21, 2022)

AusWolf said:


> I don't know about others, but I'm not complaining about paying $200 for an entry-level card. I'm complaining about paying $200 for an entry-level card that shits itself in a PCI-e 3.0 (and older) slot, misses modern video acceleration features and sips nearly 100 Watts when 3-4 year old cards offer similar performance with no power connector needed.



3-4 year older cards have been bringing 350 to 400 dollars on eBay until this time, with relatively less performance with slightly higher power consumption. Those cards are unavailable, except on eBay or bstock. $200 is a market correction, let's see if these cards remain available within 2 to 3 weeks after launch dies down. I find it odd the community is beating up on the definition of an entry level card.


----------



## Nihilus (Jan 21, 2022)

Ravenas said:


> People expecting significantly more for $200 at this day are just in La La land. The 1650 has been on eBay priced at ~ $299, if it is SO much better go buy them.




Why?  Seriously, why?  "Expecting more this day and age" is such a weak cop out answer.  

Continue to use mining and human malware as excuses for GPU manufactures all you want.  However, there was NOTHING forcing AMD in making this such a nerfed card.


----------



## Ravenas (Jan 21, 2022)

Nihilus said:


> Why?  Seriously, why?  "Expecting more this day and age" is such a weak cop out answer.
> 
> Continue to use mining and human malware as excuses for GPU manufactures all you want.  However, there was NOTHING forcing AMD in making this such a nerfed card.



A nerfed card? The 6500 has a target audience, and that audience has a price point of $200 MSRP. The cards launched on Best Buy at that price. Depending on the demand, they may or may not be on the shelves 2 weeks from now.

If I were looking for an entry level PC, with capability of 1080P gaming, this would be one of my choices. If you are complaining about the performance of this card, you probably aren't the target audience for it anyhow.

Pricing of items increases overtime. Supply and demand play in. However, as I recall nVidia has been the only enthusiast grade GPU on the market until AMD's 6900 XT. The Titan was priced accordingly, and so are current generation cards. IF you are looking to buy them second hand, you will certainly be paying elevated pricing.


----------



## AusWolf (Jan 21, 2022)

Ravenas said:


> 3-4 year older cards have been bringing 350 to 400 dollars on eBay until this time, with relatively less performance with slightly higher power consumption. Those cards are unavailable, except on eBay or bstock. $200 is a market correction, let's see if these cards remain available within 2 to 3 weeks after launch dies down. I find it odd the community is beating up on the definition of an entry level card.


That would be a fair point if you could stick the 6500 XT into a pci-e 3.0 or 2.0 port the same way you can those old cards.

As for power consumption, being on par with 3-4 year old cards with similar performance is OK, but I wouldn't call it an achievement of the new 6 nm process. Nvidia delivers the same power to performance ratio on 12 nm. A GPU of this class should not need a power connector anno 2022.



Ravenas said:


> A nerfed card? The 6500 has a target audience, and that audience has a price point of $200 MSRP. The cards launched on Best Buy at that price. Depending on the demand, they may or may not be on the shelves 2 weeks from now.
> 
> If I were looking for an entry level PC, with capability of 1080P gaming, this would be one of my choices. If you are complaining about the performance of this card, you probably aren't the target audience for it anyhow.
> 
> Pricing of items increases overtime. Supply and demand play in. However, as I recall nVidia has been the only enthusiast grade GPU on the market until AMD's 6900 XT. The Titan was priced accordingly, and so are current generation cards. IF you are looking to buy them second hand, you will certainly be paying elevated pricing.


AMD targeted scalped ebay prices, that's the problem. Graphics card have been set to reasonable MSRPs so far, but with the 6500 XT, AMD decided to "scalp" their own product. If $200 was the final ebay price (and not MSRP), nobody would complain.


----------



## Ravenas (Jan 21, 2022)

AusWolf said:


> That would be a fair point if you could stick the 6500 XT into a pci-e 3.0 or 2.0 port the same way you can those old cards.
> 
> As for power consumption, being on par with 3-4 year old cards with similar performance is OK, but I wouldn't call it an achievement of the new 6 nm process. Nvidia delivers the same power to performance ratio on 12 nm. A GPU of this class should not need a power connector anno 2022.
> 
> ...



Scalped eBay prices are $350-400 for 1650 or 570/590 pre 6500 XT (they are probably on a downward trend now). AMD 6500 XT MSRP is $200. I don't see AMD targeting any scalpers as a reference for pricing. In my mind, the card is worth more than $100 and not worth more than $200. $200 graphics budget is a nice price point for an entry level PC with a dedicated GPU.

Again, the card is not (exceptionally) good, but it is not bad either. It meets a need for an entry level 1080P dedicated GPU. The community bashing this piece of technology is somewhat trendy or "bandwagon" to me. 

As a younger man, in 2012 I purchased an FX 8350. I can vaguely recall the amount of criticism that CPU received. I used it for a medium range gaming rig from 2012 to 2018. The key point was that compared to the Intel juggernaut of the time, it met a price point of $189. That CPU received plenty of criticism here and elsewhere, but for me, it was the perfect performance for the price. To each their own budget/need for gaming happiness.


----------



## AusWolf (Jan 21, 2022)

Ravenas said:


> Scalped eBay prices are $350-400 for 1650 or 570/590 pre 6500 XT (they are probably on a downward trend now). AMD 6500 XT MSRP is $200. I don't see AMD targeting any scalpers as a reference for pricing. In my mind, the card is worth more than $100 and not worth more than $200. $200 graphics budget is a nice price point for an entry level PC with a dedicated GPU.


Here in the UK, the 6500 XT starts around £220-250. You can have a 1650 for £200 on ebay. Then you can stick it into a pci-e 3.0 or 2.0 slot and be happy with it. With the 6500 XT, you need 4.0 which means AMD B550, X570 or Intel B560, Z590, B660 or Z690. The Intel 600 platforms are expensive, and the cheapest CPU you can get for the others to do pci-e 4.0 is either the Core i5-11400 or the Ryzen 5 5600X. Either way, it isn't cheap. If you just want a graphics card, you can pretty much forget about the 6500 XT. For $200 MSRP, it won't even perform like a $100 card in your old system.


----------



## Ravenas (Jan 21, 2022)

AusWolf said:


> Here in the UK, the 6500 XT starts around £220-250. You can have a 1650 for £200 on ebay. Then you can stick it into a pci-e 3.0 or 2.0 slot and be happy with it. With the 6500 XT, you need 4.0 which means AMD B550, X570 or Intel B560, Z590, B660 or Z690. The Intel 600 platforms are expensive, and the cheapest CPU you can get for the others to do pci-e 4.0 is either the Core i5-11400 or the Ryzen 5 5600X. Either way, it isn't cheap. If you just want a graphics card, you can pretty much forget about the 6500 XT. For $200 MSRP, it won't even perform like a $100 card in your old system.



I disagree, B550 boards are cheap. Best Buy has the 6500 XT at $200 MSRP. The range on Newegg is $199 to $269.

1650 used $289 https://www.ebay.com/itm/224801724151?epid=4035949541&hash=item34573a16f7:g:kqsAAOSwiWJh6tqk

1650 new $350 https://www.ebay.com/itm/185225818437?epid=8035384595&_trkparms=ispr=1&hash=item2b2051b145:g:qzcAAOSwIblhw64p&amdata=enc:AQAGAAACkPYe5NmHp%2B2JMhMi7yxGiTJkPrKr5t53CooMSQt2orsS7UXID%2BRPOSNsnm8kYPghtCBIQ9r00zrT5VqpN2KQKysYAcfjWzSW%2B21AIx8PQgVzB2oHgQmiCgDPlt%2BnMxuk3iAY8xN95t%2Bqck%2FWbO2mB8Z0NZANc91wFl7iimeO6OZL6dyFY1HiDLQ3LZgWFLq%2Bm7HeHB91gvkV4KXJU6avKICXuyrMckYMOKSgpz8rYoR0pt4zMCs1olPVcV8ln%2BLayU4xDgnUPmzchCNBCG7wCYg7pB5w%2Bd%2Bzfs1B84FnN01ejaL497CXjZsT8eYW%2Ba8CokknzvKWejMxNnwHCLGpLR8ahA4u7EOowmKLM4Z5o2bG18exYcTiDV0t8g%2BaQ5Wn5jaEZoljh7dVwjs7sBDFFOELMZlxMkTj7FIjhFTXq%2Bk3%2BedPBwWoAYG8ByX81HIoMJcDFxy6gjwzhTJe9rAr97aZ1zIhA3Hngo7nWLPdXmJ7ralRKlhWP17%2Bz2zoiJYlr8sqptOk3DopteQid%2BD39ZarC%2BzU0KZ83JL%2B0hpx%2F5bhMFq6uHCvuiPtV%2BN2uBKNWsHB9uNX8fyyulSz2%2Fjt7jEGdNd7zraaJ%2B%2FexG%2Fu1KwOz9b2yOxCbK6fLSAEhPd7oy6kLfg%2BlIy%2FySjV2ngI3IJD5bmMOTLdKzFzl1fs2s2oCCMYH9Gdy7hdW%2BROUezopNJZdaXgyqFWxnpRJuniD4gu%2Fwp0eHy%2FdztilWsBxFAy21xH2frjHRpMONbwhOrGTm6PYANZZUk3s9Y3YHg7Tj68S%2Fi9MVFQuhGfgO5KLjez4LAbAKEj%2B%2FLsaVTZRwJ7OZHRsVOmfMgOYLscE1u5WSwQSmQlh4Esp9l2RxWtxQnG|clp:2334524|tkp:BFBMguKG8c9f

1650 used $229 https://www.ebay.com/itm/304306850669?epid=5040500813&hash=item46da1a0b6d:g:tS8AAOSwjGBh34Xe

570 used $240 https://www.ebay.com/itm/165294873370?epid=6031752144&hash=item267c579b1a:g:R0UAAOSw2eFh6yYj


----------



## trsttte (Jan 22, 2022)

The 5500xt 4gb launched at 169$, the 8gb quickly followed at 200$, both perform better than the now 200$ 6500xt.

The 6500xt is an awesome product for whoever has no other choice. It might be the best thing you can actually get, but there's no reason why anyone should be happy about it.


----------



## Ravenas (Jan 22, 2022)

trsttte said:


> The 5500xt 4gb launched at 169$, the 8gb quickly followed at 200$, both perform better than the now 200$ 6500xt.
> 
> The 6500xt is an awesome product for whoever has no other choice. It might be the best thing you can actually get, but there's no reason why anyone should be happy about it.



Supply and demand. Inflation. All factors which have increased computer part pricing. I have stated multiple times that I purchased a Sapphire 5700 XT Nitro for $470 new, and sold it a little more than a year ago for $900 on eBay.

Pricing is not 100% resistant to all the factors associated with scalping, supply and demand, and inflation. Those who want to game with good frames today at 4K ultra settings have to make a conscious decision to say I'm all in because pricing is obviously high right now.

What I have stated is that if my mission was to get an entry level PC for gaming at 1080P, I would have absolutely no problem with the 6500 XT being one of my options. The wave of criticism in that regard is unwarranted. One alternative could just be settling for console gaming, and those are also effected by all of the previously mentioned factors.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> I disagree, B550 boards are cheap. Best Buy has the 6500 XT at $200 MSRP. The range on Newegg is $199 to $269.
> 
> 1650 used $289 https://www.ebay.com/itm/224801724151?epid=4035949541&hash=item34573a16f7:g:kqsAAOSwiWJh6tqk
> 
> ...


B550 is cheap, but the cheapest CPU you can currently buy that allows you to use pci-e 4.0 is the 5600X, which brings the price up quite a bit. If you need a temporary solution, you're better off with a 5600G and it's integrated graphics until you can buy a proper x16 card. Or even better, buy Intel B560 and a 10th gen Core i3. With the money you save, you may be able to afford a 6600 or 3060. There are many better options than the 6500 XT, and that's what I mean: it's not a bad card, but being on pci-e x4 limits it so much that basically every other option you have is a better alternative.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> B550 is cheap, but the cheapest CPU you can currently buy that allows you to use pci-e 4.0 is the 5600X, which brings the price up quite a bit. If you need a temporary solution, you're better off with a 5600G and it's integrated graphics until you can buy a proper x16 card. Or even better, buy Intel B560 and a 10th gen Core i3. With the money you save, you may be able to afford a 6600 or 3060. There are many better options than the 6500 XT, and that's what I mean: it's not a bad card, but being on pci-e x4 limits it so much that basically every other option you have is a better alternative.



I hear where you are coming from, however, APU technology is not quite there. The 5600G will garner you about ~26 average FPS across 1080P game suite. Conversely, you are looking at an average of ~64 FPS across 1080P game suite with the 5600 XT. It's not really a temporary solution, because you would be incapable of even achieving an average of 30 FPS with an APU.

If you are looking at getting into PC gaming, and are ok with absolute entry level 1080P performance with fluid FPS accordingly, you have to start at entry level pricing. This is just a consequence of taking the plunge.


----------



## Tetras (Jan 22, 2022)

Ravenas said:


> Pricing is not 100% resistant to all the factors associated with scalping, supply and demand, and inflation. Those who want to game with good frames today at 4K ultra settings have to make a conscious decision to say I'm all in because pricing is obviously high right now.
> 
> What I have stated is that if my mission was to get an entry level PC for gaming at 1080P, I would have absolutely no problem with the 6500 XT being one of my options. The wave of criticism in that regard is unwarranted. One alternative could just be settling for console gaming, and those are also effected by all of the previously mentioned factors.



No-one is saying that pricing isn't high right now, but we're talking about _5 years old levels of performance_, even if there's more demand and inflationary factors that's just ridiculous.

As an example: 5 years ago the top CPUs were the Ryzen 7 1700/X and the Intel i7 8700/K, both priced at approximately $350. For around $350 now, you can get an i7-12700 non-K or Ryzen 7 5800X, which blow them away, their raw computing performance has doubled. Meanwhile, in the world of graphics cards, AMD don't even care if it's faster.

Is the 6500 XT an acceptable option for entry-level gaming? Yeah, I suppose. But, did it need to be so limited and lose features historically present at this price point? No. Everyone involved in making graphics cards is making record profits right now. A turd is a turd, even if it's the best turd we've got. They didn't even have to do much to make it a lot more usable, as suggested above, a 96-bit bus with 6GB memory would have helped a lot. A few more CUs would have also pushed it comfortably ahead of the RX 5500.

It's barely fit for purpose as-is, except for a budget gaming build that has to be new and can't get anything else. With the RX 6600, It wasn't easy for AMD to justify releasing a midrange card intended for 1080p heading into 2022, but at least it can hold 60 fps at high/max details and is usually playable at 1440p. I think it was GN that made the point that the RX 6500 XT is almost a 720p card nowadays (or it soon will be) and 720p monitors are obsolete! And since they deleted the HTPC features it'll have artificially limited usefulness even when it's retired. It would have been really freaking helpful for old systems if AV1 goes mandatory for video content in a few years, but nope, AMD says "%@"! you" for trying that one, here's a paperweight.

Most reviewers are using historical products as a reference, since they've been around awhile, so sure, like GN said, you CAN ignore this and only look at the market right now, but why should you? Should we continue to pay the same price for an arguably worse performing product, with less features, 5 years later? A lot of gamers have done exactly what you suggested and abandoned PC gaming for consoles.


----------



## Ravenas (Jan 22, 2022)

Tetras said:


> No-one is saying that pricing isn't high right now, but we're talking about _5 years old levels of performance_, even if there's more demand and inflationary factors that's just ridiculous.
> 
> As an example: 5 years ago the top CPUs were the Ryzen 7 1700/X and the Intel i7 8700/K, both priced at approximately $350. For around $350 now, you can get an i7-12700 non-K or Ryzen 7 5800X, which blow them away, their raw computing performance has doubled. Meanwhile, in the world of graphics cards, AMD don't even care if it's faster.
> 
> ...



In regard to future usability being the crux, well that is inevitable on any investment in GPU technology. My 6900 XT soon won't be able to achieve 60FPS in most all 4K games. However, growth in FSR will drive lower range GPU technology for AMD. When FSR can be implanted on any game without need of developer input, then the FSR technology provides an uplift to all aging GPU.

Regardless, you touched on exactly the point. A budget gaming build GPU.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> I hear where you are coming from, however, APU technology is not quite there. The 5600G will garner you about ~26 average FPS across 1080P game suite. Conversely, you are looking at an average of ~64 FPS across 1080P game suite with the 5600 XT. It's not really a temporary solution, because you would be incapable of even achieving an average of 30 FPS with an APU.
> 
> If you are looking at getting into PC gaming, and are ok with absolute entry level 1080P performance with fluid FPS accordingly, you have to start at entry level pricing. This is just a consequence of taking the plunge.


I don't mean to buy an APU and stay with it, but it's fine to play at reduced details until you can buy something better. If you're a new system builder, a 10th gen Core i3 with a 6600 or 3060 would cost you about the same as any pci-e gen 4 capable CPU with the 6500 XT, but gives you tons more gaming power. If you have an old system, pci-e 3.0 or 2.0 will hugely bottleneck you with the 6500 XT. If you want a HTPC card, the GT 1030 is a much better value. It's cheaper, eats a lot less power, it's available in LP form factor and has relatively OK video decode (though it doesn't have VP9 either). The 6500 XT is so limited by so many factors that, while not being terrible in its own use case, has basically no target audience. It's an OK entry-level gaming card if you have a pci-e 4.0 capable system, which I'm sure most people shopping in this range don't. Its ridiculous MSRP is just icing on the cake.

Edit: my question still stands: why does AMD's new 6 nm card with 2 RAM chips use as much power as nvidia's 3-4 year-old 12 nm cards of the same performance range with 4 RAM chips? My theory is that the 64 bit bus and pci-e x4 interface bottlenecks the card so much that AMD had to clock it through the roof for performance to be on par with said 3-4 year-old nvidia cards.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> entry-level gaming card



The target audience. It's available. It ok. The vast majority are on entry level machines, with 1080P resolution.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> The target audience. It's available. It ok. The vast majority are on entry level machines, with 1080P resolution.


No. The target audience is gamers with entry-level pci-e 4.0 capable systems. Systems that basically do not exist.


----------



## Tetras (Jan 22, 2022)

AusWolf said:


> Edit: my question still stands: why does AMD's new 6 nm card with 2 RAM chips use as much power as nvidia's 3-4 year-old 12 nm cards of the same performance range with 4 RAM chips? My theory is that the 64 bit bus and pci-e x4 interface bottlenecks the card so much that AMD had to clock it through the roof for performance to be on par with said 3-4 year-old nvidia cards.



It must be the clocks, since it uses less power on the lower clocked Sapphire Pulse in TPU's review. But the weird thing is, the voltage appears to be very similar (on the efficiency & clock speed charts) and I didn't think frequency alone could explain that scale of difference, unless the lower clocks allow it to operate for longer at lower voltages which isn't too easy for me to read on the chart. It could be that Sapphire just uses a less aggressive behaviour to maintain them, but I'm not sure if AIB's have that much leeway.


----------



## TheoneandonlyMrK (Jan 22, 2022)

Tetras said:


> No-one is saying that pricing isn't high right now, but we're talking about _5 years old levels of performance_, even if there's more demand and inflationary factors that's just ridiculous.
> 
> As an example: 5 years ago the top CPUs were the Ryzen 7 1700/X and the Intel i7 8700/K, both priced at approximately $350. For around $350 now, you can get an i7-12700 non-K or Ryzen 7 5800X, which blow them away, their raw computing performance has doubled. Meanwhile, in the world of graphics cards, AMD don't even care if it's faster.
> 
> ...


Until this came out even making a entry level gaming pc was not possible at all in the UK, I did some, all APUs and this is 60% better than the GPU in a 5600G.

LTT and MLD had it right, it's not for you , it's just useful to new builds with no other choices.

And it couldn't be made at this price without the compromises that became necessary due to its triple use as a Dgpu laptop integrated and possibly a core component of future Apu (possibly)

Again I wouldn't buy it for me but no option at all isn't better.


----------



## AusWolf (Jan 22, 2022)

TheoneandonlyMrK said:


> Until this came out even making a entry level gaming pc was not possible at all in the UK, I did some, all APUs and this is 60% better than the GPU in a 5600G.


Huh? Overclockers UK is literally stocked full of graphics cards, including 1050 Tis and 1650s. They even have a few 1660s and 1660 Supers, although they cost a bit more than what I'd pay for them.



TheoneandonlyMrK said:


> LTT and MLD had it right, it's not for you , it's just useful to new builds with no other choices.


You need to buy a Core i5-11400 or Ryzen 5 5600X to be able to properly use it. For that total system cost, you'd be better off buying a Core i3-10100 and a 6600 or 3060. If you've gone with the 10100 already, and have only around 2-300 cash left for a graphics card, a 1650 is a much better choice, as the 6500 XT wouldn't do you any good on PCI-e 3.0.



TheoneandonlyMrK said:


> And it couldn't be made at this price without the compromises that became necessary due to its triple use as a Dgpu laptop integrated and possibly a core component of future Apu (possibly)


That's fine, but it compromises on far too many levels to be considered a viable alternative _to anything else_ as a desktop dGPU.


----------



## trsttte (Jan 22, 2022)

Ravenas said:


> Supply and demand. Inflation. All factors which have increased computer part pricing. I have stated multiple times that I purchased a Sapphire 5700 XT Nitro for $470 new, and sold it a little more than a year ago for $900 on eBay.
> 
> Pricing is not 100% resistant to all the factors associated with scalping, supply and demand, and inflation. Those who want to game with good frames today at 4K ultra settings have to make a conscious decision to say I'm all in because pricing is obviously high right now.



The thing is, we're talkin about launch prices set by AMD. ebay/retailers are selling them for over 300$/€ (US tarifs aligned prices a bit).

Now who should profiteer from the current market for gpus, amd, retailers or scalpers? AMD is the one putting in the work so they decided fuck it they'll be the ones eating the pie. Fine, but again, we have no reasons to be happy about or celebrate it.


----------



## TheoneandonlyMrK (Jan 22, 2022)

AusWolf said:


> Huh? Overclockers UK is literally stocked full of graphics cards, including 1050 Tis and 1650s. They even have a few 1660s and 1660 Supers, although they cost a bit more than what I'd pay for them.
> 
> 
> You need to buy a Core i5-11400 or Ryzen 5 5600X to be able to properly use it. For that total system cost, you'd be better off buying a Core i3-10100 and a 6600 or 3060. If you've gone with the 10100 already, and have only around 2-300 cash left for a graphics card, a 1650 is a much better choice, as the 6500 XT wouldn't do you any good on PCI-e 3.0.
> ...


"Huh? Overclockers UK is literally stocked full of graphics cards, including 1050 Tis and 1650s. They even have a few 1660s and 1660 Supers, although they cost a bit more than what I'd pay for them."

They were not before christmass and a 1050 is 219 minimum now so nah not great but fair enough equal ish but fair point.

"You need to buy a Core i5-11400 or Ryzen 5 5600X to be able to properly use it. For that total system cost, you'd be better off buying a Core i3-10100 and a 6600 or 3060. If you've gone with the 10100 already, and have only around 2-300 cash left for a graphics card, a 1650 is a much better choice, as the 6500 XT wouldn't do you any good on PCI-e 3.0."

Fair enough I didn't suggest everyone should buy one though or that I would now always buy one in every scenario but i dont get your pricing either the board stays the same as cheapest available similar to ddr4 in any of these hypothetical systems, but where am i getting 2/300 quid from swapping a 12400 to a 10100 or even the 5600G to the 10100 the G was 269£

"That's fine, but it compromises on far too many levels to be considered a viable alternative _to anything else_ as a desktop dGPU."

your allowed your opinion i am mine, it isnt that bad imho.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> No. The target audience is gamers with entry-level pci-e 4.0 capable systems. Systems that basically do not exist.



Untrue. You are now just stating an opinion that entry level hardware doesn't exist. I can build a system, including monitor and keyboard, for ~ $1100. That's called entry level.

Fractal Design Core 1000 Black Micro ATX Mini Tower Computer Case - Newegg.com
ASUS PRIME B550M-A (WI-FI) AM4 Micro ATX AMD Motherboard - Newegg.com
PowerColor Fighter Radeon RX 6500 XT Video Card AXRX 6500XT 4GBD6-DH/OC - Newegg.com
AMD Ryzen 5 5600X 6-Core 3.7 GHz AM4 CPU Processor - Newegg.com
Crucial P2 500GB 3D NAND NVMe PCIe M.2 SSD - Newegg.com
LG 24BK400H-B 23.5" Full HD 1920 x 1080 75 Hz FreeSync (AMD Adaptive Sync) Monitor - Newegg.com
Logitech MK270 Wireless Keyboard and Mouse Combo 920-004536 - USB 2.0 RF Wireless Ergonomic Keyboard & Mouse - Newegg.com
CORSAIR Vengeance RGB Pro 16GB (2 x 8GB) DRAM DDR4 3000 Desktop Memory - Newegg.com
EVGA 650 BQ 110-BQ-0650-V1 80+ BRONZE 650W Semi Modular Includes FREE Power On Self Tester Power Supply - Newegg.com



trsttte said:


> The thing is, we're talkin about launch prices set by AMD. ebay/retailers are selling them for over 300$/€ (US tarifs aligned prices a bit).
> 
> Now who should profiteer from the current market for gpus, amd, retailers or scalpers? AMD is the one putting in the work so they decided fuck it they'll be the ones eating the pie. Fine, but again, we have no reasons to be happy about or celebrate it.



You can't worry about scalpers; they have always been here. It's only a problem when supply is lower than demand.


----------



## AusWolf (Jan 22, 2022)

TheoneandonlyMrK said:


> They were not before christmass and a 1050 is 219 minimum now so nah not great but fair enough equal ish but fair point.


The 6500 XT wasn't available before Christmas, either, so... 



TheoneandonlyMrK said:


> Fair enough I didn't suggest everyone should buy one though or that I would now always buy one in every scenario but i dont get your pricing either the board stays the same as cheapest available similar to ddr4 in any of these hypothetical systems, but where am i getting 2/300 quid from swapping a 12400 to a 10100 or even the 5600G to the 10100 the G was 269£


What I mean is, as a new builder, you need to fork out for some kind of middle-class part to get PCI-e gen 4.

B550 motherboards are cheap, but you need a 5600X at least. At £270, it's nearly 200 quid more expensive than a Core i3-10105F.
You can get a Core i3-12100F for £120, but a good B660 board will set you back almost £200 which is nearly double than what a similar B560 would cost you.
Again, I'm not saying that the 6500 XT is a bad card purely because of its performance. I'm saying it's a bad card because the compromises AMD made with it limit it to a use case / buyer base that doesn't exist.



Ravenas said:


> Untrue. You are now just stating an opinion that entry level hardware doesn't exist. I can build a system, including monitor and keyboard, for ~ $1100. That's called entry level.


I said entry-level PCI-e 4.0 capable systems don't exist.

As I mentioned above, instead of a B550 + 5600X combo, you'd be better off with a B560 + i3-10100 and a 6600 or 3060.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> I said entry-level PCI-e 4.0 capable systems don't exist.



They exist, I just quoted you one. $1100



AusWolf said:


> As I mentioned above, instead of a B550 + 5600X combo, you'd be better off with a B560 + i3-10100 and a 6600 or 3060.



Pick any combination of what you desire to be better at entry level components, it's still entry level. There is no arguing around that. There is a market for it, and it's catered.


----------



## Selaya (Jan 22, 2022)

b550 5600x and entrylevel, that's like a fucking oxymoron

as auswolf's already said, you're _far better off_ saving like $200 on board/cpu w/ the 10100; instead you should put the savings into a better gpu instead


----------



## Ravenas (Jan 22, 2022)

Selaya said:


> b550 5600x and entrylevel, that's like a fucking oxymoron
> 
> as auswolf's already said, you're _far better off_ saving like $200 on board/cpu w/ the 10100; instead you should put the savings into a better gpu instead



Most B550 are entry level mobos. X570 are for enthusiast. I don't really care what components you consider better for entry level, it's beside the point. Getting an entire computer setup for ~$1100 which can play most games at 1080P with average of 60 FPS is an entry level system.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> They exist, I just quoted you one. $1100


That's not an entry-level system. Swap the motherboard and CPU for the Asus Prime B560-Plus and the Core i3-10100F and you have budget for a Radeon RX 6600 with $40 saved.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> That's not an entry-level system. Swap the motherboard and CPU for the Asus Prime B560-Plus and the Core i3-10100F and you have budget for a Radeon RX 6600.



And the 6500 XT is still an entry level card priced at entry level pricing.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> And the 6500 XT is still an entry level card priced at entry level pricing.


And the 5600X is a mid-level CPU with mid-level pricing. You'd be crazy to use it with a 6500 XT when you can have FAR BETTER gaming experience for the same money.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> And the 5600X is a mid-level CPU with mid-level pricing. You'd be crazy to use it with a 6500 XT when you can have FAR BETTER gaming experience for the same money.



I gave you an example, I never required you to purchase the 5600X. Purchase a cheaper Intel with a cheaper Intel board. Go with the 6500 XT, and you still have entry level performance at 1080P. IF you wish to increase your performance, great go with the 6600 you mentioned, you still pay more money.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> Purchase a cheaper Intel with a cheaper Intel board. Go with the 6500 XT, and you still have entry level performance at 1080P.


No. You will have crap performance, because the 6500 XT cries home to mama when it's limited to PCI-e gen 3.


----------



## Selaya (Jan 22, 2022)

Ravenas said:


> I gave you an example, I never required you to purchase the 5600X. Purchase a cheaper Intel with a cheaper Intel board. Go with the 6500 XT, and you still have entry level performance at 1080P. IF you wish to increase your performance, great go with the 6600 you mentioned, you still pay more money.





AusWolf said:


> That's not an entry-level system. Swap the motherboard and CPU for the Asus Prime B560-Plus and the Core i3-10100F and you have budget for a Radeon RX 6600 with $40 saved.





AusWolf said:


> That's not an entry-level system. Swap the motherboard and CPU for the Asus Prime B560-Plus and the Core i3-10100F and you have budget for a Radeon RX 6600 *with $40 saved*.





AusWolf said:


> That's not an entry-level system. Swap the motherboard and CPU for the Asus Prime B560-Plus and the Core i3-10100F and you have budget for a Radeon RX 6600 *with $40 saved*.


monkaS


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> No. You will have crap performance, because the 6500 XT cries home to mama when it's limited to PCI-e gen 3.
> 
> View attachment 233517View attachment 233518View attachment 233519



Which is exactly why I recommend the 5600 

I actually forgot about the 5600G, Amazon.com: AMD Ryzen 5 5600G 6-Core 12-Thread Unlocked Desktop Processor with Radeon Graphics : Electronics . I like it a little more for the entry level build.

Would rather use this motherboard as well... GIGABYTE B450 AORUS PRO WIFI AMD Motherboard - Newegg.com


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> Which is exactly why I recommend the 5600


You mean the Radeon 5600? Straw man argument. Nice try.



Ravenas said:


> I actually forgot about the 5600G, Amazon.com: AMD Ryzen 5 5600G 6-Core 12-Thread Unlocked Desktop Processor with Radeon Graphics : Electronics . I like it a little more for the entry level build.


It's only cents/pennies cheaper than the 5600X. Besides, it can only do PCI-e gen 3. It's a good thing to buy if you don't want a dGPU right away, but you're happy to wait and save for later. Totally irrelevant CPU to be paired with a 6500 XT.



Ravenas said:


> Would rather use this motherboard as well... GIGABYTE B450 AORUS PRO WIFI AMD Motherboard - Newegg.com


Again, it only has PCI-e gen 3.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> You mean the Radeon 5600? Straw man argument. Nice try.
> 
> 
> It's only cents/pennies cheaper than the 5600X. Besides, it can only do PCI-e gen 3. It's a good thing to buy if you don't want a dGPU right away, but you're happy to wait and save for later. Totally irrelevant product to be paired with a 6500 XT.
> ...



Ah you're right I haven't looked at a lot of these APUs offered by AMD. The gigabyte board didn't save money on the PCI-e config... My bad I didn't research enough on both of those. I'm responding and playing a game at the same time.  I still like my original build with the card.

I don't disagree there are downfalls with it, particularly with PCIe 3.0. However, it is placed well as an entry level card.


----------



## Selaya (Jan 22, 2022)

b550-5600x-6500xt is asinine, since b560-10100f-6600 is _both cheaper and better_


----------



## Ravenas (Jan 22, 2022)

Sure, but


Selaya said:


> b550-5600x-6500xt is asinine, since b560-10100f-6600 is _both cheaper and better_



Haven't really looked much into 11th gen chips beyond 12900k, but I like this combo too:

Intel Core i5-11400 - Core i5 11th Gen Rocket Lake 6-Core 2.6 GHz LGA 1200 65W Intel UHD Graphics 730 Desktop Processor - BX8070811400 - Newegg.com
MSI B560M PRO-E LGA 1200 Micro ATX Intel Motherboard - Newegg.com

Drops the build price down to $963.79


----------



## TheoneandonlyMrK (Jan 22, 2022)

AusWolf said:


> No. You will have crap performance, because the 6500 XT cries home to mama when it's limited to PCI-e gen 3.
> 
> View attachment 233517View attachment 233518View attachment 233519


Yes but no because in his example you would have pciex 4 on b560 with your 12100 not 10100 but you made your mind up that's clear.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> Sure, but
> 
> 
> Haven't really looked much into 11th gen chips beyond 12900k, but I like this combo too:
> ...


The 11400 is a good CPU for the price, but again, it's nearly $200. A core i3 and a better graphics card (like an RX 6600) is a lot better bang for the buck.



TheoneandonlyMrK said:


> Yes but no because in his example you would have pciex 4 on b560 with your 12100 not 10100 but you made your mind up that's clear.


B*5*60 is 10/11th gen. You need a Core i5-11400 with it at least to have PCI-e 4.0. The 12100 is fine, but you need B*6*60 with it, which is considerably more expensive than B560 if you're looking at a good quality board.


----------



## TheoneandonlyMrK (Jan 22, 2022)

AusWolf said:


> The 11400 is a good CPU for the price, but again, it's nearly $200. A core i3 and a better graphics card (like an RX 6600) is a lot better bang for the buck.
> 
> 
> B*5*60 is 10/11th gen. You need a Core i5-11400 with it at least to have PCI-e 4.0. The 12100 is fine, but you need B*6*60 with it, which is considerably more expensive than B560 if you're looking at a good quality board.


Slip up by me fair enough ,it is Saturday morning ,I'm not at my sharpest ATM.

I wouldn't go i3 10/11 gen personally but I get your points.

I would always try for better balance and a pc that's not scrap in two  to three years on the whole.

I don't think those i3's will age well.

Although I thought those dual cores with HT a quad with HT ain't bad really now I've looked.

As you know the market moves With the time's.


----------



## AusWolf (Jan 22, 2022)

TheoneandonlyMrK said:


> Slip up by me fair enough ,it is Saturday morning ,I'm not at my sharpest ATM.


Happens. I've been awake for nearly 20 hours after a night shift and a daily Mass Effect run myself. 



TheoneandonlyMrK said:


> I wouldn't go i3 10/11 gen personally but I get your points.
> 
> I would always try for better balance and a pc that's not scrap in two  to three years on the whole.


Me too, though the platform is still upgradeable to an i5 or i7 later (11th gen i9 is a waste of money).



TheoneandonlyMrK said:


> I don't think those i3's will age well.


I agree, though I think the 6500 XT will age even worse due to its many limitations.


----------



## trsttte (Jan 22, 2022)

Ravenas said:


> Most B550 are entry level mobos. X570 are for enthusiast. I don't really care what components you consider better for entry level, it's beside the point. Getting an entire computer setup for ~$1100 which can play most games at 1080P with average of 60 FPS is an entry level system.



That's super elitist and completely wrong. 1100$ is not entry level at all. Currently, since there are no cheap gpus, it's not that far off but that's not a normal thing nor will it stay that way (hopefully at least).


----------



## Ravenas (Jan 22, 2022)

trsttte said:


> That's super elitist and completely wrong. 1100$ is not entry level at all. Currently, since there are no cheap gpus, it's not that far off but that's not a normal thing nor will it stay that way (hopefully at least).



$200 is a cheap GPU in my opinion, nothing elitist about it or wrong. Pricing between Intel and AMD ranges from $960-$1000, for an entry level gaming PC. You could debate about which card is better for entry level, it's just not the point (although I think the $200 price point fits well for this GPU). The package quoted includes 1080P monitor and minimal accessories needed. The fact is, it's the current state of entry level.

You are welcome to go console, in which case you're basically paying for a $500 garden wall APU all in one box, and you will still have the investment in your TV.


----------



## AusWolf (Jan 22, 2022)

Ravenas said:


> $200 is a cheap GPU in my opinion, nothing elitist about it or wrong. Pricing between Intel and AMD ranges from $960-$1000, for an entry level gaming PC. You could debate about which card is better for entry level, it's just not the point (although I think the $200 price point fits well for this GPU). The package quoted includes 1080P monitor and minimal accessories needed. The fact is, it's the current state of entry level.
> 
> You are welcome to go console, in which case you're basically paying for a $500 garden wall APU all in one box, and you will still have the investment in your TV.


What's cheap and what isn't, is up to personal circumstances. Some people can afford two 3090s in SLi without problem.

I agree that different classes have moved up a lot in price, and $200 is practically classed as a cheap GPU nowadays. What I'm still trying keep in mind though, is that GPUs moving price classes due to supply chain issues doesn't change who can afford what. If Random Joe could only buy a $150 graphics card in 2015, he can probably only afford the same price range now. Only that he's getting a lot less for his money in terms of value. It's sad, but if you think of it this way, low-end is practically moving out of the PC world entirely.

Edit: Another thing is that game technologies don't seem to be evolving as quickly as they used to. A modern mid-range graphics card can easily last you 5-6 years, while just a decade ago, you had to buy a new one every 2-3 years, even if you went for the top model. So instead of spending $200 every 2 years, you're spending $600 every 6 years. With this logic, GPUs (per yearly cost) haven't got more expensive at all. It's just that AMD, nvidia (and the scalpers of course) are making a lot more money than they used to.


----------



## Ravenas (Jan 22, 2022)

AusWolf said:


> Edit: Another thing is that game technologies don't seem to be evolving as quickly as they used to. A modern mid-range graphics card can easily last you 5-6 years, while just a decade ago, you had to buy a new one every 2-3 years, even if you went for the top model. So instead of spending $200 every 2 years, you're spending $600 every 6 years. With this logic, GPUs (per yearly cost) haven't got more expensive at all. It's just that AMD, nvidia (and the scalpers of course) are making a lot more money than they used to.



This is the only generation in at least ~15 years AMD has had entry, mid, high, and enthusiast GPUs. Nvidia has had that for at least the last 3 generations.


----------



## trsttte (Jan 22, 2022)

Ravenas said:


> $200 is a cheap GPU in my opinion, nothing elitist about it or wrong. Pricing between Intel and AMD ranges from $960-$1000, for an entry level gaming PC. You could debate about which card is better for entry level, it's just not the point (although I think the $200 price point fits well for this GPU). The package quoted includes 1080P monitor and minimal accessories needed. The fact is, it's the current state of entry level.
> 
> You are welcome to go console, in which case you're basically paying for a $500 garden wall APU all in one box, and you will still have the investment in your TV.



You don't know how to spec a proper budget system. Again, 1100$/1000$/960$ is not budget, you can have a decent entry level gaming system for much less. I gave the benefit of the doubt before because current prices are crazy on everything, but since you insisted on the ~1000$ here you go.

https://pcpartpicker.com/list/bRpKH2   (6500xt is not yet on pcpartpicker so just imagine the 5500xt is one). 630$ in the current market and with further savings still possible (cheaper psu, cheaper ssd, cheaper case, older board/cpu, etc)

200$ is only a cheap gpu on this crazy stupid market. I'll repeat what I said before, the 5500xt 8gb launched for 200$. In the middle of 2020 *before* the mining apocalipse you could buy a RX 580 8gb for 160€ new. A 1050ti for less than 150€, many other options for decent prices.


----------



## Ravenas (Jan 23, 2022)

trsttte said:


> You don't know how to spec a proper budget system. Again, 1100$/1000$/960$ is not budget, you can have a decent entry level gaming system for much less. I gave the benefit of the doubt before because current prices are crazy on everything, but since you insisted on the ~1000$ here you go.
> 
> https://pcpartpicker.com/list/bRpKH2   (6500xt is not yet on pcpartpicker so just imagine the 5500xt is one). 630$ in the current market and with further savings still possible (cheaper psu, cheaper ssd, cheaper case, older board/cpu, etc)
> 
> 200$ is only a cheap gpu on this crazy stupid market. I'll repeat what I said before, the 5500xt 8gb launched for 200$. In the middle of 2020 *before* the mining apocalipse you could buy a RX 580 8gb for 160€ new. A 1050ti for less than 150€, many other options for decent prices.



Easy killer. Throwing out accusations and assumptions left and right. Where is your monitor? Where is your keyboard and mouse? A 12100F, those really don't exist where is the stock.

5500 XT and 5700 XT were both midrange cards, AMD did not have an entry level, high end, or enthusiast card at that time.

It was a combination of mining and covid.


----------



## arni-gx (Jan 23, 2022)

i think asus should released radeon rx 6500 xt with 4gb or 8gb vram 128bit......


----------



## AusWolf (Jan 23, 2022)

Ravenas said:


> This is the only generation in at least ~15 years AMD has had entry, mid, high, and enthusiast GPUs. Nvidia has had that for at least the last 3 generations.


Well, kind of no. The last entry-level nvidia card released is the GT 1030 which is 2 generations old (3 if you count the 16-series a separate generation). AMD had the RX 550.


----------



## Cutechri (Jan 24, 2022)

I can't believe this thing was even released. The RX 6000 series looked like such a fantastic generation from AMD finally trading blows with Nvidia at the high end, and then there's this cashgrab.


----------



## Mussels (Jan 25, 2022)

Why does ASUS think this model is worth double the cheapest competing card?


----------



## Assimilator (Jan 25, 2022)

Cutechri said:


> I can't believe this thing was even released. The RX 6000 series looked like such a fantastic generation from AMD finally trading blows with Nvidia at the high end, and then there's this cashgrab.


It's not that it's a cash-grab (all cards released over the past 2 years have been), it's that it's such an obviously offensive cash-grab. At least the other cards that have been released can somewhat justify their existence in terms of performance and/or features; this is literally half of an APU that's been deliberately re-engineered into a desktop card purely to take advantage of the lack of supply in the channel. It's consuming manufacturing capacity that could be used for those other, better cards and in the same breath, contributing to said lack of supply. It's AMD taking advantage of the manufacturing capacity shortage to shaft their customers, who are already tired of being shafted. It's the epitome of tone-deaf anti-consumer exploitation, and it makes it very obvious that there are no such thing as "good guys" when it comes to companies and making profits.



Mussels said:


> Why does ASUS think this model is worth double the cheapest competing card?
> 
> View attachment 233846


Because ASUS is managed by alcoholic simians who continue to labour under the impression that their brand name somehow makes their products worth double. This is not helped by the legions of morons who continue to buy ASUS products because they believe said management.


----------



## Cutechri (Jan 25, 2022)

Personally I don't buy ASUS products because of the brand but because they haven't failed me so far, can't say the same about Gigabyte


----------



## Assimilator (Jan 26, 2022)

Cutechri said:


> Personally I don't buy ASUS products because of the brand but because they haven't failed me so far, can't say the same about Gigabyte


I've had good and bad experiences with Gigabyte, but overall - especially lately - their boards seem to be pretty solid. (PSUs on the other hand, yikes...)


----------



## Cutechri (Jan 26, 2022)

Their boards is exactly what I'll avoid from now on. Bought B550 Aorus Pro, died after changing one DRAM timing. Replacement died randomly while I was idling, BIOS corrupted. Third replacement DOA. At that point I just gave in and bought a ROG Strix B550-F, best board I've owned so far.


----------

