Thank you! I'm of the opinion that if a reply doesn't solve the problem, there's no point in making a reply.
Yeah, it sucks, but it's better than having to do a platform upgrade. Just make sure that you run
DDU before installing and/or updating the Radeon drivers. You should also check to see if your chipset drivers are up to date as well. All AM4 chipsets use the same driver package regardless of whether you use Windows 10 or Windows 11. The current version is
5.05.16.529 <- click this link to download it directly from amd.com.
The best you could do would be the RX 6700 if you want more than 8GB:
XFX Radeon RX 6700 Speedster SWFT 10GB: $280
However, I wouldn't recommend it because the RX 6700 is only about 7% faster than the RX 6600 but costs $100 more. Sure, it has 2 extra GB of VRAM but that's not worth $100 either. Then there's the fact that the R5-5600 can't hit 165FPS in most games (I still don't get how it managed almost 250FPS in SOTTR, a game that is notoriously hard on CPUs) so paying more for a card that can would be a crap-tonne of money for almost no benefit over the RX 6600. Then there's the fact that it would use a crap-tonne more juice as well so your PSU's capabilities would then be called into question as well. The extra $100 would be better spent later on a future upgrade.
For e-sports titles, well, the RX 6600 gets over 500FPS in CS:GO with the R5-5600:
I did the same thing. Then if/when you upgrade your CPU to an R7-5800X3D, you'll be in good stead for a very long time.
What are you talking about? FHD is 1080p and is mostly immune to the 8GB problem. There are a couple of games that do have issues at 1080p ultra but not if you just turn some settings down or use lower-res textures.
If Gmr_Chick is already mostly happy with a GTX 1660 Super, then she's going to be absolutely thrilled with an RX 6600.
As nVidia fanboys have
loved to point out, you can always turn settings down so that your VRAM doesn't max out and overflow. They're 100% right about that but also completely missing the point (something that fanboys of all kinds are all-too good at). That
very pertinent point is that for the price that nVidia was demanding for 8GB cards like the RTX 3060 Ti, RTX 3070, RTX 3070 Ti and RTX 4060, the end user
shouldn't have to turn settings down with a brand-new card! However, if you were just buying an RX 6600 for only $180, knowing what it can and can't do, it wouldn't be nearly as bitter a pill to swallow, would it?
Gmr_Chick has stated, quite clearly, that she games on a 1080p 156Hz high-refresh monitor. That means she
can't raise the resolution into the VRAM danger zones of 1440p and 2160p so 99.9% of the time, 8GB will be fine for her purposes.
You mentioned "AMD's Marketing" but the public's perception isn't because of AMD marketing. Sasa Marinkovic isn't even close to being that smart. The real reason is because Steve Walton of Hardware Unboxed/Techspot, (one of the most respected benchmarkers in the world BTW) who has absolutely
nothing to do with AMD's marketing department, first discovered the problem in Hogwarts: Legacy and posted a video about it that went viral. What really annoyed nVidia fanboys about his video was the fact that RT was also crippled by having only 8GB of VRAM which only made their bad choices even worse. Now, the "but-but-but Ray Tracing!" reason for buying a GeForce card has collapsed like a house of cards in the wind. You like to talk about marketing, but it was nVidia's marketing about RT that was the most successful at hoodwinking people.
Here's the video from "AMD Marketing" as you like to call it:
Since then, other games have had issues like
The Last of Us: Part 1:
So Steve decided to try a comparison between an 8GB GeForce RTX 3070 and a 16GB Radeon RX 6800.
The RTX 3070 got mopped and badly. :
Steve was very careful to say that 8GB was no longer enough for
high-end gaming which doesn't include 1080p.
Jedi Survivor, and
Resident Evil Remastered have also demonstrated issues with 8GB of VRAM, mostly at 1440p or higher but sometimes at 1080p (although only at high or ultra settings). Anyone who has been around for more than 10 years knew that this would happen because the same thing happened already with 1GB, 2GB, 3GB, 4GB and 6GB so it was only a matter of time. However,
AMD's marketing does suck, there's no question about that. I'm honestly shocked that Lisa Su hasn't canned that moron named Sasa Marinkovic for the crap that he has pulled. However, they had nothing to do with the idea that 8GB is a problem. The truth is that 8GB isn't a problem for someone who games at 1080p. The problem is the fact that nVidia was putting 8GB in mid-range 1440p cards and only 10GB in their high-end RTX 3080. I remember being shocked at how the RTX 3080 only had 10GB and was glad that the RX 6800 XT had 16GB because I knew that it would be cheaper than the RTX 3080 despite having the same performance. Sure enough, it
was cheaper and it ended up being cheaper by about $1000USD during the mining crisis. I
way overpaid for mine because I wanted a reference model and got one for about $700 less than a card in a store would have cost me (a non-reference card at that!).
This whole problem was engineered by nVidia, not by AMD.