• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Do you want AMD to make better low-end graphics cards?

Do you want from AMD to make better low-end graphics cards?


  • Total voters
    84
Well, the best prices we had seen so far were back during the 2008-2009 recession - high-end Radeon HD 4890 could have been taken for only $195 which is the best price for a GFX ever.
Yep and then after that the 7950 was $249 years later. People really do forget that before the 1080TI there was no such thing as a $1000 GPU for the masses. Even the Vega 64 was supposed to be a certain price but Mining killed that. ymhdis is right because Polaris did not just sell well but influenced mind share positively for AMD as it was inexpensive but good. It was a double edged sword because by the time AMD responded with 5000 series GPUs they were 1/2 the power of Vega but really no faster so the Mind share influencers put shade on it. Then Nvidia launched the 2080TI and every single media outlet made it into a Unicorn which justified the price to many. By the time the first 6000 gets released it is faster and cheaper across the stack than the 2080TI but Ray Tracing and DLSS come in focus and the word rasterization replaced Gaming even though there were not even 10 Games with those features at the time. I do feel that AMD's biggest mistake was following Nvidia in killing Multi GPU though as them doing it at the Driver level was genius.
 
I think it really might be as simple as plain ol market correction. One of the reasons these meager base cards were introduced was to try to get something into the hands of people who needed a GPU to even game. Keep in mind most of the pre-7000 Zen lineup lacked the ability to run a display, so AMD was certainly going to feel the desperation to get a GPU out that miners and scalpers weren’t going to gobble up.

Now that GPU mining has dried up, we’re left with a large volume of used and unsold new inventory. It’s telling that the next gen GPUs only launched with a handful of high end cards. It’s going to be a while before things readjust, I’m afraid. Right now you can get a used Polaris for under $100 and a 5600XT for under $150. I kinda wonder if we’re looking at a rebadge job coming up, or if we’re really just going to see limited lineup launches and generational overlap as a new normal.
 
So because it needs a 6 Pin as it pulls 100 watts is too much? Even a 450 WATT PSU is fine with any CPU and 6500XT. The mistake AMD made was calling it the 6500XT. It should just have been a 6500 with 4GB and then the 6500XT should have had 8 GB but it would have been able to mine crypto so either or.

The best budget card is the 6600M and you can even get some without the 6 pin so PCie power is there if you want it but you will be losing performance but it is still the best price/performance card right now period.

The last 2 1/2 years were unlike any other in the history of PC though you have to acknowledge that.


If you want a Byiski block you can pay $250 from a US company or pay $100 for the same block from Aliexpress. I know what you mean but is it any different than Walmart, Amazon Marketplace and Newegg 3 rd Party? I once ordered a 1920X CPU from Walmart. It took 4 months to arrive and were plastic pearls. They would all be from China as China is where most laptops are manufactured and therefore would be able to do that and China has no rules when it comes to Corporate laws. I doubt AMD wants to have the 6600M become a budget card that is made world wide as it would make the 6400/6500XT indeed useless as some are saying.
No, no, it isn't the slow shipping or oddball company names (by themselves) it's the paranoid assumptions people make with electronics built in China. Silly? yes, but that's largely why the 6600m won't catch on here. It won't slow me down one bit, I have no problem saving 50-70% on gear from aliexpress (sorry ppcs) I hear it's a worldwide market afterall lol.
 
One must always remember, that even the high end card costs any company around one tenth of the price at which they are sold (or 1/5 if you count R&D), so there is massive thievery going on as it is; to add insult to injury, they offer POS tiny cards with 64-bit memory that are useless. It all started with GeForce FX 5200...
Of course I'm all for the what the title says.
 
It gave you the GTX 1630. Enjoy.

You can enjoy not just the GTX 1630, but also the RTX 4090 and RTX 4080 16GB. Unfortunately the RTX 4080 12GB was canceled. 3 out of 4.

Nvidia started moving prices up, after AMD introduced FM2. Probably started seeing where things where going when AMD announced FM1. So, 10 years ago they where giving us the first Titan, a professional, gaming, anyway semi professional card, that it was the first real try from Nvidia to educate consumers to start accepting the $999 price point as an accepted price point for a gaming card. Sorry, semi pro card. Nvidia of course didn't move all it's high end models at the $1000 price point, but it did started differenciating itself from the sub $100 price point. AMD's inability to compete most of the times was making Nvidia's life easier and the latest mining craze, combined with the huge demand and scalping thanks also to the pandemic, gave Nvidia the chance to totally abandon the sub $200 market and strengthen it's image as a premium brand. People asking for Nvidia graphics cards lower than $250 today, they where only have the option of an older architecture lacking many features in the form of 1600 series and people not wanting to pay more than $100, could enjoy the GT 1030 (or a GT 730, a GT 710 and why not, a marvelous G210). With the latest GTX 1630, Nvidia made it clear. If you want a 3D card from them, get ready to pay over $200, maybe over $300 dollars.
Nvidia dropped the ball because that was a business decision. It wasn't some mistake they did.

It makes sense when the problem is not really the low end market, but the low end market is just an excuse.
You completely missed the point. Ngreedia dropped the ball in that as a business, their sole reason for existence being to make money for their shareholders. Hung the most lucrative segment of said business out to dry when they chose to repeat the same mistake again and again, year after year. Making no attempt to reinvigorate a market segment that was stagnating and neglected. Missing opportunity after opportunity to cash in. They chose margins over customers by trying to force everyone to spend more per card vs building the cards consumers wanted. This thread speaks to that clearly enough I'd say.
 
Comparing between gens is hardly fair;
I do see your point, nonetheless.

We shall see what the low-end of Radeon 7000 (and nVidia 4000) offers, but as-is I am fairly impressed:
My 4GB 6500XT reliably benches faster than my 8GB RX580.
That's a current generation '1 step above entry-level' card beating/meeting a 2-3 gen old 'high-end' card. That's acceptable IMHO.
 
You completely missed the point. Ngreedia dropped the ball in that as a business, their sole reason for existence being to make money for their shareholders. Hung the most lucrative segment of said business out to dry when they chose to repeat the same mistake again and again, year after year. Making no attempt to reinvigorate a market segment that was stagnating and neglected. Missing opportunity after opportunity to cash in. They chose margins over customers by trying to force everyone to spend more per card vs building the cards consumers wanted. This thread speaks to that clearly enough I'd say.
No I haven't. Nvidia changed strategy to try to avoid being a future Creative. Creative was the number one brand in mind in late 90s, when someone was out there buying an audio card. And they are now a company that most people totally ignore, don't even know it exists, because onboard audio became more than good enough. On the contrary to audio, Nvidia can't be a cheap onboard graphics option, because Intel and AMD already offer integrated graphics. It also doesn't have X86 license to build it's own platform. So the only way to avoid going into nothingness when integrated video becomes (more than) good enough for the majority of consumers out there, was to innovate, differentiate through proprietary features and become a premium brand. They done that but that wasn't obviously enough. So they used their strength, marketing, brand recognition, loyalty of their fans and customers to move prices up. And they continue moving prices up. AMD will happily follow, to improve it's profit margins, because it needs money to fight two fronts, Intel will also follow, because let's not forget that Intel is also a premium brand.
The low end market was the fat cow 15 years ago. Then AMD came up with it's AM1 and FM1 and FM2 platforms, Intel integrated graphics in almost all of it's CPUs, with laughable 3D graphics but very good media capabilities, instead of some of their chipsets, usually used in microATX motherboards, so the necessity of a discrete graphics card, just gone away. CPUs became multicore, they could playback almost any media or online video, even by using software instead of hardware decoding, so there wasn't really enough market out there for little graphics cards. So, AMD and Nvidia stopped producing new products, so anything out there that is not a GT 1030 or an RX 550, is just ancient garbage. But still good enough to send picture to the monitor. Damn I am even using an HD 5670 in what I call "Media PC". A have the worst card in that system and it's more than enough for that system and what it does.
That low end market will never become Nvidia's again. Just look at laptops. Ryzen 6000 series made Nvidia's MX line look pathetic. An integrated solution made the cheap Nvidia series DOA. Nvidia still sales MX cards, but in most cases the question is "Why?". They are unnecessary in most of those systems.
Nvidia is pushing prices higher to make up for the loss of that low end market. And fortunately for them, while I hate Huang's business practices, he does know how to make future GPUs even more important and more valuable than before. Without him, Nvidia would have been bought for peanuts, probably from Intel or Samsung, years ago.
 
Last edited:
Yes. But I also want them to revert to the old control panel or at least redesign the current one.

And please, please stop adding X letters to make the cards look better, it's dumb. Next xRXx x8900x xXx_XT_xXx.
 
Yes. But I also want them to revert to the old control panel or at least redesign the current one.

And please, please stop adding X letters to make the cards look better, it's dumb. Next xRXx x8900x xXx_XT_xXx.
What don't you like about it?
 
Yes. But I also want them to revert to the old control panel or at least redesign the current one.

And please, please stop adding X letters to make the cards look better, it's dumb. Next xRXx x8900x xXx_XT_xXx.

I see the XTX as a brand revival, unless memory fails me, the moniker hasn't been used in a retail name since X1K series in the X1950 XTX.

They've used old names to draw excitement from long time fans before, such as the R9 Fury which borrowed its name from the ATI Rage Fury, hottest gaming graphics board of 1999. 250 nm of raw power, that.

I just haven't seen them reuse the Maxx branding yet but I'm sure that it will happen around the time they develop the first dual GCD model, which probably won't be until at least RDNA 4.
 
What don't you like about it?

Because there is a problem with the tongue and pronunciation.
You have to say ei-em-di-reidion-seven-thousand-nine-hundred-eks-ti-eks...

Isn't it better if AMD invents a new model numbering: AMD R4K U1 (simply and shorter ei-em-di-a:-four-kei-iu:-uan) - the new name for RX 7900 XTX.

Sony uses it for its smartphones: Sony Xperia 10 I, II, III, etc. 5 I, II, III, etc. 1 I, II, III, IV, etc.
 
Because there is a problem with the tongue and pronunciation.
You have to say ei-em-di-reidion-seven-thousand-nine-hundred-eks-ti-eks...

Isn't it better if AMD invents a new model numbering: AMD R4K U1 (simply and shorter ei-em-di-a:-four-kei-iu:-uan) - the new name for RX 7900 XTX.

Sony uses it for its smartphones: Sony Xperia 10 I, II, III, etc. 5 I, II, III, etc. 1 I, II, III, IV, etc.
I should have clarified but what I meant was the Control Panel.
 
I should have clarified but what I meant was the Control Panel.

Meh, that's even worse. The settings tab is simply a continuation of Windows own basic window, not good for a UI for GFX.
 
I'm going to be controversial, I think, but regarding AMD, the mid to low-range cards got lost once the uArch was kept way behind the hi-range, imo. (or, around the HD7xxx era)
And that was a path predictable for APUs but not necessarily in discrete cards.
Sometimes I figure the unusual approach of getting mobile parts and just rebranding them as discrete cards would work better in a few cases.

VLIW5 was used to oblivion and by the time it was shelved, GCN2 was used solely for APUs when GCN4 was already a thing.
From then onward there was no product made from GCN4 that could, from my understanding, yield an interesting cut-down chip that could respect a 75W TDP and acceptable performance.
Maybe from Polaris 30 that could output a configuration that was not the 560 (P21) - which was disappointing - rather something closer to a 570, but this happening in late 2017 or early 2018.
Right now, I'm on the high suspicion that a 7nm GCN5 die in the form of a cut-down Vega20 could, in theory, make a good 75W card but as said earlier, must not be cost-efficient to manufacture.
 
It's better if they make something on 12 nm with the maximum theoretical (or close to it?) transistor density but RDNA 3 with maximum media package and connectivity options.
The low end has to have more features, while the ultra-high-end more raw brutal performance. It was like that in the past.
 
If it's TMSC, it could be 10nm even, but looking at RDNA2 as it is configured, the maximum theoretical transistor density would end up being less performant than the RX6400, which is 6nm and already a 12CU config eating-up 50W (or thereabouts). GCN5, albeit older, could be squeezed further to reach what you're wiritng about, imo.
Now, the likes of this one I doubt we will ever see it again (updated to modern standards, or having video capture ICs that offload it from the GPU):
1667945010616.png
 
It's better if they make something on 12 nm with the maximum theoretical (or close to it?) transistor density but RDNA 3 with maximum media package and connectivity options.
The low end has to have more features, while the ultra-high-end more raw brutal performance. It was like that in the past.
I was thinking basically the same thing. RDNA3 or Ada on a cheaper process node. Limit it to whatever you can fit into 75 watts. Oh, and don't neuter it with a 4x PCIe bus. Make sure older machines can fully utilize it too. Just a decent entry level card that doesn't cost more than $200+.
 
  • Like
Reactions: ARF
Meh, that's even worse. The settings tab is simply a continuation of Windows own basic window, not good for a UI for GFX.
If you are saying that you have not installed Adreniline software. I don't see any correlation between Windows video settings vs Adrenline. You might be talking about installing the driver without the software though I don't know why anyone would do that. The GPU settings in Adrenline more refined than Afterburner.
 
If you are saying that you have not installed Adreniline software. I don't see any correlation between Windows video settings vs Adrenline. You might be talking about installing the driver without the software though I don't know why anyone would do that. The GPU settings in Adrenline more refined than Afterburner.
It's way better than this GeForce Experience garbage I have to deal with now. I really miss Adrenaline.
 
It's way better than this GeForce Experience garbage I have to deal with now. I really miss Adrenaline.
Yeah I have 3060 (laptop) and was very underwhelmed with the GUI. It looked no different than when I had a GTS450 really. At least with Adrenilne you can see the updates like being able to OC the CPU.
 
What don't you like about it?
I should have clarified but what I meant was the Control Panel.
It's a mess. And too much bloat you can't skip installing, well, you can, but then you can't make use of the good features, like tuning the clocks and voltages.

And the "PRO" control panel is basically the same but in blue. AMD has to realise not only gamers buy their graphics cards.
 
No I haven't. Nvidia changed strategy to try to avoid being a future Creative. Creative was the number one brand in mind in late 90s, when someone was out there buying an audio card. And they are now a company that most people totally ignore, don't even know it exists, because onboard audio became more than good enough. On the contrary to audio, Nvidia can't be a cheap onboard graphics option, because Intel and AMD already offer integrated graphics. It also doesn't have X86 license to build it's own platform. So the only way to avoid going into nothingness when integrated video becomes (more than) good enough for the majority of consumers out there, was to innovate, differentiate through proprietary features and become a premium brand. They done that but that wasn't obviously enough. So they used their strength, marketing, brand recognition, loyalty of their fans and customers to move prices up. And they continue moving prices up. AMD will happily follow, to improve it's profit margins, because it needs money to fight two fronts, Intel will also follow, because let's not forget that Intel is also a premium brand.
The low end market was the fat cow 15 years ago. Then AMD came up with it's AM1 and FM1 and FM2 platforms, Intel integrated graphics in almost all of it's CPUs, with laughable 3D graphics but very good media capabilities, instead of some of their chipsets, usually used in microATX motherboards, so the necessity of a discrete graphics card, just gone away. CPUs became multicore, they could playback almost any media or online video, even by using software instead of hardware decoding, so there wasn't really enough market out there for little graphics cards. So, AMD and Nvidia stopped producing new products, so anything out there that is not a GT 1030 or an RX 550, is just ancient garbage. But still good enough to send picture to the monitor. Damn I am even using an HD 5670 in what I call "Media PC". A have the worst card in that system and it's more than enough for that system and what it does.
That low end market will never become Nvidia's again. Just look at laptops. Ryzen 6000 series made Nvidia's MX line look pathetic. An integrated solution made the cheap Nvidia series DOA. Nvidia still sales MX cards, but in most cases the question is "Why?". They are unnecessary in most of those systems.
Nvidia is pushing prices higher to make up for the loss of that low end market. And fortunately for them, while I hate Huang's business practices, he does know how to make future GPUs even more important and more valuable than before. Without him, Nvidia would have been bought for peanuts, probably from Intel or Samsung, years ago.
Comparing creative and nvidia -sigh- . An apologist, joy.
At no point in NVs history were they ever remotely in danger of becoming obsolete. With or without the switch to the greedy margin based sales practices. They have always been healthy as a hog in every business sense. Saying they were ever in danger of becoming the next creative is hilarious. Creative died because they couldn't compete with high quality integrated audio and your saying that was the road ngreedia was going down...sure. Terrible analogy.
Integrated graphics always have been and generally speaking, still are trash. I wonder why AMD and Intel are scrambling to fill the gap with decent onboard options after what, 20 years of Intels junk? There's a massive river of cash flow just waiting to be tapped. Anyone that owns, or is planning on buying a halfway decent pc wants it paired with quality graphics. They want every aspect of their pc experience to be satisfying and integrated, rarely fills that role.
Comparing anything to an mx is just sad btw. Not sure why you felt that was relevant? Advancements are naturally going to outperform old tech.
Integrated didn't cause anything. It's simply what's being used to try and fill the hole that's been left in the very bottom of the market. There's another level of performance that's being ignored. That's where the aforementioned 6600m type of value and performance fits in perfectly. But both AMD and NV are to busy hyping to take advantage. It's a big ass niche called value. The forgotten market that NV and AMD are trying to squeeze out of existence.
 
I voted yes, but really the APUs are good enough for light gaming. Might as well get a midrange card if you really need more than that. I'm interested to see what the next gen APUs with DDR5 can do with the extra bandwidth.
 
I was thinking basically the same thing. RDNA3 or Ada on a cheaper process node. Limit it to whatever you can fit into 75 watts. Oh, and don't neuter it with a 4x PCIe bus. Make sure older machines can fully utilize it too. Just a decent entry level card that doesn't cost more than $200+.
Like I mentioned before, the RX 6600 is that card if you have something better than a 250 W PSU.
 
Yes, if they're priced right. Not everyone needs raytracing 8k realistic nudes at 240fps. Most of my friends play at 1080p and are staying there for a long while still, and don't need anything special.
I am considering getting a 6600xt. Next Gen cards look expensive. For 1080p this should be fine. 2.6x better performance for under $275. I could do worse.
 
Back
Top