Tuesday, June 22nd 2021
Ten Years in, AMD to End Support for Radeon HD 7000, R200, R300 and Fury GCN Graphics Cards
AMD is ending support for the Radeon HD 7000 series, R200 series, R300 series, and R9 Fury series graphics cards, based on the oldest versions of the Graphics CoreNext architecture. The HD 7000 series debuted in 2011, R9 200 series in 2013, with the R9 300 series essentially being rebadged. The R9 Fury series joined the ranks in 2015. This would make the Radeon 21.5.2 the final drivers from these graphics cards, giving AMD the opportunity to clean-break its drivers from the RX 400 series "Polaris" and forward. A conclusion of driver support would mean that upcoming driver releases, including the 21.6.1 drivers released today, lack support for GPUs older than the RX 400 series. Should AMD encounter glaring security flaws with its drivers, it can, in the future, release special driver updates.
Sources:
AMD, VideoCardz
99 Comments on Ten Years in, AMD to End Support for Radeon HD 7000, R200, R300 and Fury GCN Graphics Cards
As for the drivers, it's been a few years since I installed them - the card is 9yo, there's no benefit from having the freshest drivers anymore, but I'd run a risk of dropping OC capabilities (freq. cap etc.).
I myself had an HD6950 for the longest time, but had to replace it due to it also no longer being supported and giving graphical errors in The Division.
Another thing: a bunch of Athlon and A-series APUs IGPs are also dropped
I like the cooler design of those cards, too.
I've also got the legendary GTX 8800 and GTX 8800 Ultra. Epic cards.
At least that's how I take it.
What's the point? You can use an older driver if you want to use the card.
You people are unbelievable. You want new technologies, features, new fast cards that can run those with a decent FPS (144hz monitors 144FPS and up) and yet you want the driver to support a 6 year old cards that would run all those features and new games (freakin' shit load of games have new engines with so many features and improvements).
I'm not worried that my card will be EoL in 3 year time. I'm really not worried about it. But, if a company wants to push boundaries for new tech and new features with new games which are getting more demanding, you can't cling to an old card's architecture expecting miracles for them to handle these new games and technologies.
You have RT now. You think within a 2 years time a 2060 (even with DLSS 2.0 on) will run a newest game at 1080p with RT on with it's 6GB of Vram? If it does how many FPS you think you will get?
So what's the point for supporting a 2060 4 years from now, if that card won't be able to run any modern game with a decent 60 FPS at least? It costs the company a lot of time and money to support something that wont be able to run the newest games anyway so why bother?
You don't have to agree but that's the way progress and pushing boundaries for new graphics technologies in games is going to be either you like it or not. You can't rely on an old architectures and support for those to make them run faster because that's not what progress and/or advancement is.
Beyond that is untenable, no support for new tech, no enhanced features etc, beyond a point your wasting power unnecessarily by using outdated old tech.
And if you think any company is going to miss people who buy every decade going to the competition, you are confused.
Calm the butt hurt, this is the way, and it has been the way a long time.
It just means that in the future some game might have some poor communication with the card and while a driver could fix it, it wont happen as support for it is dropped.
again in my first post, hence I had to replace my HD6950.