- Joined
- Jul 1, 2022
- Messages
- 49 (0.06/day)
Processor | 12th Gen i7-12700KF Stock |
---|---|
Motherboard | MSI B660M Pro M-A WIFI |
Memory | 4x16GB DDR4 3200 CL16 @ 3466 CL16 |
Video Card(s) | NVIDIA RTX 4070 SUPER |
Storage | 4x HDDs, 2x SATA SSDs, 2x NVME SSDs. |
Display(s) | 1920x1080 180Hz AOC, 1920x1080 60 Hz BenQ. |
That's why the best time to buy a GPU is two years after a console launch, then keep said GPU for 6+ years.
This has been true for awhile. 2 years after the xbox came out, we got the 9800 pro/XT that worked well for years afterwards. two years after the PS3 we got the 9800 series/ radeon 3000 series, again working for most of those consoles lifecycles. Two years after the PS4 we got the GTX 900 series. And two years after the PS5 we are getting ada/rDNA3.
Agreed with your 1st statement.
But with regards to the NVIDIA 9800/ATI 3000 series, those DX10 cards had really horrible longevity, DX10 (launched with Vista in 2006) was a transitional phase that most game devs ignored (they just kept making DX9 games instead with some providing DX10 executables, no one made DX10 exclusive titles that didn't also have DX9 fallback renderers) before the industry moved rapidly to DX11 only 3 years later in 2009 when Windows 7 released, once that happened, those DX10 only cards became obsolete rapidly.
DX11 would become the longest running API and is STILL used in games to this day, over 14 years after introduction, so those 1st gen Radeon 5800/Nvidia 4xx series cards had arguably the longest lifespan of them all in terms of feature set compatibility.