Actually there is.
A 1280x1024 or a 1366x768 screens can be powered by mid-high end graphic card for up to 200 EUR and it will last for literally years and you can play games with max possible settings no problem.
With 1920x1080 screen, you need a 300+ EUR card and you might already have problems in newer games which will have lower fps already. Yes, even 1080p will eventually get to the point of the above two resolutions, but it will take some more time.
I'm telling you this from my personal experience. I have a 1280x1024 screen and most will argue that it's too low res and too old. But i like it. Size doesn't bother me, but it just works and i can play EVERYTHING with my HD6950 at max possible settings. Most ppl were scared with Far Cry, Crysis series and latest Battlefield 3. I wasn't. I knew it would run easily with Ultra settings. And it did. With this screen i can simply watch everyone rushing for HD7970 and GTX 680 and just well, laugh. And i'll see if there will even be any need for HD8970 and GTX 680...
So, the first rule of cheap gaming, have a moderate resolution screen and you'll get through some high quality gaming much much cheaper. At the moment this resolution is 1366x769 and 1280x1024. Wide and boxed format, whatever you prefer. The biggest problem is they all rush for cheap 1080p screens and then complain how their games are slow...
I would argue that playing 1280x1024 with an HD 6950 is just not making full use of your card right now. As a comparison, I used an HD 6950 to upgrade my brother's aging Q6600 based system so he could run eyefinity in SWTOR with 3x 23" panels. Despite his system being a bit dated he's still able to play high settings and manage 50-60 fps in demanding areas. So that single GPU is pushing 3x 1080p panels and almost 5x as many pixels as your single 1280x1024 display. So that just shows how much of your GPU is sitting idle right now.
You do have a valid and legitimate point that the card will last you for several years and remain able to play some of your games at high settings. But remember the ultra settings in some games will rely on GPU technology that may be implemented in the latest version of Direct X, Direct3D, or OpenGL which your card will not support, so no you will not always be able to run ultra settings in the latest games every year despite running a lower resolution. Some games today (I believe Shogun 2 is among them) require you to run DX11 mode to play the ultra detail settings and get tessellation, where as the high detail modes would be limited to DX10. So some doing what you have done and is still running an older card from a few years ago may still be running a very powerful DX10 card and running Shogun 2 smoothly at 1280x1024, but wont be getting the full detail experience because they dont have DX11 support.
Which is why, perhaps the better option for lower resolution gaming would be to buy the cheaper mid range cards every year or so? That way you get the latest tech every time, you take advantage of the latest processing (lower power, lower heat, lower noise), you get the latest features (directx, opengl, latest media acceleration, etc).
But honestly, resolution IS king. More pixels means more clarity, you get to see more detail, you're not running so much texture compression, and you don't have to rely on a lot of post processing or AA/AF to improve image quality. At 2560x1440 (or 4360x1440 depending if I want to run my triple panels in business mode or fun mode
) I really don't even need AA enabled because you can hardly tell the difference.
TheLostSwede is correct, at 1080p you DO NOT need a $400 gpu, pretty much anything in the $100 range will do. A GTX 460 will plow through any game at 1080p with high settings.
At this point the lowest resolution device I own is my 1080p TV. My ipad runs 2048x1536, my PC runs 4360x1440 (17" portrait, 27" landscape 17" portrait).