Non-Gamers might need more than IGPs, if you want smooth HD play back you need at lease a 4350 and 4550 will be better.
Go watch some really 1080p movies and you gona love the shutter on the IGP.
You don't need a graphics card to play H264 1080p movies! In what year do you live? 2007?
I own an Acer Ferrari One. It has an Athlon Neo L310 (old 65nm dual core A64 @
1.2GHz) and a HD3200 IGP, underclocked to 380MHz, which uses the old UVD 1 engine.
It plays every 1080p movie I throw at it, with no stutter whatsoever.
My HTPC, with a desktop HD3200 IGP, played Blu-Rays with no stuttering at all.
Even the single core Atom is capable of flawless 1080p playback with the ION IGP.
Nowadays, a discrete card is useless for most non-gamers. They (HD3200/4200 and Geforce 8300/9300) have everything needed for a full computing experience: more than suficient 3D acceleration for OS gimmicks, complete video acceleration, two monitor output, HDMI sound passthrough, etc.
That's why dirt low-end graphics cards are gradually ceasing to exist. There's no market for those in new PCs.
This card is more like a general purpose card that can play games, and by playing it doesn't mean its gona be smooth.
Sure you can play a game @30FPS but it just won't be a decent experience.
Anything below a 5750 don't really cut it.
Do you have the slightest idea of the average PC gamer rig?
First of all, most PCs nowadays are notebooks, not desktops. The best GPU you find in mid-range laptops right now is the HD4650 with DDR3. Before that was out and popular (half a year ago), it was the 32sp 9600M GT.
You buy a high-end laptop and you get a Geforce 260M GTX and a 280M GTX. Those are G92b chips corresponding to underclocked 9800GT and 9800GTX.
This card should have a performance around a 9800GT. This means
the HD5670 will be a lot faster than most PC gaming rigs when it comes out.
And even desktops, pay attention to the next hardware survey from Steam and you'll know how far behind most people are from
DX10. Or take a visit to the GameSpot forums. You find many
REAL gamers who buy new games every week and play them all, using outdated hardware. They only change their hardware when the current one refuses to run the recent games.
Developers know this too. They won't marginalize 95% of their customers in a few years.
Call of Duty 4 is a joke, why buy a new DX11 card to play DX9 games that are more than 2 years old by now?
Go get Dirt2 and see how this card holds up with games "3 years later".
The pricing will be the key, and we have yet to see how this card performs in real games.
BTW I can play some shitty old games on my IGP, so do I now have a "gamer IGP"?
First off, you seem to have some issues with older games. Just because the game is over a year, you call it "shitty old" and a "joke"? Now that's a joke itself.
COD4 is a "joke" that still sells hundreds of copies worldwide and is used for hundreds of gaming tournaments. I'm replaying KOTOR, with graphic settings maxed out, in my subnotebook. It's not a shitty game. It's one of the best games I ever played.
How will this card perform in 3 years? Easy: as well as a HD2600 XT performs now, which is damn fine if you ask me. Sure, you can only crank the settings up to medium or low, but it'll play every game and still provide a satisfatory gaming experience.