Older drivers on the 4850, Perhaps they beefed up a couple items. Who cares about lower resolutions, that is like saying a pinto beat a Supra at a 2 ft race, it doesn't really matter unless you drive the pinto. But you are still going home with the ugly chick.
No, wait it has to be more bandwidth.......that is it. But it still needs more.
If you read the resolutions *****AND EYE CANDY**** levels used, you will see it fails at eye candy, exactly what the extra shaders are for elsewhere.
•1024 x 768, No Anti-aliasing. This is a standard resolution without demanding display settings.
•1280 x 1024, 2x Anti-aliasing. Common resolution for most smaller flatscreens today (17" - 19"). A bit of eye candy turned on in the drivers.
•1680 x 1050, 4x Anti-aliasing. Most common widescreen resolution on larger displays (19" - 22"). Very good looking driver graphics settings.
•1920 x 1200, 4x Anti-aliasing. Typical widescreen resolution for large displays (22" - 26"). Very good looking driver graphics settings.
•2560 x 1600, 4x Anti-aliasing. Highest possible resolution for commonly available displays (30"). Very good looking driver graphics settings.
I think he's meaning adding a red line to the bottom of each
card's individual bar if it's not playable. Still, though -- what's "playable?"
I remember playing Halo at 20fps with my Radeon 9200SE back in the day. It was definitely "playable," it just looked like shit!
That being said -- the power draw on this card is absolutely stunning. 40nm chips sure sip the juice.
And yes, putting a red line at the bottom of the bars of anything above 1680X1050 on performance per watt, dollar.....and for the games themselves that fal below 30FPS. Showing the card will not perform well enough at those leves to maintain smooth gameplay.