It looks to be worse than a 5770 or I am wrong?
You're wrong. Dont look with such narrow minded on raw specs
I show 80gb bandwidth on my 5770 .... yet I do wonder on the downgrade. I DOUBT the higher clock and improved architecture will permit much increase over the old cards considering the shader loss. [...] This cards sounds like a fail

.
It's not all in raw memory bandwidth. You compute with a CPU/GPu not the memory buffers

Dont you?
You cant compare kiwis vs. mangos color and shape if you're doing quality comparison. These are different architectures, and theres huge improvement betwen HD4870 and HD5700 series when the latter achieved almost same results on half-width memory bus. Do you remember?
I agree it's a fail when you look at it's HIGH PRICE but there should be huge room for price drops on this cards. Even on 75USD for HD7750 and 90USD for HD7700 long before EOL DAMN would make a considerable profit. (And they're MSRPed 110USD and 160USD respectively)
It's nice to know that two-an-half month later HD 7770 are priced at more reasonable 140USD but still at least 30USD too much. I hope that in next two-and-half month they'll cut them to normal price of 110USD
this will have 640 GCN cores while 6770 and 5770 used the old vliw5 which had 10cu's aswell (except vliw5 has 80 stream processors per cu totaling 800 while vliw4 and gcn has 64 yet perform better than the vliw5's 80)
Different computing approach has nothing to do with raw memory bandwidth during gameplay, but improved texture compression does
GCN should proof itself better only in easier code optimizations than pretty WaferIncognita VLIW5 even for ATi which developed it and most of their engineers are now exported with benefits from new DAMNs conglomerate. So ATis VLIW5 and code optimizations doesnt play well with each other in the same universe. I still believe VLIW5 is far better approach just it nede far more time for properly implement it. They had crapstart with its firs R600 implementation and ever since they were doing bugfixes ... sucessfully i might add in HD3800/HD4800/HD5800 series
And then they skip back to HD6900series and VLIW4, so it might be another reason why they return back to X900 branding for their GPUs based on high-end chips .... preparing for new NON-BUG-BOTHERED GPU implementation ... GCN (fancy name for 15yrs old SIMD)