Stream processors!? Forget them! How many ROPs?
ROP count x GPU Clock = Real power.
Used car dealer: Yeah, this 1988 Ford Escort has 225790 miles on it, and transmission is near death, and most of the undercarriage has rusted away. But it has a leather interior! (leather interior = stream processors)
Not true. Also SP=leather interior comparison is just silly. SPs aren't decorative parts by no means.
ROPs (x clock) mark the maximum Pixel fillrate, where the SPs determine the maximum transformed, textured, and shaded pixel fillrate. Somehow inbetween are the TMUs, wich are, by far, more important than ROPs. Just take a look at these examples: 6800Ultra vs. 7800GT (6400 MP/s both, Texel fillrate 6400 vs. 8000.) or 7900GS (7200 MP/s, 9000 MT/s) vs. 7900GT(7200 MP/s,10800 MT/s). Even the 7600GT (4480 MP/s, 6720 MT/s) beats the 6800Ultra on newer or complex games.
SPs are also important as I'll try to explain in this example. Let's say you want all your objects in the game to use a shader wich uses the following:
-a color texture (diffuse)
-bump mapping
-specular highlights (so some spots of the surface shine more than others)
-some kind of transparency
To achieve this 4 effects you need to perform at least 4 operations per pixel, since renderization is performed altering the pixel color itself. This is done by the SPs.
Also you need the TMUs to map the required textures, but usually (until now) the same texture is used for more than one effect. So an hipothetic SP/TMU/ROP balance for this case could be 4/2/1.
Basically you can only take Pixel Fillrate as a power measurement when you play at really high resolutions, above 1920x1200 and at the same time you don't use AA/AF because nowadays this is also done at shader level (to some extent) and SPs would become the bottleneck again.
AddSub I think I get your point though. If you mean you don't want this card to have 8 or 12 ROPs, I must agree with you. But if it has 16 (4x4, wich is my bet) at 600MHz, that makes 9600MP/s, wich is more than enough for a 64 SP mid-range card. Compare to GTS (10,000), Ultra (14,688). In order to get 1920x1200 4xAA/16xAF at 60Hz this is what you need
1920x1200x60x4x16=8,847,630,000 => 8,847 MP/s (I was never 100% sure of this math, tho)
Something that pretty much every high end card has had since GF 7 series. But you don't get 60 frames in games with that config, why? (Retorical)
Sorry for the rant. I'm not trying to teach no one or anything, just to educate those who don't know anything about this. I may be wrong in my statements also. Feel free to reply.
Bye, Dark