duno, the x1900 range was still kicking your dear nvidias arse till the 8800gts/gtx came out, so your bias is bullshit.
and bullshit hdr+aa works great on my 1900xt/xtx card, farcry, oblivion and a slew of other games, hell my 8800gt buggers up if i try and run it with aa+hdr.......nvidias drivers have buggs hey refuse to spend the time to fix, look up the crash bug under server 2003 and x64xp, its a KNOWN BUGG since those os's came out, but nvidia hasnt fixed it, because they are to busy getting 2fps boost in crysis and a few 3dmarks so they can be king of the bench/review sites.....
as to the 600/670, i have seen a few reviews indocating that in NATIVE dx10 games/benches the r600 pulls ahead of the nvidia cards even with AA enabled, this is because ati designed the chips for dx10 and to dx10.x specs so the AA is shader based, on the other hand nvidia's 8800 line is NOT NATIVE DX10, its acctualy a dx9 card with dx10 shader support, the 8800/9800 use detocated hardware for AA as was needed for dx9 performance.
ATI's mistake here was not adding the hardware AA units AND in bliving that vista would take off and every gamer would move to it and that every game maker would move to dx10 because all the gamers moved to vista.
vista floped, and since it floped so did the dx10 native design of the r600/670 range of cards, well not really a flot on the 3800 cards since they are still selling very well really.
oh HDR+AA has NO PERFORMANCE IMPACT AT ALL on the x1900 range of cards, you can add hdr to aa or aa to hdr with no hit, google some reviews
as to the rage128, the hardware was great, the drivers for 2k sucked, the 98drivers where decent, and gave the tnt/tnt2 line of cards a run for their $, mostly due to the fact that the rage128 was native 32bit so using 32bit mode didnt have the impact that it had on nvidias cards, i had BOTH lines, the worse was the rage128 maxx edition due to drivers never maturing for the dual chip card)
Nvidia has had its flops, look at the FX line, they all suck for dx9 stuff, dispite that being their main selling point, just crap utter crap.....again i know from personal experiance.
each company has screwed up.
examples on late to market or under supplyed items.
Radeon VE: cheap design with no hardware t&l, but they never implyed it had those.
x800 was avalable BUT short supply at first.
x1800: was very late to market but at least around here could be had at msrp with ease.
2900: late to market, used to much power, ran hot, poor dx9 aa performance, and poor avivo decoding support.
38*0: same poor aa perf as above, other problems fixed.
3870x2: little late to market, to big a hit when u crank aa up.
nvidia:
geforce1 sdr/ddr: late to market by 2-3 weeks, VERY short supply, it took me camping compusa to get one.
geforce2gts/ultra: see above, they where a couple weeks late to acctualy getting cards into stores, then the supplys where FAR to low for the demmand.
geforce4 4400/4800: damn neer impossable to find those 2 models, they wher emore marketing bs then a product you could acctualy get ahold of.
Geforce5/FX: marketed on time, ran find in OLD games, but as soon as you moved to dx9 games they fell on their faces due to VERY poor design, also many fx line cards ran VERY hot, the 5800ultra was the worst design i have seen pre 8800gt stock cooler.......
7800: was in short supply at least around here, had to order one and wait your turn to get one as they came in.
7900/7950: see above.
nvidias cards tend to come into stock more offten but in smaller numbers at least around here, so if they run out of nvidia cards you gotta camp a store to get one for sure, ATI cards tend to come in stock with reasonable stock in my exp, this is since the 9600 days, the x800 line was in short supply if you wanted a xt/xt pe card, but the pro vivo was avalable aplenty and most of them flashed into xt pe cards no problem at all!!!!
the fx line was horrible, i had a few of them, the 5700 was the only one that didnt totaly tank in dx9 and even its perf was at the best BLAH compared to the ati cards in the same price range.
as to shader3 vs shader2, as you would know if you wherent a fanboi, the x800/850 can do hdr, look at halflife2 lost coast and the 2 expantions, they use sm2 hdr and it looks damn good.(lost cost's was a bit..meh but it was just an early tech demo.)
http://en.wikipedia.org/wiki/High_dynamic_range_rendering
read up,
Graphics cards which support HDRR
This is a list of graphics cards that may or can support HDRR. It is implied that because the minimum requirement for HDR rendering is Shader Model 2.0 (or in this case DirectX 9), any graphics card that supports Shader Model 2.0 can do HDR rendering. However, HDRR may greatly impact the performance of the software using it; refer to your software's recommended specifications in order to find specifications for acceptable performance
sm3/fp16 hdr requiers x1k or 6seirse nvidia cards, BUT under the 6 seirse the performance hit of HDR is such that u gotta lower res to play, and NO card pre 8800 can do hdr+aa on the nvidia side, where as the 1800/1900 and any x1k card can do hdr+aa with no aditional perf hit when u combine them
http://www.firingsquad.com/hardware/hdr_aa_ati_radeon_x1k/
Conclusion
Up to this point, most gamers have pretty much come to accept that you can’t combine HDR with AA. A lot of this is because hardware capable of taking advantage of both features just hasn’t existed until just recently, but another significant factor which can’t be understated is the performance hit that’s traditionally come from enabling both features. After all, we all saw the huge performance hit that came from turning on HDR with Far Cry a few years ago, and more recently we saw it again in Oblivion with HDR, where today’s latest high-end cards ran with frame rates that were even slower than 4xAA once HDR was enabled. Based on all this evidence, who would have thought you could combine the two and still get pretty good performance? Certainly not us.
Until today that is.
Adding 2xAA/8xAF to Far Cry running with HDR had very little effect on the Radeon cards relatively speaking. At 1600x1200 the Radeon X1900 XTX’s performance drops by just 2 fps, or a little over 4%, while the X1800 XT 512MB sees an even slimmer 2% drop off. Even the slower Radeon X1800 GTO and X1900 GT cards see only slight declines once AA is added on top of HDR in Far Cry.
Under the greater demands of Oblivion, the margins are definitely greater, but we still saw manageable frame rates; adding AA to HDR actually comes free at 1024x768 for all cards except the Radeon X1800 GTO, and keep in mind that we could easily turn down the graphics settings a little for even better performance. In our outdoors testing the Radeon X1900 XTX saw a performance dropoff of nearly 30% while the Radeon X1800 XT took at performance hit of 21% once HDR+AA was enabled. Similarly, the Radeon X1900 GT took a greater hit than the GTO.
This is probably because the GPU on the older R520 cards is already pretty bottlenecked once HDR is running in Oblivion, once AA is added, the GPU can’t bottom out much further. In the case of the R580-based cards, they’re not quite as overtaxed with just HDR running, so once HDR+AA is enabled you see performance decline significantly.
The differences between running with HDR and HDR+AA aren’t quite as significant in foliage testing simply because the foliage area is more stressful on the graphics card than the outdoors area.
Looking over the results, HDR+AA is certainly a lot more feasible on ATI’s Radeon X1K cards than we initially thought. Getting playable frame rates with HDR+AA shouldn’t be too hard as long as you keep the eye candy in check, and with older games like Far Cry you should be able to turn it all on while also running HDR+AA without any problems.
Now if we can just see what kind of performance we can expect from Crysis and Unreal Tournament 2007 once HDR+AA is turned on. Unfortunately we’ll all have to wait a little longer to see those results…
as we all know, oblivion like crysis is poorly optimized(im being kind)
but even with the perf hit, at least it works and is playable, 7950 and lower CANT do aa+fp16hrd, just not possable, they can do the same HDR the 9700-x800-x850 cards can do combine with AA because thats fp12 based(used in halflife games)
blah i went on to long, but hey, you spew bullshit, i gotta counter it.