@W1zzard: any chance you could show this card version of this chart?
http://tpucdn.com/reviews/AMD/R9_290X/images/analysis_quiet.gif
Would like a better sense of the throttling amount this card has, if any.
Anybody ever did this test or review with Ultra Low Power State (ULPS) set to 0 in the registry? Personally, I think down-throttling is due to the fact that the card doesn't need to push higher Core Frequencies or GPU Loads to do the same level of work.
In CrossfireX, in a lot of current games, the 1st GPU won't push full GPU Load, or Core Frequency because it isn't necessary to get the same amount of output needed to finish drawing frames. In other games that are more intensive, the GPUs will push 100% at full stock frequencies on both GPUs at 95 degs C tops.
Disabling ULPS would probably prove whether people are making a big deal about the throttling anomalies, or there are merits to back up the claims. I'm leaning towards the possibility that it's just NVidia consumers blowing something simple and insignificant, out of proportion...
About GTX 780 Ti. Nice card, but in some ways, I feel as if NVidia kicked it's consumers in the balls when they bought GTX Titan for $1069 and $1099 for partial 2880 Cuda Cores, and K6000 with a whopping Price Tag of almost $4000.00 for 6 GBs VRam less, 64bit floating precision, and the small minor additions that come with Workstation Cards.
RX9-290x still has a higher Max Consumption Wattage over 300 Watts, but GTX 780 Ti was only 10 watts different full load in comparison to the RX9-290x. Max Temps are less than 10 degs C difference. RX9-290x is 95 degs c on full load, GTX 780 Ti is 89 degs on full load?... I only took a glimpse of the numbers. For games optimized for AMD, there's only a 2 to 7 FPS difference, and for games that are optimized for NVidia, there's only a 10 to 20 fps difference. Still, in theory, GTX 780 Ti is suppose to be a theoretical 15% performance increase, but it seems like GTX Titan inches closer to GTX 780 Ti on resolutions higher than 1600p. For $699.99, I am thinking more of like $749.99 on Newegg because they need to make a profit, is probably what you're looking to pay on the first day of release. I remember when GTX 680s first hit the shelves, Newegg jacked the price up from $599.99 to $699.99. Well the price was somewhere in between those figures... Also, to take into account, with that price tag, NVidia users are getting DX11.2 full support. Ya... If you look at the "nitty gritty", NVidia consumers aren't getting a whole lot more back. Just a GTX 780 Refresh with full Titan Cores, and some additional perks... Like the Frame Time Variance Graphs on the GTX 780 Ti. Curve Band looks smaller, and the minimum extreme is a little lower than RX9-290x on a single card setup. This is something that should have been seen back in GTX 680, 690, Titan, and 780.... Lower frame times equate to higher fps, and small frame time bands equates to less deviation or stalling. AMD is now better at Multi-GPU setups because Scaling on the new PCIe Based CrossfireX is roughly from 1.8 to 2.0x the FPS. NVidia is now, again, the better single card/GPU solution. Seems like AMD and NVidia are playing musical chairs by switching between these two factors.
One other thing. GTX 780 Ti OC's past 1100 Mhz. Asus ROG Ares II has a turbo clock of 1100 Mhz, dual GPU solution, and it can actually OC past 1200 Mhz Core, 1750 Mhz Mem with an ASCI Quality of 71%... I've expected more from GTX 780 Ti. I'll be expecting more from RX9-290x with a better cooling solution to go past the 1250 Mhz to 1300 Mhz mark with a higher power envelope then it's competitors.
@ the Btarunr and W1zzard,
You should provide screen shots of the GPU-Z in your testing setup. The reason is this. You don't really state it in your write ups, but a lot of readers are under the assumption that you're testing on the PCIe 3.0 x16. On some other sites, they don't. Now there won't be a difference between PCIe 3.0 x16 and PCIe 2.0 x16 except for the bandwidth, but for your readers, I think you should just add in the screen shot to show that you're using the Graphic Card, and that it's using that PCIe interface in the test. Just something minor to consider. The only ones who use a screen shot of GPU-Z during their benches is Legitreviews.com.