right now the 8800GTX might be about twice as fast as a X1950XTX but at launch was it . . . . no, in some games it was close but in most, no.
had both and it wasn't then killed the GTX overclocking,
It was twice as fast at launch, but you needed a Core2 to unleash all of it's potential. Here's a pre-realease review (it got even better after launch and new drivers):
http://techreport.com/articles.x/11211/14
At 1600x1200 4xAA, common resolution at the time, it was twice as fast in Oblivion, GRAW and Quake 4, the most demanding games of that time, and in HL2:Ep1 is 80% faster, %50 in FEAR. A very different picture than what we see with HD5870. The thing was even more notable when we factored in the price at that time, the $400 8800 GTS was significantly faster than the $500 X1950XTX. Not much lower priced XT wasn't either a match in perf/price, neither Nvidia's 79xx cards. Nothing could touch the 8800 in price/performance and offer same kind of performance at the time. Now, GTX275/HD4890/GTX260/HD4870 literally destroy the HD58xx cards in perf/price and all of them. The 8800 competed with $300+ cards, HD58xx have to compete even with $150 cards, there's no contest between 8800 and HD58xx in that regards.
EDIT: Important to note is also that in the 8800 review they are using a stock clocked Core2, so there was still some room from improvement, while Wizzard uses a heavily overclocked i7 and you can't find anything faster right now, nor you will find in quite some time.
Well, I wouldn't say he's baseless at all. While his ranting style can get wearisome, he's usually (but not always) quite accurate.
For example, he did us all a service, by exposing the bumpgate crap that nvidia was trying to hide and blame on everyone else - he wrote many many articles on that and made sure that nvidia couldn't sweep the problem under the carpet until everyone forgot about it.
That article on The Inquirer where he exposed the dodgy video chips by cutting up a brand spanking new Apple notebook and looking at the chip with an electron microscope was an awesome bit of journalism - someone had to dig deep into their pockets to buy the laptop to cut up and hire the microscope. I haven't seen an article like that anywhere else.
nvidia had totally denied this problem on the Macs until he exposed it and nvidia as liars. Kudos Charlie.
Yeah and that's the only thing he has ever been right about, and it wasn't even happening what he said it was happening, he just found something that was wrong and then made up the rest as usually does to portray something that wasn't true at all, as if it was the end of the world. He and every person that believes him come up with that "article" to say that he is most of the times right. One "article" doesn't make you be right "almost always", it makes you right
once.
Some of his contributions:
- less than 30% yields for GT200 --- FALSE
- GT200 will be late --- FALSE
- Nvidia can't make GTX cards below $300 --- FALSE
- GT200 will probably be slower than VR770 --- FALSE
- Nvidia can't make dual GT200 card even at 55nm --- FALSE
Wait, I think the tape is finished, turn it over.
- less than 20% yields for GT300 --- FALSE
- 2% yields for GT300 --- FALSE
- GT300 will be a flop couse it's MIMD --- ?
- GT300 will not be able to compete with evergreen --- ?