Do you guy know why the GTX295 suck with 3d06?
It's just... so hard to believe...
I'm not trying to be persistent , but did you run a single card on XP yet? And the 9800GX2 didn't act this way...
I got 23k 3dmark06 with my 8800SLI, and a similar setup with 9800GX2 has a similar score.
HT doesn't work properly with muli-card setup on Windows Vista...
I noticed huge different with SLI and HT on/off in Vista, but not in XP.
Hey it's always good to ask,
I had a 9800GX2 that was paired with a E8500 OCed to aroudn 4.5Ghz and it took some tweaking to break 20k...
Also the single cards do do pretty well in 3D06 and also using SLI, the thing with all these benches is the difference between 25k and 28k
is a huge gap to tweak yourself over with these cards.
I'll run a few gtx280 runs to see how close to 28k i ca nget with sli then tri sli.
I know someone on Evga that rocked a 27k with (2) 280's in Sli so it's by far not impossible.
But the 295 jsut doesn't rock the 3D06 the way you think it would, i think all us owners of them grew to hate 3D06 after running it and seeing the semi tough results.
HT doesn't work properly with muli-card setup on Windows Vista...
I noticed huge different with SLI and HT on/off in Vista, but not in XP.[/
I found I could clock higher with HT off, which i think makes sence to me, now sure if that means it doesn't work properly or that it's more stable rolling with less cores to manage.
It also another trick that CP told me that worked on Super Pi to get a few miliseconds lower..
.