# More 9800 GX2 Pictures Released



## Jimmy 2004 (Feb 29, 2008)

German site Allround-PC.com has acquired a few new pictures of NVIDIA's upcoming 9800 GX2 graphics card, which is set to become the company's flagship consumer card when it's released next month. The card will have two 65nm GeForce 8800 GPUs working in SLI on a single card, with a 600MHz core, 1GHz GDDR3 memory and a 1.5GHz shader clock. The pictures are below.



 

 

 



*View at TechPowerUp Main Site*


----------



## Psychoholic (Feb 29, 2008)

> The card will have two 65nm GeForce 8800 GPUs working in SLI on a single card, with a 600MHz core, 1GHz GDDR3 memory and a 1.5GHz shader clock.



Single card?  How bout' two cards in a single package.


----------



## a111087 (Feb 29, 2008)

Psychoholic said:


> Single card?  How bout' two cards in a single package.



well, its a single video card, you can't call it 2 video cards.


----------



## Psychoholic (Feb 29, 2008)

a111087 said:


> well, its a single video card, you can't call it 2 video cards.



sure you can, take a look under the plastic casing, its the same design as the 7950GX2, the 3870 X2 is a Single card.

Not that i care, as long as it performs well and doesnt overheat, and im sure it will perform well


----------



## mdm-adph (Feb 29, 2008)

Yeah, "single card" -- that's cute.  By the way, I've got a "single RAM chip" in my computer -- it just so happens to take up four slots, don't worry about that.


----------



## validuz (Feb 29, 2008)

I don't understand anyone's fascination with multi processors in a video card. It only helps with higher resolutions. I highly doubt that card is going to make any noticeable difference from my 8800gt oc at 1280 x 1024 on crysis. If it made at least a 20% increase, I'd totally buy it, but I will bet you real money it won't make even a 5% difference. I could be wrong but it just doesn't seem realistic that it would.


----------



## a111087 (Feb 29, 2008)

mdm-adph said:


> By the way, I've got a "single RAM chip" in my computer -- it just so happens to take up four slots, don't worry about that.



you don't add ram chips, you add RAM sticks


----------



## jydie (Feb 29, 2008)

I am sure it will be extremely powerful, but that is one of the ugliest video cards I have ever seen.   

This will be more power then I need... and out of my price range... but I can't wait to see the reviews and benchmarks.


----------



## AddSub (Feb 29, 2008)

Looks like SLI-in-a-box to me.


----------



## jbizzler (Feb 29, 2008)

What happened to the optical audio out?

It's not one card. It's not one slot. What's the proper terminology, because it's clearly single something.


----------



## twicksisted (Feb 29, 2008)

well for anyone on a 24"+ LCD... thats the way to go I guess... for now....
9800X2 & 3870X2.....

Though to be honest, i finished crysis already and the only thing I play on my big screen is COD4 and that runs maxxed out just fine with my 2900pro


----------



## ShadowFold (Feb 29, 2008)

I love how they admit its two 8800 gpus and call it a 9800.


----------



## intel igent (Feb 29, 2008)

ShadowFold said:


> I love how they admit its two 8800 gpus and call it a 9800.



that and the fact they claim it is "single card"  

Nvidia makes me laugh, they're so un-original


----------



## twicksisted (Feb 29, 2008)

come on guys... i know you bashing it... but we all know you want one  hehehehehe


----------



## Corrosion (Feb 29, 2008)

Yeah should have called it 8800GX2. i wonder if it will HAVE to have an 8 pin connector. or u can use 2 6 pins like the 2900xt.


----------



## ShadowFold (Feb 29, 2008)

twicksisted said:


> come on guys... i know you bashing it... but we all know you want one  hehehehehe



I think I like my HD 3850 cause it stays under 80c, I bet that thing gets hotter than hell lol


----------



## mdm-adph (Feb 29, 2008)

a111087 said:


> you don't add ram chips, you add RAM sticks



Nope, that's how badass I am -- I add individual _chips_. 

What's sad is that, given time, nvidia could release a much better card than this (look at the ultra -- more than a year old and still fast as hell).  This is just an awful, awful hack.


----------



## intel igent (Feb 29, 2008)

twicksisted said:


> come on guys... i know you bashing it... but we all know you want one  hehehehehe



no sorry, id rather have a 3850 AGP for my P4 

ive always felt better about ATI over Nvidia in every aspect


----------



## tkpenalty (Feb 29, 2008)

Its two cards facing each other. While it does look like one of the most sleekest cards, I really think this card will be a big flop due to many reasons. 

1. PCB - The design is simply fragile....
2. Cooling, and heat from card - Shrouding the card like that isnt smart as it traps heat, even if there is a fan and a massive grille on the side, not all the heat will be able to leave the enclosure
3. Performance - There is a heavy implication on the performance as the G92 (8800GT) cores are underclocked by a large amount. This makes this card perform worse than 2x8800GT in SLi. 
4. Price - Nvidia, please make it cheaper!
5 Aftermarket cooling capabilities - Thats next to none. Unlike the simple 7950GX2's design, this.... looks hard.


----------



## [I.R.A]_FBi (Feb 29, 2008)

more epic fail ...


----------



## FR@NK (Feb 29, 2008)

I only see one SLI connector....this card cant do Tri-SLI?


----------



## ShadowFold (Feb 29, 2008)

FR@NK said:


> I only see one SLI connector....this card cant do Tri-SLI?



No because its gonna be hard to get even one  The 9800GTX can do Tri-SLI tho.


----------



## twicksisted (Feb 29, 2008)

who is going to do tri-sli with it in all honesty... any why would you want to? unless you were trying to break the 3dmark06 mark?

you really dont need it... perhaps one yes... but not 3
I run everything except crysis maxxed out completely on my 24" 1920X1200 screen on my lowly ATI 2900pro

(ok, its overclocked up to 858-1900 )

still these cards will beat it hands down... and unless you were adamant about gaming in 2048X1280 with 16X AA then its pointless and very unneccessary....

This may be two 8800GTS cards strapped together, but who cares? both nvidia and ATI cant do better and its pretty incredible that they have put these two beasts together... now who can afford one? thats another story hehe


----------



## mandelore (Feb 29, 2008)

lols, i had this argument in another thread ^^


----------



## mandelore (Feb 29, 2008)

twicksisted said:


> both nvidia and ATI cant do better and its pretty incredible that they have put these two beasts together...



but ATI dont use 2 cards for their dual gpu design


----------



## twicksisted (Feb 29, 2008)

dosent matter really though.... what matters is that this is the best that you can get for your money right now...

buying parts for your pc is all about right now... and sure in a couple of months they will eclipse it with something better... what matters though is that right now these two are the best that you can get... 

if you want to spend that kinda money for whatever reason is besides the point hehe


----------



## erocker (Feb 29, 2008)

I'm sick of people arguing over "two cards, one card"  WHO CARES!!:shadedshu  It's not relevant to anything other than fanboy bickering.  Stop it now!  Now... is that GeForce lit up by LED?


----------



## pentastar111 (Feb 29, 2008)

erocker said:


> I'm sick of people arguing over "two cards, one card"  WHO CARES!!:shadedshu  It's not relevant to anything other than fanboy bickering.  Stop it now!  Now... is that GeForce lit up by LED?


 LOL!!!


----------



## mandelore (Feb 29, 2008)

yeah, anyways. these 2 x2 card offerings are gonna be obsolete so verry soon


----------



## twicksisted (Feb 29, 2008)

hats off to them for trying though.... the people spoke... the people wanted more power... and they delivered by strapping two of their most powerfull cores into one board (well two boards for nvidia but thats no the point)...

basically this is as good as it gets right now... hopefully they have something else up their sleeves for tomorrow and the next day...

moan as you will but this is it for now... and personally i think thats pretty fucking wicked 
(excuse my french) hehe


----------



## -=l32andon=- (Feb 29, 2008)

why quibble over advancement in technologies?


----------



## flashstar (Feb 29, 2008)

-=l32andon=- said:


> why quibble over advancement in technologies?



Because it isn't. The 9800 GX2 is simply two pre-existing cards sandwitched together. Anyone can do that, just take some glue and 1+1=2.


----------



## Exceededgoku (Mar 1, 2008)

flashstar said:


> Because it isn't. The 9800 GX2 is simply two pre-existing cards sandwitched together. Anyone can do that, just take some glue and 1+1=2.



No... :shadedshu

It's called the 9800 because it's 65nm right? The 8*00 series is all on 90nm afaik and so the new 9 series designation is to make people aware that they are getting a slightly more refined product (notice the use of refined over improved ).

Let's make it nice and simple it's a *dual board single slot card* or "Duboscard" for short.
ATI makes a very nice *dual chip single slot card* or "Duchscard" for short.
:rofl:


----------



## DaedalusHelios (Mar 1, 2008)

mandelore said:


> but ATI dont use 2 cards for their dual gpu design



They can't afford to. They are to poor! lol

Have you looked at AMD's worth? The company is in shambles. Nobody in there right mind would buy that stock right now. IBM is looking, and Nvidia might buy if they aren't sold by Q4 2008. The problem is finding a company willing to buy AMD right now despite that its "deep in the black"(debt).

AMD could pull through and pay off its debt, and do well again. That would take alot of good moves to get there.




> It's called the 9800 because it's 65nm right? The 8*00 series is all on 90nm afaik and so the new 9 series designation is to make people aware that they are getting a slightly more refined product (notice the use of refined over improved ).



8800gt 512mb, 8800gt 256mb, 8800gts 512mb, and 8800gs were all 65nm G92's or G92 variants.


----------



## newtekie1 (Mar 1, 2008)

I find it interesting that one PCB uses the 6-pin connector, and the other uses the 8-pin.  I wonder if the 8-pin will be required.  The PCB that uses the 6-pin is also the one that plugs into the PCI-E slot.  So I assume it needs less power from the power connector, and the other PCB needs all its power from the power connecter, and hence they used the beefed up 8-pin.



tkpenalty said:


> Its two cards facing each other. While it does look like one of the most sleekest cards, I really think this card will be a big flop due to many reasons.
> 
> 1. PCB - The design is simply fragile....
> 2. Cooling, and heat from card - Shrouding the card like that isnt smart as it traps heat, even if there is a fan and a massive grille on the side, not all the heat will be able to leave the enclosure
> ...



1.) How is the design fragile?  It isn't.
2.) The heat is exhausted out the back of the card, it will definitely run hot, but that is to be expected from any dual-GPU card.  I doubt heat will be a major issue, it can't be much worse than the 80°C temps we are seeing on the 3870x2.
3.) Core are underclocked by a large amount?  You really have no idea what you are talking about do you?  The cores are clocked EXACTLY the same as the 8800GT(600/1500).  The memory is 100MHz faster than the 8800GT though.  Oh, and there is also the small matter of the fact that the G92-450 cores used in the 9800GX2 will have the full 128 SP's enabled instead of just the 112 SP's on the 8800GT.
4.) No official price has been set yet, you have no argument until an official price has been set.  I've seen price estimates ranging from $400 to $650, I'll wait until the thing is actually released before I judge the price.
5.) The standard air cooling solutions are out the window on this card, yes.  However, most of them are out the window on the 3870x2 and 7950GX2 also.  However, designing a waterblock for this should be MUCH easier than the 7950GX2.  Because the cooler on the 9800GX2 is sandwiched between the cores, a single waterblock can use used that cools both cores, unlike the 7950GX2 which requires 2 seperate waterblocks.


----------



## PrudentPrincess (Mar 1, 2008)

tkpenalty said:


> Its two cards facing each other. While it does look like one of the most sleekest cards, I really think this card will be a big flop due to many reasons.
> 
> 1. PCB - The design is simply fragile....
> 2. Cooling, and heat from card - Shrouding the card like that isnt smart as it traps heat, even if there is a fan and a massive grille on the side, not all the heat will be able to leave the enclosure
> ...



I have no respect for you as of now. :shadedshu
Most of what you're saying is false and I'm pretty sure you've done this before.


----------



## DaedalusHelios (Mar 1, 2008)

To everybody:
1. Whats your guess on how much it will cost(MSRP)?
2. Whats your guess on the performance of 9800GX2 X2 versus 9800gtx Tri-SLI?

I am not sure, so please tell me what you think it will be.


----------



## newtekie1 (Mar 1, 2008)

PrudentPrincess said:


> I have no respect for you as of now. :shadedshu
> Most of what you're saying is false and I'm pretty sure you've done this before.



Yep, I just wish more people would be as smart as you.  He is a huge ATi fanboy(despite claiming to have an nVidia GPU).  He constantly spreads nVidia mis-information.  And when he is called on it he usually freaks out and repeats himself over and over again, and pretends like he has power around here, and even makes meaningless threats.



DaedalusHelios said:


> To everybody:
> 1. Whats your guess on how much it will cost(MSRP)?
> 2. Whats your guess on the performance of 9800GX2 X2 versus 9800gtx Tri-SLI?
> 
> I am not sure, so please tell me what you think it will be.



1.) It depends on overall performance.  If nVidia thinks the performance will be higher than the 3870x2, then you can bet it will cost more.  Personally, I think the performance could be better than the 3870x2, considering 2 8800GTs in SLI outperform a 3870x2, and the 9800GX2 is better than 2 8800GTs in SLI.  It is really all going to come down to performance of the card.  I would guess $550, but all we can really do is guess and speculate.  The reason I pick that number is because that is roughly the cost of 2 8800GTS 512MB cards, maybe a little more than that, and that is essentially what you are getting with the 9800GX2.  You aren't going to get as good of a priceerformance ratio that you get with lower then G92 card, but that is always the case as you go up in performance.

2.)  I would guess the 9800GTX in Tri-SLI will outperform a single 9800GX2.  But the price of a single 9800GX2 will be far less than 3 9800GTX cards.  Personally, I don't think Tri-SLI is worth it in any way, it is only good for breaking benchmark records, just like CrossfireX.


----------



## PrudentPrincess (Mar 1, 2008)

newtekie1 said:


> Yep, I just wish more people would be as smart as you.  He is a huge ATi fanboy(despite claiming to have an nVidia GPU).  He constantly spreads nVidia mis-information.  And when he is called on it he usually freaks out and repeats himself over and over again, and pretends like he has power around here, and even makes meaningless threats.



Yeah heres a taste of what I've seen from him:

"9800GTX is the SAME thing as the 8800GTS 512MB..."

"That cooler is just a 8800GT's cooler multiplied by two and soldered together....... I smell overheating." <talking about 9800


----------



## DaedalusHelios (Mar 1, 2008)

Yeah, but what about two 9800GX2's in SLI? Do you think it will beat 8800ultra Tri-SLI?

Sorry about all the questions. 

That is my planned configuration for my gaming rig that is waiting in my bedroom with an Asus P5N-T 780i !!!


----------



## newtekie1 (Mar 1, 2008)

I don't know, I would guess 2 9800GX2's in SLI would beat 3 9800GTX's or 8800Ultra's in Tri-SLI, but we will have to see once the cards come out.


----------



## brian.ca (Mar 1, 2008)

DaedalusHelios said:


> They can't afford to. They are to poor! lol
> 
> Have you looked at AMD's worth? The company is in shambles. Nobody in there right mind would buy that stock right now. IBM is looking, and Nvidia might buy if they aren't sold by Q4 2008. The problem is finding a company willing to buy AMD right now despite that its "deep in the black"(debt).
> 
> ...



Yeah, and I'm sure it was that and had nothing to do with design aspects of it.....  

Wasn't it Nv that had been asking their partners to cut back on production costs for their cards b/c they were currently costing too much to make?  Meanwhile ATI seems to be able to adjust prices as needed and still turn a profit.  ATI didn't make a sandwhich b/c the single PCB route is most likely more practical for them, partners and consumers not b/c they can't afford their current operating costs.

I'd probably question the validity of the stock speculation as well... I'll be the first to admit I know jack about the stock market but if you look at the higher points for AMD, ATI, or Nvidia and consider that in the worst case scenario AMD would most likely be bought out by someone I'd imagine buying AMD during a low point won't be the worst buy you'll make.  I mean I've been looking at it a bit over the past months.. if you bought it at their worst point you would currently have made like 30% on your investment

A lot of AMDs troubles can be attributed to the purchase of ATI but it was also predicted that would happen and it would take a while to digest the acquisition and start to really see the benefit of absorbing ATI.  We now see ATI poised to really compete vs. Nvidia now, and even if AMD is not as well positioned I'd be willing to be there's still a decent chunk of things it still has going for it that people don't really consider.  Their joint ventures with IBM are probably worth a pretty penny.  I don't know how much they get out of the deals but for example that recent story about the two of them prducing the a successful EUV lithographed test chip sounds like it might hold a great amount of potential.  If they stay ahead of the curb that is tech that Intel, Samsung, Toshiba and any number of companies will be wanting in a bad way.


----------



## brian.ca (Mar 1, 2008)

newtekie1 said:


> 5.) The standard air cooling solutions are out the window on this card, yes.  However, most of them are out the window on the 3870x2 and 7950GX2 also.  However, designing a waterblock for this should be MUCH easier than the 7950GX2.  Because the cooler on the 9800GX2 is sandwiched between the cores, a single waterblock can use used that cools both cores, unlike the 7950GX2 which requires 2 seperate waterblocks.



I think it was you that made this comment on another thread and I questioned it then but didn't get a reply.. Could you explain how that single water block idea would work out?

I've never played with water coolling but if there are two chips, one above & one below, wouldn't the materials still mostly be the same minus the extra connection?  But more importantly how would that be designed to achieve good contact on both chips while still being practical.  I can't imagine that being something you just slide in, wouldn't it either be too tight (requiring you to force it) going in or too loose once it's in if you didn't have to force it?

The only way I could see that working is if the 2nd pcb is removable so you could reattach it over the water block after you've put it on against the first chip (but even then what's the real benefit there?  Would the single double sided water block be significantly cheaper than two normal ones?

But that brings me back to one of the major concerns (and I think this is what the guy you quoated was getting at) that I think you overlook... the practicality of actually working with this card.  That an normal aftermarket cooler wouldn't work on this card would be a given obviously.  But should a company design one for the dual chips (essentially similiar to their normal models but extended I would guess?), wouldn't it be a lot easier to replace the cooling on the ATI card since it should take the usual procedure?  With the Nv card it seems like you'd have to take the whole thing apart and put it back together or else work in some extremely limited space.


As for #4 on your list I'm pretty sure there was a post last week that had the roadmap for the 9 series that listed the x2 as < $599 so we can guess it'll at least be between $500 & $600 bucks with a fair amount of certainty.


----------



## TooFast (Mar 1, 2008)

ugly card. they hide the two cards with that crappy cooler lol
not worth 600$


----------



## Nick89 (Mar 1, 2008)

tkpenalty said:


> Its two cards facing each other. While it does look like one of the most sleekest cards, I really think this card will be a big flop due to many reasons.
> 
> 1. PCB - The design is simply fragile....
> 2. Cooling, and heat from card - Shrouding the card like that isnt smart as it traps heat, even if there is a fan and a massive grille on the side, not all the heat will be able to leave the enclosure
> ...



I agree with most of what you are saying, tk  but I dont  think it will be to fragile.

Also @ newtekie theres no way a single water block could be used safely, when you install the water block it would have to scrape against both cores, not mentioning how to cool the memory.. 

My estimate is that this will get about the same performance as two 9800GTX's as it looks like it has the same cores as the 9800GTX sofar. 

I think it will cost between 600$-800$ as I dont see nvidia releasing one 9800GTX for less than 400$


----------



## Nick89 (Mar 1, 2008)

newtekie1 said:


> Yep, I just wish more people would be as smart as you.  He is a huge ATi fanboy(despite claiming to have an nVidia GPU).  He constantly spreads nVidia mis-information.  And when he is called on it he usually freaks out and repeats himself over and over again, and pretends like he has power around here, and even makes meaningless threats.
> 
> 
> 
> ...



Look at how you attack his credibility first then discuss the aspects of the card. 
If a card looks junky, people are going to say so regardless of the brand; that doesn't make tkpenalty a fanboy, but your eagerness to defend nVidia sure does cast you in a bad light.


----------



## indybird (Mar 1, 2008)

Heres my usual "how I think this card will turn out" post. 
Warning: Lots of guessing and speculation based on no real facts .

1) It will run Hot: about the same as the 8800GT w/ original fan.  The covering over it wont cause much of a heat problem.
2) It will outperform the HD3870X2 by a decent margin (HD3870X2: ~15000 3Dmarks, 9800GX2: ~ 16000 3D Marks)
3) It will cost ~ $550
4) Fragility won't be a problem for anyone who builds high-end PCs such as one this would go into
5) There will be a few waterblocks available for it
6) 2 of these will beat 3 9800GTXs in Tri-SLI by a small margin
7) The SLI will scale very well (9600GTs are the first nvidia cards to almost reach 200% efficiency in an SLI setup as apposed to a single card)
8) These will overclock very well, but will require watercooling to do any
9) nvidia will make the 8-pin required for overclocking just like ATI

If this card misses more then 2 of my criteria above I will probably go for the HD3870X2, because those are looking better every day.

-Indybird


----------



## Wile E (Mar 1, 2008)

I just wanted to add my 2cents on the design aspect here. This is an argument that even happens about AMD and Intel Quad Cores, and my response is the same, the means of getting there doesn't matter, only the end result matters.

That said, I'm still skeptical about this card. Not because of it's design, but because of what happened the last time NV released a dual gpu card right before a new line released. Good driver support dropped off the face of the planet. Hopefully this case is different, however.


----------



## DaedalusHelios (Mar 1, 2008)

brian.ca said:


> Yeah, and I'm sure it was that and had nothing to do with design aspects of it.....
> 
> Wasn't it Nv that had been asking their partners to cut back on production costs for their cards b/c they were currently costing too much to make?  Meanwhile ATI seems to be able to adjust prices as needed and still turn a profit.  ATI didn't make a sandwhich b/c the single PCB route is most likely more practical for them, partners and consumers not b/c they can't afford their current operating costs.



1. It was a joke about not affording the PCBs. Nvidia has run a surplus for how many business quarters? Exactly. *ATi is deep in debt*, and *nvidia makes massive profits* _due to cutting back on costs_. ATi is not turning a profit because it is part of AMD which is falling fast. 

My source is TomsHardware: 

*"Nvidia has a lot of money in the bank, currently about $2.4 billion, but to eat up a company like AMD, it would have to cough up somewhere in the tune of $10 billion, since AMD is $5.4 billion in debt."*



brian.ca said:


> I'd probably question the validity of the stock speculation as well... I'll be the first to admit I know jack about the stock market but if you look at the higher points for AMD, ATI, or Nvidia and consider that in the worst case scenario AMD would most likely be bought out by someone I'd imagine buying AMD during a low point won't be the worst buy you'll make.  I mean I've been looking at it a bit over the past months.. if you bought it at their worst point you would currently have made like 30% on your investment



AMD/ATi could go bankrupt in under a year if this trend continues. I hope they don't because _competition is what moves technology forward _but the writing is on the wall.

Again from TomsHardware:

*"AMD is currently valued at $3.88 billion and Nvidia goes for $13.44 billion." 

"AMD has been trading below a $4 billion market value for some time and there are many companies out there, which could purchase AMD in a snap, if they wanted to get into the CPU business."*

That is a pretty deep low for a company with a one time market cap of $23 billion. 



brian.ca said:


> A lot of AMDs troubles can be attributed to the purchase of ATI but it was also predicted that would happen and it would take a while to digest the acquisition and start to really see the benefit of absorbing ATI.  We now see ATI poised to really compete vs. Nvidia now, and even if AMD is not as well positioned I'd be willing to be there's still a decent chunk of things it still has going for it that people don't really consider.  Their joint ventures with IBM are probably worth a pretty penny.  I don't know how much they get out of the deals but for example that recent story about the two of them prducing the a successful EUV lithographed test chip sounds like it might hold a great amount of potential.  If they stay ahead of the curb that is tech that Intel, Samsung, Toshiba and any number of companies will be wanting in a bad way



What they are making from those deals still will obviously not keep them out of debt, because they are still in debt. They become more in debt every month. They would need to take back the CPU market and GPU market for a full quarter, to rise again to their original AMD market cap of $23 billion considering they also have ATi (GPU market). 

Source:
http://www.tomshardware.com/2008/02/14/amd_merge_or_not/page3.html

*PS. Brian.ca don't debate with me about it unless you have done the reading. 
Then its just about you trying to argue with me. 
I am obviously not an Nvidia fanboy because my most used computer has an OC'ed 3850. *


----------



## CrAsHnBuRnXp (Mar 1, 2008)

a111087 said:


> well, its a single video card, you can't call it 2 video cards.



Thats like saying that Intel's quad core are two dual cores inside one package. However its integrated, whether it be 4 cores in a single CPU or dual PCB's inside a single gfx card, it is still a quad core/single GFX card.


----------



## hat (Mar 1, 2008)

I already know that the 9800GX2 looks like a box. I want to see pictures with the cooler off of the card...


----------



## DaedalusHelios (Mar 1, 2008)

brian.ca said:


> I think it was you that made this comment on another thread and I questioned it then but didn't get a reply.. Could you explain how that single water block idea would work out?
> 
> I've never played with water coolling but if there are two chips, one above & one below, wouldn't the materials still mostly be the same minus the extra connection?  But more importantly how would that be designed to achieve good contact on both chips while still being practical.  I can't imagine that being something you just slide in, wouldn't it either be too tight (requiring you to force it) going in or too loose once it's in if you didn't have to force it?
> 
> ...




Don't you think companies will create a cooler for it easily. Making one by hand fits into your assumption. But the ones that will be made and manufactured by Danger Den(liquid cooling), or others, of course wouldn't be hard. First, you would check your fittings, apply your thermal compound. Then, you would put it in, match up the holes, and tighten the screws with a screwdriver.

We are talking about cutting edge GPU's and you're wondering if a company can make a piece of metal hollow with flat sides? 

It could be engineered to even use the reference design fastener if they wanted to.


----------



## Wile E (Mar 1, 2008)

DaedalusHelios said:


> Don't you think companies will create a cooler for it easily. Making one by hand fits into your assumption, but one made and manufactured by Danger Den(liquid cooling), or others of course wouldn't be hard.
> 
> We are talking about cutting edge GPU's and you're wondering if a company can make a piece of metal hollow with flat sides?
> 
> It could be engineered to even use the reference design fastener if they wanted to.


It doesn't even have to be a single piece of metal. It can be a piece of metal on each gpu, connected by acetal or something similar.


----------



## a111087 (Mar 1, 2008)




----------



## DaedalusHelios (Mar 1, 2008)

But how will they OC and how will they scale in SLI, Tri-SLI and with the latest drivers when they come out?

I know, we all wish we knew.


----------



## Wile E (Mar 1, 2008)

a111087 said:


>



That doesn't mean anything. That's pre-release drivers. No 9800GX2 benches have meaning until it gets a completed driver.


----------



## a111087 (Mar 1, 2008)

Wile E said:


> That doesn't mean anything. That's pre-release drivers. No 9800GX2 benches have meaning until it gets a completed driver.



yes, but i don't remember if it was you or someone else saying that it still gives a pretty good estimate


----------



## Wile E (Mar 1, 2008)

a111087 said:


> yes, but i don't remember if it was you or someone else saying that it still gives a pretty good estimate



Wasn't me. But, at any rate, I suspect things will be a lot different with proper drivers. I score over 16k with my single 8800GT and a Quad at 3870.


----------



## candle_86 (Mar 1, 2008)

mdm-adph said:


> Yeah, "single card" -- that's cute.  By the way, I've got a "single RAM chip" in my computer -- it just so happens to take up four slots, don't worry about that.



hmm, well your logic is flawed it takes two PCIe slots, the x2 takes 2 PCIe slots, whats the problem?


----------



## Wile E (Mar 1, 2008)

candle_86 said:


> hmm, well your logic is flawed it takes two PCIe slots, the x2 takes 2 PCIe slots, whats the problem?


They only plug into one slot, however. Electrically, they are a single slot card.


----------



## brian.ca (Mar 1, 2008)

DaedalusHelios, if I didn't catch your tone I'll apologize for that.  But it was still a silly thing to say (or maybe just a bad joke).  re: the pcb thing as I remember it they made the GTs in a way that the margins on it were too small and the cost cutting was a reactionary measure not proactive and it didn't make the partners very happy.  I wouldn't credit their success on cost effectiveness so much as good marketing & positioning combined with steady performance.

On ATIs side it's harder to comment.. I'd be interested to see how they're actually doing (ie: apart from AMD as a whole).  It does sound like after the intial merger mess they're doing a lot better.  They margins on the 3000 series cards were supposedly pretty good (and they seem to have room to play with the prices still looking at the recent price cuts) and they seem to be poised to get ahead of Nv this spring/summer.

Regarding AMDs situation I'm not completely sure what you're arguing... I did not say AMD is not in a bad situation but that your comment "No one in the right mind would buy AMD stock right now" may not be the most accurate assement.   Yes AMD is currently in the shitter but that's part of the appeal of buying into them atm.  Like I said, if you bought them at their worst 2 months ago when people were crapping on them the most you would currently be up 30% on your investment today.  And given AMD/ATI's potential, what they are to a few companies, and how low they currently are it just seems to me like it's most likely to go up for them (and like the saying goes, you're supposed to buy low).  If nothing else I have a hard time believing someone won't buy them out before they go bankrupt, which (again I'm not a big finance guy) I would imagine would drive the price up since there's not much room to currently fall before it gets to that point and there are probably a few companies that would have a stake in acquiring them.

With the side deals thing I don't think you can say they obviously won't help b/c they're currently still in debt when the pay off is still be a ways away.  The EUV thing for example is expect to not be needed for at least another 3 years I think it was (yes that seems like a long time but Intels already been waiting at least 1 year - as in that's how long it takes to disassemble, ship, & reassemble the stuff they're waiting for - just to get a tool from Nokia I think it was so they could do some R&D on the tech AMD+IBM are playing around with).

And I read the same article from Tom's a while ago.  I also read one from them a month or so ago as well that the new one contracdicts on a lcouple of points.   When they write a new one that offers further speculation I'll probably read that as well.   I'll probably take that with a grain of salt though rather than convincing myself that reading an article on Tom's equates to working knowledge of business and finance.  



> Don't you think companies will create a cooler for it easily. Making one by hand fits into your assumption. But the ones that will be made and manufactured by Danger Den(liquid cooling), or others, of course wouldn't be hard. First, you would check your fittings, apply your thermal compound. Then, you would put it in, match up the holes, and tighten the screws with a screwdriver.
> 
> We are talking about cutting edge GPU's and you're wondering if a company can make a piece of metal hollow with flat sides?
> 
> It could be engineered to even use the reference design fastener if they wanted to.



1) I'm not trying to argue.. someone said something, I'm not completely familiar with water cooling but I understand the principals so I was curious how it would actually work b/c it sounded like what he was saying had a flaw in the idea.  If you don't have anything worthwhile to add feel free to save some gas.

2) I'm sure if a company really wanted to they could figure something out ... whether they would actually do so would depend on the success of the card and the likelyhood of selling their cooling solution (talking particularly of aftermarking - I'd imagine Nv partners will design their own cooling on their versions of the card).  No matter how smart the design of an aftermarket cooler may be, it's worthless if the end user won't be able to install it.

But here's what I'm asking as it relates to what you said.. using your assumptions it sounds like you think the case will be removable and that there should be enough space between the 2 cards to just slide the block in.  The block if it's like Newtekie was saying would have to be connected to/between both chips (for anyone else - keep in mind this was all originally directed to him so that's why I'm referring to this design) so both sides would have to be connected to both boards to secure a tight fit after it's in no?

Now try this, take 2 pencils.. lay one against your palm across the top of your thumb and pinky.  Take the other one and cross across the first between your index and middile finger and hold the two tight in a fist so they stay together.  Imagine the one across your pinky and thumb is the top card and the vertical pencil is the water block that is already tight/secured against the top chip/pcb.  When you tighten the block to the bottom core it should pull the block down towards it.. if you pull on the bottom of the vertical pencil maybe you'll get what I was getting at.  

When you screw a block or fan down to 1 PCB the block moves not the PCB.  But if you do it to 2 it's different.  If there's any space between them and you go to tighten the blocks down to both, one or both of the cards will have to bend to make up the difference (the space that allowed easy installation) which I'd imagine would normally be considered a bad thing.  If you remove the space from the equation before that point you have to deal with the possability of a bad fit/forcing it in, thermal paste scrapping off (more annoying than anything) etc.  The only way I can think of that design Newtekie was suggesting working would be is if it's designed for no space and it's possible to disassemble the 2 cards so you can put the block in first and then the top card.   But I have a hard time buying that that would be possible.

Aftermarket cooling for this just seems like it would be impractical.. especially if it's success and Nvidia's commitment to the design ends up being questionable.


----------



## newtekie1 (Mar 1, 2008)

brian.ca said:


> I think it was you that made this comment on another thread and I questioned it then but didn't get a reply.. Could you explain how that single water block idea would work out?
> 
> I've never played with water coolling but if there are two chips, one above & one below, wouldn't the materials still mostly be the same minus the extra connection?  But more importantly how would that be designed to achieve good contact on both chips while still being practical.  I can't imagine that being something you just slide in, wouldn't it either be too tight (requiring you to force it) going in or too loose once it's in if you didn't have to force it?
> 
> ...



Just like the 7950GX2, the card will need to be taken apart to put a water block in.  It is entirely practical to do, and most people capable of installing watercooling should have no problem doing it.  You don't need to slide the cooler in, do you really think that is how nVidia did it to put the air cooler in?  Installing a cooler on the 3870x2 is definitely easier, and you are right you would have to take the whole card apart.  However, you won't have to install two seperate block.  You just install a double sided block that is the same dimentions as the current air cooler that contacts both cores.  You install the block on the first PCB like you normally install a block, then just set the second PCB on top of the block and install that.

As for #4, I remember that post with the roadmap.  However, if you notice, there are little stars by some of the information, including the price for the 9800GX2.  And at the bottom is says "* Not confirmed, and should be considered with a pinch of salt".  In other words, they are just speculating what the price will be.  It will be priced based on performance, if it performs like a $600 card, why shouldn't nVidia sell it at that price point?  Just because ATi can't get their cards to perform at that high of price points, doesn't mean nVidia should lower the price of their cards.



Nick89 said:


> I agree with most of what you are saying, tk  but I dont  think it will be to fragile.
> 
> Also @ newtekie theres no way a single water block could be used safely, when you install the water block it would have to scrape against both cores, not mentioning how to cool the memory..
> 
> ...



So you agree that the card is clocked lower than the 8800GT?  You agree that it will perform worse than 2 much weaker cards in SLI?

Yes, there is a way a single waterblock can be used.  The two PCBs come apart, just like the 7950GX2.  You don't have to slide the thing in there.  Like I said before, do you really think that is how nVidia got their cooler in there?

The roadmap posted earlier suggests that the speculation is the 9800GTX will be below $400.  And that seems about right considering it is just a slightly beefed up 8800GTS512.



Nick89 said:


> Look at how you attack his credibility first then discuss the aspects of the card.
> If a card looks junky, people are going to say so regardless of the brand; that doesn't make tkpenalty a fanboy, but your eagerness to defend nVidia sure does cast you in a bad light.



Actually, I replied to the posts in the order they were posted in.  If the original posts I replied to were reversed I would have answered them in reverse order.

The card doesn't look junky, he is just needlessly bashing it because it is an nVidia card.  He has done it in the past, and he is doing it now.  And just like in the past, he is bashing the card using blatently false information.


----------



## Rambotnic (Mar 1, 2008)

Elegant, solid, masculine, representing strength and power, just like me ! Me like it!


----------



## AsphyxiA (Mar 1, 2008)

im actually hoping that this card will in fact be a beast!  partly due to the fact that ATI will be ready to step up to try to one them up and vice versa, the never ending war continues.  plus all of you saying that it looks ugly, WTF, it actually looks wicked.  pardon me for not liking uber flashy looking video cards but i find that less is more!  Also, this definantly looks way better than nVidias last frankestein they realeased a while back, at least its streamlined!


----------



## DaedalusHelios (Mar 1, 2008)

brian.ca said:


> Regarding AMDs situation I'm not completely sure what you're arguing... I did not say AMD is not in a bad situation but that your comment "No one in the right mind would buy AMD stock right now" may not be the most accurate assement. Yes AMD is currently in the shitter but that's part of the appeal of buying into them atm. Like I said, if you bought them at their worst 2 months ago when people were crapping on them the most you would currently be up 30% on your investment today. And given AMD/ATI's potential, what they are to a few companies, and how low they currently are it just seems to me like it's most likely to go up for them (and like the saying goes, you're supposed to buy low). If nothing else I have a hard time believing someone won't buy them out before they go bankrupt, which (again I'm not a big finance guy) I would imagine would drive the price up since there's not much room to currently fall before it gets to that point and there are probably a few companies that would have a stake in acquiring them.



The buy low, and sell high rule of the stock market, does not include super high risk investments that sell low for very long periods. Also, they will bankrupt soon enough if someone doesn't buy them in time, or they some how pull out of it. They have been trading at under $4 billion for way too long. 

Considering IBM has been decreasing there use of AMD products lately and opting for Intel parts instead on workstations and laptop segments; I would say don't count on IBM helping them much. Unless you are talking about R&D only, and then I would say yes. Remember the evil "chipzilla" that is Intel, has much more money to throw at R&D, than IBM would throw at a company like AMD.


But there is one thing on your side of the debate that I am surprised you haven't mentioned. In the past year, AMD has received tax breaks, incentives, and corporate welfare from the German government to help keep them afloat. If the German government wanted to badly enough, they could pay off all of AMD's debt and wait for them to pick up speed again before getting the money back as part of an agreement with them. 

But it probably won't happen, but thats just speculation on my part that it wouldn't though. I am not sure how their politicians feel towards AMD because I don't follow them politically.


----------



## newtekie1 (Mar 1, 2008)

I don't know what is going on with AMD, but I honestly hope they don't go under.  Competition pushes prices down, but more importantly it pushed technology forward.  If AMD goes under the Processor and Video Card markets will both suffer from increased prices and lack of inovation.


----------



## DaedalusHelios (Mar 1, 2008)

newtekie1 said:


> I don't know what is going on with AMD, but I honestly hope they don't go under.  Competition pushes prices down, but more importantly it pushed technology forward.  In AMD goes under the Processor and Video Card markets will both suffer from increased prices and lack of inovation.




I totally agree with you. But from the looks of things, this is AMD's darkest hour.


----------



## brian.ca (Mar 2, 2008)

Newtekiem thanks for the answer.  I never really followed tech stuff too much back when the 7950GX2 was released so I only had a rough idea about it.  Like I said to put it together like a sandwhich is bar none the most sensible way to actually do it so no I didn't think that's how Nv would be doing it.  I would assume Nv / partners would have no problem implementing their own cooling solutions b/c it should be easy to do at the time of production... it's like electrical or plumping work in a house... if you do it at the time of construction it's a laughable job.  If you try to run a new line through the house when all the insulation and drywall is up though it's a completely different story.

I was under the impression that seperating that second card to give you the room you would need to work would be more towards the side of impractical.  I suppose that would mostly be based on what actually connects the two together though.  ie: if it was a small bridge that disconnects easy that should be no problem assuming that the second cards connection to the back plate is also designed to come apart. .  If it's a solid connection though I could see that being a pain in the ass (I was assuming it'd be the later).   Did they design the 7950GX2 to be easily disassembled?


DaedalusHelios, where I disagree is the high risk assesment. I think a good chunk of their problems can be attributed to ATI acquisition and as they go along they should be able to put that behind them if they can manage to press through.  Both AMD and ATI have the potential to perform well.  

What you mentioned with the german gov relates to what I was saying about AMD is to different companies and how that mitigates the risk with AMD.  Germany wants to see AMD do well b/c of their Dresdan plant so they're bound to treat them well.  AMD I think was also supposed to be opening a plant in NY as well no?  And low and behold NY state files that anti trust suit vs. Intel, so they'll probably have the state of NY on their side as well.  

With IBM what I'm wondering about is the extent of their relationship.  I keep hearing about the duo having quite the relationship and having a bunch of joint R&D projects together.  For IBM to work with AMD they must have something to offer, be it intellecutal property or some form of physical asset.  If AMD was to go under would IBM not stand to lose if another company buys that IP or if they need to find those assests elsewhere?   Ex: with the EUV tech thing, part of that test chip was manufactured in Dresden... is that something that is easily & immediately replicable elsewhere?  If not it should be in IBMs best interest to protect AMD when their stability can have an adverse effect on IBMs business.  Not for AMDs sake but their own.


----------



## DaedalusHelios (Mar 2, 2008)

brian.ca said:


> Newtekiem thanks for the answer.  I never really followed tech stuff too much back when the 7950GX2 was released so I only had a rough idea about it.  Like I said to put it together like a sandwhich is bar none the most sensible way to actually do it so no I didn't think that's how Nv would be doing it.  I would assume Nv / partners would have no problem implementing their own cooling solutions b/c it should be easy to do at the time of production... it's like electrical or plumping work in a house... if you do it at the time of construction it's a laughable job.  If you try to run a new line through the house when all the insulation and drywall is up though it's a completely different story.
> 
> I was under the impression that seperating that second card to give you the room you would need to work would be more towards the side of impractical.  I suppose that would mostly be based on what actually connects the two together though.  ie: if it was a small bridge that disconnects easy that should be no problem assuming that the second cards connection to the back plate is also designed to come apart. .  If it's a solid connection though I could see that being a pain in the ass (I was assuming it'd be the later).   Did they design the 7950GX2 to be easily disassembled?
> 
> ...



I used to own a 7950GX2 and the performance was alright but I never had it crash once throughout the months I was using it. Atleast it was reliable. I never benched it though.

The best scenario would be IBM buying AMD. That kind of money is nothing to IBM. 

Sadly, IBM's history of success has partially rooted from Nazi support. They created automated methods of keeping up with the jews with punch card systems that increased the success of those genocidal goals of the Nazi's. Those contracts and relationships helped them grow extremely fast as a company, but at what cost?


----------



## newtekie1 (Mar 2, 2008)

Yes, the 7950GX2 was designed to easily come apart.  It had a small PCB connected thet wo main PCBs that slide into little connectors on both PCBs.  It was actually very easy to take apart and put back together.






There is a picture of the 7950GX2 taken apart to give you an idea.  The two PCBs are only linked by a few screws/risers and a small "link PCB".

There have been a few pictures of the 9800GX2 already dismantled floating around the internet.  It seems that it might be a little more complicated to take apart and put back toghether, but it shouldn't be anything crazy.  If you can install a water cooling kit and water blocks, then you can more than likely take apart and reassemble the 9800GX2.


----------



## tkpenalty (Mar 2, 2008)

newtekie1 said:


> Yes, the 7950GX2 was designed to easily come apart.  It had a small PCB connected thet wo main PCBs that slide into little connectors on both PCBs.  It was actually very easy to take apart and put back together.
> 
> 
> 
> ...



The current way is some ribbon cable, and theres a fan hole on the PCB, which honestly looks a bit fragile.


----------



## captainskyhawk (Mar 2, 2008)

DaedalusHelios said:


> Sadly, IBM's history of success has partially rooted from Nazi support. They created automated methods of keeping up with the jews with punch card systems that increased the success of those genocidal goals of the Nazi's. Those contracts and relationships helped them grow extremely fast as a company, but at what cost?



Hey, IBM wasn't the only company back then helping out the Nazi's. 

Of course, it's no worse than Google, Yahoo, or Cisco helping out the Chinese government today.  All companies have skeletons in their closet.

Here's to hoping that IBM buys them out instead of nvidia!


----------



## tkpenalty (Mar 2, 2008)

captainskyhawk said:


> Hey, IBM wasn't the only company back then helping out the Nazi's.
> 
> Of course, it's no worse than Google, Yahoo, or Cisco helping out the Chinese government today.  All companies have skeletons in their closet.
> 
> Here's to hoping that IBM buys them out instead of nvidia!



China isnt fascist shadedshu:shadedshu

Be wary how racist your comments sound and all will be okay  (and get back on topic)


----------



## tkpenalty (Mar 2, 2008)

a111087 said:


>



Looks like they need a new CPU


----------



## newtekie1 (Mar 2, 2008)

tkpenalty said:


> The current way is some ribbon cable, and theres a fan hole on the PCB, which honestly looks a bit fragile.



I know the current way, I've seen the pictures.  The ribbon cable shouldn't pose a problem, the connectors used are quick connect, and easily come apart and go back together.



tkpenalty said:


> Looks like they need a new CPU



Without knowing the details behind those numbers, they are meaningless.  However, I actually perfer to do some research instead of just posting ignorant crap statements based on numbers I know nothing about.

If you would have bothered to do a little reading about those number you would see some interesting things.

1.) The test setup used was a stock QX6700(2.66GHz).  14,000 is about all you are going to get in 3Dmark06 with a stock QX6700.  With the current high end graphics cards, 3Dmark06 is CPU limitted with the C2Q processors at such low speeds.  My 8800GTS won't score over 12,100 unless I overclock my Q6600.  And then it tops out at 14,000 with the Q6600 at 2.7GHz.

2.) Those runs were rather old drivers.  173.67 to be specific.  174.12 is what I have installed right now, and there are even newer version available.  I'm sure the drivers have come a long way since the drivers they used.  Those old drivers probably barely had support for the 9800GX2.  I'm going to guess nVidia has been doing some optimizations since those drivers.

I'm not going to believe anything I see about the performance of these cards until they are released.


----------



## flashstar (Mar 2, 2008)

tkpenalty said:


> China isnt fascist shadedshu:shadedshu
> 
> Be wary how racist your comments sound and all will be okay  (and get back on topic)



China might have a free market economy, but the government itself has millions of political "dissidents" locked away in prisons.


----------



## DaedalusHelios (Mar 2, 2008)

tkpenalty said:


> China isnt fascist shadedshu:shadedshu
> 
> Be wary how racist your comments sound and all will be okay  (and get back on topic)




They are worse than your average fascist. Quit insulting fascists like that.

The Chinese government does genocide. Ever heard of the Falun Gong or Falun Dafa? They are a growing pacifist movement in China which are being killed by the thousands over there. Don't defend the chinese government unless you are evil yourself.  

The Chinese government commits more human rights violations than I could ever keep track of. I have friends that fled China to avoid being killed by the chinese government. They are part of the Falun Gong.


----------



## Rambotnic (Mar 2, 2008)

China can be considered Russia's alter-ego, wich does whatever Russia would do if it just had a chance.


----------



## PrudentPrincess (Mar 2, 2008)

Rambotnic said:


> China can be considered Russia's alter-ego, wich does whatever Russia would do if it just had a chance.



An enemy of Japan is a friend of China.


----------



## AsphyxiA (Mar 2, 2008)

wow... this threads topic has shifted so much! videocards to fascism


----------



## DaedalusHelios (Mar 2, 2008)

AsphyxiA said:


> wow... this threads topic has shifted so much! videocards to fascism



As the news gets older the topic gets more interesting.


----------



## Rambotnic (Mar 2, 2008)

A thread should be made for political discussions.


----------



## mandelore (Mar 2, 2008)

Rambotnic said:


> A thread should be made for political discussions.



no doubt that would get closed down when immatures pop their top when they cant take criticism of their country


----------



## Rambotnic (Mar 2, 2008)

mandelore said:


> no doubt that would get closed down when immatures pop their top when they cant take criticism of their country


Yeah, you're probably right. People like that like to call themselves patriots, although i'd call them successful subjects of mind-control practice.


----------



## mandelore (Mar 2, 2008)

Rambotnic said:


> Yeah, you're probably right. People like that like to call themselves patriots, although i'd call them successful subjects of mind-control practice.



Hell knows I love my country, but its ran by t***s, unelected t***s to be precise. And our mp's are now demanding £160 a day just to TURN UP TO WORK??!!

the UK is going to hell in a nicely laid out picnic basket, fast


----------



## vampire622003 (Mar 3, 2008)

They said, "Ok, well ATI is selling their 3870X2, so lets slap together a couple a 8800's, case it in some plastic and call it a 9800GX2." I heard also it is getting over 4,000 points less in 3DMark '06 than a 3870X2. It also keeps getting delayed, mostly because of driver problems.


----------



## imperialreign (Mar 3, 2008)

mandelore said:


> Hell knows I love my country, but its ran by t***s, unelected t***s to be precise. And our mp's are now demanding £160 a day just to TURN UP TO WORK??!!
> 
> the UK is going to hell in a nicely laid out picnic basket, fast



and the US is hand in hand with them - hopefully our country to start getting it's crap straight next year . . . but a new administration is still a year away, and our current choices don't appear to be much better than the bumbling idiot currently running the three-rings


----------



## DaedalusHelios (Mar 3, 2008)

vampire622003 said:


> They said, "Ok, well ATI is selling their 3870X2, so lets slap together a couple a 8800's, case it in some plastic and call it a 9800GX2." I heard also it is getting over 4,000 points less in 3DMark '06 than a 3870X2. It also keeps getting delayed, mostly because of driver problems.



If you even moderately follow GPU roadmaps you would know that the 9800GX2 was in the works just before the launch of the 8800gt. That was long before the 3870X2 was being sold.

Don't be an ATi fanboy. I like ATi slightly more than Nvidia but I don't go out and lie to make ATi look better.


----------



## vampire622003 (Mar 4, 2008)

DaedalusHelios said:


> If you even moderately follow GPU roadmaps you would know that the 9800GX2 was in the works just before the launch of the 8800gt. That was long before the 3870X2 was being sold.
> 
> Don't be an ATi fanboy. I like ATi slightly more than Nvidia but I don't go out and lie to make ATi look better.


Rofl, I am not a fanboy I just go with what I think is better at the moment. I have had bad luck with the last few NVIDIA cards I have had (5500, 6600, 7300GT), but not with my ATI cards (9250, 9600Pro (except this one, it fried, lol), X1300). So I prefer ATI right now. Also, I don't buy cards as soon as they come out, I wait awhile so they can fix most of the problems. Mine can play the games I like to play, but if there is a game I can't run that I "absolutely" have to play, I'll upgrade. Nvidia was shown to me first with the MX4000. But that was when DirectX 8.1 gaming was cool, lol.


----------



## DaedalusHelios (Mar 5, 2008)

vampire622003 said:


> Rofl, I am not a fanboy I just go with what I think is better at the moment. I have had bad luck with the last few NVIDIA cards I have had (5500, 6600, 7300GT), but not with my ATI cards (9250, 9600Pro (except this one, it fried, lol), X1300). So I prefer ATI right now. Also, I don't buy cards as soon as they come out, I wait awhile so they can fix most of the problems. Mine can play the games I like to play, but if there is a game I can't run that I "absolutely" have to play, I'll upgrade. Nvidia was shown to me first with the MX4000. But that was when DirectX 8.1 gaming was cool, lol.




My first ATi was 128 rage and then the radeon 7000, but my first nvidia was the Ti4400.


----------



## b1lk1 (Mar 5, 2008)

I just did my favorite thing when I see a large thread I haven't read yet.  I read the first 10 replies, then I jumped to the last 10.  Never fails that the topic is 110% completely different by the last 10 posts, LOL!!!!!!!

PS:  My 8800GTS 512MB card maxxes everything out on my 24" LCD with 4XAA/16XAF.  This card is an epic failure out of the gate....


----------



## DaedalusHelios (Mar 5, 2008)

b1lk1 said:


> I just did my favorite thing when I see a large thread I haven't read yet.  I read the first 10 replies, then I jumped to the last 10.  Never fails that the topic is 110% completely different by the last 10 posts, LOL!!!!!!!
> 
> PS:  My 8800GTS 512MB card maxxes everything out on my 24" LCD with 4XAA/16XAF.  This card is an epic failure out of the gate....



It hasn't been released yet and there aren't even finalized drivers yet. No wait........ dude you can see the future....

Just playin


----------

