# R700 Supports 4-way Crossfire X?



## btarunr (Jul 13, 2008)

Yes, you read that right. You can use upto four HD4870 X2 accelerators in tandem for an 8-GPU, 9.6 TFLOP, 8 GB graphics crunching monster. You need a 4-slot motherboard though. There are 4-slot solutions available on the AMD 790FX platform, for Intel though X48 did support 4 slots and images of prototype X48 boards with 4 slots did surface months back, there isn't such a board out yet. You do have the Skulltrail platform and upcoming X58 Bloomfield-supportive boards do promise to come in 4 slot flavours. 

In a press-conference, Raja Koduri, worldwide CTO (Products Group), AMD talked to Indian website TechTree, among minor revelations such as "Fusion in 2009", here's a shocker:

"AMD has already built a computer that has four 4870X2s in it. So it has eight GPUs; drivers will not be supporting eight GPUs at this point of time."

-Implies that AMD is fully geared up to go head on against GT200b whenever it comes up. It's all a matter of them releasing a supportive driver. Again, unreliable sources point towards the possibility that also in the pipeline could be a R700+ which could be based on 'Super-RV770XT' processors. If you thought you're witnessing the peak of the GPU battle for supremacy, hold on, you're only getting appetised.

*View at TechPowerUp Main Site*


----------



## PrudentPrincess (Jul 13, 2008)

IM HUNGRY


----------



## Fitseries3 (Jul 13, 2008)

sh!t... there goes all my money.


----------



## dark2099 (Jul 13, 2008)

PrudentPrincess said:


> IM HUNGRY



LOL, and looks like records are going to be shattered, or utterly bombed, who here is going to get the 2k watt PSU just to test this?


----------



## Fitseries3 (Jul 13, 2008)

dark2099 said:


> LOL, and looks like records are going to be shattered, or utterly bombed, who here is going to get the 2k watt PSU just to test this?



rofl... i have a pair of psu's that would power any rig. 1kw galaxy and 860watt pc power.


----------



## boogah (Jul 13, 2008)

wow imagine a computer generated movie rendered in days instead of months or years O_O (real time raytracing +++)


----------



## dark2099 (Jul 13, 2008)

I am going to eat the waffle in your avatar boogah.


----------



## trt740 (Jul 13, 2008)

good for AMD, make back some cash and push them prices down from intel and geforce. I gotta hand it to them they just don't quit and are going for the kill shot. I think it's time to buy some AMD they are gonna be the top dog here soon.


----------



## Megasty (Jul 13, 2008)

Damn, I'm even about to be broke  I guess its better than the wifey spending $2500 on a crystalline doggy paperweight


----------



## ShadowFold (Jul 13, 2008)

Well I wonder if 4 of them would even work long with a 1kw psu. Might need 1200w or 1500w... Or maybe im just insane. But then again AMD is more insane.. 8 gpu's lol I can see that breaking some 3dmark records.


----------



## Megasty (Jul 13, 2008)

Gah, now I'm going to have to use both of my 1000watters in one machine


----------



## btarunr (Jul 13, 2008)

And plus thanks to _n_GOHQ.com,  you have a Radeon PhysX driver by then, dedicate a GPU to crunch PhysX, eat the record cake.

Why so much worry over power? A good 700W PSU is all you need for 2 cards, you do have 1200W boxes out there by ABS-Tagan, Silverstone, etc. 1200W could do.


----------



## Fitseries3 (Jul 13, 2008)

honestly..... WTF?

3 GPU's scale well but they are still having problems with 4 GPU's scaling similar to 3. you cant add 4 more and expect it to work if 4 doesnt even perform up to par. 

you dont add 4 extra tires to a car and expect it to go faster.


----------



## btarunr (Jul 13, 2008)

fitseries3 said:


> honestly..... WTF?
> 
> 3 GPU's scale well but they are still having problems with 4 GPU's scaling similar to 3. you cant add 4 more and expect it to work if 4 doesnt even perform up to par.
> 
> you dont add 4 extra tires to a car and expect it to go faster.



It's not a question of tires here, it's engines. A bomber (aircraft) with four engines carries more load.


----------



## Megasty (Jul 13, 2008)

btarunr said:


> And plus thanks to _n_GOHQ.com,  you have a Radeon PhysX driver by then, dedicate a GPU to crunch PhysX, eat the record cake.
> 
> Why so much worry over power? A good 700W PSU is all you need for 2 cards, you do have 1200W boxes out there by ABS-Tagan, Silverstone, etc. 1200W could do.



I love the idea. However you'll only need 2 of them to destroy vantage. So you can just use the other 2 as lovey PhysX. 100000+ CPU vantage scores anyone 

BTW, I have an unopened 1600W X3 uber monster. This is going to be a very good year. I never thought I would ever get to use that thing


----------



## trt740 (Jul 13, 2008)

Megasty said:


> I love the idea. However you'll only need 2 of them to destroy vantage. So you can just use the other 2 as lovey PhysX. 100000+ CPU vantage scores anyone



speaking of Physx Nvidia is smart as hell letting AMD in on that. You know damn well they are gonna sell the a license and then make money off the competitions video card success and also on any game that uses it. Basically getting all the revenue that the physx cards were supposed to make but not making the cards just selling the programming, thats smart


----------



## X-TeNDeR (Jul 13, 2008)

4 cards 
AMD is going all-out here, lets hope they pull this off nicely, and devastate this round.

Daaamn.. this setup should be illegal


----------



## tkpenalty (Jul 13, 2008)

I would LOVE one of these rigs


----------



## ViciousXUSMC (Jul 13, 2008)

More than the power need, I would be concerned for the cooling & case needs.  4 cards you need a full tower for sure, and probably a extended atx formfactor mobo.  Cooling all of those would be a major pain and you may as well strip down to your boxer shorts because after about an hour of crysis you will probably be in a 100 degrees + room temp.


----------



## HaZe303 (Jul 13, 2008)

8 gpu´s not supported by their drivers, hah! Im happy when 1 of their gpu´s is supported properly! I have huge issues with my 4850 whenever I want to play games. All games crash with current drivers, and im not alone with these issues. C´mon ATI, stop youre war with Nvidia for a second and give us some proper support. ATI doesnt even have official drivers for the HD4800 series yet, you need to search for the hotfix drivers, and they dont even work alright.... Im 3 seconds away from selling this junk piece of card!!!


----------



## Wile E (Jul 13, 2008)

ViciousXUSMC said:


> More than the power need, I would be concerned for the cooling & case needs.  4 cards you need a full tower for sure, and probably a extended atx formfactor mobo.  Cooling all of those would be a major pain and you may as well strip down to your boxer shorts because after about an hour of crysis you will probably be in a 100 degrees + room temp.



Meh, 2 GTX480 rads would be plenty. lol.


----------



## ViciousXUSMC (Jul 13, 2008)

May as well just park the front end of an old car in your room and use a car rad.


----------



## Wile E (Jul 13, 2008)

ViciousXUSMC said:


> May as well just park the front end of an old car in your room and use a car rad.



Hmmmmm, good idea! lol


----------



## candle_86 (Jul 13, 2008)

hmm well interesting question though what AMD CPU can drive this with a REZ we can use today?

Where talking Resolutions that havnt been thought of for a single display to occure, youd need 71111x4000 displays here ya know


----------



## selway89 (Jul 13, 2008)

Mmmm power lol. But thats 8GB of just graphics memory, screwed if you run 32bit OS lol.


----------



## tvdang7 (Jul 13, 2008)

i guess you could say " hey i have more video ram than you have ram" lol


----------



## candle_86 (Jul 13, 2008)

lol right now not all my ram shows up in dxdiag anyway 1gb for the crossfire, 2gb in system, and another chunck pulled by my lan card and another by my recently accurited xfi. System ram shows up at 2034mb lol


----------



## Hayder_Master (Jul 13, 2008)




----------



## X-TeNDeR (Jul 13, 2008)

hayder.master said:


>



*Exactly.*


----------



## FreedomEclipse (Jul 13, 2008)

4way crossfire???

well obviously for those who wish to invest in such hardware will have to run a 64bit O/S unless M$ & ATi have found a way to use the gpu ram as bog standard system ram.


which wouldnt be such a bad idea - being able to convert your 'idle' gpu ram into ram which is use by your pc


----------



## roberto888 (Jul 13, 2008)

What they want with 8GB of VRAM? Anybody will use to prey on those monsters? I don't think so. Maybe the game developers. But they will be pleased with 4GB of VRAM too. Don't need 8GB. This is my mind. Sorry for bad English.


----------



## btarunr (Jul 13, 2008)

roberto888 said:


> What they want with 8GB of VRAM? Anybody will use to prey on those monsters? I don't think so. Maybe the game developers. But they will be pleased with 4GB of VRAM too. Don't need 8GB. This is my mind. Sorry for bad English.



It's not about what 8 GB can do, it's about what 8 GPUs can.


----------



## lemonadesoda (Jul 13, 2008)

*Architecture*

Would that be 4x 2GB, or 8x 1GB memory?

It ISNT 8GB of independent memory, because the "same assets" are being stored multiple times in each memory bank. The GPU doesnt actually have 8GB to play with, only 1GB (or is it 2GB?). The rest of the memory are just COPIES of the same memory space on each GPU's local address bus.

P.S. I'd love to see a FPS or flightsim using all 8 DVI ports independently! WOW.


----------



## btarunr (Jul 13, 2008)

lemonadesoda said:


> Would that be 4x 2GB, or 8x 1GB memory?
> 
> It ISNT 8GB of independent memory, because the "same assets" are being stored multiple times in each memory bank. The GPU doesnt actually have 8GB to play with, only 1GB (or is it 2GB?). The rest of the memory are just COPIES of the same memory space on each GPU's local address bus.
> 
> P.S. I'd love to see a FPS or flightsim using all 8 DVI ports independently! WOW.



Technically it's 8 x 1GB GDDR5 256bit when the rig made to render a scene in AFR (Alternate frame rendering), where each GPU takes turns...I needn't explain, you have the know-how. 

So the application has to load the same stuff (= textures, mesh, shader data, wireframe data) into all eight memory arrays. Same applies when the GPUs are made to render a scene in tiling. All 'stuff' should be available to all GPUs. 

Since we still don't have motherboards with four full x16 2.0 slots, with four slots populated it means x8 2.0 bandwidth / card.


----------



## Jansku07 (Jul 13, 2008)

ot/ That guy from mahabharata; what's he drinking (milk)? /ot Wasn't there a project which used four 9800GX2s to crunch numbers? Was it a 4x 16x PCI-E mobo which they were using?  I faintly remember this, so I might be wrong too..


----------



## btarunr (Jul 13, 2008)

Jansku07 said:


> ot/ That guy from mahabharata; what's he drinking (milk)? /ot Wasn't there a project which used four 9800GX2s to crunch numbers? Was it a 4x 16x PCI-E mobo which they were using?  I faintly remember this, so I might be wrong too..



For GPGPU it isn't essential that multiple cards be arranged in an SLI / Crossfire array of some sort. Parallelism is brought about by the driver. Remember, there's no video to play with, no SLI bridge to transport frames to a 'primary' card.

In reality, you can't arrange four 9800 GX2s in SLI. I wonder how they did it with  four R700s though, since PCB has only one CFX finger.


----------



## CDdude55 (Jul 13, 2008)

OMFG... You will need a beast of a PC to handle that. And the cooling will be hell(literally).


----------



## TheMailMan78 (Jul 13, 2008)

I just heard "QUAD DAMAGE" in a quakish voice from the AMD camp.


----------



## btarunr (Jul 13, 2008)

Haha. I miss Q3A. 

The cons are many....8 GPUs and say hello to heat, power, mobo, micro-stutters, etc. But end of the day the fastest available consumer graphics would be made by ATI. That adds brand-value all the way down to Radeon HD4450. "Hey these guys make the fastest graphics configurations in the industry".


----------



## Jansku07 (Jul 13, 2008)

> For GPGPU it isn't essential that multiple cards be arranged in an SLI / Crossfire array of some sort. Parallelism is brought about by the driver. Remember, there's no video to play with, no SLI bridge to transport frames to a 'primary' card.


 I know that. I wasn't asking about 9800GX2 octaSLI support but the following
1) Wasn't there a project which used four 9800GX2s to *crunch numbers*? (crunch numbers NOT to do video)
2) Was it a 4x 16x PCI-E mobo which they were using?
3) /ot what's the guy drinking in your picture?


----------



## Disparia (Jul 13, 2008)

Jansku07 said:


> I know that. I wasn't asking about 9800GX2 octaSLI support but the following
> 1) Wasn't there a project which used four 9800GX2s to *crunch numbers*? (crunch numbers NOT to do video)
> 2) Was it a 4x 16x PCI-E mobo which they were using?
> 3) /ot what's the guy drinking in your picture?



Yup: http://fastra.ua.ac.be/en/

An MSI K9A2 Platinum was used, x16(x8) x 4.


----------



## Jansku07 (Jul 13, 2008)

Thanks for the heads up. =) So there aren't any 4x 16x PCIE (2.0) boards available anywhere?


----------



## btarunr (Jul 13, 2008)

Jansku07 said:


> I know that. I wasn't asking about 9800GX2 octaSLI support but the following
> 1) Wasn't there a project which used four 9800GX2s to *crunch numbers*? (crunch numbers NOT to do video)
> 2) Was it a 4x 16x PCI-E mobo which they were using?
> 3) /ot what's the guy drinking in your picture?



1. I didn't say they did 'video' either. I said GPGPU use (general-purporse graphics processing unit) i.e. 'crunching numbers'. You can add as many cards as there are slots and PCI-E switches available, just that you can't call that a 'multi-GPU' except each 9800GX2 has its local SLI array. 

2. Maybe, I don't know. It's impossibile that they used Skulltrail since the last two PCI-E slots are adjescent and 9800GX2 is a dual-PCB card. They must've used some NForce Professional based WS board which had 4 slots spaced out.

3. not drinking, blowing air into a conch. (Ancient-Indian for war-cry).


----------



## Jansku07 (Jul 13, 2008)

I misunderstood that you misunderstood me. Sorry


----------



## Weer (Jul 13, 2008)

In THEORY this would be like having Quad-Sli GTX 280's.

BUT, let's see how the drivers hold the 8 cores together.


----------



## Megasty (Jul 13, 2008)

btarunr said:


> 1. I didn't say they did 'video' either. I said GPGPU use (general-purporse graphics processing unit) i.e. 'crunching numbers'. You can add as many cards as there are slots and PCI-E switches available, just that you can't call that a 'multi-GPU' except each 9800GX2 has its local SLI array.
> 
> *2. Maybe, I don't know. It's impossibile that they used Skulltrail since the last two PCI-E slots are adjescent and 9800GX2 is a dual-PCB card. They must've used some NForce Professional based WS board which had 4 slots spaced out.*
> 
> 3. not drinking, blowing air into a conch. (Ancient-Indian for war-cry).



The Foxconn Transformer F1 would work:


----------



## Disparia (Jul 13, 2008)

If we were to split hairs, there are flexible x16 to x16 adapters which would allow the use of double slot cards if the slots are single-spaced.


----------



## mandelore (Jul 13, 2008)

Weer said:


> In THEORY this would be like having Quad-Sli GTX 280's.
> 
> BUT, let's see how the drivers hold the 8 cores together.



wouldnt it be more like having 6 way sli with 280's? since the 4870x2 beats the crap out of a single 280?


----------



## Cybrnook2002 (Jul 13, 2008)

PrudentPrincess said:


> IM HUNGRY



You should edit to "DEEEEEEAAAAAAAMNNN"


----------



## Polarman (Jul 13, 2008)

Wow! this is pretty insane. Now you can hook up your PC to your central heating and heat up the whole house for winter!


----------



## btarunr (Jul 13, 2008)

Megasty said:


> The Foxconn Transformer F1 would work:



Yeah, this is the X48 board (that didn't make it) that I was talking about in the OP. Nice find.


----------



## CDdude55 (Jul 13, 2008)

Those are some long boards.


----------



## Waldoinsc (Jul 13, 2008)

Megasty said:


> Gah, now I'm going to have to use both of my 1000watters in one machine



Start pulling that much electricity out of one socket and it can trip the circuit breaker.  

Most US houses have a 20Amp 120VAC trip.  I read an article several months ago about the upper limit for most PSU's will be about 1800W for US households. (if the PSU's are more efficient, they can go to higher ratings)


----------



## mlutag (Jul 13, 2008)

you should all be worried instead iof celebrating.....whhy?
well think about it, the manufacturer are really going to suffer from this kind of war between Nvidai and Ati. The 3rd party producers cant be in a good situation right now. And why should the consumer care ....well look at the whole nvidia and their mobile GPUs fiasco. When manufacturers suffer  in this way they start to cut corners.which means that shoddy products end up in the market. I would not be surprised if warranty periods are reduced all of a sudden. We are all not rich folks , I need to spend my hard earned cash to but a new GPU and I expect to get my money's worth. If however this GT280 dies out after 1 year..then I cant even sell it.


----------



## bill_d (Jul 13, 2008)

Waldoinsc said:


> Start pulling that much electricity out of one socket and it can trip the circuit breaker.
> 
> Most US houses have a 20Amp 120VAC trip.  I read an article several months ago about the upper limit for most PSU's will be about 1800W for US households. (if the PSU's are more efficient, they can go to higher ratings)



most breakers are 15 amp for plugs and lighting and can service more than one room


----------



## CDdude55 (Jul 13, 2008)

I agree with above poster(mlutag). Im just going to buy a single nvidia card and shrug this off. This is really exciting for people who benchmark tho. And people who can go skiing in aspen and then still have money left over to go to Disney land and pay all there bills at the same time. Which is probably alot of you guys tho. But i dont need the best, i just want to get a one 8800 or 9800 and play my games on high.


----------



## Waldoinsc (Jul 13, 2008)

I'm not current on the specs... I had a conversation with a home builder a couple of months  ago about this (the increasing power req't of high end PC's & limits of house circuits).  He indicated that many builders put in 20A circuit brkrs.  Don't know if the building codes have changed or not.


----------



## X-TeNDeR (Jul 13, 2008)

Polarman said:


> Wow! this is pretty insane. Now you can hook up your PC to your central heating and heat up the whole house for winter!



^ Thats even more value added to the cards.
They can do gaming, crunch numbers, and keep it nice and warm for the night too.


----------



## LiveOrDie (Jul 13, 2008)

to bad 95% of games wont uses all 8GPU's and drivers Support will be bad, plus most games are only 32bit so they wont even touch most of the memory


----------



## btarunr (Jul 13, 2008)

Live OR Die said:


> to bad 95% of games wont uses all 8GPU's and drivers Support will be bad, plus most games are only 32bit so they wont even touch most of the memory



Games needn't. It's the cards that will render frames taking turns directed by the driver.


----------



## HTC (Jul 13, 2008)

Live OR Die said:


> *to bad 95% of games wont uses all 8GPU's and drivers Support will be bad*, plus most games are only 32bit so they wont even touch most of the memory



I imagine that's the reason why drivers supporting 8 GPUs don't exist yet.


----------



## Megasty (Jul 13, 2008)

Waldoinsc said:


> I'm not current on the specs... I had a conversation with a home builder a couple of months  ago about this (the increasing power req't of high end PC's & limits of house circuits).  He indicated that many builders put in 20A circuit brkrs.  Don't know if the building codes have changed or not.



Most houses in the US built before 1997 are 15 amp (about 95%)  Only houses built within the last 10 yrs are 20 amp - & even those houses have to warranted by the owner for a 20 amp breaker. The house I just bought in 95' had a 15 amper that I had to have upgraded about 5 yrs ago. I have 13 computer users 

I can easily run this 1600W monster even with all the other PCs on at the same time. However, a PC that uses that much power (8 GPUs  ) don't even qualify as overkill anymore - since 2 of those cards would be doing nothing but sucking juice


----------



## LiveOrDie (Jul 13, 2008)

Live OR Die said:


> to bad 95% of games wont uses all 8GPU's and drivers Support will be bad, plus most games are only 32bit so they wont even touch most of the memory



mite as well just buy a quadro and start making my own games


----------



## btarunr (Jul 13, 2008)

mlutag said:


> you should all be worried instead iof celebrating.....whhy?
> well think about it, the manufacturer are really going to suffer from this kind of war between Nvidai and Ati. The 3rd party producers cant be in a good situation right now. And why should the consumer care ....well look at the whole nvidia and their mobile GPUs fiasco. When manufacturers suffer  in this way they start to cut corners.which means that shoddy products end up in the market. I would not be surprised if warranty periods are reduced all of a sudden. We are all not rich folks , I need to spend my hard earned cash to but a new GPU and I expect to get my money's worth. If however this GT280 dies out after 1 year..then I cant even sell it.



"ATI versus NVIDIA" and "AMD versus Intel" can be seen in many perspectives. I see it not as just another "Boeing versus Airbus" competition but one company trying to prevent the other from monopolizing the industry, because if either monopolizes the industry the way Microsoft has with software, we're all doomed. 

Thank AMD for preventing Intel from selling you a 5.00 GHz 200W, $1000 Netburst CPU, Thank ATI for preventing NVIDIA from selling you a $350 9800 GTX or better still, $500 8800 GTX (after R680 demolished it). The day either Intel or NVIDIA monopolize the industry, the victims are going to be us, the users. Users turn losers.


----------



## imperialreign (Jul 13, 2008)

One more reason to move to Vista if you want to run 8 GPUs, eh? 


TBH, I think this is ATI flexing their muscle at this point - TBH, I would be surprised if within the next few GPU generations, tha majority of their cards are dual GPU.


Sure, those of us common consumers wouldn't ever have a need for any more than maybe 4 GPUs; but for people working in certain industries that need all the rendering power they can get their hands on, being able to run 8 GPUs in cooperation would be a blessing.


I really don't think this is anything more than one of those "look what I can do!!" kinda incidents.  A big fat







to the green camp.


----------



## Waldoinsc (Jul 13, 2008)

Megasty said:


> Most houses in the US built before 1997 are 15 amp (about 95%)  Only houses built within the last 10 yrs are 20 amp - & even those houses have to warranted by the owner for a 20 amp breaker. The house I just bought in 95' had a 15 amper that I had to have upgraded about 5 yrs ago. I have 13 computer users
> 
> I can easily run this 1600W monster even with all the other PCs on at the same time. However, a PC that uses that much power (8 GPUs  ) don't even qualify as overkill anymore - since 2 of those cards would be doing nothing but sucking juice



Yeah, I was in the process of building/installing a server closet in my house when this came up. Now I'm relocating due to the job and want to make sure the new house can handle future req'ts..currently have 5 computers in the house.

I think the 8 GPU capability is an awesome development for all of us, even if everyone can't use it or afford it now...it pushes both companies forward and allows developments to trickle down for everyone.


----------



## bill_d (Jul 13, 2008)

i think 2-x2 for most games on my 30" lcd will be more than enough for 2560x1600
well maybe not crysis
but this 8 gpu's may get DreamWorks back


----------



## LiveOrDie (Jul 13, 2008)

i would pay more for a signle card that would work better in games, that only had 2gb of ram,than 4x R700 that i could cook my dinner on, because the time games do run using 8GPU'S PCI-e 2.0 would be out of the ark


----------



## trt740 (Jul 13, 2008)

mlutag said:


> you should all be worried instead iof celebrating.....whhy?
> well think about it, the manufacturer are really going to suffer from this kind of war between Nvidai and Ati. The 3rd party producers cant be in a good situation right now. And why should the consumer care ....well look at the whole nvidia and their mobile GPUs fiasco. When manufacturers suffer  in this way they start to cut corners.which means that shoddy products end up in the market. I would not be surprised if warranty periods are reduced all of a sudden. We are all not rich folks , I need to spend my hard earned cash to but a new GPU and I expect to get my money's worth. If however this GT280 dies out after 1 year..then I cant even sell it.



well the good thing is you won't need to sell it because it is monster fast already but I do get your point.


----------



## btarunr (Jul 13, 2008)

Live OR Die said:


> i would pay more for a signle card that would work better in games, that only had 2gb of ram,than 4x R700 that i could cook my dinner on, because the time games do run using 8GPU'S PCI-e 2.0 would be out of the ark



Had 9800 GX2 supported 3-way SLI, I seriously doubt you would've come up with that argument.


----------



## [I.R.A]_FBi (Jul 13, 2008)

fanboi-ism has been on the rise since of late, sour grapes green camp?


----------



## Megasty (Jul 13, 2008)

mlutag said:


> you should all be worried instead iof celebrating.....whhy?
> well think about it, the manufacturer are really going to suffer from this kind of war between Nvidai and Ati. The 3rd party producers cant be in a good situation right now. And why should the consumer care ....well look at the whole nvidia and their mobile GPUs fiasco. When manufacturers suffer  in this way they start to cut corners.which means that shoddy products end up in the market. I would not be surprised if warranty periods are reduced all of a sudden. We are all not rich folks , I need to spend my hard earned cash to but a new GPU and I expect to get my money's worth. If however this GT280 dies out after 1 year..then I cant even sell it.



The manufacturers that NV & AMD contract under are only 'worried' about the respective product they put out. The most viable component by far is the GPU itself. All the other manufacturers have the greatest confidence in their products. The GPU don't have this do to the yield lvl & the way its binned. However, the main design itself is always on the shoulders of NV & AMD.

Look at the 8800GTX. When it came out, all the different derivatives of it was $800-1000  1 years later, you could pick it up for $400-500. 2 yrs later, you can get it on eBay for $150-200  The GTX280 was $650, a month later $500. If things keep going like they are, & it will, the thing will be about $200 in 11 months or so. The best thing NV buyers can do is buy what they can, then step up when the next thing comes out. The resell value of this thing is going to be _cah _ soon so live it up while you can


----------



## newconroer (Jul 13, 2008)

All I have to say to this is ..well this :

If you need four GPUs to play your games, or hell, even do your multi-media work, you are fu**ing with one seriously demanding application.

So much so, that I would question it's coding...


"Look at the 8800GTX. When it came out, all the different derivatives of it was $800-1000  1 years later, you could pick it up for $400-500. 2 yrs later, you can get it on eBay for $150-200 The GTX280 was $650, a month later $500. If things keep going like they are, & it will, the thing will be about $200 in 11 months or so."


Which begs the question why everyone is so up in arms over it's price....


----------



## imperialreign (Jul 13, 2008)

newconroer said:


> All I have to say to this is ..well this :
> 
> If you need four GPUs to play your games, or hell, even do your multi-media work, you are fu**ing with one seriously demanding application.
> 
> So much so, that I would question it's coding...




I'm sorry . . . but the first thing to pop into my head after reading that was Crysis


----------



## mlupple (Jul 13, 2008)

fitseries3 said:


> honestly..... WTF?
> 
> 3 GPU's scale well but they are still having problems with 4 GPU's scaling similar to 3. you cant add 4 more and expect it to work if 4 doesnt even perform up to par.
> 
> you dont add 4 extra tires to a car and expect it to go faster.



Yea, and this guy knows better than ATI and the rest of us.


----------



## W1zzard (Jul 13, 2008)

wow wtf is going on .. 

"So it has eight GPUs; drivers will not be supporting eight GPUs at this point of time"

stop dreaming .. there is no promise that the driver will ever support 8 gpus. everything in the driver is engineered to support a maximum of four active gpus. 

"drivers will not be supporting 4597498547 GPUs at this point of time"
omg ?! ati working on 4597498547-gpu computer?

all raja said is that they inserted 4 r700 cards into a motherboard, not that they powered it, not that it worked, not that it had been used for anything... NOTHING. you people just got so PR-owned by techtree


----------



## btarunr (Jul 13, 2008)

"drivers will not be supporting eight GPUs at this point of time" are his words, if the driver never supports 8 GPUs, he wouldn't have used "at this point of time" ? Maybe Chris PR-owned us.


----------



## HTC (Jul 13, 2008)

W1zzard said:


> wow wtf is going on ..
> 
> "So it has eight GPUs; drivers will not be supporting eight GPUs at this point of time"
> 
> ...



You're right there 



> *TT*: How soon can we see the 4800 X2s?
> 
> Chris: The HD 4870 X2 is going to be available in August. There will be two different flavors for that, available at different price points.
> 
> As an aside, AMD has already built a computer that has four 4870X2s in it. So it has eight GPUs; drivers will not be supporting eight GPUs at this point of time.



I was so caught up with the "WOW: 8 GPUs!" that i didn't even notice they *didn't say anything about it being running*.

@ btarunr: i should have read it more closely


----------



## Dia01 (Jul 13, 2008)

btarunr said:


> "drivers will not be supporting eight GPUs at this point of time" are his words, if the driver never supports 8 GPUs, he wouldn't have used "at this point of time" ? Maybe Chris PR-owned us.



That would be ashame, I was hoping to replace my 4 slot toaster so I can eat and render the blue screen of death at 1,000,000 FPS.  Maybe 8 GPU's didn't meet the Kyoto Protocol?


----------



## Megasty (Jul 13, 2008)

HTC said:


> You're right there
> 
> 
> 
> ...



That's all it really means. Just like you can go & get a msi k9a2 & plug 4 3870x2s on it. Power all the cards & set 2 CF bridges on each set of them. Will the rig turn on, yeah, but that's it. 2 of the cards would work while the other 2 suck juice


----------



## W1zzard (Jul 13, 2008)

btarunr said:


> "drivers will not be supporting eight GPUs at this point of time" are his words, if the driver never supports 8 GPUs, he wouldn't have used "at this point of time" ? Maybe Chris PR-owned us.



"never" is such a permanent word. what about gpu drivers in 10 years? 

what raja did confirm though is that 8-gpu configurations DO NOT work (at this time). even though the ati fanboys will not be happy it is still some valid input.


----------



## Wile E (Jul 13, 2008)

Waldoinsc said:


> Start pulling that much electricity out of one socket and it can trip the circuit breaker.
> 
> Most US houses have a 20Amp 120VAC trip.  I read an article several months ago about the upper limit for most PSU's will be about 1800W for US households. (if the PSU's are more efficient, they can go to higher ratings)


That's only a problem if you pull around 2200-2400W from the wall. Seeing as psu's only use what they need, even with 2000W of psu in his rig and 4 4870x2's, I doubt power draw will ever come close to that. It's 1600-1800W on a 15A circuit.


----------



## Waldoinsc (Jul 13, 2008)

Wile E said:


> That's only a problem if you pull around 2200-2400W from the wall. Seeing as psu's only use what they need, even with 2000W of psu in his rig and 4 4870x2's, I doubt power draw will ever come close to that. It's 1600-1800W on a 15A circuit.



What, don't you think Crysis would trip breakers in your house if it could?


----------



## zithe (Jul 13, 2008)

I think I'm gunna go emo.

I wanna see this for real. XD!


----------



## yogurt_21 (Jul 13, 2008)

Wile E said:


> Meh, 2 GTX480 rads would be plenty. lol.



one might do it actually those things are crazy

but yeah you'd have to go to a higher cooling to run it 24/7


----------



## Wile E (Jul 14, 2008)

yogurt_21 said:


> one might do it actually those things are crazy
> 
> but yeah you'd have to go to a higher cooling to run it 24/7



I know. I would just like an excuse to run 2. lol.


----------



## LiveOrDie (Jul 14, 2008)

btarunr said:


> Had 9800 GX2 supported 3-way SLI, I seriously doubt you would've come up with that argument.



i had one 9800GX2 and that was bad enough  the performance in games was bad up and down it went, games don't all ways like muitGPU card, like i said a signle GPU card that would max any game out, is good enough for me, + anyways i upgrade every 6-12 months so when intels new chip comes out theres goin to be some think better out then.


----------



## Siman0 (Jul 14, 2008)

it is possible a few collage students did it with the 9800GX2 in the FASTRA Supercomputer. http://fastra.ua.ac.be/en/index.html its possible to get it to work so i wouldn't be going around yelling it cant be done. Note they also did it with an AMD processor for the FSB. (going to be flamed by Intel fans every 3.4 sec after i make this edit i know it) But it wasn't designed to play games so i couldn't think it got the best FPS


----------



## zithe (Jul 14, 2008)

They wrote drivers to have video cards perform CPU tasks. I don't think that counts. It is, however, interesting that they managed what they did.

Edit: They should do the same thing with 4 4870x2!


----------



## WarEagleAU (Jul 15, 2008)

Holy hell, talk about super graphics. You could run a 500' hd projector with all that muscle.


----------

