# ATI R680 PCB Design and Cooling Early Pics



## malware (Dec 16, 2007)

Here are some leaked pictures from ATI R680, courtesy of ChipHell. What you'll see on the PCB are 2x RV670XT GPUs and one PLX chip for communication between the two cores. All the sixteen memory chips (8 on the front and 8 on the back side) are missing from the board, probably because of the early development stage (that's not a finished product). Source said the card is using 0.7ns GDDR4 memory.



 

 



*View at TechPowerUp Main Site*


----------



## ccleorina (Dec 16, 2007)

WTF..... Damm... Long card.... Man... Have to change card and sell the old one again.... I will have one of this card.... ATI Rocks....


----------



## REVHEAD (Dec 16, 2007)

I am genuinely excited about these ,I have had it with my 8800GTX ,if the price is right I will be running 2 of these in my X38 DQ6 board.


----------



## Necrofire (Dec 16, 2007)

*drools.

So much for fitting that into my mini tower with my micro-atx mobo.


----------



## tkpenalty (Dec 16, 2007)

That card is as long and chunky as a GTX lmao.


----------



## tvdang7 (Dec 16, 2007)

so it this any diff than running cross fire 3870's?


----------



## mandelore (Dec 16, 2007)

tkpenalty said:


> That card is as long and chunky as a GTX lmao.



only that this one sports 2 dies, and the gtx just one 

I think ATI can be allowed that one


----------



## Kursah (Dec 16, 2007)

Question is, could they run 4 of these in CrossfireX? Imagine harnessing the power of 8 GPUs! I'm sure support for something like that is far off and distant, but I'm pretty sure if properly usable, that would be quite a performance feat.

Looks interesting for sure, but I'm waiting to see what the actual product is capable of.


----------



## DOM (Dec 16, 2007)

wheres the mem on the card


----------



## Judas (Dec 16, 2007)

DOM_ATI_X800XL_PCI-E said:


> wheres the mem on the card



Damn you beat me to it, was going to say them same


----------



## ccleorina (Dec 16, 2007)

Hay guys.... from what i know.. that u can run only 2 of this card in crossfire.... because there only one crossfire bridge slot.....

Hope it come with 1 or 2 Gb of GDDR4


----------



## btarunr (Dec 16, 2007)

Look at the comedy:

NVidia's best cards have two SLI bridge extensions. This card being the best one from ATI has only one. The AMD Spider got pwned even before it took off.

Where the hell are the memory banks??? Please don't tell me it's going to be AMD Cockroach (an evolution from HyperMemory + AMD's weird naming shemes).

EDIT: the card still is in the development stage and so no memory banks. I guess you can make a guess on how it performs right now. 

1. Take two HD3870 cards, strip them across a x8,x8 lane Crossfire on a board like the Gigabyte GA-790X-DS4. Voila! benchmarks ready. I really don't think the PCI Express lane arbiter made by PLX does anything but to assign tasks for the GPUs. Well, that's what arbiters are meant for.  

2x RV680 cards === 4x HD3870
AMD Crossfire X === Good as dead.


----------



## badsykes (Dec 16, 2007)

i am curious if 3x 7950x2 can be installed in SLI


----------



## Assimilator (Dec 16, 2007)

Instead of wasting money on developing niche products that may never see the light of day (Spyder anyone?), ATI should be trying to fix the R600 series' crappy AA performance - or even better, pushing to get R700 out the door before nVidia releases their next monster GPU.


----------



## Deleted member 30823 (Dec 16, 2007)

I thought they would put 2 cores in the same die, like CPU's, or even 4 cores!!


----------



## Aeon19 (Dec 16, 2007)

HUUUUGEE!!


----------



## EastCoasthandle (Dec 16, 2007)

The R680 aka RV670 x2 is a 2 gpu, single PCB video card using a PLX switch.  So far it appears that it will use a total of 16 ICs (8 in the front, 8 in the rear of the video card) and it is suppose to total Samsung 1 gig of GDDR4 at .7ns ram at 1400MHz (however nothing confirmed as of yet).

The PLX chip is unknown for the RV670 x2 but we do know that the HD 2600 X2 used a PLX PEX 8547 which is a Multi-Purpose, PCI Express Express Lane *Switch*.  This chip has a latency of 110ns (x16 to x16), uses 48 lanes and 3 port PCI Express Switch. Here is a review of the HD 2600 X2 here.  My only issue with this review is that AA should have been used.  Using only AF only tells 1/2 the story at higher resolution.  Here is another review

If a guess was made, the PLX chip on the RV670 x2 should be a gen 2 also using 48 lanes which would make it a PEX 8648 but there is still no confirmation on this yet. Only time will tell if this is true or not.  Ultimately this video card will be very dependent on how profiled and tuned for multi-GPU usage each game is.  This can be worrisome if you look at past CF and SLI gaming compatibility.


----------



## btarunr (Dec 16, 2007)

This product will last not more than a week, I've been screaming in so many threads now, ATI should work on a powerful GPU and not this. How much time does it take for NVidia to slap two 8800 GT cores onto a board and roll out a 8800 GX2?? What's going to happen to this two headed jerk then.


*Increase ROP count!!!*

 ^ An acronym that ATi got blind at.


@EastCoastHandle

Dude, just because I used a non-technical term  "arbiter" doesn't mean I was getting to something other than what you highlighted. Contextually the word arbiter means switch. PLX is not the only company that made these for the HD2000 X2. Some cards even used one made by IDT.


----------



## wazzledoozle (Dec 16, 2007)

ATI is the new 3dfx







This killed 3dfx.


----------



## btarunr (Dec 16, 2007)

wazzledoozle said:


> ATI is the new 3dfx
> 
> 
> 
> ...



EXACTLY MY FRIEND

This is exactly what 3DFX went thru before it collapsed. The AMD ATi merger had a lot of its top Canadian engineers quit to protest the merger. Look what it lead to.


----------



## EastCoasthandle (Dec 16, 2007)

btarunr said:


> This product will last not more than a week, I've been screaming in so many threads now, ATI should work on a powerful GPU and not this. How much time does it take for NVidia to slap two 8800 GT cores onto a board and roll out a 8800 GX2?? What's going to happen to this two headed jerk then.
> 
> 
> *Increase ROP count!!!*
> ...


Get your emotions in check I wasn't talking about you


----------



## btarunr (Dec 16, 2007)

Which is why I didn't quote you. No malice.

BTW, this is the IDT switch I was talking about


----------



## UnXpectedError (Dec 16, 2007)

wazzledoozle said:


> ATI is the new 3dfx
> 
> 
> 
> ...



my exact thoughts , i actually was gonna post something like that

ATI is doomed and no new cards for 2008 wtf ugg


----------



## snuif09 (Dec 16, 2007)

What about voodoo 6000 style 4 chips on a pcb


----------



## btarunr (Dec 16, 2007)

snuif09 said:


> What about voodoo 6000 style 4 chips on a pcb



You'll need liquid cooling or a four slot cooler to keep 4 RV670s in operational conditions.


----------



## [I.R.A]_FBi (Dec 16, 2007)

DOM_ATI_X800XL_PCI-E said:


> wheres the mem on the card






> All the sixteen memory chips (8 on the front and 8 on the back side) are missing from the board, probably because of the early development stage (that's not a finished product)



read folks, dont just look at the pretty pix0rz


----------



## Basard (Dec 16, 2007)

wazzledoozle said:


> This killed 3dfx.



I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.

Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)? 

And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70?  Yet Nvidia still pulls ahead with the same amount of ROPs...  I don't think it's the ROP's fault, completely, it's gotta be something else.


----------



## regan1985 (Dec 16, 2007)

if the price is good then its great for amd/ati and it could lower the price of current cards, but if its not cheap why not just go crossfire way


----------



## [I.R.A]_FBi (Dec 16, 2007)

Basard said:


> I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.
> 
> Why not just make a dual core gpu? Why bother with these huge boards? Maybe they should make a separate add in card to hold the voltage regulators and mosfets (joking)?
> 
> *And don't a lot of the 8800 series have 16 ROPs just like the HD2900 and 3850/70?  Yet Nvidia still pulls ahead with the same amount of ROPs...  I don't think it's the ROP's fault, completely, it's gotta be something else.*



Read this


----------



## jaystein (Dec 16, 2007)

wazzledoozle said:


> ATI is the new 3dfx
> 
> 
> 
> ...


I have one of those in my closet. The VooDoo5500 was a badass card. SO even if ATI does go down at least they will go out with a bang.


----------



## sam0t (Dec 16, 2007)

7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ? 

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.


----------



## btarunr (Dec 16, 2007)

sam0t said:


> 7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?
> 
> To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.





Because at that time, when ATI was dominating the market with X1900 and X1950, NVidia was busy working on the G80 because DX10 was around the corner and it paid off. NVidia was well prepared for DX10 unlike ATi which rolled out its first DX10 offering months after NV. Even today, NV is two steps ahead of ATi

BTW, the 7950 GX2 is the FASTEST DX9 card. Period. It was just a makeshift card to tackle X1950 XTX and keep consumers' attention maintained.


----------



## prophylactic (Dec 16, 2007)

Kursah said:


> Question is, could they run 4 of these in CrossfireX? Imagine harnessing the power of 8 GPUs! I'm sure support for something like that is far off and distant, but I'm pretty sure if properly usable, that would be quite a performance feat.
> 
> Looks interesting for sure, but I'm waiting to see what the actual product is capable of.




A spider has eight legs?  The implications of the name aren't too terribly subtle.


----------



## Chewy (Dec 16, 2007)

sam0t said:


> 7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?
> 
> To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.



 nv wasent worth under 5 billion at that time, remember Amd/Ati is the underdog, they dont well as much as Intel/Nvidia.. intel owns like 80% or more of the market even when amd was better and Nv pretty much does the same with its marketing elc.

 Amd has been in a tight situation thier worth less than what they paid for Ati.


 I think the spider might have to due with quad core and quad gpu's.


----------



## prophylactic (Dec 16, 2007)

Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played."  As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard.  Fuck fairness, of course, it's about who has money.  I'm not a fanboy.  I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.


----------



## btarunr (Dec 16, 2007)

NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.


----------



## [I.R.A]_FBi (Dec 16, 2007)

btarunr said:


> NVidia's developer relations played a key role in grabbing its market shares. It knows that people read benchmarks before they buy a card so if they fool around with a few numbers like FPS or 3DMarks, they've got the market.....c'mon 320 Stream processors, 740 MHz core, 512 bit memory.....all of that should've translated into something....poor HD2900 XT.



read this


----------



## acperience7 (Dec 16, 2007)

So how much power consumption are we talking about even with their new power saving design? I can't help but think that this card will wind up like the 2900XT: Highly priced, great performance, but overly priced, and underwhelming for it's specs, and price.


----------



## Grings (Dec 16, 2007)

ah, but this didnt kill ati (it did flop however)


----------



## rhythmeister (Dec 16, 2007)

Fingers crossed this eats the 8800gtx; dunno how it'll fit in this Lanbox lite tho


----------



## kwchang007 (Dec 16, 2007)

Back to 2900xt heat and power levels I see.  At least it should be faster than what nvidia has out now.  But maybe when they roll out the 9*00 series it won't be king of the hill anymore....hope this is like a stop gap for ATI.


----------



## TXcharger (Dec 16, 2007)

if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...


----------



## DarkMatter (Dec 16, 2007)

[I.R.A]_FBi said:


> Read this



I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
HD2900 = 11888 MP/s 
HD3870 = 12400 MP/s
HD3850 = 10700 MP/s
8800GT = 9600 MP/s
The article at anandtech even suggests that Amd's ROP architecture is more balanced to say the least, and never worse:


> If we compare this setup with G80, we're not as worried as we are about texture capability. G80 can complete 24 pixels per clock (4 pixels per ROP with six ROPs). Like R600, G80 is capable of 2x Z-only performance with 48 Z/stencil operations per clock with AA enabled. When AA is disabled, the hardware is capable of 192 Z-only samples per clock. The ratio of running threads to ROPs is actually worse on G80 than on R600. At the same time, G80 does offer a higher overall fill rate based on potential pixels per clock and clock speed.


So if ROP capacity was the problem, G92 would never perform better than Radeons. Back when they wrote the article someone could have concluded that even if ROPs on G80 are not as efficient, they have more and thus it performed better. That's not the case though as G92 proves otherwise.
In the end, he is right when he said it's gotta be something else. Whether it it is texturing power or shading power is something really difficult to say, since on G80/G92 SP and TU are tied together and at the same time doubling the texture addressing power in G92 didn't mean any significant improvement over G80.


----------



## DarkMatter (Dec 16, 2007)

DarkMatter said:


> I don't really get what do you want to point out with the link, except of how different are those architectures between each other. First, the link is about R600 and it is compared to G80 there. I'm sure he was talking about G92. HD2000 and HD3000 have a higher pixel fillrate than G92:
> HD2900 = 11888 MP/s
> HD3870 = 12400 MP/s
> HD3850 = 10700 MP/s
> ...



EDIT: I have just realized you could have been trying to express exactly what I said instead of being contrary to his opinion. Sorry if that's the case.


----------



## jaystein (Dec 16, 2007)

TXcharger said:


> if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...
> 
> ATi is screwed...



If what you say is true, then it's a sad day for the Video Card market.

I switched over to nVIDIA a year ago, but I want to see AMD/ATI survive.


----------



## DarkMatter (Dec 16, 2007)

prophylactic said:


> Well, there's also the issue with which ATi fans have to deal, in the "Way It's Meant to be Played."  As to say, being that the Green Giant has more money than ATi, they can easily pay developers to "harness" bits of their cards; thus, causing a performance differential in this regard.  Fuck fairness, of course, it's about who has money.  I'm not a fanboy.  I don't care which card I use, but I'm also highly averse to the notion of developers necessarily expressing any degree of favoritism in the developmental process of the games.



I won't go to say Nvidia hasn't more money, because I don't know. But the next links suggest the picture should be otherwise. If Nvidia has more cash, I would say it was Ati's fault. Not only their annual revenue has been higher in the last 4 years, but their net income in the last two too. And revenue is so much higher...

http://www.marketwatch.com/tools/quotes/financials.asp?symb=ATI&sid=160919&report=1&freq=1
http://www.marketwatch.com/tools/quotes/financials.asp?symb=NVDA

Also don't forget that Ati has been in the game 8 years before Nvidia. Looking into this perspective who is the giant?


----------



## TXcharger (Dec 16, 2007)

as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside.  AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.

AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia


----------



## imperialreign (Dec 16, 2007)

Grings said:


> ah, but this didnt kill ati (it did flop however)




IIRC, that was a desperate response to 3DFX's VooDoo 5000 series




> I think partially.... Mostly it was Nvidia and ATI tha killed 3dfx, that and 3dfx thought it was a good idea to make THEIR OWN 3D API.



yeah - but the Glied API was superior to standard OGL, and their cards handled OGL applications much faster than nVidia's or ATI's cards of the time.




TBH, with all this leaked info coming out of ATI - which is extrmelly unusual, but has become more and more common over the last year . . . who's to say they're not going for some slight of hand?  y'know, leaking info on stuff supposedly "in-development" to get their competition to look the other way while they're developing the real ass-kicker behind lock & key?   ATI has done it before; releasing a sub-par "flagship" product while they get their hardware ironed out for the big-dog to be taken off the chain.  For example, look at the X1800 series and how nVidia reacted to that while ATI prepared the X1900 series for launch.


----------



## btarunr (Dec 16, 2007)

Bullseye!

Remember before the HD2900XT came out, there were pictures of some long cards (prototypes) which ppl thought would be the product that'd be launched as ATI's card to face the 8800 GTX?

It was the RV680.

ATI has this mantra of devising certain things in advance and trying out the market before bringing out a shocker.

After the 7800 GTX, it was X1800 XT (flop). Then they came up with X1900 XTX (beat the 7800 GTX) later 7900 GTX was launched and was combated by the X1950 XTX, Pro.

So there's a similarity and speculators like us should look for clues.


----------



## mandelore (Dec 16, 2007)

btarunr said:


> NVidia was well prepared for DX10 unlike ATi



they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...


----------



## DarkMatter (Dec 16, 2007)

TXcharger said:


> as much as i would like to see AMD put up a fight against Intel and Nvidia... they are not doing much to put up a fight, AMD messed up when they bought ATi. Not only were they reeling at the time but they are bringing down a good company. I used to be a ATi fanboy but now that i have a nvidia product, i like it, and it has more upside.  AMD is going to go bankrupt, the Phenom is not gonna be as great as they expect and Intel is destroying them in sales... You could say the Core2Duo architecture was the best thing to happen to Intel.
> 
> AMD is gonna go down and prices are gonna skyrocket in the processor war... But once Intel brings in there GPU's it will be a good fight between them and Nvidia



I don't really think that the fall of AMD is in the best interest to Intel. The better scenery for Intel is a weak but still alive AMD, and never a monopoly IMHO. I will explain.
In this kind of bussiness it's very easy to enter one market if there isn't any competition there. Only for entering the market you could take over ~20% of market share, of course if the product is good enough, not better than the compettion, just good. Samsung has followed this strategy many times, and I don't think I have to say they have succeeded.
On the other hand trying to enter a market in which there is already competition is very difficult. Can you remember XSI? Their cards were good, more or less on par with Nvidia or Ati (they had some drivers issues, but which new card doesn't nowadays?),  but they were new in the game, there were alternatives to the better yet expensive cards in respective segments, so they didn't get any market share. There existed the possibility to buy the better Radeons or the worse but cheaper Nvidias. In this game there wasn't a place for XSI.
I know it's not the best example, since Volari had severe rendering issues on some games, but they could have had some market share for not gaming PCs for example.
If AMD goes down, someone will buy it, they are just not going to let it totally dissapear. The buyer could be IBM or Samsung, for example. If any of those buy AMD it could mean big troubles for Intel, since what AMD lacks, both of those have in excess. Money.


----------



## EastCoasthandle (Dec 16, 2007)

mandelore said:


> they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.
> 
> NV 2 steps ahead, ?? I think not...



Ain't that the truth!  How soon we forgot the flame wars over at nvidia's forums back last novermber, december 2006 over proper dx10 drivers


----------



## MilkyWay (Dec 16, 2007)

Thats exactly what i thought that the problems with the ATi AMD merger is killing the 2 company's products maybe when they overcome the problems possibly 2008-2010 we will see AMD/ATi go back to what they used to be like.

I think if AMD can get out a really poweful cpu that clocks like hell i mean clocks we've never seen like 6ghz and stuff then it will beat intel because Intel just go for crappy tech but clock like fuck to compensate and they have like quad cores that arnt even proper quad cores but clock past 4ghz and run faster than any of the AMD quad cores that have got proper tech in them but are still shit.

I think that 3dfx had great cards but wtf happend at the end of its life with those multi gpu concept cards and no support, ATi just brought out cards with great drivers and Nvidia was coming out with better products than 3dfx.

I had a vodoo 3 card it was good at the time.


----------



## Rurouni Strife (Dec 16, 2007)

Actually, what I personally believe wrong with the general performance of the R600 (and to a lesser extent, the RV670) is the super scalar archeticture.  Check up Beyond 3D and probably a few other sites that I cant name, Anand probably has it too.  The Radeon cards have the potential to use 320 stream processors if their compiler software and the game software interfaces perfectly and they all agree and so on.  However, at worst, the Radeon cards only use 64 stream processors, or roughly half of what's on a 8800GTX.  So I think most games fall in the range of 128 and 256 stream processors for ATI.  However, because of a few other differences in the cards the Nvidia cards run faster in most games.  Thats why in a few certian games you see the ATI cards running close to a GTX.  I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.


----------



## EastCoasthandle (Dec 16, 2007)

> The long of the short is that AMD believes that Nvidia is locking it out of the market with its TWIMTBP programme—something I’m sure Nvidia would disagree with—and that developers working with Nvidia often make it difficult for AMD to get access to code early enough to develop CrossFire drivers in time for a game’s launch. Whatever the case may be (I try not to get involved with the politics of this industry), I’d really like to see more CrossFire support out of the gate in as many of the big titles as possible - but I don’t think it’s just an issue for the developers to tackle. AMD’s driver and developer relations teams need to pull the strings on AMD’s side of the fence too.
> 
> One interesting tidbit I did learn was that AMD is looking at ways to make multi-GPU as transparent as possible, because it no longer sees a future in making increasingly large GPUs. I’m speculating here, but I can see AMD using something like a HyperTransport bus to pass data between the two (or more) GPUs and a PCI-Express controller, which may also have the render back-ends incorporated that talk directly to the on-board memory. It sounds crazy I know, but I really believe that if multi-GPU is going to be the future, it needs to be as transparent for the user as humanly possible.


Source



> The other thing that still irks me a little is the chip’s architectural efficiency – I can’t help but feel this card should (and would) crucify Nvidia’s GeForce 8800 GT if code was written in such a way to take advantage of the VLIW architecture or if AMD had opted for a more versatile architecture that doesn’t suffer from some of the constraints that we’re used to seeing in GPUs of past years, before the unified shaders came to be.


Source

This really sums it up!


----------



## imperialreign (Dec 16, 2007)

> I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.



I agree with this - but marketing further into the performance gaming market right now would be very hard for ATI, as nVidia support is massive, and even n00bs are sucked over to their side rather quickly.

Their best bet, would be to advertise on their HD capabilites being superior to nVidias.  I had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie.  People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior in IQ.  People looking for the best equipment for their HD capabilities at home would also take note, too . . . and we all know how quickly HD broadcasts and movies are moving in.

ATI needs a campaign like TWIMTBP, but they need to target a completely different market right now.


----------



## Assimilator (Dec 16, 2007)

sam0t said:


> 7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?
> 
> To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.



The difference is that the 7950 GX2 wasn't brought out as a last-ditch effort to grab back market share. When the GX2 appeared the G70 series was already a huge success; GX2 was just a marketing/PR stunt. (Granted, also the fastest DX9 video card in the world - and I should know, I have 2 of em .)



mandelore said:


> ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.



That's not a "maybe", it's a fact. The raw power is there in the silicon, it's a crying shame that the drivers just can't make effective use of it.


----------



## InnocentCriminal (Dec 16, 2007)

imperialreign said:
			
		

> I had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie. People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior...



If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P


----------



## Judas (Dec 16, 2007)

Do you know what the sad thing is? If AMD go down the toilet bowl they will take ATI with them , then we are fucked


----------



## mandelore (Dec 16, 2007)

ATI should have purchased AMD, then we would see some goodness


----------



## imperialreign (Dec 16, 2007)

> If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P



I'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.


----------



## InnocentCriminal (Dec 16, 2007)

Well I'm always up for trying to make a difference!


----------



## Basard (Dec 16, 2007)

[I.R.A]_FBi said:


> Read this



Does that tell me what is at fault, other than the supposed "lack of ROPs" like some people are claiming?  It did tell me why the AA settings suck on the ATI cards, and a couple other things.  But the thing is like 30 pages long... And still doesnt really answer my "question".  Also it said some stuff about virtualization that the stream processors could be used to help out windows somehow, I dunno.

http://www.techpowerup.com/reviews/Zotac/GeForce_8800_GTS_512_MB/  Here says ROP's are 16x2... so thats 32 ROPs (more than any other card)?  The only thing I see as "better" on the Nvidia cards is the shader clock.  So what is it the shader clock giving the nvidia cards the big advantage?  Because the numbers never add up, ATI should be smashing nvidia, just considering numbers.  ATI has 320 stream processors and all this other nonsense that should be making it great, it just seems like drivers and software support might be to blame.


----------



## mandelore (Dec 16, 2007)

Basard said:


> it just seems like drivers and software support might be to blame.



pretty much sums it up, if you got great hardware, but no drivers/software to use it, you only ever gonna be as powerful as the software in use dictates


----------



## cool_recep (Dec 16, 2007)

A little collection...


----------



## imperialreign (Dec 16, 2007)

good 'ol 3DFX . . . sadly missed, and still well respected . . .


they were out for blood with those setups  and would've gotten it too if they could've lasted another year


----------



## PrudentPrincess (Dec 16, 2007)

btarunr said:


> You'll need liquid cooling or a four slot cooler to keep 4 RV670s in operational conditions.



Your logic is amazing. 4 chips, 4 slot cooler.


----------



## AsRock (Dec 17, 2007)

mandelore said:


> they were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.
> 
> NV 2 steps ahead, ?? I think not...




I be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?...  As NV wanted NV got it changed..


----------



## Mussels (Dec 17, 2007)

AsRock said:


> I be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?...  As NV wanted NV got it changed..



microsoft got that changed, not NV. They just wanted to push vista, and decided DX10 + the new audio scheme would work better bundled together.


----------



## X-TeNDeR (Dec 17, 2007)

Strange how things turn up lately.. i look forward to some benchies.
Lets just hope AMD wont sell 'em by the inches


----------



## btarunr (Dec 17, 2007)

PrudentPrincess said:


> Your logic is amazing. 4 chips, 4 slot cooler.



Sure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.


----------



## Mussels (Dec 17, 2007)

btarunr said:


> Sure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.



he makes a good point here, the GX2 was pretty much two single slot cards stuck together, sharing a single PCI-E slot.


----------



## btarunr (Dec 17, 2007)

imperialreign said:


> I'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.



Not possible because the CG companies that make these movies use ATI FireGL hardware that they would have bought just like any other customer. So a producer wouldn't agree to spending 15 seconds (15x 20 = 300 frames) to show an ATI logo unless ATI pays for it. On the other hand, several games like Unreal Tournament 2004, 2003, FEAR, etc. Show either a 2D image or a 3D animation of the NVidia "The way it's meant to be played" logo for the reason that NVidia gives them their newest cutting edge hardware, hardware that would not have been released to the market at the point when the game was being made. And game developer wouldn't mind putting up a logo or even making their games work better with NVidia hardware. NVidia leases them the hardware for peanuts.


----------



## rhythmeister (Dec 17, 2007)

TXcharger said:


> if yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...
> 
> ATi is screwed...



I don't find anything sad about ATI releasing a single card that'll be the best in the market upon release  I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx


----------



## Mussels (Dec 17, 2007)

rhythmeister said:


> I don't find anything sad about ATI releasing a single card that'll be the best in the market upon release  I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx



but we CAN all afford 8800GT 256


----------



## InnocentCriminal (Dec 17, 2007)

Mussels said:
			
		

> but we CAN all afford 8800GT 256



I can't. Plus I wouldn't want one even if I could.


----------



## KainXS (Dec 17, 2007)

If Ati's goin crazy and doing stuff like this then maybe NVIDIA needs to make a dual cpu 8800ULTRA based on the G92, that would be funny


----------



## btarunr (Dec 17, 2007)

How funny. Each time there's some news about anything pertaining to a CPU or a GPU, there's a catfight triggered in the forum. with people siding AMD/Intel/NVidia/ATi. come on, people let's just be smart buyers and not fanboys. If AMD gives me a better deal for my money today, I'll buy it. Intel gives a better deal, I'll buy that. Regardless of which brand, we have to get a GPU, something that processes video and lets us play games. Neither ATi nor NVidia are selling things that blow up so lets not hate a company, they're just earning a living, like you and me. Of course NVidia is a multi-gazillion dollar company but is run by people who are in the same economic planes/strata as us.


----------

