Sunday, December 16th 2007

ATI R680 PCB Design and Cooling Early Pics

Here are some leaked pictures from ATI R680, courtesy of ChipHell. What you'll see on the PCB are 2x RV670XT GPUs and one PLX chip for communication between the two cores. All the sixteen memory chips (8 on the front and 8 on the back side) are missing from the board, probably because of the early development stage (that's not a finished product). Source said the card is using 0.7ns GDDR4 memory.
Source: ChipHell
Add your own comment

77 Comments on ATI R680 PCB Design and Cooling Early Pics

#51
EastCoasthandle
mandelorethey were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...
Ain't that the truth! How soon we forgot the flame wars over at nvidia's forums back last novermber, december 2006 over proper dx10 drivers :rolleyes:
Posted on Reply
#52
MilkyWay
Thats exactly what i thought that the problems with the ATi AMD merger is killing the 2 company's products maybe when they overcome the problems possibly 2008-2010 we will see AMD/ATi go back to what they used to be like.

I think if AMD can get out a really poweful cpu that clocks like hell i mean clocks we've never seen like 6ghz and stuff then it will beat intel because Intel just go for crappy tech but clock like fuck to compensate and they have like quad cores that arnt even proper quad cores but clock past 4ghz and run faster than any of the AMD quad cores that have got proper tech in them but are still shit.

I think that 3dfx had great cards but wtf happend at the end of its life with those multi gpu concept cards and no support, ATi just brought out cards with great drivers and Nvidia was coming out with better products than 3dfx.

I had a vodoo 3 card it was good at the time.
Posted on Reply
#53
Rurouni Strife
Actually, what I personally believe wrong with the general performance of the R600 (and to a lesser extent, the RV670) is the super scalar archeticture. Check up Beyond 3D and probably a few other sites that I cant name, Anand probably has it too. The Radeon cards have the potential to use 320 stream processors if their compiler software and the game software interfaces perfectly and they all agree and so on. However, at worst, the Radeon cards only use 64 stream processors, or roughly half of what's on a 8800GTX. So I think most games fall in the range of 128 and 256 stream processors for ATI. However, because of a few other differences in the cards the Nvidia cards run faster in most games. Thats why in a few certian games you see the ATI cards running close to a GTX. I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.
Posted on Reply
#54
EastCoasthandle
The long of the short is that AMD believes that Nvidia is locking it out of the market with its TWIMTBP programme—something I’m sure Nvidia would disagree with—and that developers working with Nvidia often make it difficult for AMD to get access to code early enough to develop CrossFire drivers in time for a game’s launch. Whatever the case may be (I try not to get involved with the politics of this industry), I’d really like to see more CrossFire support out of the gate in as many of the big titles as possible - but I don’t think it’s just an issue for the developers to tackle. AMD’s driver and developer relations teams need to pull the strings on AMD’s side of the fence too.

One interesting tidbit I did learn was that AMD is looking at ways to make multi-GPU as transparent as possible, because it no longer sees a future in making increasingly large GPUs. I’m speculating here, but I can see AMD using something like a HyperTransport bus to pass data between the two (or more) GPUs and a PCI-Express controller, which may also have the render back-ends incorporated that talk directly to the on-board memory. It sounds crazy I know, but I really believe that if multi-GPU is going to be the future, it needs to be as transparent for the user as humanly possible.
Source
The other thing that still irks me a little is the chip’s architectural efficiency – I can’t help but feel this card should (and would) crucify Nvidia’s GeForce 8800 GT if code was written in such a way to take advantage of the VLIW architecture or if AMD had opted for a more versatile architecture that doesn’t suffer from some of the constraints that we’re used to seeing in GPUs of past years, before the unified shaders came to be.
Source

This really sums it up!
Posted on Reply
#55
imperialreign
I think if ATI managed to get something like TWIMTBP off the ground it would help ATI get better drivers out faster AND get optimizations in more games.
I agree with this - but marketing further into the performance gaming market right now would be very hard for ATI, as nVidia support is massive, and even n00bs are sucked over to their side rather quickly.

Their best bet, would be to advertise on their HD capabilites being superior to nVidias. I had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie. People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior in IQ. People looking for the best equipment for their HD capabilities at home would also take note, too . . . and we all know how quickly HD broadcasts and movies are moving in.

ATI needs a campaign like TWIMTBP, but they need to target a completely different market right now.
Posted on Reply
#56
Assimilator
sam0t7950GX2 did not kill Nvidia, so why would this card bury AMD/ATI ?

To me this seems much more elegant than Nvidias Frankenstein (7950GX2) with two cards strap up as one with some Macgyver tape.
The difference is that the 7950 GX2 wasn't brought out as a last-ditch effort to grab back market share. When the GX2 appeared the G70 series was already a huge success; GX2 was just a marketing/PR stunt. (Granted, also the fastest DX9 video card in the world - and I should know, I have 2 of em :D.)
mandeloreATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.
That's not a "maybe", it's a fact. The raw power is there in the silicon, it's a crying shame that the drivers just can't make effective use of it.
Posted on Reply
#57
InnocentCriminal
Resident Grammar Amender
imperialreignI had the thought that they should start including a small advertisement of logo with all these CGI movies coming out - most have admitted to using ATI's hardware, and I'm sure they wouldn't mind a 15s ATI logo brandished at the beginning of a movie. People will go see the film, see the logo (and most will recognize it from the hardware industry), and say that if it's good enough to create a movie like that, then it must be superior...
If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P
Posted on Reply
#58
Judas
Do you know what the sad thing is? If AMD go down the toilet bowl they will take ATI with them , then we are fucked
Posted on Reply
#59
mandelore
ATI should have purchased AMD, then we would see some goodness ;)
Posted on Reply
#60
imperialreign
If only ay? That's a really good idea, maybe you should pitch it to ATi, linking this thread. ;P
I'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.
Posted on Reply
#61
InnocentCriminal
Resident Grammar Amender
Well I'm always up for trying to make a difference!
Posted on Reply
#62
Basard
[I.R.A]_FBiRead this
Does that tell me what is at fault, other than the supposed "lack of ROPs" like some people are claiming? It did tell me why the AA settings suck on the ATI cards, and a couple other things. But the thing is like 30 pages long... And still doesnt really answer my "question". Also it said some stuff about virtualization that the stream processors could be used to help out windows somehow, I dunno.

www.techpowerup.com/reviews/Zotac/GeForce_8800_GTS_512_MB/ Here says ROP's are 16x2... so thats 32 ROPs (more than any other card)? The only thing I see as "better" on the Nvidia cards is the shader clock. So what is it the shader clock giving the nvidia cards the big advantage? Because the numbers never add up, ATI should be smashing nvidia, just considering numbers. ATI has 320 stream processors and all this other nonsense that should be making it great, it just seems like drivers and software support might be to blame.
Posted on Reply
#63
mandelore
Basardit just seems like drivers and software support might be to blame.
pretty much sums it up, if you got great hardware, but no drivers/software to use it, you only ever gonna be as powerful as the software in use dictates
Posted on Reply
#65
imperialreign
good 'ol 3DFX . . . sadly missed, and still well respected . . .


they were out for blood with those setups :laugh: and would've gotten it too if they could've lasted another year
Posted on Reply
#66
PrudentPrincess
btarunrYou'll need liquid cooling or a four slot cooler to keep 4 RV670s in operational conditions.
Your logic is amazing. 4 chips, 4 slot cooler. :roll:
Posted on Reply
#67
AsRock
TPU addict
mandelorethey were so prepared that they couldnt even get a proper gpu virtualisation to work, they barely got anything working without crying to microsoft to get dx10 nerfed so they were back in the game. ATI architecture is waaay ahead of NV, driver support may be another issue, resulting is a knee-capped card.

NV 2 steps ahead, ?? I think not...
I be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?... As NV wanted NV got it changed..
Posted on Reply
#68
Mussels
Freshwater Moderator
AsRockI be leave i remember that did it not all so mean that there was NO REASON whot so ever that DX 10 could not be on XP ?... As NV wanted NV got it changed..
microsoft got that changed, not NV. They just wanted to push vista, and decided DX10 + the new audio scheme would work better bundled together.
Posted on Reply
#69
X-TeNDeR
Strange how things turn up lately.. i look forward to some benchies.
Lets just hope AMD wont sell 'em by the inches ;)
Posted on Reply
#70
btarunr
Editor & Senior Moderator
PrudentPrincessYour logic is amazing. 4 chips, 4 slot cooler. :roll:
Sure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.
Posted on Reply
#71
Mussels
Freshwater Moderator
btarunrSure is, Try placing four RV670s on a PCB, you cant. So it has to be two RV670s, two each on a PCB a la 7950 GX2. You simply can't have one PCB atop another, so there has to be a 1 slot gap for a leaf-blower. That would work out to be four slots in all.
he makes a good point here, the GX2 was pretty much two single slot cards stuck together, sharing a single PCI-E slot.
Posted on Reply
#72
btarunr
Editor & Senior Moderator
imperialreignI'd love to, but I don't really think they'd take me seriously - not unless there was a strong showing of support for something like that fro the fan forums.
Not possible because the CG companies that make these movies use ATI FireGL hardware that they would have bought just like any other customer. So a producer wouldn't agree to spending 15 seconds (15x 20 = 300 frames) to show an ATI logo unless ATI pays for it. On the other hand, several games like Unreal Tournament 2004, 2003, FEAR, etc. Show either a 2D image or a 3D animation of the NVidia "The way it's meant to be played" logo for the reason that NVidia gives them their newest cutting edge hardware, hardware that would not have been released to the market at the point when the game was being made. And game developer wouldn't mind putting up a logo or even making their games work better with NVidia hardware. NVidia leases them the hardware for peanuts.
Posted on Reply
#73
rhythmeister
TXchargerif yall are that desperate for a card to beat the 8800gtx... sad... especially considering this is a dual GPU and the 8800gtx has been out over a year and has a single gpu...

ATi is screwed...
I don't find anything sad about ATI releasing a single card that'll be the best in the market upon release :wtf: I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx :slap:
Posted on Reply
#74
Mussels
Freshwater Moderator
rhythmeisterI don't find anything sad about ATI releasing a single card that'll be the best in the market upon release :wtf: I support the underdog anyway cos monopolies SUCK and we can't all afford an 8800gtx :slap:
but we CAN all afford 8800GT 256 :toast:
Posted on Reply
#75
InnocentCriminal
Resident Grammar Amender
Musselsbut we CAN all afford 8800GT 256
I can't. Plus I wouldn't want one even if I could.

:p
Posted on Reply
Add your own comment
Jan 22nd, 2025 23:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts