# AMD Radeon HD 5970 Specs Surface



## btarunr (Nov 11, 2009)

In a few days from now, AMD will unveil its new flagship graphics accelerator, the ATI Radeon HD 5970, which will intends to cement the brand's performance leadership over every product from rival NVIDIA. The HD 5970, codenamed "Hemlock", is a dual-GPU accelerator, with two codenamed "Cypress" GPUs in an internal CrossfireX configuration. 

Built on the 40 nm process, these GPUs will feature 1600 stream processors each, and will each have a 256-bit wide GDDR5 memory interface to connect to 2 GB of memory (4 GB total on card). The clock speeds are where the specifications of these GPUs differ from their single-GPU avatar, the Radeon HD 5870. The core is clocked at 725 MHz, while the memory runs at 1000 MHz (4000 MHz effective). 

The accelerator will not have a rear panel identical to those of other Radeon HD 5000 series accelerators. It has the usual broad air vent occupying one slot, while the other has two DVI-D and one mini DisplayPort (DP) connector. The mini DP connector can give out DVI output using a dongle, and in this way, support for ATI Eyefinity technology remains intact. The NDA covering this accelerator is said to expire on the 19th of November, not very far away.

*View at TechPowerUp Main Site*


----------



## theorw (Nov 11, 2009)

WOW,so now to use this card u need 64bit OS since it has 4 gb right???


----------



## Marineborn (Nov 11, 2009)

Mine!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


----------



## skylamer (Nov 11, 2009)

wtfmaaaan?!


----------



## wiak (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???


i think so, now you have to love AMD for their x86-64 extension also known as amd64 or intel64/EMT64


btw you can use Eyefinity with Dual DVI and a DisplayPort to btarunr


----------



## Phxprovost (Nov 11, 2009)

lol great, yet another phantom card that looks lovely yet is impossible to buy


----------



## btarunr (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???



You needed that for the ASUS MARS as well. Besides anyone who bought the HD 4870 X2 (a 2 GB card) with 2 GB of system memory was expected to be running 64-bit OS anyway, so it's not a big deal for its target consumers.


----------



## afw (Nov 11, 2009)

hmm ....clock and memory speeds of a 5850 but 1600 shaders each .... might end up being a notch better than a 5850 crossfire configuration ....  anyways ... its gonna hold the performance crown .... no card from NV boys gonna beat that im sure ...  ... (until their dual-gpu solutions come out that is) ..... ATi is surely havin it there way for now ... 

whats the price of this going to be like ??  $500 ?? $550 ?? $600 .... ???  ..
Im sure its goin to be high and remain high until the NV guys put out there cards  ... ( which i think will take another 2-3 months at least ... hmmm  )


----------



## mtosev (Nov 11, 2009)

4GB. hmm,... most PCs have that amount of RAM. a lot of memory on that card. should be good for playing games at max for the next 3years.


----------



## Mussels (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???



yep. expect tons of reviews on 32 bit OS's to find the card 'lacking' 

(technically, crossfire doesnt add ram. so only 2GB is accesible)


----------



## toyo (Nov 11, 2009)

It seems the temperature issues rumours were true... too bad AMD engineers couldn't find a way to keep the stock speeds.


----------



## Lionheart (Nov 11, 2009)

HELL YEAH BITCHES!!!  I WANT ONE RIGHT NOW, BUT THEN AGAIN, 2 HD5870 CARDS IN CROSSFIRE WOULD PROBABLY BEAT THIS MOST LIKELY, BUT THESE HAVE MORE MEMORY, HMMMMM WHAT TO GET, WHAT U GUYS THINK!!! I would get this if these dont have that microstuttering crap!!1


----------



## inferKNOX (Nov 11, 2009)

Hmm... any info on the 5950, if there is one?


----------



## gumpty (Nov 11, 2009)

Hmmm ... looks very tasty.

But while tempted, I can still remember all too freshly the burns I got from RMAing three 4870x2s.

Will be sticking to a single GPU solution for a while yet.


----------



## Lionheart (Nov 11, 2009)

call it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol


----------



## gumpty (Nov 11, 2009)

inferKNOX said:


> Hmm... any info on the 5950, if there is one?



I'm not entirely convinced that these are 5950 & not 5970, but here you go. Source.

First pic has it next to a regular 5870.


----------



## inferKNOX (Nov 11, 2009)

Thanks. I see what you mean though, it is arguable.
If true, that'll put 2 cards standing head and shoulders above the rest as the most powerful in the world.
If nVidia is struggling with single GPU cards, it's very hard to seem them bringing out a dual to knock back these performance giants. Unless the single nV GPU cards will outdo these, but that's crazy... right?


----------



## Lionheart (Nov 11, 2009)

mmmmmm they look nice!!!


----------



## InnocentCriminal (Nov 11, 2009)

I look forward to (p)reviews and such, but I still believe that having two video cards is rather pointless unless you're gaming at stupidly large resolutions. Still, it's nice to have a big ePenis.


----------



## Yellow&Nerdy? (Nov 11, 2009)

That card begs for watercooling. And I think that the pictures are of the 5950, not 5970.

It's freaking 13.5 inches! You're gonna have to own a monster of a case to fit that. Not to mension availability. Probably not a chance to get one until next year, since they're already short on 5870 and 5850 cards.


----------



## wolf (Nov 11, 2009)

*sigh* uninspiring clock speeds IMO.

Looks like for most benchmarks and gaming a pair of 5870's will whip this until it actually gets into its extra memory per GPU, unless the overclocking is magical.

I think if it uses the standard blower cooler the OCing wont be that great, as temps/noise may get out of control, could be good if it has the cooler shown in Post #16, or a water block 

In any case I really hope this means 1gb 5870/50's take a bit of a drop in price, and availability rises over the next few months.

Also if drivers focus on this it may really help a dual 5850/5870 situation


----------



## PP Mguire (Nov 11, 2009)

I just got a boner enough to say fuck nvidia. dooo want!!


----------



## inferKNOX (Nov 11, 2009)

CHAOS_KILLA said:


> call it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol


Somehow the naming scheme already seems too long as, for example
Sapphire Radeon HD 5900 4GB Vapor-X Edition,
nevermind
Sapphire Radeon HD 5800 x2 4GB (2x2GB) Vapor-X Edition

EDIT: IMO, although it wouldn't happen obviously, but anyway:
Sapphire 5900 4GB Vapor-X
would be sufficient naming, or maybe include the "Radeon"


----------



## HalfAHertz (Nov 11, 2009)

toyo said:


> It seems the temperature issues rumours were true... too bad AMD engineers couldn't find a way to keep the stock speeds.



It's not about temps, but providing enough juice through two pci-e power connectors and the onboard port. A single 5870 uses >190W and the max power envelope is ~300W for a pci-e port+6-pin+8-pin...


----------



## W1zzard (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???



using such a card does not require a 64-bit system. i dont know the exact details how it works, but i sent an email to someone who should know it asking for more info


----------



## Disparia (Nov 11, 2009)

So we might see Eyefinity Crossfire support in the next set of drivers?


----------



## gaximodo (Nov 11, 2009)

it will run @5870's clock only if they make it 15" long


----------



## theorw (Nov 11, 2009)

gumpty said:


> I'm not entirely convinced that these are 5950 & not 5970, but here you go. Source.
> 
> First pic has it next to a regular 5870.
> http://img.hexus.net/v2/news/amd/5950-leak.jpg
> ...



WTF???Is it gonna have that DUorb like cooler from stock?Cos i see ATI brand stickers on...?


----------



## theorw (Nov 11, 2009)

gaximodo said:


> it will run @5870's clock only if they make it 15" long



I wanna see someone hardvMOD this card!!!!
Will be interesting!!I bet u ll need to solder 12 volts directly to the PCB????


----------



## Weer (Nov 11, 2009)

CHAOS_KILLA said:


> call it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol



I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.

It's like this:

*DAAMIT
*
ATI - X-series - Spring 2004
ATI - X1k-series - Fall 2005 [+ 1.5 Years]
AMD - HD2k-series - Spring 2007 [+ 1.5 Years]
AMD - HD3k-series - Fall 2007 [+ _0.5_ Years]
AMD - HD4k-series - Spring 2008 [+ _0.5_ Years]
AMD - HD5k-series - Fall 2009 [+ 1.5 Years]

Highlight: HD 2900 XT -> 6 Months -> *HD 3870* -> 6 months -> HD 4870

*nVidia*

nVidia - 6000-series - Spring 2004
nVidia - 7000-series - Spring 2005 [+ 1 Year]
nVidia - 8000-series - Fall 2006 [+ 1.5 Years]
nVidia - 9000-series - Winter 2007/2008 [+ ~2 Years]
nVidia - GTX 200-series - Spring 2008 [+ 0.5 Years]
nVidia - GTX 300-series - Fall 2009 [+ 1.5 Years]

Highlight: 8800 GT/GTS 512 -> 2 Months -> *9800 GT/GTX* -> 4 Months -> GTX 280

So, as you can see, the healthy timeline for the release of a new series from either and both graphics card manufacturers is 1.5 years. The unhealthy is 0.5 years, and also 2 years.
After AMD bought and merged with ATI, they failed to deliver a solid performing chip in the R600. So, in order to be able to compete with nVidia, they required hype. They gained this through changing two series in a single year. What should have been the HD 2950, etc. was thus named 3850, as part of the new and completely fraudulent HD3k series. 
Then, nVidia got wind of this and needed to make a move to equal the hype. So, they used the Exact same GPU they did in the 8000 series, the G92, in the 9000 series, which was even worse than what AMD were doing, because nVidia was blatantly re-marketing their product under a superior name, solely in order to garner hype. Thus, they also jumped through two series in roughly the same amount of time (given the actual linear-based timeline).
And in the end, AMD took themselves by the trousers and fashioned an actually competitively good.. and New, GPU, which started the HD4k series, that lasted for the healthy 1.5 Years. nVidia thus again followed suit, with their GTX 200-series, which will also last for 1.5 Years.
So, in the mean time, all is well in the graphics card kingdom, and the terror of the HD3k and 9000 series, is forgotten. But who knows when these big companies will, again, try to trick us because they are too scared, in this almost childish mindset, to lose any piece of market share. 
All I can say is men like me will be here to enlighten the masses, and protect the commoners.


----------



## MadClown (Nov 11, 2009)

There's my card.


----------



## niko084 (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???



No, the card is it's own subsystem.
And although it has 4gb of ram only 2gb is probably usable, just as if you had 2 2gb cards in crossfire, you only really have 2gb of video ram for all practical reasons.

This card is going to be massive and heavy.
Get ready to build your braces!


----------



## niko084 (Nov 11, 2009)

Weer said:


> I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.



You have it all wrong...

ATI was releasing new technology, new cards, new cores.
Nvidia would was just doing a small die shrink and calling it a 8800GT-9800GT.

Calling what should be a 5870x2 a 5890, that is a bit stupid.
But they did not take a 2600 and rename it a 3600.


----------



## Binge (Nov 11, 2009)

I forsee some strangely high ammount of suck coming from this card!  It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle.  This isn't good for the 5970.


----------



## wolf (Nov 11, 2009)

A wave of disappointment washes over... Dual 5850's with all 1600sp's is not what I wanted from this card. *sigh*

Seems like it will actually let a 5870 down in trifire.


----------



## ToTTenTranz (Nov 11, 2009)

theorw said:


> WOW,so now to use this card u need 64bit OS since it has 4 gb right???



That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.


----------



## Binge (Nov 11, 2009)

ToTTenTranz said:


> That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.
> 
> Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.



It's a valid question regardless.


----------



## gumpty (Nov 11, 2009)

theorw said:


> WTF???Is it gonna have that DUorb like cooler from stock?Cos i see ATI brand stickers on...?



I doubt it - from the article it said it was an early engineering sample - just bolt a couple of heat-sinks on so they can test it. I imagine the retail-ready piece will be a like the normal stock coolers.


----------



## niko084 (Nov 11, 2009)

Binge said:


> It's a valid question regardless.



Agreed, some people don't understand things as well, and it's a very common assumption.

To the point I have heard techs at Microcenter tells customers that, hopefully just because they didn't feel like explaining the truth, but I doubt it


----------



## ToTTenTranz (Nov 11, 2009)

Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup. 


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.


----------



## Binge (Nov 11, 2009)

ToTTenTranz said:


> Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.
> 
> It's about time we get to have shorter names in ATI's lineup.
> 
> ...



Nice point?  With whom are you arguing?


----------



## ToTTenTranz (Nov 11, 2009)

Binge said:


> Nice point?  With whom are you arguing?



With this post and consequent discussion about the naming schemes.


----------



## pr0n Inspector (Nov 11, 2009)

ToTTenTranz said:


> That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.
> 
> Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.



Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.


----------



## Steevo (Nov 11, 2009)

I just jizzed..........in my pants.


Any way will two of my MCW-60R blocks fit?


----------



## 1933 Poker (Nov 11, 2009)

Good to know thanks for the post! Right On!


----------



## WarEagleAU (Nov 11, 2009)

That actually looks like two Zalman type coolers, or it could be the DuoOrb, either way it is sexy. What I don't like is two DVI and one minidisplay port. That was supposed to a be a six screen behemoth. HEll they should have kept it two dvi, one HDMI (which I use and Ive seen monitors with, haven't seen a monitor displayport, not saying there isn't one) and one display port; if they are not going to make it a six screen monster.


----------



## devguy (Nov 11, 2009)

This may sound lame, but I would actually prefer it if they delayed the launch of the 5970 to the time of release of Fermi.  My reasoning is that this card coming out is going to further reduce the stock of available RV870 chips available.  Thus, even worse shortages of the more practical 5850/5870.

I mean, in all honesty, how many people buy these $500+ dollar cards at launch?  I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost).  That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market.  Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.

And as for the clocks, nVidia did the almost the exact same thing with the GTX 295.  It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count.  And became the GTX 275.


----------



## JrRacinFan (Nov 11, 2009)

OK since this is a dual gpu card, will we see a 5900 series single gpu card for trifire? Sorry for the rhetorical question. I just figured to bring up a point that if a certain person picks one of these up and they indeed keep the naming scheme as a 5970 we won't see CrossfireX with it paired with a current card, Well for what information we have today ....


----------



## ToTTenTranz (Nov 11, 2009)

pr0n Inspector said:


> Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.



What?!
You're talking about losing memory to the IGP? But that's predictable, with or without a 64bit OS.


----------



## inferKNOX (Nov 11, 2009)

Weer said:


> ... actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit....


He's talking about the rename from die shrink of nv 8series to call it 9series & blatant rename of 9800GTX+ to GTS250.



Weer said:


> *DAAMIT
> *


Dude-bra, that's kinda lame...



ToTTenTranz said:


> That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.
> 
> Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.


The amount of memory on the card cannot exceed the amount in the system, thus 4GB+ would be necessary in the system, which could only be utilised by a 64-bit system.


Binge said:


> I forsee some strangely high ammount of suck coming from this card!  It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle.  This isn't good for the 5970.


Binge, I've noticed some huge hate coming from you for anything non-nV.
And what happened to your specs? I saw them with your nV card 1 day, then just the name the next.


ToTTenTranz said:


> Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.
> 
> It's about time we get to have shorter names in ATI's lineup.
> 
> ...


Agreed totally.


----------



## DanishDevil (Nov 11, 2009)

59xx is dual GPU. If you want trifire with 2 cards, you pair a 5970 with a 5870.

I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).


----------



## W1zzard (Nov 11, 2009)

DanishDevil said:


> I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).



wildest speculation, no evidence. maybe ati wants to keep some room for a higher clock version? wait for my reviews before you decide to believe something like that


----------



## ToTTenTranz (Nov 11, 2009)

inferKNOX said:


> The amount of memory on the card cannot exceed the amount in the system, thus 4GB+ would be necessary in the system, which could only be utilised by a 64-bit system.



And why is that?





DanishDevil said:


> 59xx is dual GPU. If you want trifire with 2 cards, you pair a 5970 with a 5870.
> 
> I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).



Hey, these are 2.15billion transistor behemoths. I'm sure it's not a task that easy for a second-gen 40nm process.


----------



## wolf (Nov 11, 2009)

inferKNOX said:


> He's talking about the rename from die shrink of nv 8series to call it 9series & blatant rename of 9800GTX+ to GTS250.



http://www.nvidia.com/object/product_geforce_9800_gtx_plus_us.html

http://www.nvidia.com/object/product_geforce_gts_250_us.html

Funny, I could have sworn there were differences.... PCB, pci-e power requirements, cooler... In any case I really don't want to get into this here, its hardly the thread, Wanna start one about it?


----------



## JrRacinFan (Nov 11, 2009)

DanishDevil said:


> 59xx is dual GPU. If you want trifire with 2 cards, you pair a 5970 with a 5870.



But last we known you can't crossfire from different families. eg 58XX with 58XX, 48XX with 48XX ....

unless I am missing something here.


----------



## 1Kurgan1 (Nov 11, 2009)

Not as exciting as I would have hoped, but it's also a good sign. While the clock speeds on this are less than stellar, it's overkill either way. Yes 2x 5870's will be faster, but thats a big deal right now. ATI has said they have that "ace in the hole" for when GT300 lands, which is exactly what they need, and my bet is, when GT300 is announced, ATI will announce 5890 and a 5990.


----------



## devguy (Nov 11, 2009)

JrRacinFan said:


> But last we known you can't crossfire from different families. eg 58XX with 58XX, 48XX with 48XX ....
> 
> unless I am missing something here.



Yeah, the family of the chip is not as important as the core of the chip.  I would believe one could crossfire any cards that contain the RV870 pro or xt cores.  Since R800 is just 2xRV870, then it should be fine with the HD 58xx series.


----------



## aj28 (Nov 11, 2009)

DanishDevil said:


> I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).



Die shrink and double the stream processors. The "underclock" still leaves them at 4870 speeds, which is pretty impressive if you ask me.


----------



## 1Kurgan1 (Nov 11, 2009)

aj28 said:


> Die shrink and double the stream processors. The "underclock" still leaves them at 4870 speeds, which is pretty impressive if you ask me.



Not to mention that when you get a 4870 or especially a 4870x2 there isnt a ton of headroom. The 5870's have a ton though, heck even 5850's I'm seeing a ton of people with them over 1000mhz on the core, dual GPU cards are just heat monsters.


----------



## Soylent Joe (Nov 11, 2009)

Wow, this thing is going to be badass, and probably close to a grand.


----------



## eidairaman1 (Nov 11, 2009)

I hope its overwhelmingly fast, and have certain techniques that make it faster for games that do not support crossfire etc.  Apparently a gaming buddy of mine has a 4870X2, apparently MS had a flaw with the OS that was problematic with detecting video cards in Multicard config or even dual chip cards, They Fixed it and His performance jumped considerably.


----------



## hv43082 (Nov 11, 2009)

More importantly, HOW MUCH???


----------



## Kenshai (Nov 11, 2009)

hv43082 said:


> More importantly, HOW MUCH???



Expect slightly over the price over 2 5870's at launch but slightly cheaper than 2 5870's at it's settle price.

This is a guess, but it's what the previous generation did. I assume ATI's going to go with the same thing.


----------



## jaredpace (Nov 11, 2009)

hv43082 said:


> More importantly, HOW MUCH???


----------



## Tatsumaru (Nov 11, 2009)

Damn, A monster not a  card !  for its power and its fantastic length.
This thing will kick the crap out of even the strongest fermi single chip card ,that still has  a very long way to get in the market


----------



## AsRock (Nov 11, 2009)

1Kurgan1 said:


> Not as exciting as I would have hoped, but it's also a good sign. While the clock speeds on this are less than stellar, it's overkill either way. *Yes 2x 5870's will be faster, but thats a big deal right now*. ATI has said they have that "ace in the hole" for when GT300 lands, which is exactly what they need, and my bet is, when GT300 is announced, ATI will announce 5890 and a 5990.



But if CF is not supporte3d by the game it's going run worse than a single 5870 ?.


----------



## Lazzer408 (Nov 11, 2009)

5970 = 5870x2?



CHAOS_KILLA said:


> call it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol



Agreed.


----------



## newconroer (Nov 11, 2009)

mtosev said:


> 4GB. hmm,... most PCs have that amount of RAM. a lot of memory on that card. should be good for playing games at max for the next 3years.



Maybe, but isn't the architecture still the same? The cards don't technically have access to 4gb in real time, it's 2gb/2gb instead?


----------



## Animalpak (Nov 11, 2009)

Man that thing will be heavy.


----------



## eidairaman1 (Nov 11, 2009)

they may have looked at how the 4870X2 accesses the ram and actually improved on it or changed the technique to make it work better overall.


----------



## Kenshai (Nov 11, 2009)

eidairaman1 said:


> they may have looked at how the 4870X2 accesses the ram and actually improved on it or changed the technique to make it work better overall.



All it's doing is crossfire on a single card, it should work as crossfire does on two cards.


----------



## Lazzer408 (Nov 11, 2009)

Kenshai said:


> All it's doing is crossfire on a single card, it should work as crossfire does on two cards.



I had 5-10% better performance using 2x3870s vs. 1 3870x2 fwiw.


----------



## Kenshai (Nov 11, 2009)

Lazzer408 said:


> I had 5-10% better performance using 2x3870s vs. 1 3870x2 fwiw.



Was referring to the memory, not overall performance.


----------



## W1zzard (Nov 11, 2009)

i just got an answer regarding "4 GB card requires 64-bit OS".

This is not the case. Any such cards will work fine in a 32-bit OS. Once the texture is created in system memory (which you have to do in any case) the GPU is tasked with copying it to video memory without any CPU interaction (DMA transfer). In other words, the GPU is told: "copy 16 MB of texture from main memory address X into GPU memory". Done.


----------



## extrasalty (Nov 11, 2009)

Lazzer408 said:


> I had 5-10% better performance using 2x3870s vs. 1 3870x2 fwiw.


3870X2 had PCIe 1.1 bridge chip.


----------



## inferKNOX (Nov 11, 2009)

Kenshai said:


> Expect slightly over the price over 2 5870's at launch but slightly cheaper than 2 5870's at it's settle price.
> 
> This is a guess, but it's what the previous generation did. I assume ATI's going to go with the same thing.


Don't you mean, slightly under the price of 2 5870s? That's how I remember it from the 4 series.
Even now, looking at *jaredpace* post, you can see that it's US$600, where 2x5870s would cost about US$800, considering they're going for about US$400 ATM.

All of which is pretty pricey thanks to nVidia's fumbling. If you think about it however, the performance is worth the price, far more so than previous generations of cards that still couldn't handle the games of their day totally, but were being priced even higher than this.


----------



## inferKNOX (Nov 11, 2009)

W1zzard said:


> i just got an answer regarding "4 GB card requires 64-bit OS".
> 
> This is not the case. Any such cards will work fine in a 32-bit OS. Once the texture is created in system memory (which you have to do in any case) the GPU is tasked with copying it to video memory without any CPU interaction (DMA transfer). In other words, the GPU is told: "copy 16 MB of texture from main memory address X into GPU memory". Done.



And how about how much system memory is needed to use the card? Is it alright to have less system RAM than GFX card RAM and still have the system run okay? (Coz I could have sworn I heard it here on TPU from someone I consider intelligent like yourself, that it can't.
Lol, sorry *ToTTenTranz*)


----------



## Polarman (Nov 11, 2009)

I'm just trying to imagine how much heat and noise this thing will generate. Ouch!


----------



## Zubasa (Nov 11, 2009)

Polarman said:


> I'm just trying to imagine how much heat and noise this thing will generate. Ouch!


Hardly any noise, ATi usuaully let these cards run @90C so its dead quiet.


----------



## OnBoard (Nov 11, 2009)

W1zzard said:


> i just got an answer regarding "4 GB card requires 64-bit OS".
> 
> This is not the case. Any such cards will work fine in a 32-bit OS. Once the texture is created in system memory (which you have to do in any case) the GPU is tasked with copying it to video memory without any CPU interaction (DMA transfer). In other words, the GPU is told: "copy 16 MB of texture from main memory address X into GPU memory". Done.





			
				Microsoft said:
			
		

> Various devices in a typical computer require memory-mapped access. This is known as memory-mapped I/O (MMIO). For the MMIO space to be available to 32-bit operating systems, the MMIO space must reside within the first 4 GB of address space.
> 
> For example, if you have a video card that has 256 MB of onboard memory, that memory must be mapped within the first 4 GB of address space. If 4 GB of system memory is already installed, part of that address space must be reserved by the graphics memory mapping. Graphics memory mapping overwrites a part of the system memory. These conditions reduce the total amount of system memory that is available to the operating system.



http://support.microsoft.com/kb/929605/en-us

Doesn't that mean that with HD5970 you can only have max 2GB of RAM no matter how much is installed on your 32bit OS, hence the "you need 64bit OS" for this (and similar) card(s)?


----------



## eidairaman1 (Nov 11, 2009)

Maximum 32Bit OS supports for Main system memory is 3.25GB, if you want more than that you need 64bit OS.


----------



## W1zzard (Nov 11, 2009)

OnBoard said:


> http://support.microsoft.com/kb/929605/en-us
> 
> Doesn't that mean that with HD5970 you can only have max 2GB of RAM no matter how much is installed on your 32bit OS, hence the "you need 64bit OS" for this (and similar) card(s)?



no. it maps maximum of 256 mb for the graphics card


----------



## Animalpak (Nov 11, 2009)

Zubasa said:


> Hardly any noise, ATi usuaully let these cards run @90C so its dead quiet.



a friend of mine had a 4870 X2 would say that warmed so much behind the case that dvi usb cables etc. ... Were soft from the heat.


----------



## 1Kurgan1 (Nov 12, 2009)

Soylent Joe said:


> Wow, this thing is going to be badass, and probably close to a grand.



A grand? Not a chance, this is 2 underclocked 5870 GPU's, so expect it to be under the cost of 2x 5870's. Anyone whos after this card will most likely want to get 2x 5870's instead, so they have to have the price under that. The real question is, are these binned down chips? Or are they just underclocked because of heat? If they are underclocked because of heat, anyone who puts a waterblock on this will be getting 2 full fledges 5870's for cheap.



AsRock said:


> But if CF is not supported by the game it's going run worse than a single 5870 ?.



At stock clocks, yes, the 5970 would be slower in a game that doesnt support CF.



Animalpak said:


> a friend of mine had a 4870 X2 would say that warmed so much behind the case that dvi usb cables etc. ... Were soft from the heat.



You could cook breakfast on the backplate of a 4870x2. And I'm not even kidding you, if I was to actually hold my finger on my backplate of my 4870x2 it would literally burn me. The exhausting air is pretty hot too, but I think your friend was exaggerating a bit there.


----------



## newfellow (Nov 12, 2009)

Looking to the specs this will cost small damn fortune. 4GB? my god that's gonna be something real. Wonder how will the little sister be of 5850x2. 2 x 1600SPs hell they can't even utilize low end 800SPs how the heck they think 3200SPs can be utilized and not to even consider what the heck will use such power? Considering any G92 even is way too much for todays demand.

I'm so gonna see when the power runs out even on, atm, equipment before even considering this kind of hardware plus the drivers most improve like 500% style and fast.


----------



## mtosev (Nov 12, 2009)

jaredpace said:


> http://img188.imageshack.us/img188/7348/5790.png



Equal to 600Eur in Europe. no thank you. way too much,...


----------



## eidairaman1 (Nov 12, 2009)

all i hear is griping, if you want to gripe, send an Email to AMD.


----------



## mtosev (Nov 12, 2009)

newconroer said:


> Maybe, but isn't the architecture still the same? The cards don't technically have access to 4gb in real time, it's 2gb/2gb instead?



there are two GPUs , GPU0 has 2GB available and GPU1 has 2GB of ram. if a game supports CF/SLi then all 4GB will be used. if i remember correctly part of the image is rendered by first gpu and the other part of the picture is rendered by the second gpu.


----------



## Mussels (Nov 12, 2009)

ToTTenTranz said:


> That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.
> 
> Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.



it is a valid question, DX9 apps have to be loaded into system ram, then moved to the video card.

if it was larger than 2GB, a 32 bit OS/app would only be able to use the first 2GB.

then again, 32 bit apps can only use 2GB of address space total, so they'd be optimised to use less than that anyway




W1zzard said:


> i just got an answer regarding "4 GB card requires 64-bit OS".
> 
> This is not the case. Any such cards will work fine in a 32-bit OS. Once the texture is created in system memory (which you have to do in any case) the GPU is tasked with copying it to video memory without any CPU interaction (DMA transfer). In other words, the GPU is told: "copy 16 MB of texture from main memory address X into GPU memory". Done.



i missed w1zzards post - but he pretty much said the same thing.

you have a 2GB limit between video ram and system ram for a 32 bit app, and they get to balance it out themselves so it doesnt crash by going over that limit.

sure a 2GB card wont reach its max potential in a 32 bit OS in DX9 - but so few games use 1GB of Vram, it doesnt matter just yet (and due to DX10's saving of system ram, it would also delay the problems for DX10/11 users)



mtosev said:


> there are two GPUs , GPU0 has 2GB available and GPU1 has 2GB of ram. if a game supports CF/SLi then all 4GB will be used. if i remember correctly part of the image is rendered by first gpu and the other part of the picture is rendered by the second gpu.



This is not true. ram is not additive - each GPU in crossfire or SLI contains a copy of the ram in its entirety, its not divided up in any way


----------



## erocker (Nov 12, 2009)

eidairaman1 said:


> all i hear is griping, if you want to gripe, send an Email to AMD.



Bah, and no one should be using a 32bit O/S nowdays anyways.


----------



## pr0n Inspector (Nov 12, 2009)

I will put this simply: to make  memories accessible to software, they must have unique addresses. There is only 4G of these addresses in a plain 32-bit OS. The end.

If you want to see it with your own eyes, open _Device Manger_, look under the _Resources_ tab of your video card. See those funny hex numbers? Those are the addresses assigned to your card's vRAM.


----------



## mtosev (Nov 12, 2009)

Mussels said:


> This is not true. ram is not additive - each GPU in crossfire or SLI contains a copy of the ram in its entirety, its not divided up in any way



I ment that each GPU has it's own 2GB of ram and it addresses that amount of ram. and that the other gpu has it's own 2GB of ram.

if i'm wrong please correct me.


----------



## Mussels (Nov 12, 2009)

pr0n Inspector said:


> I will put this simply: to make  memories accessible to software, they must have unique addresses. There is only 4G of these addresses in a plain 32-bit OS. The end.
> 
> If you want to see it with your own eyes, open _Device Manger_, look under the _Resources_ tab of your video card. See those funny hex numbers? Those are the addresses assigned to your card's vRAM.



4GB to the OS, 2GB address space to any one application.

we all know this - i have a nice educational link in my sig where we all hammered it out and finally got some accurate info (even w1z participated, thanks w1zzy)


----------



## pr0n Inspector (Nov 12, 2009)

Mussels said:


> 4GB to the OS, 2GB address space to any one application.
> 
> we all know this - i have a nice educational link in my sig where we all hammered it out and finally got some accurate info (even w1z participated, thanks w1zzy)



No. you are referring to the 50/50 memory splitting of Windows.

I am talking about the number of addresses available in a 32-bit OS. There are only 4G of addresses, and video card memory takes priority over system memory, thus bigger vRAM = less addresses for system RAM = system RAM "disappeared". What's more, video card RAM isn't the only thing that needs addresses, other devices need them too, so there's even less left for system RAM.


----------



## Monkeywoman (Nov 12, 2009)

i want one soo bad, going to wait till tax return time before i drop 600 on a vid card


----------



## ToTTenTranz (Nov 12, 2009)

W1zzard said:


> i just got an answer regarding "4 GB card requires 64-bit OS".
> 
> This is not the case. Any such cards will work fine in a 32-bit OS. Once the texture is created in system memory (which you have to do in any case) the GPU is tasked with copying it to video memory without any CPU interaction (DMA transfer). In other words, the GPU is told: "copy 16 MB of texture from main memory address X into GPU memory". Done.



Of course. The system's CPU doesn't keep track of what's in the graphics card, which means it doesn't have to spend its registers to address the information in the graphics card.

Unless it's a UMA system, but that's a whole other story.


----------



## eidairaman1 (Nov 12, 2009)

erocker said:


> Bah, and no one should be using a 32bit O/S nowdays anyways.



Trust me, My new machine will have Win 7 X64 and Win XP SP3 in dual boot from separate HDs.

Also FYI i was talking about a topic posted earlier

http://tinyurl.com/ykg9avt


----------



## Mussels (Nov 12, 2009)

ToTTenTranz said:


> Of course. The system's CPU doesn't keep track of what's in the graphics card, which means it doesn't have to spend its registers to address the information in the graphics card.
> 
> Unless it's a UMA system, but that's a whole other story.



it doesnt keep track of whats in video memory, but it does have to keep a copy in local memory.

there are links about this in the x64 thread in my sig.

DX9 = system ram copied to video ram. System ram gets modified, then copied to video ram before the video card getsi t

DX10 (and up) = direct access to video ram. No system copy = less ram/address space used


----------



## EarlZ (Nov 12, 2009)

Just waiting for the GT300 to stomp this card and an X2 version to burry it


----------



## Disparia (Nov 12, 2009)

^ 5970 X2? Yes, I'm liking that idea!

Then Crossfire a couple of them!


----------



## facepunch (Nov 12, 2009)

Jizzler said:


> ^ 5970 X2? Yes, I'm liking that idea!
> 
> Then Crossfire a couple of them!



agreed then buy a water block and over clock the hell out of them


----------



## Steevo (Nov 12, 2009)

EarlZ said:


> Just waiting for the GT300 to stomp this card and an X2 version to burry it



Let us know how that goes.......next year....


----------



## 1Kurgan1 (Nov 12, 2009)

EarlZ said:


> Just waiting for the GT300 to stomp this card and an X2 version to burry it



Your going to be waiting a while, I highly doubt GT300 will be as powerful as whatever card ATI announces the instant NV announces the GT300 is coming to the market. You know ATI will be holding the slap to kill NV's happy day of announcing their DX11 card coming to the market.


----------



## Imsochobo (Nov 12, 2009)

system memory amount does not need to exceed system memory, i have never noticed any problems with that.
64 bit OS is just required to get 64 bit extension, and getting over 3.37 gb of system memory running Microsoft windows operating systems
Linux can do 4gb in 32 bit just fine.


----------



## Steevo (Nov 12, 2009)

The 32bit OS & 8GB (or whatever) of added memory isn't a huge issue, but you will have a performance decrease as the page has to be translated to the higher memory address.


This 4GB card will be seen as a 2GB card if it were to memory map the entire addressable memory as the card is basicly crossfire on a stick. So, no you do not haev 4GB of memory to use, you have 2GB per core.


Any further disagreements from this will result in a cheese grater and salt vinegar treatment to your ass.


----------



## shevanel (Nov 12, 2009)

I wonder why they went to call it a 5970 and not a 5870x2 or even put it as the 5890.. why 5970?

Was there a 4970?


----------



## pr0n Inspector (Nov 12, 2009)

eidairaman1 said:


> Maximum 32Bit OS supports for Main system memory is *3.25GB*, if you want more than that you need 64bit OS.






Imsochobo said:


> system memory amount does not need to exceed system memory, i have never noticed any problems with that.
> 64 bit OS is just required to get 64 bit extension, and getting over *3.37 gb* of system memory running Microsoft windows operating systems



Hilarious!

hint: That's not a fixed number.



Imsochobo said:


> Linux can do 4gb in 32 bit just fine.



PAE PAE PAE PAE PAE PAE PAE PAE


----------



## Hayder_Master (Nov 12, 2009)

the beast is coming out , welcome to hill


----------



## eidairaman1 (Nov 12, 2009)

pr0n Inspector said:


> Hilarious!
> 
> hint: That's not a fixed number.
> 
> ...




read this genius, From Microsoft's Mouth

http://www.microsoft.com/whdc/system/platform/server/PAE/PAEdrv.mspx

more useful info

http://en.wikipedia.org/wiki/Physical_Address_Extension

anyways im going back to the discussion of the video cards


----------



## pr0n Inspector (Nov 12, 2009)

eidairaman1 said:


> read this genius, From Microsoft's Mouth
> 
> http://www.microsoft.com/whdc/system/platform/server/PAE/PAEdrv.mspx
> 
> ...



uh... ok? So what exactly are you trying to say? Or are you just trying to link flood without saying anything?

Here's a link to an old but execllent article on Dan's Data, which is where I learn about this issue long ago.


----------



## Mussels (Nov 12, 2009)

PAE applies to windows as well as linux.

its not as simple as allowing more than 4GB in a 32 bit operating system, and its nowhere near as good as a true x64 environment


----------



## pr0n Inspector (Nov 12, 2009)

Mussels said:


> PAE applies to windows as well as linux.
> 
> its not as simple as allowing more than 4GB in a 32 bit operating system, and its nowhere near as good as a true x64 environment



PAE on desktop Windows and cheaper _Server_s is practically useless because Microsoft put an artificial limit on the addressable space(4GB). For compatibility reasons.


----------



## mtosev (Nov 12, 2009)

mtosev said:


> I ment that each GPU has it's own 2GB of ram and it addresses that amount of ram. and that the other gpu has it's own 2GB of ram.
> 
> if i'm wrong please correct me.



so am i correct or not?

on the GTX 295 the GPUs are on their seperate PCBs and each GPUs has it's own ram:


----------



## dinmaster (Nov 12, 2009)

just wanting to know if the mini dp needs an active adapter or a passive adapter, planning on using eyefinity on 3 monitors but they are dvi.


----------



## handsomerichguy (Nov 12, 2009)

So the 5970 is not connected to six monitors? Where's the card they said can be connected to six monitors? Last time I saw the picture of an ATi 5000 card is plugged with 6 monitors and I thought It's the Hemlock or 5970


----------



## pantherx12 (Nov 12, 2009)

shevanel said:


> I wonder why they went to call it a 5970 and not a 5870x2 or even put it as the 5890.. why 5970?
> 
> Was there a 4970?





Just to simplify and shorten things.



3/4/5 generation
5/6/7/8/9 type of card, 5= htpc class, 8 = enthusiast class, 9 = dual GPU
30/50/70/90, card rankings for type of card.

Simples : ]


----------



## Mussels (Nov 12, 2009)

pantherx12 said:


> Just to simplify and shorten things.
> 
> 
> 
> ...



except that the 4890 wasn't dual GPU


----------



## Mussels (Nov 12, 2009)

handsomerichguy said:


> So the 5970 is not connected to six monitors? Where's the card they said can be connected to six monitors? Last time I saw the picture of an ATi 5000 card is plugged with 6 monitors and I thought It's the Hemlock or 5970



a variety of the ATI cards can, they're a special "eyefinity" variant.

so you can get a 5850, 5870 and 5890 all in normal (3/4 monitors) or eyefinity (6 monitors)


----------



## pantherx12 (Nov 12, 2009)

Mussels said:


> except that the 4890 wasn't dual GPU





Re-read my friend.

first number
second number

3rd and 4th numbers.

4
8 <- see single GPU
90 <- how pimp the card is in its class.


----------



## Mussels (Nov 12, 2009)

you say "9 = dual GPU"

i misread that.


you mean the 9 as in x9xx

say... the 4970 that never existed


----------



## pantherx12 (Nov 12, 2009)

yes.

the second digit, not the 3rd : ]

For once I didn't make a typo either


----------



## AsRock (Nov 12, 2009)

devguy said:


> This may sound lame, but I would actually prefer it if they delayed the launch of the 5970 to the time of release of Fermi.  My reasoning is that this card coming out is going to further reduce the stock of available RV870 chips available.  Thus, even worse shortages of the more practical 5850/5870.
> 
> I mean, in all honesty, how many people buy these $500+ dollar cards at launch?  I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost).  That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market.  Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.
> 
> And as for the clocks, nVidia did the almost the exact same thing with the GTX 295.  It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count.  And became the GTX 275.



Maybe they have a load of chips that are not clocking as good and will use them ?.  And did i not hear that there doing another card after this so maybe that will have default clocks.


----------



## ToTTenTranz (Nov 12, 2009)

shevanel said:


> I wonder why they went to call it a 5970 and not a 5870x2 or even put it as the 5890.. why 5970?
> 
> Was there a 4970?



Well, someone at wikipedia thinks that HD5890 will be an Evergreen chip with 384bit memory bus.

I would like that. It would make the HD5890 a high-end but very balanced GPU like the HD4890 was already.


----------



## inferKNOX (Nov 12, 2009)

pr0n Inspector said:


> No. you are referring to the 50/50 memory splitting of Windows.
> 
> I am talking about the number of addresses available in a 32-bit OS. There are only 4G of addresses, and video card memory takes priority over system memory, thus bigger vRAM = less addresses for system RAM = system RAM "disappeared". What's more, video card RAM isn't the only thing that needs addresses, other devices need them too, so there's even less left for system RAM.


50/50 memory splitting? 2GB is just the maximum address space limit a single application is given within a 32-bit system at any given time.


			
				Wiki said:
			
		

> Ability to run existing 32-bit applications (.exe's) and dynamic link libraries (.dll's). A 32-bit program, if linked with the "large address aware" option, can use up to 4 GB (4,294,967,296 bytes) of virtual address space, as compared to the default 2 GB (2,147,483,648 bytes; optional 3 GB [3,221,225,472 bytes] with /3GB boot.ini option and "large address aware" link option) offered by 32-bit Windows.


source


----------



## Mussels (Nov 12, 2009)

inferKNOX said:


> 50/50 memory splitting? 2GB is just the maximum address space limit a single application is given within a 32-bit system at any given time.
> 
> source



i actually have a tool that lets you mod exe files to make the large address aware - very handy.

worked well on games like stalker, supreme commander (original/early patches) and even the sims 3, preventing eventual crashes as they hit the 2GB barrier.



I'm not entirely sure that all 4GB is addressable, i think its another case of "in a perfect world"

a large address space aware app can use whatever is available, if windows sees only 3.25GB of ram, i get the feeling the app may find a similar limit.


----------



## inferKNOX (Nov 12, 2009)

Mussels said:


> *i actually have a tool that lets you mod exe files to make the large address aware* - very handy.
> 
> worked well on games like stalker, supreme commander (original/early patches) and even the sims 3, preventing eventual crashes as they hit the 2GB barrier.
> 
> ...


Yes, but that requires modding, it is not active by default in the system.
EDIT: from what I know, there are registers (or something) that take the ~0.8GB chunk out of the 4GB, making it 3.2GB addressable, and out of that, some goes to the kernel, etc leaving the lower (numbered) address spaces to apps. Don't quote me though, I'm not spot on... but that's the idea as I know it.


----------



## pr0n Inspector (Nov 12, 2009)

inferKNOX said:


> 50/50 memory splitting? 2GB is just the maximum address space limit a single application is given within a 32-bit system at any given time.
> 
> source



half for the kernel half for the app. 50/50. 4/2= 2 
Where do you think the "2GB limit" came from?
Not to mention, there won't even be 4GB of usable RAM in the first place.

And please for the love of the $readers_preferred_deity$, stop bringing up numbers like 3.2GB, 3.25GB, 3.3333GB 3.456789GB. They are meaningless. It's not a fixed value, it's what's left of the available addresses after all other devices grabbed their own.


----------



## imperialreign (Nov 12, 2009)

btarunr said:


> You needed that for the ASUS MARS as well. Besides anyone who bought the HD 4870 X2 (a 2 GB card) with 2 GB of system memory was expected to be running 64-bit OS anyway, so it's not a big deal for its target consumers.



Agreed.

TBH, the whole market really needs to start moving more towards x64 anyways.  Between some of this new video hardware, and the fact that WIN7 includes a fully licensed copy of x64 at no additional cost . . . there's really *no* excuse why the end user can't move over to x64, either (although, I'm sure there will be a ton of excuses like always).

Regarding these cards - looks good.  I'm planning on two of these beasts once they're released.


----------



## Depth (Nov 16, 2009)

I might have room for a few of those if I saw out my HD enclosure.


----------



## vagxtr (Nov 16, 2009)

toyo said:


> It seems the temperature issues rumours were true... too bad AMD engineers couldn't find a way to keep the stock speeds.



The problem wasnt in overheating cause heatpipes and cooling meshes could easily cool a way more than measilly 300W but they tried to stay under 300W pro pcie sake compliancy 2x6-pin pcie graphic plugs=2x75W and 2x8pin should be ~200W, while pcie16 port itself could provide 75W of power so it maximises 275W for safe operating and most of cards top out that with 300W+ in their peaks. This card is designed to be GPGPU failsafe in 27/7 operating mode, so that's reason why they keep TDP as low as they could.


----------



## eidairaman1 (Nov 16, 2009)

That needs a Sticky



Mussels said:


> i actually have a tool that lets you mod exe files to make the large address aware - very handy.
> 
> worked well on games like stalker, supreme commander (original/early patches) and even the sims 3, preventing eventual crashes as they hit the 2GB barrier.
> 
> ...


----------



## vagxtr (Nov 16, 2009)

gumpty said:


> I'm not entirely convinced that these are 5950 & not 5970, but here you go. Source.
> 
> First pic has it next to a regular 5870.



Weird thing is that supposedly HD5970 on that pics has old analog tv-out/rgb connector on it's backplate which no other card from HD5000 series doesn't have. Didnt ATi cutout that tv-out analog support since Cypress?


----------



## vagxtr (Nov 16, 2009)

ToTTenTranz said:


> Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.
> 
> It's about time we get to have shorter names in ATI's lineup.
> 
> ...



It's not done to shorten elongated enthusiastic X2 names, AMD just once again practice their copycating naming intel/nVidia scheme solution. Well seems AMD marketing herd thinks of intel/nv as smarter than themselves. I never liked it but. ... So since nVidia break with their gx2 naming scheme for sandwiched cards with 9800gx2 ATi decided to follow path and get rid of nicely fitting name like X2 (or R800 codenames in fact) in favor of blatantly confusing names. Since their official introduction in 3870 X2 series ATi tried to explain themselves as _how great that names fit into their CPU/GPU naming lineup with X2 and X4 Athlons on their way_ and now abandoning that just cause nV also did release sandwiched dual g200b card just based on cutdown chip (as this 5970 should be) not having full 512 bit as gtx 280/285 but just 448 as gtx260/275 and claiming throne over 4870X2 with it. So they did just as nV did use affordable GPUs and lower clockspeeds but retaining not so complex 256b memory bus for each chip. And _they didn't want to mislead customers_, as they said, _into believing that 5970 (or ex 5870X2) is really 5870 x2_ when it's not just downclocked version of fully functional RV870 chip x2

And as ATi and nVidia abandoning their long established code naming scheme as Rx00 series or recently Gx00 series in favor of newcomer Intel's larabee is just too stupid. They now have some *next to nothing meaning names instead down to earth numbering scheme*. And we all know where i leads ... spending more and more money on _stupid marketing "wisdom"_ instead of trying to made way better chips (at least they have traditionand great starting ground) than richy rich player that wants to bully itself to market. As long for nV they at last tried to praise their CUDA capable gpus with scientist names either none of them was great mathematician afaik. And they skipping from one concept scientist to another in blink of an eye. Not dedicated at all. But probably there's reason why cause no one claims rights to use that scientist names. Duh what gutterly low way of promoting their products in patent ruled world.


----------



## W1zzard (Nov 16, 2009)

the postman just delivered two packages. oh what it could be


----------



## W1zzard (Nov 16, 2009)

Mussels said:


> i actually have a tool that lets you mod exe files to make the large address aware - very handy.
> 
> worked well on games like stalker, supreme commander (original/early patches) and even the sims 3, preventing eventual crashes as they hit the 2GB barrier.
> 
> ...



for that to work the app must not be coded in a way that makes it limited to 2 gb allocation, windows must be running in /3GB mode, then the app can allocate 3 gb of memory maximum (no matter how much physical ram you have, paging file will handle it)


----------



## shevanel (Nov 16, 2009)

W1zzard said:


> the postman just delivered two packages. oh what it could be



how soon before we can see reviews? )wondering if there was a nda date or something)


----------



## mtosev (Nov 16, 2009)

W1zzard said:


> the postman just delivered two packages. oh what it could be



one HD 5970 X2

and a mind altering substance that causes you to buy only AMD products. MUST HAVE MORE MORE!!!!!!!!!!1111


----------



## pantherx12 (Nov 16, 2009)

W1zzard said:


> the postman just delivered two packages. oh what it could be



You know what, I'm going to straight up say it, I hate you man 

I wish I got free toys to play with!


----------



## btarunr (Nov 16, 2009)

shevanel said:


> how soon before we can see reviews? )wondering if there was a nda date or something)



19th. A full-fledged launch-o-rama (main review(s), CrossFireX review, possibly PCI-E scaling).


----------



## W1zzard (Nov 16, 2009)

pantherx12 said:


> I wish I got free toys to play with!



free except for the time invested to make the review, the time invested to learn how to do reviews, the time to learn about graphics cards, the time to establish as credible source of reviews, probably more.


----------



## shevanel (Nov 16, 2009)

what happens to the cards that you review after theyve been reviewed?


----------



## DanishDevil (Nov 16, 2009)

He puts them up on a giant GPU billboard with oversized thumb tacks and carries it around with him to show off.


----------



## W1zzard (Nov 16, 2009)

shevanel said:


> what happens to the cards that you review after theyve been reviewed?



where do you think does all the data for older cards come from in reviews?


----------



## Kenshai (Nov 16, 2009)

W1zzard said:


> where do you think does all the data for older cards come from in reviews?



Your gigantic box of video cards? I lost the picture of it but it was pretty amazing


----------



## mtosev (Nov 16, 2009)

W1zzard said:


> free except for the time invested to make the review, the time invested to learn how to do reviews, the time to learn about graphics cards, the time to establish as credible source of reviews, probably more.



when you started the site your reviews consisted mainly on fans, cheap stuff < 50Eur,... how did you came to this what the site is today? when i came here in 2005 the site wasn't well known at all, the forum didn't have a lot of users, etc,...?


----------



## jaredpace (Nov 16, 2009)

NDA Press deck slides here:

http://forums.techpowerup.com/showpost.php?p=1638260&postcount=114


----------



## verona (Nov 16, 2009)

No, you don't need a 64-bit OS just because there's 4GB of RAM on the card. The 4GB limit is system addressable memory- not graphics memory. No matter how much memory is on the graphics card itself, it only uses up 256MB of system addressable memory. That's it. I spent a really boring week plugging in all kinds of graphics cards and recording base address registers to verify this. I'm just glad I got paid to do it.


----------



## eidairaman1 (Nov 17, 2009)

it appears the 5970 is a Downclocked 5870 without any shaders disabled, running at 725MHz which is 5850 speed, and the Memory Clock is running at 1GHz


----------

