• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon HD 5970 Specs Surface

So we might see Eyefinity Crossfire support in the next set of drivers?
 
:roll: it will run @5870's clock only if they make it 15" long
 
:roll: it will run @5870's clock only if they make it 15" long

I wanna see someone hardvMOD this card!!!!
Will be interesting!!I bet u ll need to solder 12 volts directly to the PCB????:D:p:nutkick:
 
call it the hd5870x2, stop changing the naming schemes, dont follow in nvidias footsteps! lol

I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.

It's like this:

DAAMIT

ATI - X-series - Spring 2004
ATI - X1k-series - Fall 2005 [+ 1.5 Years]
AMD - HD2k-series - Spring 2007 [+ 1.5 Years]
AMD - HD3k-series - Fall 2007 [+ 0.5 Years]
AMD - HD4k-series - Spring 2008 [+ 0.5 Years]
AMD - HD5k-series - Fall 2009 [+ 1.5 Years]

Highlight: HD 2900 XT -> 6 Months -> HD 3870 -> 6 months -> HD 4870

nVidia

nVidia - 6000-series - Spring 2004
nVidia - 7000-series - Spring 2005 [+ 1 Year]
nVidia - 8000-series - Fall 2006 [+ 1.5 Years]
nVidia - 9000-series - Winter 2007/2008 [+ ~2 Years]
nVidia - GTX 200-series - Spring 2008 [+ 0.5 Years]
nVidia - GTX 300-series - Fall 2009 [+ 1.5 Years]

Highlight: 8800 GT/GTS 512 -> 2 Months -> 9800 GT/GTX -> 4 Months -> GTX 280

So, as you can see, the healthy timeline for the release of a new series from either and both graphics card manufacturers is 1.5 years. The unhealthy is 0.5 years, and also 2 years.
After AMD bought and merged with ATI, they failed to deliver a solid performing chip in the R600. So, in order to be able to compete with nVidia, they required hype. They gained this through changing two series in a single year. What should have been the HD 2950, etc. was thus named 3850, as part of the new and completely fraudulent HD3k series.
Then, nVidia got wind of this and needed to make a move to equal the hype. So, they used the Exact same GPU they did in the 8000 series, the G92, in the 9000 series, which was even worse than what AMD were doing, because nVidia was blatantly re-marketing their product under a superior name, solely in order to garner hype. Thus, they also jumped through two series in roughly the same amount of time (given the actual linear-based timeline).
And in the end, AMD took themselves by the trousers and fashioned an actually competitively good.. and New, GPU, which started the HD4k series, that lasted for the healthy 1.5 Years. nVidia thus again followed suit, with their GTX 200-series, which will also last for 1.5 Years.
So, in the mean time, all is well in the graphics card kingdom, and the terror of the HD3k and 9000 series, is forgotten. But who knows when these big companies will, again, try to trick us because they are too scared, in this almost childish mindset, to lose any piece of market share.
All I can say is men like me will be here to enlighten the masses, and protect the commoners.
 
Last edited:
There's my card.
 
WOW,so now to use this card u need 64bit OS since it has 4 gb right???:wtf:

No, the card is it's own subsystem.
And although it has 4gb of ram only 2gb is probably usable, just as if you had 2 2gb cards in crossfire, you only really have 2gb of video ram for all practical reasons.

This card is going to be massive and heavy.
Get ready to build your braces!
 
I'm not going to blame you, because you haven't been here long enough, but it is actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit.

You have it all wrong...

ATI was releasing new technology, new cards, new cores.
Nvidia would was just doing a small die shrink and calling it a 8800GT-9800GT.

Calling what should be a 5870x2 a 5890, that is a bit stupid.
But they did not take a 2600 and rename it a 3600.
 
I forsee some strangely high ammount of suck coming from this card! It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle. This isn't good for the 5970.
 
A wave of disappointment washes over... Dual 5850's with all 1600sp's is not what I wanted from this card. *sigh*

Seems like it will actually let a 5870 down in trifire.
 
WOW,so now to use this card u need 64bit OS since it has 4 gb right???:wtf:

That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
 
That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.

It's a valid question regardless.
 
WTF???Is it gonna have that DUorb like cooler from stock?Cos i see ATI brand stickers on...?

I doubt it - from the article it said it was an early engineering sample - just bolt a couple of heat-sinks on so they can test it. I imagine the retail-ready piece will be a like the normal stock coolers.
 
It's a valid question regardless.

Agreed, some people don't understand things as well, and it's a very common assumption.

To the point I have heard techs at Microcenter tells customers that, hopefully just because they didn't feel like explaining the truth, but I doubt it :roll:
 
Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
 
Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.

Nice point? With whom are you arguing?
 
That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.

Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.
 
I just jizzed..........in my pants.


Any way will two of my MCW-60R blocks fit?
 
Last edited:
Good to know thanks for the post! Right On!:slap:
 
That actually looks like two Zalman type coolers, or it could be the DuoOrb, either way it is sexy. What I don't like is two DVI and one minidisplay port. That was supposed to a be a six screen behemoth. HEll they should have kept it two dvi, one HDMI (which I use and Ive seen monitors with, haven't seen a monitor displayport, not saying there isn't one) and one display port; if they are not going to make it a six screen monster.
 
This may sound lame, but I would actually prefer it if they delayed the launch of the 5970 to the time of release of Fermi. My reasoning is that this card coming out is going to further reduce the stock of available RV870 chips available. Thus, even worse shortages of the more practical 5850/5870.

I mean, in all honesty, how many people buy these $500+ dollar cards at launch? I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost). That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market. Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.

And as for the clocks, nVidia did the almost the exact same thing with the GTX 295. It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count. And became the GTX 275.
 
OK since this is a dual gpu card, will we see a 5900 series single gpu card for trifire? Sorry for the rhetorical question. I just figured to bring up a point that if a certain person picks one of these up and they indeed keep the naming scheme as a 5970 we won't see CrossfireX with it paired with a current card, Well for what information we have today ....
 
Tell that to those who(still insist on using a 32-bit OS) lost another chunk of their 4GB RAM due to the vRAM.

What?!
You're talking about losing memory to the IGP? But that's predictable, with or without a 64bit OS.
 
... actually AMD that started this entire fiasco. Not just with changing the names, but mostly with the frequency of going forward to the next series, mostly to generate hype. nVidia just (had) to follow suit....
He's talking about the rename from die shrink of nv 8series to call it 9series & blatant rename of 9800GTX+ to GTS250.:p

Dude-bra, that's kinda lame...:o

That makes no sense at all! The memory subsystem is handled by the GPU itself, not the CPU.

Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
The amount of memory on the card cannot exceed the amount in the system, thus 4GB+ would be necessary in the system, which could only be utilised by a 64-bit system.
I forsee some strangely high ammount of suck coming from this card! It will probably beat the 5870 under load by a good margin, but it will probably be the same as 2x5870 in crossfire idle. This isn't good for the 5970.
Binge, I've noticed some huge hate coming from you for anything non-nV.:wtf:
And what happened to your specs? I saw them with your nV card 1 day, then just the name the next.
Nonetheless, calling it HD5900 instead of HD5800X2 makes sense to me.

It's about time we get to have shorter names in ATI's lineup.


In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!

I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
Agreed totally.;)
 
Back
Top