Wednesday, November 11th 2009
AMD Radeon HD 5970 Specs Surface
In a few days from now, AMD will unveil its new flagship graphics accelerator, the ATI Radeon HD 5970, which will intends to cement the brand's performance leadership over every product from rival NVIDIA. The HD 5970, codenamed "Hemlock", is a dual-GPU accelerator, with two codenamed "Cypress" GPUs in an internal CrossfireX configuration.
Built on the 40 nm process, these GPUs will feature 1600 stream processors each, and will each have a 256-bit wide GDDR5 memory interface to connect to 2 GB of memory (4 GB total on card). The clock speeds are where the specifications of these GPUs differ from their single-GPU avatar, the Radeon HD 5870. The core is clocked at 725 MHz, while the memory runs at 1000 MHz (4000 MHz effective).
The accelerator will not have a rear panel identical to those of other Radeon HD 5000 series accelerators. It has the usual broad air vent occupying one slot, while the other has two DVI-D and one mini DisplayPort (DP) connector. The mini DP connector can give out DVI output using a dongle, and in this way, support for ATI Eyefinity technology remains intact. The NDA covering this accelerator is said to expire on the 19th of November, not very far away.
Source:
TechConnect Magazine
Built on the 40 nm process, these GPUs will feature 1600 stream processors each, and will each have a 256-bit wide GDDR5 memory interface to connect to 2 GB of memory (4 GB total on card). The clock speeds are where the specifications of these GPUs differ from their single-GPU avatar, the Radeon HD 5870. The core is clocked at 725 MHz, while the memory runs at 1000 MHz (4000 MHz effective).
The accelerator will not have a rear panel identical to those of other Radeon HD 5000 series accelerators. It has the usual broad air vent occupying one slot, while the other has two DVI-D and one mini DisplayPort (DP) connector. The mini DP connector can give out DVI output using a dongle, and in this way, support for ATI Eyefinity technology remains intact. The NDA covering this accelerator is said to expire on the 19th of November, not very far away.
147 Comments on AMD Radeon HD 5970 Specs Surface
Will be interesting!!I bet u ll need to solder 12 volts directly to the PCB????:D:p:nutkick:
It's like this:
DAAMIT
ATI - X-series - Spring 2004
ATI - X1k-series - Fall 2005 [+ 1.5 Years]
AMD - HD2k-series - Spring 2007 [+ 1.5 Years]
AMD - HD3k-series - Fall 2007 [+ 0.5 Years]
AMD - HD4k-series - Spring 2008 [+ 0.5 Years]
AMD - HD5k-series - Fall 2009 [+ 1.5 Years]
Highlight: HD 2900 XT -> 6 Months -> HD 3870 -> 6 months -> HD 4870
nVidia
nVidia - 6000-series - Spring 2004
nVidia - 7000-series - Spring 2005 [+ 1 Year]
nVidia - 8000-series - Fall 2006 [+ 1.5 Years]
nVidia - 9000-series - Winter 2007/2008 [+ ~2 Years]
nVidia - GTX 200-series - Spring 2008 [+ 0.5 Years]
nVidia - GTX 300-series - Fall 2009 [+ 1.5 Years]
Highlight: 8800 GT/GTS 512 -> 2 Months -> 9800 GT/GTX -> 4 Months -> GTX 280
So, as you can see, the healthy timeline for the release of a new series from either and both graphics card manufacturers is 1.5 years. The unhealthy is 0.5 years, and also 2 years.
After AMD bought and merged with ATI, they failed to deliver a solid performing chip in the R600. So, in order to be able to compete with nVidia, they required hype. They gained this through changing two series in a single year. What should have been the HD 2950, etc. was thus named 3850, as part of the new and completely fraudulent HD3k series.
Then, nVidia got wind of this and needed to make a move to equal the hype. So, they used the Exact same GPU they did in the 8000 series, the G92, in the 9000 series, which was even worse than what AMD were doing, because nVidia was blatantly re-marketing their product under a superior name, solely in order to garner hype. Thus, they also jumped through two series in roughly the same amount of time (given the actual linear-based timeline).
And in the end, AMD took themselves by the trousers and fashioned an actually competitively good.. and New, GPU, which started the HD4k series, that lasted for the healthy 1.5 Years. nVidia thus again followed suit, with their GTX 200-series, which will also last for 1.5 Years.
So, in the mean time, all is well in the graphics card kingdom, and the terror of the HD3k and 9000 series, is forgotten. But who knows when these big companies will, again, try to trick us because they are too scared, in this almost childish mindset, to lose any piece of market share.
All I can say is men like me will be here to enlighten the masses, and protect the commoners.
And although it has 4gb of ram only 2gb is probably usable, just as if you had 2 2gb cards in crossfire, you only really have 2gb of video ram for all practical reasons.
This card is going to be massive and heavy.
Get ready to build your braces!
ATI was releasing new technology, new cards, new cores.
Nvidia would was just doing a small die shrink and calling it a 8800GT-9800GT.
Calling what should be a 5870x2 a 5890, that is a bit stupid.
But they did not take a 2600 and rename it a 3600.
Seems like it will actually let a 5870 down in trifire.
Only if the PC used an UMA architecture like the consoles (or systems with IGPs), that constraint could exist.
To the point I have heard techs at Microcenter tells customers that, hopefully just because they didn't feel like explaining the truth, but I doubt it :roll:
It's about time we get to have shorter names in ATI's lineup.
In fact, I think it's about time that both ATI and nVidia drop the Radeon and Geforce brands. It's been 10 years now, we need new names!
I still remember when nVidia was so confident about NV30 that they even publicly stated that they'd drop the Geforce brand and call it a different name. When the architecture turned up as bad as it was, they had to use the Geforce brand again, for marketing purposes.
Any way will two of my MCW-60R blocks fit?
I mean, in all honesty, how many people buy these $500+ dollar cards at launch? I know only a few who did, and when they sold them to buy a 5870, they only got around $200 (didn't even come close to cover their 5870 cost). That is the cost for buying into new technology, sure, but I'd rather AMD focus on getting more 58xx series cards available on the market. Plus, the 5970 coming out when Fermi does, while serve as a sort of distraction to nVidia.
And as for the clocks, nVidia did the almost the exact same thing with the GTX 295. It was a GTX 285 with the memory bandwidth of the GTX 260 and similar clocks to it, yet had full shader count. And became the GTX 275.
You're talking about losing memory to the IGP? But that's predictable, with or without a 64bit OS.
And what happened to your specs? I saw them with your nV card 1 day, then just the name the next. Agreed totally.;)
I find it interesting that after the die shrink, they still had to underclock the cards to get them to run cool enough. ATi is feeling the heat (from their own cards).