Wednesday, January 26th 2011
AMD Radeon HD 6990 Pictured Up Close
Here's the Radeon HD 6990 up close. The HD 6990 is AMD's new dual-GPU graphics card that extends the performance leadership held by Radeon HD 5970. The pictures put rest to some speculation surrounding the cooler design. It now appears that the cooler design is similar to that of the GeForce GTX 295 single-PCB, as far as air-flow is concerned. A single long PCB holds two GPU systems on either sides, a centrally-located blower pushes air on either sides. The exhaust from one GPU is sent out of the case, while that from the other is pushed out of the card from its rear portion.
The Radeon HD 6990 uses two 40 nm Cayman GPUs, it packs a total of 3072 stream processors, and 4096 MB of memory between the two GPU systems. It also features a new kind of display output that consists of one dual-link DVI and four mini-DP 1.2. Power is drawn in from one 6-pin and an 8-pin PCIe connector. The card can pair with another of its kind for 4-GPU CrossFireX. It is expected to be released a little later in this quarter.
Source:
4Gamer.net
The Radeon HD 6990 uses two 40 nm Cayman GPUs, it packs a total of 3072 stream processors, and 4096 MB of memory between the two GPU systems. It also features a new kind of display output that consists of one dual-link DVI and four mini-DP 1.2. Power is drawn in from one 6-pin and an 8-pin PCIe connector. The card can pair with another of its kind for 4-GPU CrossFireX. It is expected to be released a little later in this quarter.
124 Comments on AMD Radeon HD 6990 Pictured Up Close
However, unlike the 5970, it may have a rival from the green camp to drive down prices (presumably eating about 600W of power, but then again I'd have said the same about this card).
And yeah, if you got married, you cannot buy this anyway, as your soul is already taken. [Is that a jab at the marriage paradigm, or a romantic comment? You decide]. :laugh:
I'd rather see a DisplayPort connecter or two ditched in favor of and HDMI port or two. Or better yeat, ditch the DVI port too and give us 4 MIni-HDMI ports.
The card should be a real beast though. I hope nVidia gets a dual GF110/114 card out soon so this thing isn't priced to the moon.
My workstation pc at work have 3X dp ports, yet I use 3x DVI screens.
It's no issue, it comes with DP to DVI on most stuff i've bought, my 5850 even came with one which have 2xdvi and HDMI.
i think its missing its memory card slots
Since they are limited to 5 displays, I feel they've disabled one graphics cards display controllers to save power consumtion, and they've probably saved abit in the PWM, dual cards usually do.
150W+75+75=300w if i'm not mistaken, amd likes staying inside 300W.
Are we anywhere near using 4gb? I can see 2gb being to little, but i feel like 8gb might just be a bit overki-damn it, now I'm 'that guy'. :wtf:
March? April? May? June? July?
What is NDA, by the way?
:wtf:
They will supply at least one free dongle Mini-DP to DVI passive converter to enable second DVI monitor.
And if you want to setup eyefinity 3 monitors, just buy 1 Active Mini-DP to DVI converter.
This is the good design to balance amount of display output and airflow. Because DP 1.2 can do daisy chain monitors by hub in the future.
I'm aware that you can use DVI with DP, using active adapters, and the adapters only cost $30(and they usually don't come with the cards). My point is that most gamers are still using DVI or HDMI monitors. So to make a card that is capable of eyefinity with all DVI/HDMI connectors, and then fill it with the relatively unused DP connectors doesn't make sense. It would have been better to throw a few HDMI or mini-HDMI ports on there instead of 4 DP ports.
The single DVI takes care of like 85%++ of all users. If $60 for adapters for Eyefinity is a concern, you should not be running Eyefinity, IMHO.
This point of view makes no sense to me, and purely based on the fact that Eyefinity is not for those looking to save a buck...it's for those with extra cash to spend on thier entertainment.
As far as I am concerned, as and actual Eyefinity user, there should NOT be any DVI on the card. It has been like a over a year that I've been running Eyefinity. It's like 3D, that has specific hardware requirements, and if they cost more, oh well.
I'm especially dumbfoudned that someone that has bought an unlocked Intel CPU is making these statements, to boot. You, Newtekie1, already paid extra for something you didn't need, but that's OK for CPU, but not GPU? WHUT?
It is about convenience. One of the main drawbacks of eyefinity in a lot of people's eyes is the fact that it requires you use a DP. And the techincal reasons you speak of for that is because each GPU only has a set number of TMDS links, this forces single GPU users to use a DP connection. This is fine on a single GPU card because that is the only option. But on a dual GPU card like this one, there are enough TMDS links to allow 3 or more DVI/HDMI connectors, and it is more convienient for people to use those. So on a card like this I don't really see any good reason to include DP connectors over HDMI/DVI connectors, can you? Shows how little you know. I didn't pay extra for an unlocked Intel CPU. When the 875K came out is was $200 cheaper than the 870. And besides that, when I bought the 875K I paid less than what the i5 750 was selling for at the time, in fact I paid less than what an i5 760 goes for today... If it wasn't for the great deal on the 875K I would have bought an i5-750, actually I wouldn't have I would have kept my X3370...
600 dollars please, no more