Monday, June 3rd 2024

AMD Introduces New Radeon PRO W7900 Dual Slot at Computex 2024

In addition to the new Zen 5 Ryzen 9000 series desktop CPUs and Ryzen AI 300 series mobile CPUs, as well as the new Ryzen 5000XT series AM4 socket desktop CPUs and updates to the AMD Instinct AI GPU roadmap, AMD rather silently announced the new Radeon PRO W7900 Dual Slot workstation graphics card at the Computex 2024. While not a completely new product, as it is just a model update of the currently available flagship Radeon PRO W7900 workstation graphics card, it is still a rather important update since AMD managed to squeeze it into a dual-slot design, which gives it support for multi-GPU setups.

As said, the AMD Radeon PRO W7900 Dual Slot still uses the same Navi 31 GPU with 96 Compute Units (CUs), 96 Ray Accelerators, 192 AI Accelerators, and 6144 Stream Processors, as well as 48 GB of GDDR6 ECC memory on a 384-bit memory interface, giving it maximum memory bandwidth of 864 GB/s. It still needs 2x8-pin PCIe power connectors and has a Total Board Power (TBP) of 295 W. The card still comes with three DisplayPort 2.1 and one Enhanced Mini DisplayPort 1.2 outputs. What makes the new Radeon PRO W7900 Dual Slot special is the fact that AMD managed to get it down to a dual-slot design, even with the same blower-fan cooler. Unfortunately, it is not clear if the fan or its profiles are different, but it does make it suitable to be used in multi-GPU configurations.
AMD announced that the AMD Radeon PRO W7900 Dual Slot graphics card will be priced at $3,499 and is expected to be available beginning June 19, 2024, alongside the AMD ROCm 6.1 for AMD Radeon GPUs, which is expected to make "AI development and deployment with AMD Radeon desktop GPUs more compatible, accessible, and scalable."
Source: AMD
Add your own comment

9 Comments on AMD Introduces New Radeon PRO W7900 Dual Slot at Computex 2024

#1
HBSound
Seeing places like EKWB support this GPU with pro-level waterblocks would be nice.
Posted on Reply
#2
bonehead123
Sooo.....now instead of using a single card taking up 3.5-4 slots, you can use 2x cards in that same space AND not waste 2 slots for nothing....

WOW, just friggin WOW....

Now if they could just do this for their consumer cards AND enable SLI again (including drivers & app support), we'd be all set for a glorious return to the old days, oh yea :D

But Jacket Lady is probably too busy picking out her next batch of letha to be bothered with this, heeheehee
Posted on Reply
#3
Tomorrow
bonehead123Sooo.....now instead of using a single card taking up 3.5-4 slots, you can use 2x cards in that same space AND not waste 2 slots for nothing....

WOW, just friggin WOW....

Now if they could just do this for their consumer cards AND enable SLI again (including drivers & app support), we'd be all set for a glorious return to the old days, oh yea :D

But Jacket Lady is probably too busy picking out her next batch of letha to be bothered with this, heeheehee
This comment makes no sense. Why should AMD bring back multi-gpu to consumer space and unless jensen has gone trough hormone therapy there is no "jacket lady".
As for the news itself - it's odd that W7900 was already not dual slot.
Posted on Reply
#4
bonehead123
Tomorrowthere is no "jacket lady"
So Lisa doesn't count ?

no therapy needed AFAIK....but she does frequently wear outfits that are eerily similar to what her letha-clad counterpart does :)
Why should AMD bring back multi-gpu to consumer space
Because they CAN (and this card proves it is possible), and it could possibly give them something to offer their users that those other guys don't have, seeins how they only care about AI everything everywhere all at once ..
Posted on Reply
#5
Onasi
bonehead123Because they CAN (and this card proves it is possible), and it could possibly give them something to offer their users that those other guys don't have, seeins how they only care about AI everything everywhere all at once ..
This point has been done to death. Multi-GPU makes sense in professional workloads. And in those it still works. Even with consumer cards. In consumer workloads though, i.e gaming, it suffers from poor scaling and a need either for driver implementation (on older APIs), or for developers of games to specifically code for it (newer APIs). Neither makes sense from a cost to benefit point of view.
Crossfire and SLI is dead. It’s not coming back. Those workloads that support the explicit mGPU way of things still do and work fine.
Posted on Reply
#6
Guwapo77
bonehead123So Lisa doesn't count ?

no therapy needed AFAIK....but she does frequently wear outfits that are eerily similar to what her letha-clad counterpart does :)



Because they CAN (and this card proves it is possible), and it could possibly give them something to offer their users that those other guys don't have, seeins how they only care about AI everything everywhere all at once ..
Wasn't it more of a software issue/game engines issue as to why they quit XFire and SLi? Some games showed 80% FPS increase while others could range from -10% to 50% FPS increase? It was all over the place. I use to be a big Xfire supporter, but I also don't miss it either. Life has been far more stable since.
Posted on Reply
#7
Panther_Seraphin
Guwapo77Wasn't it more of a software issue/game engines issue as to why they quit XFire and SLi? Some games showed 80% FPS increase while others could range from -10% to 50% FPS increase? It was all over the place. I use to be a big Xfire supporter, but I also don't miss it either. Life has been far more stable since.
Crossfire/SLI was a mixed bag for most people, with small amounts of users and having to do per game optimizations every time something new came out. I was a big user back in the BF3 days with dual MSI HD6870s

Things like the mGPU features in DX12 were meant to replace the need for per game driver optimisations but it moved the onus away from AMD/nVidia over to game developers. You only have to look at the amount of games that mention mGpu support to realise its very unlikely to come back.

We need someone like Epic/ID Tech to come out wiht a new Unreal or Doom and go "Hey this is actually what is possible with mgpu in the 2020s" to kickstart this and I would argue with the latest unreal engine it would be a killer thing to show off properly.
Posted on Reply
#8
DemonicRyzen666
TomorrowThis comment makes no sense. Why should AMD bring back multi-gpu to consumer space and unless jensen has gone trough hormone therapy there is no "jacket lady".
As for the news itself - it's odd that W7900 was already not dual slot.
mGPU can be enable on all RDNA variant cards.
Panther_SeraphinCrossfire/SLI was a mixed bag for most people, with small amounts of users and having to do per game optimizations every time something new came out. I was a big user back in the BF3 days with dual MSI HD6870s

Things like the mGPU features in DX12 were meant to replace the need for per game driver optimisations but it moved the onus away from AMD/nVidia over to game developers. You only have to look at the amount of games that mention mGpu support to realise its very unlikely to come back.

We need someone like Epic/ID Tech to come out wiht a new Unreal or Doom and go "Hey this is actually what is possible with mgpu in the 2020s" to kickstart this and I would argue with the latest unreal engine it would be a killer thing to show off properly.
Games have still had the same issue without mutli-GPU's setup's for quite some time now. I mean look at the 14 triple A games that release through 2023 that had stuttering problems on single card. It quite obvious now it was never the driver's problem. I just watch a Rcent Hardware Canuck Video on youtube about it. A bad saved tanked perfromance in Baldurs Gate 3 from 131 fps to 50fps & low GPU usage.
Posted on Reply
#9
Panther_Seraphin
DemonicRyzen666Games have still had the same issue without mutli-GPU's setup's for quite some time now. I mean look at the 14 triple A games that release through 2023 that had stuttering problems on single card. It quite obvious now it was never the driver's problem. I just watch a Rcent Hardware Canuck Video on youtube about it. A bad saved tanked perfromance in Baldurs Gate 3 from 131 fps to 50fps & low GPU usage.
So now your asking companies that cant get their game to run right on one card to now dedicate resources to cater for the <5% running two or more GPUs that is considerably harder to optimise for?

*Grabs Popcorn*

The SLI/Crossfire optimisations that were done in driver were only ever optimisations/fixes for unintended behaviours from those games sepcific implementations. It was never about writing a specific driver for that game with specific configurations.

I would argue the one company who may throw a curve ball with this is actually Intel of all people. I could see them doing something wacky with Celestial making it attractive if game developers put some time in for it. The reason I say this is they already have a dual GPU card out in the wild for Alchemist

www.techpowerup.com/gpu-specs/data-center-gpu-flex-140.c4072
Posted on Reply
Jul 15th, 2024 23:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts