• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD CEO Lisa Su: "CrossFire Isn't a Significant Focus"

Joined
Oct 10, 2009
Messages
792 (0.14/day)
Location
Madrid, Spain
System Name Rectangulote
Processor Core I9-9900KF
Motherboard Asus TUF Z390M
Cooling Alphacool Eisbaer Aurora 280 + Eisblock RTX 3090 RE + 2 x 240 ST30
Memory 32 GB DDR4 3600mhz CL16 Crucial Ballistix
Video Card(s) KFA2 RTX 3090 SG
Storage WD Blue 3D 2TB + 2 x WD Black SN750 1TB
Display(s) 2 x Asus ROG Swift PG278QR / Samsung Q60R
Case Corsair 5000D Airflow
Audio Device(s) Evga Nu Audio + Sennheiser HD599SE + Trust GTX 258
Power Supply Corsair RMX850
Mouse Razer Naga Wireless Pro / Logitech MX Master
Keyboard Keychron K4 / Dierya DK61 Pro
Software Windows 11 Pro
The more time passes, the more I'm convinced this was just a stupid rat race started by nvidia and ati did just add fuel to the fire. Nvidia did had to come up with an exotic tech derived from 3dfx and promised double performance in a time they needed to push hard because there was a lot of competition. But the basis of the idea means that in the end you just had to wait to second market to lower prices to get double performance, something card makers that have to roll new products every x months can't allow because of investors' fury.

A stunt that worked for maybe 3-5 years. If there was a real intention to push that tech to a point like nowadays multiple cpu cores are used or cuda, then we would have it more developed by now, and probably multi brand by microsoft.
 
Joined
Aug 23, 2017
Messages
112 (0.04/day)
System Name DELL 3630
Processor I7 8700K
Memory 32 gig
Video Card(s) 4070
Consoles killed Crossfire and SLI years ago. Waste of developer time and resources since the amount of pc users that use them are minimal anyway. And most pc games are console ports.
 
Joined
Jan 15, 2015
Messages
362 (0.10/day)
But we also place higher demands on our games now, the dependancy on VRAM I think was a catalyst for SLI's demise. Nvidia started killing it right at the same time AMD started looking at HBM; and Nvidia had to move to delta compression, and VRAM capacities doubled overnight.

Now look at today; high end GPU between Maxwell and Pascal gained another 4GB (970 > 1070) and the high end even goes to eleven ;)

This makes it even harder to sell 'wasted' hardware resources like doubled VRAM.
The solution to that is to have socketed VRAM, just as we have socketed RAM on motherboards.

In fact, it shouldn't be that difficult to have socketed GPUs, too. Perhaps, given the insanely-high cost of high-end GPUs these days, it's time to start demanding more instead of passively accepting the disposable GPU model. If a GPU board has a strong VRM system and is well-made, why replace that instead of upgrading the chip?

Personally, I'd like to see GPUs move to the motherboard and the ATX standard be replaced with a modern one that's efficient to cool. A vapor chamber that can cool both the CPU and the GPU could be nice (and the chipset — 40mm fans in 2019, really?). A unified VRM design would be a lot more efficient.

It's amusing that people balk at buying a $1000 motherboard but seem not to notice the strangeness of disposing of a $1200 GPU rather than being able to upgrade it.
 
Last edited:
Joined
Feb 18, 2017
Messages
688 (0.25/day)
That really sucks for those of us that use Crossfire. Though many games don't support it, the ones that do shine brightly. I was contemplating getting the 5700XT but I will probably get a used Vega 7 instead when the prices come down.

Well, you got ~52% performance for 100% more price. Literally you threw half the money spent on the second GPU out of the window. If they spend only 10% of the efforts put in Crossfire (or SLI, as NV also leaving the SLI train) on anything else, everyone will be happier.
 
Joined
Apr 10, 2013
Messages
302 (0.07/day)
Location
Michigan, USA
Processor AMD 1700X
Motherboard Crosshair VI Hero
Memory F4-3200C14D-16GFX
Video Card(s) GTX 1070
Storage 960 Pro
Display(s) PG279Q
Case HAF X
Power Supply Silencer MK III 850
Mouse Logitech G700s
Keyboard Logitech G105
Software Windows 10
Good Lisa could say it plainly... stuffs for chumpz just like the 14.4k modem.
 
Joined
Nov 27, 2010
Messages
924 (0.18/day)
System Name future xeon II
Processor DUAL SOCKET xeon e5 2686 v3 , 36c/72t, hacked all cores @3.5ghz, TDP limit hacked
Motherboard asrock rack ep2c612 ws
Cooling case fans,liquid corsair h100iv2 x2
Memory 96 gb ddr4 2133mhz gskill+corsair
Video Card(s) 2x 1080 sc acx3 SLI, @STOCK
Storage Hp ex950 2tb nvme+ adata xpg sx8200 pro 1tb nvme+ sata ssd's+ spinners
Display(s) philips 40" bdm4065uc 4k @60
Case silverstone temjin tj07-b
Audio Device(s) sb Z
Power Supply corsair hx1200i
Mouse corsair m95 16 buttons
Keyboard microsoft internet keyboard pro
Software windows 10 x64 1903 ,enterprise
Benchmark Scores fire strike ultra- 10k time spy- 15k cpu z- 400/15000
they can stop investing in it all they want, I'll do what fits performance best.
 
D

Deleted member 177333

Guest
they can stop investing in it all they want, I'll do what fits performance best.

Ya, I sure hope more developers follow the example done in the Tomb Raider games. I'm in the process of doing my first run through of those games, and they did such a nice job with DX12 mGPU implementation - currently playing "Rise". Having mGPU support built into DX12 & Vulkan so as not to need the driver profiles anymore is really cool.

And I'm with ya on multi gpu in general - you know, it's never been a technology that's for folks on a budget or looking for perfect scaling. We know we're not getting 100% scaling, it just comes down to wanting a certain level of image quality or framerates, etc. and knowing we can't get it with a single GPU.

I know not everyone is a fan and some folks have reported bad experiences, but IMO, as someone who has run the technology for years and years, I've had a great experience with it. Sure hope more developers make use of mGPU.
 
Joined
Oct 4, 2017
Messages
706 (0.28/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
You clearly haven't done your research, here is a chart with 2080 ti @4k ultra settings, tell me if those frames are all playable. good luck with that
this is the article it is taken from

That's why you don't read only one review especially not when it comes from some low reputation website !

Here you have a 35 game sample

https://www.techspot.com/review/1701-geforce-rtx-2080/page2.html ,
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/ ,

from Techspot ( known as HardwareUnboxed on Youtube wich is a reference when it comes to GPU reviews ) . On those 35 games as an average in 4K 2080Ti is hitting 92fps in terms of average frames and 73,31fps in terms of 1% low frames , furthermore there are only 4 games out of 35 where 2080Ti doesn't hit 60fps ( still above 50fps ) . When you compare the review you provided there are plenty of weird results such as GTA 5 : GpuCheck 59fps vs Techspot 122fps !

Keep in mind this is a stock 2080Ti , with memory OC ( more important in 4K ) you are warrated to hit over 60fps in all those games ..... so yeah come again and tell about those unplayable games . Sure there might be some poorly optimized games here and there but other than that you have to be out of your mind to believe that 2080Ti can't drive 4K/60 !
 
Joined
Jun 2, 2017
Messages
8,716 (3.25/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
Well, you got ~52% performance for 100% more price. Literally you threw half the money spent on the second GPU out of the window. If they spend only 10% of the efforts put in Crossfire (or SLI, as NV also leaving the SLI train) on anything else, everyone will be happier.

That is not the way you do crossfire. You buy the best GPU around launch (Vega 64) then when used GPUs (Vega 64) you get a second (which cost me less than half). BTW most of the games that I play fully support crossfire with more than 52% scaling especially TWWH. With that I go from 35 to 72 FPS @ 4K Extreme. Then other games I play like Strange Brigade and others also support crossfire through DX12 and then the best thing about crossfire is that if the game does not support it there is no power draw from the 2nd GPU. Indeed I have been using SLI/Crossfire since the GTS 450 days for....you guessed it Total War. I know that TW3K does not support crossfire but that game is meh compared to TWWH, in fact it is not to me as good as TW Shogun 2. There are still plenty of games that support crossfire including the Witcher series, Watchdogs, Xcom, Tomb raider and Project Cars series to name a few.
 
Joined
Mar 24, 2012
Messages
532 (0.12/day)
That is not the way you do crossfire. You buy the best GPU around launch (Vega 64) then when used GPUs (Vega 64) you get a second (which cost me less than half). BTW most of the games that I play fully support crossfire with more than 52% scaling especially TWWH. With that I go from 35 to 72 FPS @ 4K Extreme. Then other games I play like Strange Brigade and others also support crossfire through DX12 and then the best thing about crossfire is that if the game does not support it there is no power draw from the 2nd GPU. Indeed I have been using SLI/Crossfire since the GTS 450 days for....you guessed it Total War. I know that TW3K does not support crossfire but that game is meh compared to TWWH, in fact it is not to me as good as TW Shogun 2. There are still plenty of games that support crossfire including the Witcher series, Watchdogs, Xcom, Tomb raider and Project Cars series to name a few.

nah. one of the primary reason you want multi GPU is because fastest single GPU can't provide you enough performance. this is the common thing we think about when it comes to multi GPU right now. but in the past majority of people (especially those that can't really afford high end hardware) they want multi GPU because that will allow them to have performance that is even faster than what fastest single GPU can provide but the cost to do it is even cheaper than buying one fastest single GPU. the caveat is they have to deal with the drawback of multi GPU. but the performance uplift in majority of games and the cost to make it happen should out weight the drawback. this is one of the primary example for it:

 
Joined
Jun 2, 2017
Messages
8,716 (3.25/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
nah. one of the primary reason you want multi GPU is because fastest single GPU can't provide you enough performance. this is the common thing we think about when it comes to multi GPU right now. but in the past majority of people (especially those that can't really afford high end hardware) they want multi GPU because that will allow them to have performance that is even faster than what fastest single GPU can provide but the cost to do it is even cheaper than buying one fastest single GPU. the caveat is they have to deal with the drawback of multi GPU. but the performance uplift in majority of games and the cost to make it happen should out weight the drawback. this is one of the primary example for it:


You are absolutely right.
 
Joined
Jan 15, 2015
Messages
362 (0.10/day)
That's why you don't read only one review especially not when it comes from some low reputation website !

Here you have a 35 game sample

https://www.techspot.com/review/1701-geforce-rtx-2080/page2.html ,
https://www.techspot.com/article/1702-geforce-rtx-2080-mega-benchmark/ ,

from Techspot ( known as HardwareUnboxed on Youtube wich is a reference when it comes to GPU reviews ) . On those 35 games as an average in 4K 2080Ti is hitting 92fps in terms of average frames and 73,31fps in terms of 1% low frames , furthermore there are only 4 games out of 35 where 2080Ti doesn't hit 60fps ( still above 50fps ) . When you compare the review you provided there are plenty of weird results such as GTA 5 : GpuCheck 59fps vs Techspot 122fps !

Keep in mind this is a stock 2080Ti , with memory OC ( more important in 4K ) you are warrated to hit over 60fps in all those games ..... so yeah come again and tell about those unplayable games . Sure there might be some poorly optimized games here and there but other than that you have to be out of your mind to believe that 2080Ti can't drive 4K/60 !
So, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU?

So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.

(I've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)

Sell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.

(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)
 
Joined
Mar 24, 2012
Messages
532 (0.12/day)
So, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU?

So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.

(I've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)

Sell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.

(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)

But even if they are successful on that front i don't think it will solve the issue that many people did not like about high end gpu right now: the price.
 
Joined
Oct 4, 2017
Messages
706 (0.28/day)
Location
France
Processor RYZEN 7 5800X3D
Motherboard Aorus B-550I Pro AX
Cooling HEATKILLER IV PRO , EKWB Vector FTW3 3080/3090 , Barrow res + Xylem DDC 4.2, SE 240 + Dabel 20b 240
Memory Viper Steel 4000 PVS416G400C6K
Video Card(s) EVGA 3080Ti FTW3
Storage XPG SX8200 Pro 512 GB NVMe + Samsung 980 1TB
Display(s) Dell S2721DGF
Case NR 200
Power Supply CORSAIR SF750
Mouse Logitech G PRO
Keyboard Meletrix Zoom 75 GT Silver
Software Windows 11 22H2
So, we've been getting propaganda to tell us 8K is really important, or, at the very least, something better than 4K is an important upgrade. But, in order to run good ole 4K well we need to spend... how much on exactly one GPU? So, a reeeaallly expensive GPU that has zero competition in the market... Sounds like a recipe for a bargain, not the situation one is in when there is a monopoly that raises prices artificially.

Totally unrelated to the topic wich is to know if single GPUs nowadays can drive 4K/60 Ultra .........

I've told people before that it's in Nvidia's interest to get rid of multi-GPU as long as AMD isn't competing at the high end. Since AMD is competing against the PC gaming platform by peddling console hardware it also doesn't have as much incentive to compete at the high end. Letting Nvidia increase prices helps it to peddle its midrange hardware at a price premium, undercutting Nvidia's higher premium. But, why not cheerlead for the situation where we can have any color we want as long as it's black?)

Interesting theory of yours , now i would like to hear your theory on why AMD shares the same interest with Nvidia to get rid of multi-GPU https://www.techpowerup.com/258522/amd-ceo-lisa-su-crossfire-isnt-a-significant-focus considering they are far from having Nvidias position in the market and considering according your theory it would be in their interest to maintain mult-GPU support in order to stand a chance to challenge Nvidia's high end with their midrange GPUs .......... in other words your theory doesn't hold water !

It's ok i get it , bashing Nvidia for no reason will never get old for some peoples !

Sell a kidney to play at 4K or stick with 1440. Who needs dual GPU these days? Most everyone's got two kidneys.

I've been playing in 4K since 2014 , i started with R9 290 then moved R9 290 CF then to a single GTX 970 ( because of heat and noise reasons and because one of my 290s died on me ) then to GTX 1060 and recently to GTX 1080Ti wich can handle anything i throw at it in 4k/60 . All those GPU's i named prior to 1080Ti where perfectly able to play in 4K . Obviously sometimes you had to lower some settings depending the game and the GPU but with very minimal impact in terms of visual quality ( on most games there is barely any difference between Ultra and High ) . Nowadays GPUs like GTX 1080 can be found for dirty cheap and are perfectly able to handle 4K assuming you are not one of those '' ultra everything '' elitists and are smart enough to optimise your game settings in order to make the most out of your hardware !

So to answer your question no peoples don't need to sell a kindey to play at 4K they just need to buy a brain ! I don't know who needs dual GPU these days but what i do know is that if you value silence , thermals and power consumption there is absolutely no need to go dual GPU for 4K gaming especially nowadays .

(Given AMD's success with Zen 2 chiplets I would expect that future is going to be multi-GPU, only the extra GPU chips will be chiplets.)

That's a reasonable expectation and a much more elegant solution ( in terms of architecture / noise / thermals ) than dual discrete GPU systems !
 
Last edited:
Top