• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Readying a 10-core AM4 Processor to Thwart Core i9-9900K?

Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
For most users that's actually enough.
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.

LOL! No it isn't. A solid 20% of my clients do either crossfire or SLI(mostly SLI). Multi GPU gaming has maintained a steady level of popularity for more than 10 years. It's not going anywhere.
Well, that would place them solidly in the "more money than sense" category. I suppose you can't stop people from wasting money. The RTX 2070 doesn't support SLI, meaning that the minimum price for a dual GPU setup in the future will be ~$1400. According to GamersNexus' recent test of RTX SLI, scaling is actually slightly better than it has been (probably thanks to NVLink), but still unpredictable and requires per-game profiles, limiting support to a handful of titles per year. In other words, in most games your $1400 SLI setup will perform no better than a $699 single card, let alone a $1200 Ti - which works in every game. That's dead enough for me.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,750 (3.26/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.

Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.

Well, that would place them solidly in the "more money than sense" category. I suppose you can't stop people from wasting money. The RTX 2070 doesn't support SLI, meaning that the minimum price for a dual GPU setup in the future will be ~$1400. According to GamersNexus' recent test of RTX SLI, scaling is actually slightly better than it has been (probably thanks to NVLink), but still unpredictable and requires per-game profiles, limiting support to a handful of titles per year. In other words, in most games your $1400 SLI setup will perform no better than a $699 single card, let alone a $1200 Ti - which works in every game. That's dead enough for me.

This I agree with. SLI is plagued with problems and meh performance gains even in titles that support it. Even if someone handed me $10,000 and told me I MUST use it to buy myself a computer, SLI (or xfire) would still not be on the list. I have two 1070s right now only because of mining. If it weren't for that, I would still more than likely be rocking my old 660 Ti.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
I don't disagree, but with the increasing popularity of NVMe drives another few lanes would be a nice form of future-proofing.
But..
Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.
This.
Well, that would place them solidly in the "more money than sense" category.
Or people that have the money and can, comfortably or not, afford the setup and want the extra performance.
limiting support to a handful of titles per year.
Rubbish, SLI/Crossfire support is driver-centric. All games will run fine in a dual GPU config.
SLI is plagued with problems and meh performance gains even in titles that support it.
Also rubbish. I haven't seen a show-stopping bug/glitch/problem in over three years and the last one had an easy workaround until AMD fixed the driver. Haven't seen an SLI related problem, that affected the systems I've built, in over five years.
 

hat

Enthusiast
Joined
Nov 20, 2006
Messages
21,750 (3.26/day)
Location
Ohio
System Name Starlifter :: Dragonfly
Processor i7 2600k 4.4GHz :: i5 10400
Motherboard ASUS P8P67 Pro :: ASUS Prime H570-Plus
Cooling Cryorig M9 :: Stock
Memory 4x4GB DDR3 2133 :: 2x8GB DDR4 2400
Video Card(s) PNY GTX1070 :: Integrated UHD 630
Storage Crucial MX500 1TB, 2x1TB Seagate RAID 0 :: Mushkin Enhanced 60GB SSD, 3x4TB Seagate HDD RAID5
Display(s) Onn 165hz 1080p :: Acer 1080p
Case Antec SOHO 1030B :: Old White Full Tower
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro - Bose Companion 2 Series III :: None
Power Supply FSP Hydro GE 550w :: EVGA Supernova 550
Software Windows 10 Pro - Plex Server on Dragonfly
Benchmark Scores >9000
Also rubbish. I haven't seen a show-stopping bug/glitch/problem in over three years and the last one had an easy workaround until AMD fixed the driver. Haven't seen an SLI related problem, that affected the systems I've built, in over five years.

I've seen problems with SLI personally (in my uncle's system), but granted that was ages ago with two 8800GTS 320MB cards. Between that and the nonstop lamenting over SLI/xFire before and after, to this day, all over the net is more than enough to put me off of it.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
I've seen problems with SLI personally (in my uncle's system), but granted that was ages ago with two 8800GTS 320MB cards. Between that and the nonstop lamenting over SLI/xFire before and after, to this day, all over the net is more than enough to put me off of it.
I'm not saying dual GPU system aren't without glitch and issues once in a while, but these problems like many other, get blow way out of proportion. I've been building gaming PC's since the original Voodoo SLI and have never seen the kind of problems a lot of people lament over. The worst problem with multiGPU setups I ever encountered was with the Voodoo2's. Even that was just a matter of figuring out what the problem was.

With the new RTX series cards, SLI seems like an attractive prospect for those who can afford it. However..
The RTX 2070 doesn't support SLI
I had to look this up. The 2070 and below will not have NVlink. That does not mean they will not still have the standard SLI bridge. NVidia has not stated that it will not be available.
 
  • Like
Reactions: hat
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Supposedly Ryzen (at least the 2600x) has 20 lanes, 16 for video card and 4 for... whatever else, usually those NVMe drives you speak of. Beyond that the chipset provides more. Not sure what else you would be looking for.
That's the 16+4+4 I mentioned above. 16 for graphics (and general usage, really), 4 for NVMe (or, again, anything, really), and 4 for the chipset link. The chipsets only provide PCIe 2.0 (8 lanes for 70-series, 6 for 50). So if you want/need more than one full-speed NVMe SSD (which is growing more likely as time passes), you need to eat into the 16 GPU lanes, which means that no/few motherboards will provide more than one NVMe port from the CPU-connected lanes - they'll use the chipset 2.0 lanes instead. Of course, running your GPU at x8 isn't actually a problem, but this requires a riser card for the SSD.

Rubbish, SLI/Crossfire support is driver-centric. All games will run fine in a dual GPU config.
Yes, it depends on drivers - SLI profiles in the drivers, specifically. SLI has zero effect without a bespoke profile for the game in question (activating it in a game without a profile usually leads to a tiny but measurable performance drop, bugginess, or nothing at all happening). For some games, modders even make their own profiles, with varying success. The only difference between SLI and DX12 multi-GPU in this regard is that the effort lies with Nvidia and not the developer. The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).

As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back). For previous cards, the SLI fingers needed a cut-out in the backplate, so unless they've redesigned the entire SLI interface, it doesn't have it. There isn't room to fit the bridge connector between the backplate and the PCB, so a cutout or fingers sticking up past the backplate would be necessary. The NVLink slot is also visible from the back on the 2080/- TI. Nvidia cut SLI from the third-largest die (then the 60-series) previously, so it's no surprise if they keep to this line even now that the third-largest die is in the 70-series.

SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back).
That's a CGI mock-up not an actual photograph. And the FE RTX cards and many of the AIB cards have a removable cover for the NVlink, the FE GTX 2070 cards likely have the same.
The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).
What I meant was that all games will benefit from SLI/CF. I have yet to see a game that doesn't get at least some performance increase from a multiGPU setup, when properly configured.
SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.
That is entirely your opinion, not shared by all.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
That's a CGI mock-up not an actual photograph. And the FE RTX cards and many of the AIB cards have a removable cover for the NVlink, the FE GTX 2070 cards likely have the same.
The same mock-ups of the 2080 and 2080 Ti have the NVLink connector (with its cover) very clearly visible (it protrudes slightly from the edge of the backplate). Official renders of the 970 and 1070 also clearly showed the SLI fingers. The renders of the 2070 show nothing but a straight edge there - clearly no nvlink adapter, and also no SLI finger cutout. Official product renders for Founders Edition cards also tend to match the final product quite exactly.

What I meant was that all games will benefit from SLI/CF. I have yet to see a game that doesn't get at least some performance increase from a multiGPU setup, when properly configured.
I think what you mean is that all games can benefit from it. Will implies that it'll happen in time, which it won't - even Nvidia doesn't have the resources to do all that development. The problem is that >95% of games never come close to "properly configured" for SLI. The vast majority never even get profiles, and many of those who do never see more than 30-40% scaling (there are exceptions - in the GamersNexus 2080 Ti SLI scaling review i referenced above they had a title with >90% scaling!). Of course, some people don't mind paying 2x the price for 1.4x the performance in <5% of titles, and that's of course their right - but that won't make me stop calling it dumb, bad value, poorly implemented, and generally problematic. 'Cause it is.

I'm quite a fan of the concept behind SLI/CF (I even had CF 4850s back in the day, which worked decently in supported titles up until their 512MB of RAM started being an issue). The problem is that until MULTI-GPU can be implemented universally and transparently on a general driver basis (meaning no developer effort required, unlike DX12 multi-GPU, and much less driver development required, unlike SLI/CF), it's going to be a niche solution with disappointing results and egregious value.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
The same mock-ups of the 2080 and 2080 Ti have the NVLink connector (with its cover) very clearly visible (it protrudes slightly from the edge of the backplate). Official renders of the 970 and 1070 also clearly showed the SLI fingers. The renders of the 2070 show nothing but a straight edge there - clearly no nvlink adapter, and also no SLI finger cutout. Official product renders for Founders Edition cards also tend to match the final product quite exactly.
Until they actually officially announce that they will not have SLI, I'm not willing to accept such. They would be shooting themselves in the foot and handing AMD a whole class of customers if they didn't continue SLI on mid-range cards.
I think what you mean is that all games can benefit from it. Will implies that it'll happen in time
Right, bad choice of vocabulary.

We are way off topic here, let's rope it in..
 
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Until they actually officially announce that they will not have SLI, I'm not willing to accept such. They would be shooting themselves in the foot and handing AMD a whole class of customers if they didn't continue SLI on mid-range cards.
Don't disagree here (I'm generally not a fan of making features exclusive to high-end SKUs), but seeing how they cut it from XX06 cards last generation, It'd be a bit strange for them to bring back support to this chip tier this generation. Of course, the separation of the three topmost SKUs into three separate silicon dice is itself unprecedented, so who knows what they'll end up doing?
 
Joined
May 31, 2016
Messages
4,486 (1.41/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Here's some SLI tests for the 2080 TI. It looks pretty good.

 

Earlzmoade

New Member
Joined
Sep 24, 2018
Messages
4 (0.00/day)
Lol. This is gonna be embarrassing. I've decided to find those old slides for you, and out of all the sources the first one that came up in google was an article from WCFTech with an informative title "Fake AMD Ryzen 2800X 12 Core 5.1GHz Slide Sends Media Into Frenzy" ))))
So much for keeping up with news.... :banghead:

So, all we have to go on, is a now-taken-down and non-existent MSI promotional video for B450 motherboard that claimed "8-core and up CPU" support... All clues and hints have been meticulously erased.


Not just Msi.

If you look at the manual for Asrock x370 PRO btc+

You can see in the bios that they have cpu overclock for 16 cores
 
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Not just Msi.

If you look at the manual for Asrock x370 PRO btc+

You can see in the bios that they have cpu overclock for 16 cores
... Which fits perfectly with AMD moving their top-end consumer parts to the same silicon as the currently sampling 7nm EPYC with the 3000-series (with two 8-core CCXes per die for a maximum of 64 cores in 4-die EPYC). There's no reason to suspect this is relevant before then.

This aligns with AMDs current strategy, as well as reasonable expectations of its extention into the future. It is as such the answer that requires the fewest new assumptions (no deviations from current strategy, no unknown silicon, no reconfiguration of the architecture that we don't know of) and is thus the best hypothesis according to Occam's razor.
 
Joined
Oct 22, 2014
Messages
14,358 (3.80/day)
Location
Sunshine Coast
System Name H7 Flow 2024
Processor AMD 5800X3D
Motherboard Asus X570 Tough Gaming
Cooling Custom liquid
Memory 32 GB DDR4
Video Card(s) Intel ARC A750
Storage Crucial P5 Plus 2TB.
Display(s) AOC 24" Freesync 1m.s. 75Hz
Mouse Lenovo
Keyboard Eweadn Mechanical
Software W11 Pro 64 bit
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Wrong thread.
Not really - we went off on a bit of a tangent for the past couple of pages. Still, not really on-topic, but neither were the posts preceding it.
 
Joined
Sep 26, 2018
Messages
47 (0.02/day)
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.
That concept has already been covered/suggested. I agree that it's possible with some sort of combination. We will see what happens.
 
Joined
May 7, 2009
Messages
5,392 (0.93/day)
Location
Carrollton, GA
System Name ODIN
Processor AMD Ryzen 7 5800X
Motherboard Gigabyte B550 Aorus Elite AX V2
Cooling Dark Rock 4
Memory G Skill RipjawsV F4 3600 Mhz C16
Video Card(s) MSI GeForce RTX 3080 Ventus 3X OC LHR
Storage Crucial 2 TB M.2 SSD :: WD Blue M.2 1TB SSD :: 1 TB WD Black VelociRaptor
Display(s) Dell S2716DG 27" 144 Hz G-SYNC
Case Fractal Meshify C
Audio Device(s) Onboard Audio
Power Supply Antec HCP 850 80+ Gold
Mouse Corsair M65
Keyboard Corsair K70 RGB Lux
Software Windows 10 Pro 64-bit
Benchmark Scores I don't benchmark.
With four cores per CCX, 10 cores aren't possible? Well then, why not go to 12 cores. And some rejects with two failed cores turned off would be the 10 core parts.

They will not do that for the same reason as they will not make a 16 core Ryzen chip and why I think a 10 core Ryzen 2800X is likely just a rumor. It doesn't make any sense cannibalize your own market segments. We already have a Threadripper 12 core/24-thread chips in the 1920X with a possible 2920X on the way. There is no reason to shoot yourself in the foot by offering a 10 or 12 core Ryzen chip.
 
Joined
Jul 5, 2013
Messages
29,307 (6.89/day)
There is no reason to shoot yourself in the foot by offering a 10 or 12 core Ryzen chip.
But that isn't what would happen. They would be maximizing inventory that would otherwise sit unused. That's not shooting oneself in the foot, it's being smart. Shooting themselves in the foot would be wasting those unused dies.
 
Joined
Sep 25, 2012
Messages
424 (0.09/day)
Location
Brooklyn, New York
Processor AMD Ryzen 7 2700X
Motherboard MSI Gaming M7 AC
Cooling AlphaCool Eisbaer 360
Memory G. Skill Trident X DDR4 8GBx2 (16 GB) 4266mhz dimms
Video Card(s) MSI Gaming X Twin Frozr GTX 1080 Ti
Storage 512GB Samsung 960 EVO M2 NVMe drive,500 GB Samsung 860 EVO ssd, 1 TB Samsung 840 EVO SSD
Display(s) Samsung 28 inch 4k Freesync monitor
Case ThermalTake V71 Full tower gaming case
Power Supply Corsair 1200 watt HX Platinum PSU
Mouse Razor Mamba Tournament Edition
Keyboard Das tactile mechanical gaming keyboard
Software Windows 10 Pro
Benchmark Scores Cinebench 15 64 bit Open GL 146.7 FPS Cinebench 15 CPU 1958 at 4.25 GHZ Priority set to real-time
Not based ona tinge of evidence and it would likely massively increase the current die size for Zen+. It is an idiotic article by people who need to generate a buzz when there is nothing out there at all to substantiate it. Fake News in this case.
 
Joined
Sep 25, 2012
Messages
424 (0.09/day)
Location
Brooklyn, New York
Processor AMD Ryzen 7 2700X
Motherboard MSI Gaming M7 AC
Cooling AlphaCool Eisbaer 360
Memory G. Skill Trident X DDR4 8GBx2 (16 GB) 4266mhz dimms
Video Card(s) MSI Gaming X Twin Frozr GTX 1080 Ti
Storage 512GB Samsung 960 EVO M2 NVMe drive,500 GB Samsung 860 EVO ssd, 1 TB Samsung 840 EVO SSD
Display(s) Samsung 28 inch 4k Freesync monitor
Case ThermalTake V71 Full tower gaming case
Power Supply Corsair 1200 watt HX Platinum PSU
Mouse Razor Mamba Tournament Edition
Keyboard Das tactile mechanical gaming keyboard
Software Windows 10 Pro
Benchmark Scores Cinebench 15 64 bit Open GL 146.7 FPS Cinebench 15 CPU 1958 at 4.25 GHZ Priority set to real-time
That's the 16+4+4 I mentioned above. 16 for graphics (and general usage, really), 4 for NVMe (or, again, anything, really), and 4 for the chipset link. The chipsets only provide PCIe 2.0 (8 lanes for 70-series, 6 for 50). So if you want/need more than one full-speed NVMe SSD (which is growing more likely as time passes), you need to eat into the 16 GPU lanes, which means that no/few motherboards will provide more than one NVMe port from the CPU-connected lanes - they'll use the chipset 2.0 lanes instead. Of course, running your GPU at x8 isn't actually a problem, but this requires a riser card for the SSD.


Yes, it depends on drivers - SLI profiles in the drivers, specifically. SLI has zero effect without a bespoke profile for the game in question (activating it in a game without a profile usually leads to a tiny but measurable performance drop, bugginess, or nothing at all happening). For some games, modders even make their own profiles, with varying success. The only difference between SLI and DX12 multi-GPU in this regard is that the effort lies with Nvidia and not the developer. The statement that "all games will run fine in a dual GPU config" is thus either false (no performance scaling without a profile) or meaningless (defining "running fine" as not requiring performance scaling, invalidating the point of SLI).

As for the 2070 having SLI, there are no SLI fingers visible on the back of the board (scroll down for a picture of the back). For previous cards, the SLI fingers needed a cut-out in the backplate, so unless they've redesigned the entire SLI interface, it doesn't have it. There isn't room to fit the bridge connector between the backplate and the PCB, so a cutout or fingers sticking up past the backplate would be necessary. The NVLink slot is also visible from the back on the 2080/- TI. Nvidia cut SLI from the third-largest die (then the 60-series) previously, so it's no surprise if they keep to this line even now that the third-largest die is in the 70-series.

SLI gives you the ultimate performance in the (relatively few) games that support it for the people who can afford it, but given the cost and what you gain back, it's an utter waste of money.

The X470 chipset abolished pciE 2.0 lanes in the chipset. All lanes are pciE 3.0 in the X470 chipset..
 
Joined
May 2, 2017
Messages
7,762 (2.72/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
The X470 chipset abolished pciE 2.0 lanes in the chipset. All lanes are pciE 3.0 in the X470 chipset..
No.
AMD said:
PCI EXPRESS® GP*
x8 Gen2 (plus x2 PCIe® Gen3 when no x4 NVMe)
Link (scroll down the page).
 
Top