Thursday, September 3rd 2020

NVIDIA Reserves NVLink Support For The RTX 3090

NVIDIA has issued another major blow to multi-GPU gaming with their recent RTX 30 series announcement. The only card to support NVLink SLI in this latest generation will be the RTX 3090 and will require a new NVLink bridge which costs 79 USD. NVIDIA reserved NVLink support for the RTX 2070 Super, RTX 2080 Super, RTX 2080, and RTX 2080 Ti in their Turing range of graphics cards. The AMD CrossFire multi-GPU solution has also become irrelevant after support for it was dropped with RDNA. Developer support for the feature has also declined due to the high cost of implementation, small user base, and often poor performance improvements. With the NVIDIA RTX 3090 set to retail for 1499 USD, the cost of a multi-GPU setup will exceed 3079 USD reserving the feature for only the wealthiest of gamers.
Source: NVIDIA
Add your own comment

81 Comments on NVIDIA Reserves NVLink Support For The RTX 3090

#26
RedelZaVedno
Using 3-way SLI 1080TI setup with adobe Premiere for tasks such as render in to out and for exporting (most time consuming tasks) was unbeatable price/performance combo. I used to do video editing professionally and building workstation rigs in a big IT company and even though we could get Quadros, our department opted for 1080TI setups, because they were most cost effective option out there, good enough for our needs and GeForce pascal drivers were rock solid when used with Adobe apps. I can see why NVidia didn't approve such configurations. 3x1080TI costed $2100 while 3xquadro P5000 costed around $10K.

I could see myself building 3x3080 combo today if SLI was still supported. This thing would beat single top tier Quadro in Premiere by far for less than half the price. getting 26,000 cuda cores for $2100 would be insane value. Of course Nvidia couldn't allow this.
Posted on Reply
#27
bug
DemonicRyzen666and high end cards have been around longer
store.steampowered.com/hwsurvey/videocard/

RTX 2080 ti With 0.98% why keep making them then ?

thats less than 5% too.
Return of investment. When you build thousand of chips, few of them will be defect free so you can enable more stuff, slap them on a slightly beefed up PCB and end up with a product that will reflect well on your image.
For SLI/Crossfire, you need considerable dedicated effort from both your software guys and each developer. And you end up with a product that earns you the "gratitude" of people that spent a lot of money for something that sometimes, maybe happens to work.
Posted on Reply
#28
Vya Domus
DuxCroBtw. does anyone doubt that there will be RTX 3080 Ti?
I do, the gap between 3080 and 3090 is exceedingly small already. By NVIDIA's own admission the 3090 is theoretically only 20% faster, in real world that will translate to maybe 10, 15% at best. So how faster would a 3080 ti or Super or whatever they'll call it be ? 5 % ? That's peanuts. And keep in mind they would somehow have to keep that sweet 1500$ 3090 relevant too.

I can't obviously know for sure but it would be really bizarre. It would be one of the smallest speed bump between a ti and non-ti card.
Posted on Reply
#29
Nater
ncrsIt's a gaming card. Professionals use Quadros (or rarely Titans) because of the certified drivers that make a 1060-equivalent Quadro P2000 over 7x faster than a 2080 Ti in professional use cases:

I've yet to see an answer if the 3090 is truly being treated as the "new" Titan. Will it get that sort of driver support for apps like NX and sit +50% or better over the Titan RTX on that chart?
Posted on Reply
#30
Berfs1
Daisho11Nvidia is approaching Apple-tier ridiculous pricing. But do they have the brainwashed, cult-like following, like Apple does, to get away with fleecing their customers?
Yea 30 series is totally expensive.

Have you seen the price/performance??
Posted on Reply
#31
M2B
Vya DomusI do, the gap between 3080 and 3090 is exceedingly small already. By NVIDIA's own admission the 3090 is theoretically only 20% faster, in real world that will translate to maybe 10, 15% at best. So how faster would a 3080 ti or Super or whatever they'll call it be ? 5 % ? That's peanuts. And keep in mind they would somehow have to keep that sweet 1500$ 3090 relevant too.

I can't obviously know for sure but it would be really bizarre. It would be one of the smallest speed bump between a ti and non-ti card.
The 3080Ti/3080S could be just as fast as a 3090 and both will have their own market.
The 24GB buffer on the 3090 will keep it relevant, and even now the only reason it is relevant is because of the VRAM, otherwise the extra 15% of performance is not worth that much money over the 3080.
I have almost no doubts that if AMD can compete with the 3080 Nvidia is ready to basically drop a 12GB 3090 and call it 3080Ti/3080Super or whatever for maybe 899-999$.
Posted on Reply
#32
Krzych
As a big SLI fan I find it a bit disappointing, 2x RTX 3080 would have been perfect this gen, and for the price of single 3090, but guess no cheaping out this time. But this isn't that much of a big deal, SLI was high-end enthusiast thing for years anyway, if you get it now then it is in 99% for already existing games, not future ones. It will now slowly neglect even more than before. But this time it is less because games are made with bare minimum effort and more because it is getting replaced with AI that provides similar gains without requiring second GPU, so it is understandable now more than ever, with software solutions driving the future and replacing brute force. I am still getting two cards, I have a tons of games scheduled that have support, but if NVIDIA decides to drop it completely with 4000 series then this setup will be the last of it's kind.

It will certainly take away a lot from building process. Only one GPU, maybe even some puny 16-lane platform will be enough, all games working to setup's full potential plug-and-play, that's so boring. I will need to find something to make things interesting again, like 24/7 sub ambient cooling, or making the system as small as possible.
Posted on Reply
#33
SamuelL
SandboThey are really worried that we can use the card for machine learning, I guess.

I really hope AMD could do something about the situation.
I think it’s probably a concern but sli doesn’t factor in. You actually had to remove sli bridges to run the cards together (at least on the ML libraries I was playing with).
Raendoryes, Huang can’t sleep at night because he’s afraid someone will use 3080 or god forbid 3070 for professional machine learning activities. Don’t be ridiculous.
The 1080ti and 2080ti were/are favorites for ML tasks - even at bigger companies. ML (usually) doesn’t require additional precision so the only limiting factor for these cards vs quadro was the memory size. 11gb was already the breaking point for a lot of workloads and now Nvidia dropped that back to 10gb on the 3070/3080 specifically for this reason. So let’s say you’ve been using models that consume the full 11gb you’re used to using. To move forward without changes, you’ll now need to buy into the 3090 or Quadros.
Posted on Reply
#34
CrAsHnBuRnXp
DuxCroIf you buy RTX 3090 for gaming, something is wrong with you. I don't care if you are rich. It just means you are a fool with too much money. It's a card for professional use and you would be better off just buying RTX 3080, watercooling it and OC-ing the shit out of it. Btw. does anyone doubt that there will be RTX 3080 Ti?
3090 is basically a titan replacement but it is being marketed as a gaming GPU. Only nvidia made titan GPU's but in this case, board partners such as Asus, Gigabyte, EVGA, etc are making 3090's which basically makes it a gaming card.
Posted on Reply
#35
bug
CrAsHnBuRnXp3090 is basically a titan replacement but it is being marketed as a gaming GPU. Only nvidia made titan GPU's but in this case, board partners such as Asus, Gigabyte, EVGA, etc are making 3090's which basically makes it a gaming card.
Nvidia themselves had a tough time placing Titan. They always wanted to paint it as something other then a gaming card (because of price and positioning), but Titans never ran on pro drivers. Make of that what you wish.
Posted on Reply
#36
Paganstomp
... reserving the feature for only the wealthiest of gamers.

Goddamn right there! :p

WoW 170 FPS @ 2K with NV Linked RTX 2080 Supers
Posted on Reply
#37
Dux
CrAsHnBuRnXp3090 is basically a titan replacement but it is being marketed as a gaming GPU. Only nvidia made titan GPU's but in this case, board partners such as Asus, Gigabyte, EVGA, etc are making 3090's which basically makes it a gaming card.
Yup. I'm sure people will buy it for gaming. But idk if performance difference is so big to justify almost 115% higher price compared to RTX 3080? According to Digital Foundry discussion video after Nvidia presentation, gaming performance of RTX 3090 could maybe be achieved by overclocking RTX 3080. Unless you really do plan on gaming on 8K display, i see no point in paying that price. But some people will always buy the best regardless of price. Might as well cost $3000 as far as they're concerned.
Posted on Reply
#38
Krzych
DuxCroYup. I'm sure people will buy it for gaming. But idk if performance difference is so big to justify almost 115% higher price compared to RTX 3080? According to Digital Foundry discussion video after Nvidia presentation, gaming performance of RTX 3090 could maybe be achieved by overclocking RTX 3080. Unless you really do plan on gaming on 8K display, i see no point in paying that price. But some people will always buy the best regardless of price. Might as well cost $3000 as far as they're concerned.
This is the "GTX 980 makes no sense, you can overclock GTX 970" kind of talk, talking about how way cheaper heavily overclocked and tuned GPU is going to get close to more expensive stock thermal and power throttling one. It makes no real sense and assumes that you will get Maxwell-like 20%+ gains from overclocking. It is like with people undervolting their AMD cards and then comparing with NVIDIA stock power results and saying you can get similar efficiency. If you want to compare then give the same treatment to both. Especially these days when custom cards overclock by themselves and you really need to compare your score to bare thermal and power throttled FE to get double digit percentage gain, otherwise you get 5-7%.
Posted on Reply
#39
CrAsHnBuRnXp
DuxCroYup. I'm sure people will buy it for gaming. But idk if performance difference is so big to justify almost 115% higher price compared to RTX 3080? According to Digital Foundry discussion video after Nvidia presentation, gaming performance of RTX 3090 could maybe be achieved by overclocking RTX 3080. Unless you really do plan on gaming on 8K display, i see no point in paying that price. But some people will always buy the best regardless of price. Might as well cost $3000 as far as they're concerned.
It will depend on where the 3080 will land at FPS wise for 4K gaming compared to that of the 3090. I plan on getting the 3090 because I have some points saved up on amazon that I can get it cheaper. But I will have to wait and see.
Posted on Reply
#40
xorbe
ncrsIt's a gaming card. Professionals use Quadros (or rarely Titans) because of the certified drivers that make a 1060-equivalent Quadro P2000 over 7x faster than a 2080 Ti in professional use cases:
Let's be honest, it's that the driver purposefully makes the 2080 Ti over 7x slower for the codepath professionals use. There's no secret sauce that makes the 1060 hardware 7x faster than a 2080 Ti. (Just sayin')

I thought SLI was already dead an gone. I thought you needed a special unlock software key from nVidia to even use it.

I'm waiting for an nVidia card with <= 250W and 16GB.
Posted on Reply
#41
CrAsHnBuRnXp
Paganstomp... reserving the feature for only the wealthiest of gamers.

Goddamn right there! :p

WoW 170 FPS @ 2K with NV Linked RTX 2080 Supers
Rogue 4 Lyfe! :rockout:

What is your FPS like?
Posted on Reply
#42
Paganstomp
CrAsHnBuRnXpRogue 4 Lyfe! :rockout:

What is your FPS like?
Should see it in screenshot. 170 FPS in Orgrimmar. 120 FPS in Borlus Harbor.
Posted on Reply
#43
Sandbo
SamuelLI think it’s probably a concern but sli doesn’t factor in. You actually had to remove sli bridges to run the cards together (at least on the ML libraries I was playing with).


The 1080ti and 2080ti were/are favorites for ML tasks - even at bigger companies. ML (usually) doesn’t require additional precision so the only limiting factor for these cards vs quadro was the memory size. 11gb was already the breaking point for a lot of workloads and now Nvidia dropped that back to 10gb on the 3070/3080 specifically for this reason. So let’s say you’ve been using models that consume the full 11gb you’re used to using. To move forward without changes, you’ll now need to buy into the 3090 or Quadros.
I guess I might have been confused, then.
I thought the NVLink is what enables high speed data sharing among the GPUs, not only limited to SLI, unless that's the case for Geforce card.

And for those who think gaming cards can't run professional machine learning workload, I guess they just have no idea.
Posted on Reply
#44
uio77
Now ..the question.. Should I buy another 1080ti for SLI or should I sell my current 1080ti and get me a 3080?? Gaming in ultrawide 3440x1440
Posted on Reply
#45
ncrs
xorbeLet's be honest, it's that the driver purposefully makes the 2080 Ti over 7x slower for the codepath professionals use. There's no secret sauce that makes the 1060 hardware 7x faster than a 2080 Ti. (Just sayin')
Oh for sure, by buying Quadro you're paying extra for the driver and ECC-enablement since the chips are the same as GeForce ;)
Posted on Reply
#46
CrAsHnBuRnXp
PaganstompShould see it in screenshot. 170 FPS in Orgrimmar. 120 FPS in Borlus Harbor.
I didnt even see that down there XD.
uio77Now ..the question.. Should I buy another 1080ti for SLI or should I sell my current 1080ti and get me a 3080?? Gaming in ultrawide 3440x1440
3080 would be your best best. SLi isnt supported that much anymore and youll use less power most likely going the 3080.
Posted on Reply
#47
Vayra86
Paganstomp... reserving the feature for only the wealthiest of gamers.

Goddamn right there! :p

WoW 170 FPS @ 2K with NV Linked RTX 2080 Supers
Is this impressive? I ran WoW Legion at 120 fps fixed on a single GTX 1080... The game is barely GPU limited, the options dragging it down hit the CPU.

You can rest assured those Supers aren't scaling all that much...
Posted on Reply
#48
CrAsHnBuRnXp
Vayra86Is this impressive? I ran WoW Legion at 120 fps fixed on a single GTX 1080... The game is barely GPU limited, the options dragging it down hit the CPU.

You can rest assured those Supers aren't scaling all that much...
Depends on the screen resolution and amount of addons you have running. Sometimes addons can effect the performance.
Posted on Reply
#49
Vayra86
CrAsHnBuRnXpDepends on the screen resolution and amount of addons you have running. Sometimes addons can effect the performance.
A bit, not much. The biggest factor is draw distances for environment details and lod's. And they all involve CPU, too.
Posted on Reply
#50
Xaled
This is better for gamers and gaming.
I am still feeling bad for the rich fellows who will get more than one 3090 and will ruin their gaming experience just for benchmarks and higher but stuttering frame rates
Posted on Reply
Add your own comment
Jun 1st, 2024 01:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts