• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

any video editors here who use the 2080ti/sli?

Joined
Sep 23, 2023
Messages
383 (0.98/day)
unpopular opinion obviously thinking out loud here. everywhere I look, in all benches, 2080ti betters 3070 and sli betters 3090

im curious about a budget build to get into editing. maybe premiere maybe resolve.

3950x and dual 2080ti sli.

yes I read the same cons everywhere. sli sucks, no driver support, not all games scale. its dead.. RT and dlsss sucks. sli sucks is an empty statement. no driver support-I dont update drivers often if at all. maybe 1-2 times of owning a gpu. not all games scale. even 1x2080ti will be enough for games for me. Im not a serious gamer. sometimes I cant play more then 5min without getting annoyed. st/dlss doesnt interest me. I know people are always jumping to get newest as they say newest is best. but I dont see the whole package as being better. maybe im missing something

from every YT video and bench ive seen, 1x2080ti performs better then 3070. dual 2080ti outperforms 3080 and 3090. its a cheap system entry to high performance people selling their 3090 for high prices,.and its not that I cant pay the price is that I dont want to. just like buying $1000 for a phone. I always buy old flagships for $250 cause it has more then I will ever use. I dont game or browse or bank or social media on my phone. I do little on my phone.

im not a serious gamer today. I dont need the same thing most people who buy rtx for. I dont need RT dlss, 60fps makes me super happy. btw, Ive never played multiplayer mode. only solo campaigns on any game.

im thinking about the 2080ti sli first for editing, then other things like some gaming etc. I think even 1 x 2080ti would be ok for my gaming needs.

sure you can buy a new porsche thats $150k but you can also buy a bmw m5 e39 thats no slouch for less money and does what you need.

im trying to weight the cons with the 2080ti sli vs a 3090. 3080 is not enough with 10/12gb ram with heavy 4k editing. you will run out of memory and crash. so better to get it right from the get go then compromise. and dual 2080ti outperforms 3080
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
41,670 (6.60/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
Ok dual gpu support was passed onto game devs to integrate into their games as mgpu, if it doesnt support mgpu it's not going to do you any favors.
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,743 (3.82/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) Odyssey OLED G9 (G95SC)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Modi+ & Valhalla 2
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
Ok dual gpu support was passed onto game devs to integrate into their games as mgpu, if it doesnt support mgpu it's not going to do you any favors.
This; premier won’t use it.

Davinci resolve does but not SLI, just multi GPU and only in the paid studio tier. Even then I’m not certain it will support 2xxx series cards any longer.

but I mean OP could have found all of this by 35sec of googling and visiting the support site for each.
 
Joined
Sep 23, 2023
Messages
383 (0.98/day)
This; premier won’t use it.

Davinci resolve does but not SLI, just multi GPU and only in the paid studio tier. Even then I’m not certain it will support 2xxx series cards any longer.

but I mean OP could have found all of this by 35sec of googling and visiting the support site for each.
in puget bench charts, it shows in resolve that 2x2080ti performs better then the 3090

and I dont use google.
 
Joined
Sep 23, 2023
Messages
383 (0.98/day)
Ok, Bing or Duck Duck Go? A search engine if you prefer..
I searched ddg I searched reddit. asked in reddit asked in video forums I watched tons of YT videos. puget bench shows dual 2080ti on top.

not supported is a term that says nvidia has influenced them to push for 30/40 cards, nothing else. when you have older cards outperforming your newer cards and less sales, well then lets pressure them. I also think that amd/nvidia pay blackmagic adobe to get their cards to be more compatable. not sure why amd didnt pay up to have their crds be more compatable with resolve. they are absolutely shet in performance in video editing vs nvidia
 
Joined
Jul 5, 2013
Messages
27,107 (6.57/day)
I searched reddit.
Well there's part of your problem, don't do that. Reddit is a crapshow cesspool of a website not worthy of a moments consideration.
puget bench shows dual 2080ti on top.
Naturally. Davinci Resolve is capable of multi GPU use. Not sure it's specifically SLI, kinda doubt it. To be fair, unless money is tight, you should focus on an RTX 4070ti or better for the editing you're going to do. A pair of 2080ti's might work, but a single 4070ti would be easier in the long run.
 
Joined
Sep 7, 2010
Messages
853 (0.17/day)
Location
Nairobi, Kenya
Processor Intel Core i7-14700K
Motherboard ASUS ROG STRIX Z790-H
Cooling DeepCool AK500 WH
Memory Crucial Pro 32GB Kit (16GB x 2) DDR5-5600 (CP2K16G56C46U5)
Video Card(s) Intel ARC A770 Limited Edition
Storage Solidigm P44 Pro (2TB x 2) / PNY CS3140 2TB
Display(s) Philips 32M1N5800A
Case Lian Li O11 Air Mini (White)
Power Supply Seasonic Prime Fanless Titanium 600W
Keyboard Dell KM714 Wireless
Software Windows 11 Pro x64
in puget bench charts, it shows in resolve that 2x2080ti performs better then the 3090
Benchmarks aside, how a GPU does highly depend on your workflow and the codecs you are using or plan to use
not supported is a term that says nvidia has influenced them to push for 30/40 cards, nothing else. when you have older cards outperforming your newer cards and less sales, well then lets pressure them. I also think that amd/nvidia pay blackmagic adobe to get their cards to be more compatable. not sure why amd didnt pay up to have their crds be more compatable with resolve. they are absolutely shet in performance in video editing vs nvidia
Mostly for editing like in Adobe Premier or DaVinci Resolve the types of codec that are supported by the GPU play a very big role in accelerating timeline scrubbing and also final export.
I'd say that Intel and Nvidia have better codec support than AMD but as of the RX 7000 series the codec support is also good on AMD.

Though with the software that you mentioned I'd personally lean towards Intel's Quick Sync it has the best codec support out of the three manufactures.

Chack out this video it might shade some light

 
Last edited:
Joined
Sep 23, 2023
Messages
383 (0.98/day)
. A pair of 2080ti's might work, but a single 4070ti would be easier in the long run.
ok. so im curious. what wont I be able to do in 2-4 years with 2080ti sli IN VIDEO EDITING that the 4070to will?
gaming I dont care. Ill be perfectly happy even with no sli support. Im not much a gamer today. im still playing mirrors edge from 2016

Benchmarks aside, how a GPU does highly depend on your workflow and the codecs you are using or plan to use

Mostly for editing like in Adobe Premier or DaVinci Resolve the types of codec that are supported by the GPU play a very big role in accelerating timeline scrubbing and also final export.
I'd say that Intel and Nvidia have better codec support than AMD but as of the RX 7000 series the codec support is also good on AMD.

Though with the software that you mentioned I'd personally lean towards Intel's Quick Sync it has the best codec support out of the three manufactures.

Chack out this video it might shade some light

I saw charts for the 7000..even 7900 isnt so strong. id jump on amd if they had the same results as nvidia but its not becuase of the gpu, its because of "business" that it doesnt work as well.
 
Joined
Sep 7, 2010
Messages
853 (0.17/day)
Location
Nairobi, Kenya
Processor Intel Core i7-14700K
Motherboard ASUS ROG STRIX Z790-H
Cooling DeepCool AK500 WH
Memory Crucial Pro 32GB Kit (16GB x 2) DDR5-5600 (CP2K16G56C46U5)
Video Card(s) Intel ARC A770 Limited Edition
Storage Solidigm P44 Pro (2TB x 2) / PNY CS3140 2TB
Display(s) Philips 32M1N5800A
Case Lian Li O11 Air Mini (White)
Power Supply Seasonic Prime Fanless Titanium 600W
Keyboard Dell KM714 Wireless
Software Windows 11 Pro x64
ok. so im curious. what wont I be able to do in 2-4 years with 2080ti sli IN VIDEO EDITING that the 4070to will?
gaming I dont care. Ill be perfectly happy even with no sli support. Im not much a gamer today. im still playing mirrors edge from 2016
What you can't do and won't be able to do in 2–4-year time with a 2080 TI Is AV1 encode and decoding it doesn't have hardware support for that, anyway that's assuming you have AV1 in your workflow or would want to use it in the future. Thats just 1 example.
 
Joined
Sep 23, 2023
Messages
383 (0.98/day)
good informaton, thanks

my post is about verifying before going in. im thinking out loud. I still am not sure but from the amount of closed minded folks here in my country, ill mostly be doing wedding edits done with sony cameras. so well see. not sure ill have a need for av1.
 
Joined
Sep 7, 2010
Messages
853 (0.17/day)
Location
Nairobi, Kenya
Processor Intel Core i7-14700K
Motherboard ASUS ROG STRIX Z790-H
Cooling DeepCool AK500 WH
Memory Crucial Pro 32GB Kit (16GB x 2) DDR5-5600 (CP2K16G56C46U5)
Video Card(s) Intel ARC A770 Limited Edition
Storage Solidigm P44 Pro (2TB x 2) / PNY CS3140 2TB
Display(s) Philips 32M1N5800A
Case Lian Li O11 Air Mini (White)
Power Supply Seasonic Prime Fanless Titanium 600W
Keyboard Dell KM714 Wireless
Software Windows 11 Pro x64
good informaton, thanks

my post is about verifying before going in. im thinking out loud. I still am not sure but from the amount of closed minded folks here in my country, ill mostly be doing wedding edits done with sony cameras. so well see. not sure ill have a need for av1.
You might want to look into Bit depth and chroma subsampling, the sooner you familiarize yourself with those the easier it will be for you to choose a workflow.
Example 4:2:2 10-Bit is mostly supported by Intel Quick sync which is found in their iGPUs & dGPUs, also supported in some apple m series
 
Joined
Sep 23, 2023
Messages
383 (0.98/day)
You might want to look into Bit depth and chroma subsampling, the sooner you familiarize yourself with those the easier it will be for you to choose a workflow.
Example 4:2:2 10-Bit is mostly supported by Intel Quick sync which is found in their iGPUs & dGPUs, also supported in some apple m series
ok thanks. I have much to learn thats for sure. everyone starts at 0

posted in WA group of wedding photogs/videogs. all said that no one uses av1 here and probably wont. some (I did say close minded haha) didnt know what av1 is

my brother is "better buy something newer" guy bummed me out. says my mb may not support dual at 16x 8x but only 16x and 4x.

I dont see myself getting an intel gpu. I dont think they have much to offer-yet
 
Joined
Apr 21, 2021
Messages
240 (0.19/day)
System Name Silicon Graphics O2
Processor R5000 / 180MHz
Cooling noisy fan
Memory 384 MB
Storage 4 GB
Case the one with the old logo and proud of it ;)
Software IRIX 6.5
in puget bench charts, it shows in resolve that 2x2080ti performs better then the 3090
Naturally. Davinci Resolve is capable of multi GPU use. Not sure it's specifically SLI, kinda doubt it.
SLI was basically never a thing in professional video editing. There were some niche solutions for 3D animation and 3D compositing in the early days, from what I remember, but I don't think it was very popular even then.
Simplistically speaking, multi-GPU usually came/comes in two flavors for video editing: multiple GPUs for multiple monitors or multiple GPUs for display out & scale-out rendering.

DaVinci Resolve Studio on all platforms supports up to 8 GPUs, that's including the iGPUs which are used for accelerating certain compressed formats during decoding and encoding as others mentioned already. The free version on Windows and Linux is limited to a single GPU, usually the dGPU. MacOS has more options for the free version, but considering the price for the Studio license, don't bother with the free version, if you are actually going to buy a Mac.

Having a single beefy dGPU will usually be better than two or more cheaper dGPUs in Resolve Studio, e.g. rather buy one 4090 instead of two 4080s. Also, take PugetBench with a huge grain of salt. It basically only measures standalone features/functions separately as render output. It doesn't do much to get an idea how a GPU behaves when working in the timeline or when dealing with complex projects, etc.
Avoid mixed dGPU setups, the slower/lower end one will be limiting the function of the higher end one, e.g. if you have a 12GB and a 16GB card, Resolve will only use 12GB VRAM on each.
 
Joined
Dec 25, 2020
Messages
6,353 (4.56/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
RTX 4070 Ti SUPER (AD103-based) for dual-engine NVENC. Get one of those.
 
Top