• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 3090 Looks Huge When Installed

bug

Joined
May 22, 2015
Messages
13,779 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Too close to the PCIe slot and probably would've been too squished between GPU and motherboard.


I'm sure they considered that option. If they didn't go for it, they probably had a reason to. Don't forget that unlike the 1060, this is a 350W card.
I'm actually really looking forward seeing how much this actually draws under various conditions (RTX, non-RTX, compute).
 
Joined
Jan 25, 2006
Messages
1,470 (0.21/day)
Processor Ryzen 1600AF @4.2Ghz 1.35v
Motherboard MSI B450M PRO-A-MAX
Cooling Deepcool Gammaxx L120t
Memory 16GB Team Group Dark Pro Sammy-B-die 3400mhz 14.15.14.30-1.4v
Video Card(s) XFX RX 5600 XT THICC II PRO
Storage 240GB Brave eagle SSD/ 2TB Seagate Barracuda
Display(s) Dell SE2719HR
Case MSI Mag Vampiric 011C AMD Ryzen Edition
Power Supply EVGA 600W 80+
Software Windows 10 Pro
From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
 

bug

Joined
May 22, 2015
Messages
13,779 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
Is it bad if Ampere is Turing with beefed up RT and tensor cores?
 
Joined
Jan 25, 2006
Messages
1,470 (0.21/day)
Processor Ryzen 1600AF @4.2Ghz 1.35v
Motherboard MSI B450M PRO-A-MAX
Cooling Deepcool Gammaxx L120t
Memory 16GB Team Group Dark Pro Sammy-B-die 3400mhz 14.15.14.30-1.4v
Video Card(s) XFX RX 5600 XT THICC II PRO
Storage 240GB Brave eagle SSD/ 2TB Seagate Barracuda
Display(s) Dell SE2719HR
Case MSI Mag Vampiric 011C AMD Ryzen Edition
Power Supply EVGA 600W 80+
Software Windows 10 Pro
Is it bad if Ampere is Turing with beefed up RT and tensor cores?
Not at all, I'm just expressing an opinion, there doesnt seem to be any IPC or architectural improvements, yes there is more performance but that seems to have been brought upon by the shader increase and not anything to do with a newer/more refined architecture, but 2080 SLI in a single GPU is still a massive deal and it costs less than half of the previous gen to achieve so it's still a win.

In fact if anything they should be lauded for their power efficiency, as it's Turing IPC and shaders with 1/3 less power consumption, probably down to the node shrink and more conservative core/boost clocks plus like you said "beefed" up RT and tensor cores
 
Last edited:
  • Like
Reactions: bug
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
What's the ROP count on these?

...
..
.
 
Joined
Dec 16, 2017
Messages
2,918 (1.15/day)
System Name System V
Processor AMD Ryzen 5 3600
Motherboard Asus Prime X570-P
Cooling Cooler Master Hyper 212 // a bunch of 120 mm Xigmatek 1500 RPM fans (2 ins, 3 outs)
Memory 2x8GB Ballistix Sport LT 3200 MHz (BLS8G4D32AESCK.M8FE) (CL16-18-18-36)
Video Card(s) Gigabyte AORUS Radeon RX 580 8 GB
Storage SHFS37A240G / DT01ACA200 / ST10000VN0008 / ST8000VN004 / SA400S37960G / SNV21000G / NM620 2TB
Display(s) LG 22MP55 IPS Display
Case NZXT Source 210
Audio Device(s) Logitech G430 Headset
Power Supply Corsair CX650M
Software Whatever build of Windows 11 is being served in Canary channel at the time.
Benchmark Scores Corona 1.3: 3120620 r/s Cinebench R20: 3355 FireStrike: 12490 TimeSpy: 4624
What's the ROP count on these?

...
..
.
20200910-201759.png


Ignore the ones that have just "2020" as Release date. Those are just placeholders.
 
Joined
Mar 31, 2012
Messages
860 (0.19/day)
Location
NL
System Name SIGSEGV
Processor INTEL i7-7700K | AMD Ryzen 2700X | AMD Ryzen 9 9950X
Motherboard QUANTA | ASUS Crosshair VII Hero | MSI MEG ACE X670E
Cooling Air cooling 4 heatpipes | Corsair H115i | Noctua NF-A14 IndustrialPPC Fan 3000RPM | Arctic P14 MAX
Memory Micron 16 Gb DDR4 2400 | GSkill Ripjaws 32Gb DDR4 3400(OC) CL14@1.38v | Fury Beast 64 Gb CL30
Video Card(s) Nvidia 1060 6GB | Gigabyte 1080Ti Aorus | TUF 4090 OC
Storage 1TB 7200/256 SSD PCIE | ~ TB | 970 Evo | WD Black SN850X 2TB
Display(s) 15,5" / 27" /34"
Case Black & Grey | Phanteks P400S | O11 EVO XL
Audio Device(s) Realtek
Power Supply Li Battery | Seasonic Focus Gold 750W | FSP Hydro TI 1000
Mouse g402
Keyboard Leopold|Ducky
Software LinuxMint
Benchmark Scores i dont care about scores

Cosmocalypse

New Member
Joined
Sep 11, 2020
Messages
1 (0.00/day)
I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699

It is a Titan replacement. This is always how they do it. Releasing the Ti version at the same time as the regular card is something that normally doesn't happen (like it did with 20xx series). In 6 months they'll release a slightly cut down 3090 with less VRAM and that will be the 3080 Ti. They charge a high price for the latest tech and those capable will buy it as early adopters. Like the Titan, it's really a workstation card. Gamers should go for 3080.
 
Joined
Aug 11, 2020
Messages
245 (0.16/day)
Location
2nd Earth
Processor Ryzen 5700X
Motherboard Gigabyte AX-370 Gaming 5, BIOS F51h
Cooling MSI Core Frozr L
Memory 32GB 3200MHz CL16
Video Card(s) MSI GTX 1080 Ti Trio
Storage Crucial MX300 525GB + Samsung 970 Evo 1TB + 3TB 7.2k + 4TB 5.4k
Display(s) LG 34UC99 3440x1440 75Hz + LG 24MP88HM
Case Phanteks Enthoo Evolv ATX TG Galaxy Silver
Audio Device(s) Edifier XM6PF 2.1
Power Supply EVGA Supernova 750 G3
Mouse Steelseries Rival 3
Keyboard Razer Blackwidow Lite Stormtrooper Edition
I don't know, at first glance the card looks clean, but I'm not digging it when it's installed. Maybe the fan and power connector placement ruin the otherwise clean aesthetic. Just in my opinion. I'm more into classic look with plain backplate and fan facing down.

Although FE card is never sold in my country.
 
Joined
Aug 24, 2004
Messages
217 (0.03/day)
wHY OH Why did they put the power connector in the middle of the card, thereby creating moar visible cable clutter, especially with the adapter ?

f.A.i.L....

Are you buying the 3090 to look pretty or to give you incredible 4k fps?

I'll take awesome fps in 4k for $1499 Alex.
 

Jcguy

New Member
Joined
Sep 11, 2020
Messages
1 (0.00/day)
I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699

Why are you worried about what people are willing to spend? If you can't afford, don't buy it. That simple.
 
Joined
Sep 2, 2014
Messages
660 (0.18/day)
Location
Scotland
Processor 5800x
Motherboard b550-e
Cooling full - custom liquid loop
Memory cl16 - 32gb
Video Card(s) 6800xt
Storage nvme 1TB + ssd 750gb
Display(s) xg32vc
Case hyte y60
Power Supply 1000W - gold
Software 10
Evga 3years and another 2 if you register item, so 5 in total.
EVGA only gives a 24 month warranty in my country, while most other brands 36 months.
 
Joined
May 8, 2016
Messages
1,910 (0.61/day)
System Name BOX
Processor Core i7 6950X @ 4,26GHz (1,28V)
Motherboard X99 SOC Champion (BIOS F23c + bifurcation mod)
Cooling Thermalright Venomous-X + 2x Delta 38mm PWM (Push-Pull)
Memory Patriot Viper Steel 4000MHz CL16 4x8GB (@3240MHz CL12.12.12.24 CR2T @ 1,48V)
Video Card(s) Titan V (~1650MHz @ 0.77V, HBM2 1GHz, Forced P2 state [OFF])
Storage WD SN850X 2TB + Samsung EVO 2TB (SATA) + Seagate Exos X20 20TB (4Kn mode)
Display(s) LG 27GP950-B
Case Fractal Design Meshify 2 XL
Audio Device(s) Motu M4 (audio interface) + ATH-A900Z + Behringer C-1
Power Supply Seasonic X-760 (760W)
Mouse Logitech RX-250
Keyboard HP KB-9970
Software Windows 10 Pro x64
From the nvidia presentation regarding Ampere, it seems there is no IPC or architectural increase when compared to Turing, infact the "huge" performance increase seems to have come from doubling down on the shader units and not from an architectural pov? obviously there are power consumption benefits, possibly from moving from 14nm to Samsung 8nm (aka 10nm) but aside from that if ampere had the same number of shaders as turing, there would likely be virtually no difference in performance, ray tracing excluded of course.
Arch changes are kinda like Sandy/Ivy Bridge vs. Haswell (if you like IPC comparisons).
You get more execution hardware (AVX2), with more cache bandwidth to not starve it.
 

BlackWater

New Member
Joined
Sep 11, 2020
Messages
11 (0.01/day)
The amount of VRAM on the 3090 made me think on the following:

Let's assume most enthusiasts right now game at 1440p/144 Hz, and that Nvidia is doing a strong push for 4K/144 Hz. OK, so far, so good. But even then, we know that 4K doesn't need 24GB of VRAM. They say the card is capable of "8K", but this is with DLSS upscaling, so we are not talking about actual 8K native resolution rendering. Regardless of IPC improvements or not, I absolutely don't believe we have the processing power to do 8K yet, and even if we did... We're gonna do 8K on what exactly? After all, this is a PC GPU - how many people are going to attach this to a gigantic 8K TV? And let's not even mention ultra-high resolution monitors, the very small amount of them that exist are strictly professional equipment and have 5 figure prices...

So, considering that 1440p is 3.7 Mpixels, 4K is 8.3 Mpixels and 8K is 33.2 Mpixels, perhaps a more realistic application for the 3090 is triple-monitor 1440p/4K @ 144 Hz? 3x 1440p is 11.1 Mpixel, which is slightly above one 4K display's resolution, so it shouldn't have any trouble driving it and with DLSS, triple 4K is about 25 Mpixel, which seems somewhat possible - perhaps then the 24 GB VRAM would come into play?

But even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30" is coming or... The 3090 doesn't really make much sense at all... And I'm not even talking about the price here, it's irrelevant. The question is - why does the 3090 actually exists and what is the actual application for it - Titan replacement or not, Nvidia is strongly pushing the card as a gaming product, which is all fine, but I fail to see the scenario where the 24 GB VRAM is relevant to gaming. Regardless, in about 2 weeks, benchmarks will tell us all we need to know. :D
 
Joined
Aug 11, 2012
Messages
20 (0.00/day)
Location
South Borneo
Processor 9900k, 3950x
Motherboard Z370 Strix, Max Hero
Cooling NZXT X72, TT M360 Plus
Memory Trident Z RGB 32g, Royal 32gb
Video Card(s) 2080Ti FE, 3080 Strix Gundam, 3090 Strix
Display(s) Dell S2716DG, 34" Ultrawide
Case Lian-Li O11DW, TT Core P3 White
Power Supply Seasonic
I want $799/$849 3090 12gb on 3080FE size/2 slots cooler (for itx build).
with half vram, cheaper 3090 i think 12gigs just enough for me, 49" 32:9 1080p or triple 16:9 1080p/1440p racing setup..

I can wait till nov/des after all Navi & 3070 16gb / 3080 20gb released to decide
 
Last edited:
Joined
Dec 18, 2018
Messages
24 (0.01/day)
System Name godzilla
Processor Intel Core i7-920
Cooling Air
Memory 12GB
Video Card(s) Nvidia Geforce 970
Storage Samsung 970 EVO
Display(s) LG OLED55C9
Audio Device(s) Sennheiser HD 800 S
Mouse Logitech G Pro Wireless
Keyboard Logitech G19
But even then, where are the 4K monitors - at the moment the choice is very limited, and let's be honest, 4K on a 27" panel makes no sense, and there are a few monitors at 40+", which again, for a triple monitor setup doesn't really work either. So, either a wave of new 4K/144 Hz monitors at about 30"
40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
-t owner of LG OLED55C9 TV
 
Joined
Jan 5, 2006
Messages
18,584 (2.69/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock

BlackWater

New Member
Joined
Sep 11, 2020
Messages
11 (0.01/day)
40" is also kind of small for 4K. At the distance I'm sitting 55" hits the right spot for me.
-t owner of LG OLED55C9 TV

Yeah, true, but in your use case, do you use the TV more as a TV, or more as a monitor? I'm wondering from the standpoint of someone hooking the 3090 to a monitor or multiple monitors on a desk, in which case viewing distance would be... about 0.5 to 1 meters, perhaps? I currently have a 27" 1440p monitor and I sit about 60-70 cm from it, and then I was looking at monitors like the Asus PG43UQ - great specs on paper, but it's pretty much the size of a TV, and I just can't see myself, or anyone really, hooking up 3 of those on a desk... You'd have to sit quite far so that you would literally have to constantly turn your head left and right, and at that point, you're better off just getting a large TV and using it from a distance, as you'd use a TV normally.

So that's what I was thinking originally, if we assume that the 3090 is targeted towards multi-monitor 4K gaming at 120-144Hz, where are the monitors suitable for that? IMO, the ideal desk implementation of 4K is 30-34", and pretty much all the ones available, that we can call 'gaming' monitors are 40+", and some are even straight up TVs without the TV tuner (the BFGDs)... So I kind of don't really get what exactly the 3090 is supposed to do. If you want to do big screen gaming in the living room, the 3080 can easily do that (allegedly), so then what even is the purpose of the 3090? It's not professional or scientific research for sure, since Nvidia is pretty much pushing it as the top-end gaming card. But to me it seems that if you try to figure out what it's supposed to do, it just comes out as a slightly bigger chip than the 3080 with a strangely large amount of VRAM, just so Nvidia can say "look what we can do, lol".
 
Joined
Dec 18, 2018
Messages
24 (0.01/day)
System Name godzilla
Processor Intel Core i7-920
Cooling Air
Memory 12GB
Video Card(s) Nvidia Geforce 970
Storage Samsung 970 EVO
Display(s) LG OLED55C9
Audio Device(s) Sennheiser HD 800 S
Mouse Logitech G Pro Wireless
Keyboard Logitech G19
I'm using it mainly as a PC monitor for games, I don't watch that many movies. 0.5-1m sounds about right. It's great both in fullscreen mode for shooters, as well as running other genres like RPG or MMO in a 2K window while also having a browser and some other stuff open.

I'm personally fine with Nvidia rebranding big chungus Ampere from Titan to a xx90. I had issues with Titan series, only reference coolers meant it got outperformed by nonref cooler xx80Ti's at half the price, and also available only from Nvidia store which is not available in my country.
But here I can just buy a nonref tripple slot monstrosity and have a peace of mind for a couple years that for all the unoptimized Ubisoft garbage I can set all the graphics sliders to the right and get a minimum of 4K 60FPs, something 2080Ti was not capable of. And if it has twice more memory than I will ever need for 4K gaming... I can live with that.
 
Joined
Feb 25, 2016
Messages
396 (0.12/day)
System Name 06/2023
Processor R7 7800X3D
Motherboard ROG STRIX B650E-I GAMING WIFI
Cooling Custom 240mm cooling (for CPU) with noctua nfa12x25 and Phantek T30
Memory 32gb Gskill 6000 CL30
Video Card(s) RTX 4070 dual asus deshrouded with 120mm NF-A12x25
Storage 2tb samsung 990 pro + 4tb samsung 870 evo
Display(s) Asus 27" Oled PG27AQDM + Asus 27" IPS PG279QM
Case Ncase M1 v6.1
Audio Device(s) Steelseries arctis pro wireless + Shure SM7b with Steinberg UR
Power Supply Corsair SF750 Platinum
Mouse Corsair scimitar pro (this mouse need an overall guys pls) + Logitech G Pro wireless with powerplay
Keyboard Sharkoon purewriter
Software windows 11
Benchmark Scores Over 9000 !
I'll probably get buried by Nvidia fanboys here, but I've been doing some thinking. If by some people on the net RTX 3090 is around 20% faster than RTX 3080, isn't this card basically an RTX 3080Ti with a bunch of VRAM slapped on it? But this time instead of charging $1200 for it, Nvidia had this clever idea to call it a Titan replacement card and charge +$300 knowing that people who were willing to spend $1200 on previous top card will spend extra $300 on this card.

So RTX 70 segment costs the same $499 aagain. RTX 80 segment costs $699 again. But the top segment is now + $300. Performance gains being as from 700 to 900 series and from 900 to 1000 series. Which used to cost considerably less. 1080Ti was $699

I dont understand your question. The RTX 3090 is the 3080ti, they changed it to 3090 maybe because they need the Ti for something else.
 
Joined
Jun 21, 2013
Messages
602 (0.14/day)
Processor Ryzen 9 3900x
Motherboard MSI B550 Gaming Plus
Cooling be quiet! Dark Rock Pro 4
Memory 32GB GSkill Ripjaws V 3600CL16
Video Card(s) 3060Ti FE 0.9v
Storage Samsung 970 EVO 1TB, 2x Samsung 840 EVO 1TB
Display(s) ASUS ProArt PA278QV
Case be quiet! Pure Base 500
Audio Device(s) Edifier R1850DB
Power Supply Super Flower Leadex III 650W
Mouse A4Tech X-748K
Keyboard Logitech K300
Software Win 10 Pro 64bit
They did not even try to make an unobtrusive cable adapter.
 

bug

Joined
May 22, 2015
Messages
13,779 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Top