• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Readies RTX 3060 8GB and RTX 3080 20GB Models

Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
AMD buyer's cycle: "I'm waiting for the next generation" [New product launches] "Seems nice, but I'll wait for the next one which will be even better"

Nvidia buyer's cycle: "I NEED IT RIGHT NOW, GIMME GIMME" [New product launches] "SHIT, I SHOULD'VE WAITED FOR THE BETTER VERSION"

Perfect summary, and that's why Nvidia is so filthy rich right now :respect:
 
Joined
Jul 5, 2013
Messages
28,260 (6.75/day)
It's not listed, but could see a RTX 3060S 16GB being possible as well eventually.
That would be interesting.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....
With you on that one. I want a 16GB model of the 3080. I don't mind if it's only 256bit memory bus. A 16GB 3070 would also be good.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.55/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070
 
Last edited:
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
A 20GB 3080 would definitely be more enticing but I don't want to find out what the price would be. By the way, 12GB is much more likely than 20.

Whatever happened to just giving the different levelled tiers of GPUs normal amounts of VRAM... 1, 2, 4, 8, 16, 32GB etc.. I don't get it why we need 10GB or 20GB....

You can't just put any memory configuration, the VRAM capacity is linked with that the GPU's memory controllers and bus interfaces can do.
 
Joined
Jan 14, 2019
Messages
12,572 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Are people that desperate they can't wait another month to see what AMD have to offer? Nvidia have a history of mugging off their customers, day one purchasers are setting themselves up for buyers remorse.

Give it a month or two and who knows, Turing cards may start tumbling or AMD could knock it out the park. Personally i'm just being sensible and buying a PS5 for the same cost as what one of these overpriced GPU's cost.
That's cool, though buying a console that's only good for playing games, and then buying all my circa 300 games that I own on Steam again, and playing them with a useless controller instead of WASD is totally out of the question for me. Besides, building a new PC is fun, plugging a box into my TV is boring.

As for the desperation part: I agree. Better to wait than to buy the first released, inferior product.
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
What are the expectations price performance for RTX3060?

Price ~$350
Perform ~RTX2070

$350 is where usually 60% perf of the $700 xx80 cards lands. like 1060 and 2060 were.

RTX 3060 8G 4864 ~~ RTX 2080/S ~~ 60% of RTX 3081

RTX 3080 10G 8704 new shader is 31% faster than 4352 turing, and the memory is only 23% faster,
so to get the average of 31% the shader must be pulling ahead with being 39% faster ~~6144 shaders fp32 and 2560 int32.
 
  • Like
Reactions: r9

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,425 (4.69/day)
Location
Kepler-186f
Processor 7800X3D -25 all core
Motherboard B650 Steel Legend
Cooling Frost Commander 140
Video Card(s) Merc 310 7900 XT @3100 core -.75v
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p
Case NZXT H710 (Red/Black)
Audio Device(s) Asgard 2, Modi 3, HD58X
Power Supply Corsair RM850x Gold
Maybe they should launch the original 3080 first? I really don't consider meeting less than 1% of the demand a real launch.

Look at my signature. Soon you will join me in the true gaming realm my padawan.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
That makes no sense. What defines a 60-series card? Mainly that it's two tiers down from the 80-series. Which die it's based on, how wide a memory bus it has, etc. is all variable and dependent on factors that come before product segmentation (die yields, production capacity, etc.). Which die the 3060 is based on doesn't matter whatsoever - it's specifications decide performance. Heck, there have been 2060s based on at least three different Turing dice, and they all perform identically in most workloads. The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other), or at least the PS5 performance level, but that still doesn't mean 8GB isn't plenty for it. People really need to stop this idiocy about VRAM usage being the be-all, end-all of performance longevity - that is only true for a select few GPUs throughout history.
PCI-e 4.0 x16 does not really seem to be a bottleneck yet and probably won't be a big one for a long while when we look at how the scaling testing has gone with 3.0 and 2.0. Fitting 4 lanes worth of data shouldn't matter all that much. On the other hand, I think this shader-augmented compression is short-lived - if it proves very useful, compression will move into hardware as it has already supposedly done in consoles.

Moving the storage to be attached to GPU does not really make sense for desktop/gaming use case. More bandwidth through compression and some type of QoS scheme to prioritize data as needed should be enough and this is where it really seems to be moving towards.
I agree that it'll likely move into dedicated hardware, but that hardware is likely to live on the GPU, as that is where the data will be needed. Adding this to CPUs makes little sense - people keep CPUs longer than GPUs, GPUs have (much) bigger power budgets, and for the type of compression in question (specifically DirectStorage-supported algorithms) games are likely to be >99% of the use cases.

As for creating a bottleneck, it's still going to be quite a while until GPUs saturate PCIe 4.0 x16 (or even 3.0 x16), but SSDs use a lot of bandwidth and need to communicate with the entire system, not just the GPU. Sure, the GPU will be what needs the biggest chunks of data the quickest, but chaining an SSD off the GPU still makes far less sense than just keeping it directly attached to the PCIe bus like we do today. That way everything gets near optimal access. The only major improvement over this would be the GPU using the SSD as an expanded memory of sorts (like that oddball Radeon Pro did), but that would mean the system isn't able to use it as storage. And I sincerely doubt people would be particularly willing to add the cost of another multi-TB SSD to their PCs without getting any additional storage in return.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
14,019 (2.34/day)
Location
Louisiana
Processor Core i9-9900k
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax ETS-T50 Black CPU cooler
Memory 32GB (2x16) Mushkin Redline DDR-4 3200
Video Card(s) ASUS RTX 4070 Ti Super OC 16GB
Storage 1x 1TB MX500 (OS); 2x 6TB WD Black; 1x 2TB MX500; 1x 1TB BX500 SSD; 1x 6TB WD Blue storage (eSATA)
Display(s) Infievo 27" 165Hz @ 2560 x 1440
Case Fractal Design Define R4 Black -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic Focus GX-1000 Gold
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Not sure but I thought the idea was to load textures/animations/model vertex data etc into VRAM where it needs to go anyway.
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.
 
Joined
Jun 13, 2019
Messages
549 (0.27/day)
System Name Fractal
Processor Intel Core i5 13600K
Motherboard Asus ProArt Z790 Creator WiFi
Cooling Arctic Cooling Liquid Freezer II 360
Memory 16GBx2 G.SKILL Ripjaws S5 DDR5 6000 CL30-40-40-96 (F5-6000J3040F16GX2-RS5K)
Video Card(s) PNY RTX A2000 6GB
Storage SK Hynix Platinum P41 2TB
Display(s) LG 34GK950F-B (34"/IPS/1440p/21:9/144Hz/FreeSync)
Case Fractal Design R6 Gunmetal Blackout w/ USB-C
Audio Device(s) Steelseries Arctis 7 Wireless/Klipsch Pro-Media 2.1BT
Power Supply Seasonic Prime 850w 80+ Titanium
Mouse Logitech G700S
Keyboard Corsair K68
Software Windows 11 Pro
Quite a few games just have lazy devs that load far more into VRAM than necessary to play the game. We are still not at the stage where we NEED that much VRAM.

Every thread is going on about this VRAM "issue". Look at the system requirements for Cyberpunk 2077...I don't think we're hitting a wall here anytime soon.
 

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
The 3060 might obviously be around the XSX performance level (saying "XSX/PS5 performance level" is quite broad given that one is 20% faster than the other),

3060 6GB 3840 cuda.
3060 8GB 4864 cuda

there you have it, 8GB version is 30% faster,
 
Joined
Sep 17, 2014
Messages
22,668 (6.04/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I thought 10GB was enough guys... :)

Guess Nvidia doesn't agree and brings us the real deal after the initial wave of stupid bought the subpar cards.

Well played, Huang.
 
Joined
Apr 30, 2008
Messages
4,902 (0.81/day)
Location
Multidimensional
System Name Boomer Master Race
Processor AMD Ryzen 7 8745H
Motherboard MinisForum 870 Slim Board
Cooling Mini PC Cooling
Memory Crucial 32GB 5600Mhz
Video Card(s) Radeon 780M
Storage Kingston 1TB SSD
Display(s) Sony 4K Bravia X85J 43Inch TV 120Hz
Case MinisForum 870 Slim Case
Audio Device(s) Built In Realtek Digital Audio HD
Power Supply 120w External Power Brick
Mouse Logitech G203 Lightsync
Keyboard Atrix RGB Slim Keyboard
VR HMD ( ◔ ʖ̯ ◔ )
Software Windows 11 Pro 64bit
Benchmark Scores Don't do them anymore.
If nVidia taught us something with the 20 (Super) series, it's the fact that early buyers get inferior products. That's why I'm going to wait for the 3070 Super/Ti with 16 GB VRAM and a (hopefully) fully unlocked die, unless AMD's RDNA 2 proves to be a huge hit.

This times 100. I'm playing the waiting game, tbh we all are cause no one can get a new card anyways.
 
Joined
Feb 14, 2012
Messages
2,356 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
Need to see 3070S 16GB vs 2080Ti benchmarks.
 
Joined
Oct 19, 2007
Messages
8,261 (1.32/day)
Processor Intel i9 9900K @5GHz w/ Corsair H150i Pro CPU AiO w/Corsair HD120 RBG fan
Motherboard Asus Z390 Maximus XI Code
Cooling 6x120mm Corsair HD120 RBG fans
Memory Corsair Vengeance RBG 2x8GB 3600MHz
Video Card(s) Asus RTX 3080Ti STRIX OC
Storage Samsung 970 EVO Plus 500GB , 970 EVO 1TB, Samsung 850 EVO 1TB SSD, 10TB Synology DS1621+ RAID5
Display(s) Corsair Xeneon 32" 32UHD144 4K
Case Corsair 570x RBG Tempered Glass
Audio Device(s) Onboard / Corsair Virtuoso XT Wireless RGB
Power Supply Corsair HX850w Platinum Series
Mouse Logitech G604s
Keyboard Corsair K70 Rapidfire
Software Windows 11 x64 Professional
Benchmark Scores Firestrike - 23520 Heaven - 3670
3080 20GB is going to be the Ti variant. Calling it now.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
Why on earth would a 60-tier GPU in 2020 need 16GB of VRAM? Even 8GB is plenty for the types of games and settings that GPU will be capable of handling for its useful lifetime. Shader and RT performance will become a bottleneck at that tier long before 8GB of VRAM does. This is no 1060 3GB.

RAM chip availability is likely the most important part here. The FE PCB only has that many pads for VRAM, so they'd need double density chips, which likely aren't available yet (at least at any type of scale). Given that GDDR6X is a proprietary Micron standard, there's only one supplier, and it would be very weird if Nvidia didn't first use 100% of available capacity to produce the chips that will go with the vast majority of SKUs.

Does that actually make sense, though? That would mean the GPU sharing the PCIe bus with storage for all CPU/RAM accesses to said storage (and either adding some sort of switch to the card, or adding switching/passthrough capabilities to the GPU die), rather than it being used for only data relevant to the GPU. Isn't a huge part of the point of DirectStorage the ability to transfer compressed data directly to the GPU, reducing bandwidth requirements while also offloading the CPU and also shortening the data path significantly? The savings from having the storage hooked directly to the GPU rather than the PC's PCIe bus seem minuscule in comparison to this - unless you're also positing that this on-board storage will have a much wider interface than PCIe 4.0 x4, which would be a whole other can of worms. I guess it might happen (again) for HPC and the like, for those crunching multi-TB datasets, but other than that this seems nigh on impossible both in terms of cost and board space, and impractical in terms of providing actual performance gains.

Btw, the image also lists two 3080 Super SKUs that the news post doesn't mention.
I didn't say 16GB would be practical, but I could still see it happening. To be fair if they could piggy back 12GB to a RTX3060 down the road that would make much more sense in relation to the weaker harder and you might say similar to the RTX3080 situation going from 10GB to 20GB if they could simply piggy back on a few of the chips rather than every GDDR chip and scale the density further that way that's probably a more ideal scenario for everyone involved except Micron.

Because it is not the classic 60-tier GPU. From leaks and details we have so far, both 3070 and 3060 will be based on GA104. When it comes to performance we will have to wait and see but 3060 should be roughly around PS5/XBSX performance level, so it will be enough for a long while. 16GB is more of a marketing argument (and maybe preemptive strike against possible AMD competition).
Agreed it's more marketing than practicality, but that said it's a new generation GPU with new capabilities so while it might be more anemic and stretched thin in resources for that amount of VRAM perhaps newer hardware is at least more capable of utilizing it by intelligently managing it with other techniques DLSS, VRS, mesh shading, ect, but we'll see the RTX3060 isn't even out yet so we don't know nearly enough to conclude what it'll be like entirely. GPU's are becoming more flexible at managing resources every generation though on the plus side.
 
Last edited:
Joined
Jan 5, 2006
Messages
18,584 (2.68/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
by piggy-backing two of these chips per 32-bit channel (chips on either side of the PCB).

How hot would these chips get with some thermal pads and just a backplate on top of it.
 
Joined
Sep 10, 2020
Messages
60 (0.04/day)
Answer from u/NV_Tim, Community Manager from NVIDIA GeForce Global Community Team

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
-
Justin Walker, Director of GeForce product management

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.

In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples. If you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.

Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
-
 
Last edited:

ppn

Joined
Aug 18, 2015
Messages
1,231 (0.36/day)
if MVDCC power shows 70 watt on 3080, this is 7 watts per chip, hard to cool, 3090 i think those are higher density not piggy.

According to spec 3070S should land around 70% of 3080, 2080Ti being 76%. pretty close.
 
Last edited:
Joined
Apr 13, 2009
Messages
230 (0.04/day)
System Name NERV
Processor AMD Ryzen 5 2600X
Motherboard ASROCK B450M Steel Legend
Cooling Artic Freezer 33 One + 2x Akasa 140mm
Memory 2x8 Crucial Ballistix Sport 2993 MHz
Video Card(s) KFA2 GeForce RTX 3060 Ti
Storage Crucial Mx500 500GB + 1TB HDD
Display(s) Samsung C34H892
Case CM Masterbox Q300L
Audio Device(s) ALC892 + Topping D30
Power Supply Corsair RM650
Mouse CM Mastermouse Lite S
Keyboard Logitech G510
Software Win 10 Pro x64
Benchmark Scores No bech, only game!
Joined
Jul 7, 2019
Messages
932 (0.47/day)
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
 
Joined
Dec 30, 2010
Messages
2,200 (0.43/day)
The logical next step to DirectStorage and RTX-IO is graphics cards with resident non-volatile storage (i.e. SSDs on the card). I think there have already been some professional cards with onboard storage (though not leveraging tech like DirectStorage). You install optimized games directly onto this resident storage, and the GPU has its own downstream PCIe root complex that deals with it.

So I expect RTX 40-series "Hopper" to have 2 TB ~ 8 TB variants.

Dont work like that. It's only menth / designed as a cache without streaming the data from SSD/HDD/memory over the PCI-E bus.

but i'm sure you could utilitize all that memory one day, as some storage.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
How hot would these chips get with some thermal pads and just a backplate on top of it.
Who knows I'm sure if they get real hot the backplate would be a obvious indicator. Ram isn't generally particularly hot though in the first place. It's not like that can't be resolved trivially connecting it with some heat pipe cooling to the bottom heat sink cooling.
I feel the Radeon SSG prototype helped inform the direct access capability that both upcoming consoles use in slightly different ways, as well as a future possible interim-upgrade path on theoretical mid-high end GPU models in both the gaming and professional areas. Professionally, the SSG showed it can both be used as extra storage on top of being used as an ultra-fast scratch drive, according to Anandtech's article, with only the main hurdle being getting software devs to incorporate the necessary API stuff. It could be a neat feature to install your high-end games onto the GPU drive or save your project to said drive, and let the GPU load it direct from there and effectively "stream" the project/game assets in realtime.

I could see a future Radeon X700+ series and NVIDIA X070+ series of GPUs and their professional equivalents incorporating an option to install an nVME PCIe 4.0 (or 5.0, since that's tech supposedly due late next year or 2022 and expected to last for quite awhile) onto the card, as a way to boost available memory for either professional or gaming purposes. Or maybe Intel could beat the competition to market, using Optane add-ons to their own respective GPUs, acting more like reserve VRAM expansion thanks to higher Read/Write performance than typical NVMe, but less than GDDR/HBM.
I already thought about the Optane thing. AMD could just counter that with a DDR DIMM or LPDDR DIMM combined with a microSD card and doing ram disk backups to the non-volatile storage leveraging a integrated CPU chip which could also do compression/decompression as well which further improves performance. Optane is a fair amount slower by comparison to even the DDR option. Optane is cheaper per gigabyte than DDR, but it doesn't require much DDR to speed up a whole lot of memory is the reality of it you're mostly confined by the interface hence why the Gigabyte I-RAM was kind of a dud on performance relative to what you would have hoped for utilizing the memory and that's also sort of a bit limitation to HDD's on board cache performance as well it too is limited by that same interface protocol it's attached to NVMe is infinately better than SATA III in that regard especially on PCIE 4.0 unfortunately HDD's have pretty near ceased to innovate since SSD's took over on the performance side.
 
Top