• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia's Neural Texture Compression technology should alleviate VRAM concerns

Joined
Feb 1, 2019
Messages
3,383 (1.63/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
To keep on slapping on more and more gobs of VRAM generation after generation isn't a complete solution because that VRAM cost money. It raises the price of a card at a time when prices for cards are already ridiculous. imo there needs to be an increase in VRAM offered but a step towards not letting the VRAM requirements get absurd is also part of the solution.
One could apply the same argument to things like slapping on RT cores, spending loads of money to expand rendering performance.

The difference with VRAM though despite what Nvidia like to present its not an expensive component. Its just given the feel of being expensive because Nvidia are so stingy with it and of course VRAM is not modular. So one has to buy an entire new GPU to upgrade their VRAM.

If you look at it from the point of view of a tfops vs VRAM capacity on Nvidia, its regressed, if you look at it in a way of what Nvidia provide vs the console market, its regressed. At the very least I think there should be a bump to what the consoles have on mid and high end cards.

Now if Nvidia add e.g. 8 gigs VRAM and then add $400 to the price, thats them being greedy, its not the cost to them.

I think its ok to apply minimalism (to a degree) on the budget cards whilst charging an appropriate price, so 8 gigs on a 4060 for $300. But it shouldnt be that aggressive on a 4070ti same with the 3080.

Remember Intel stuck 8 gigs on a £250 card.

Isn't the only other use of tensor cores for DLSS? If they can find another use for them all power to Nvidia -- for games that don't implement DLSS at all!
In my opinion the tensor cores at least have more value than the RT cores, as DLSS can be used in many games now due to DLDSR or whatever its called, whilst RT is only on a tiny fraction of games with questionable value. But with that said I expect if Nvidia wanted they could probably implement DLDSR using normal shaders.

If I had a choice between say a 16 gig 4070ti GTX or a 12 gig 4070ti RTX at same price, absolutely 100% I be picking the former. I suspect I wouldnt be alone either.
 
Last edited:
Joined
May 17, 2021
Messages
3,005 (2.43/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
No, you're just talking nonsense, AMD does offer more memory at every price point that's just a fact.

It's one thing to sell an 800 dollar video card and skimp on VRAM and a totally different thing to sell a 350$ one and do the same, if you somehow think the two are the same then you're the fanboy here. I wouldn't expect from a lower end video card to come with a ton of VRAM but I would from one that sells for the same money that used to buy you a flagship card not too long ago.
60 class card is not lower end, the 3070 issue we are talking now the 3070 doesn't cost 800$ now, nothing there makes sense

I think most people have been consistent 8GB is fine on an entry level card. I still think even 250 is likely too much for this card but it's much better than what Nvidia seems to want to do and stick 8GB on a 400 usd one.
250 usd for the 7060 and it comes with a free plane ticket to a magic land where pink elephants fly
 
Joined
Sep 10, 2018
Messages
6,726 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
250 usd for the 7060 and it comes with a free plane ticket to a magic land where pink elephants fly

If it's over 300 usd it's a joke for sure... it's 6nm which is much cheaper than 5nm and the die isn't much larger than a 6500XT so AMD would be pretty bold to price it any higher than 300 usd pretty sure even at that price margins would be pretty high.
 
Joined
Nov 9, 2010
Messages
5,678 (1.12/day)
System Name Space Station
Processor Intel 13700K
Motherboard ASRock Z790 PG Riptide
Cooling Arctic Liquid Freezer II 420
Memory Corsair Vengeance 6400 2x16GB @ CL34
Video Card(s) PNY RTX 4080
Storage SSDs - Nextorage 4TB, Samsung EVO 970 500GB, Plextor M5Pro 128GB, HDDs - WD Black 6TB, 2x 1TB
Display(s) LG C3 OLED 42"
Case Corsair 7000D Airflow
Audio Device(s) Yamaha RX-V371
Power Supply SeaSonic Vertex 1200w Gold
Mouse Razer Basilisk V3
Keyboard Bloody B840-LK
Software Windows 11 Pro 23H2
they will have cores for path-tracing and will call it PTX6090, gone are the days for RTX :clap:

Or they could just call them PX and make them available in Military PX stores, because you'll probably need a hefty discount to afford them. LOL
 
Joined
Jul 30, 2019
Messages
3,093 (1.63/day)
System Name Not a thread ripper but pretty good.
Processor Ryzen 9 5950x
Motherboard ASRock X570 Taichi (revision 1.06, BIOS/UEFI version P5.50)
Cooling EK-Quantum Velocity2, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR4-3200 ECC Unbuffered Memory (4 sticks, 128GB, 18ASF4G72AZ-3G2F1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 2TB & 4TB 980 PRO, 2TB 970 EVO Plus, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Controller for fans, RGB, & temp sensors(4): Corsair Commander Pro
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores Typical for non-overclocked CPU.
Joined
May 17, 2021
Messages
3,005 (2.43/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
If it's over 300 usd it's a joke for sure... it's 6nm which is much cheaper than 5nm and the die isn't much larger than a 6500XT so AMD would be pretty bold to price it any higher than 300 usd pretty sure even at that price margins would be pretty high.

i'm sure the problem isn't your misaligned expectations, in what universe did amd gave any indication they are going to do those prices?
a bit like keep pre ordering games and being pissed they all come out like crap, completely unexpected, goldfish syndrome
 
Joined
Sep 10, 2018
Messages
6,726 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
i'm sure the problem isn't your misaligned expectations, in what universe did amd gave any indication they are going to do those prices?
a bit like keep pre ordering games and being pissed they all come out like crap, completely unexpected, goldfish syndrome

I mean it could cost 500 usd for all I know or anyone else but the rumors floating around are 250-330 usd so that is all we have to go by with performance in the neighborhood of the 6650XT
 
Joined
Jan 10, 2011
Messages
1,405 (0.28/day)
Location
[Formerly] Khartoum, Sudan.
System Name 192.168.1.1~192.168.1.100
Processor AMD Ryzen5 5600G.
Motherboard Gigabyte B550m DS3H.
Cooling AMD Wraith Stealth.
Memory 16GB Crucial DDR4.
Video Card(s) Gigabyte GTX 1080 OC (Underclocked, underpowered).
Storage Samsung 980 NVME 500GB && Assortment of SSDs.
Display(s) ViewSonic VA2406-MH 75Hz
Case Bitfenix Nova Midi
Audio Device(s) On-Board.
Power Supply SeaSonic CORE GM-650.
Mouse Logitech G300s
Keyboard Kingston HyperX Alloy FPS.
VR HMD A pair of OP spectacles.
Software Ubuntu 24.04 LTS.
Benchmark Scores Me no know English. What bench mean? Bench like one sit on?
Moreover, they're compressing, which means they have all the original data,
The compression (which is lossy) is done offline at the developer's end.
The decompression is what happens on the user's side.
But your other guess isn't far off. The paper does acknowledge the possibility of "generative textures." Although that's not part of the scope. Iirc, they already did something similar with the Remix AI tech.

As for the grayscale idea, there is already ways to tell the engine the colour of some object without using encoding them in a texture's pixels: Use a material with single RGB value for diffuse, or use vertex colors.

Depending on how large the textures are, couldn't transferring the compressed textures possibly save you enough time over xferring uncompressed textures that the decompression could be done for free from a time perspective?
That's how it already works.
Major graphics APIs use block compression algorithms unpacked on the GPU. Some engines take it even further by running another layer of compression unpacked on the CPU.

It has the potential to reduce VRAM usage yes but given the current market, for end consumers, the benefit it likely to be nothing. Can you blame people not rooting for a feature that will only be used to further increase Nvidia's margins?
but if it's just "tensor cores" of what Nvidia uses then it will most likely be exclusive to Nvidia cards.
The concept isn't limited to Nvidia cards. While they could, in a cynic's view of the future, throw some propriety sauce in the mix, for the method to actually be adopted, it needs to be supported by major APIs. And to do that, the means to implement at least the decoder must be accessible to all applicable vendors.
No developer would implement a critical pipeline stage that isn't widely supported across vendors. Unlike DLSS or whatever propriety crap is common these days, texture decompression isn't just something you can toggle on/off. Only way they'd implement this (and we should remember that this is a feasibility research, not a ready-to-use product) is either by having a custom decompressor for non-supporting hardware, or have two sets of textures in NTC and DXT/BC. Both approaches obviously means more work, more storage requirements, and less performance.
 
Joined
Jan 20, 2019
Messages
1,457 (0.70/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
I'm all for finding ways to use hardware better but why is Nvidia seem so reluctant to increasing vram? Is there an industry shortage?
This was a thought I had: https://www.techpowerup.com/forums/threads/general-nonsense.232862/post-5013923

No industry shortages, just pandemic-type lockdowns (VRAM) alongside pandemic-prolonged MSRPs. Pay more, get less and repeat more often (scheming obsolescence).

The reluctance doesn't end there, there's heaps of professional/content creation workloads which demand more than 8GB. Nvidias got a firm footing in these types of Pro-build use cases with a pretty strong client base. The last thing they want is abundant VRAM solutions on more affordable mid-tier consumer cards which consequentially denotes to less-profitability. Speaking of Pro-series workloads, its not just the pricier enterprise edition cards (Quadro/whatnot) but also the extortionately priced prosumer cards too (very popular outside of gaming).

8GB is still highly relevant seeing not everyone is playing the latest and most graphics challenging titles, or that too on higher resolution displays. Also a good solution for tight budget builds with compromises acceded to. All of that somewhat belongs in the ~$250/~$300 category. For a $400-$500 card, the mid-segment of hi-performance cards, the 8GB rationing is disconcerting and a tough one to swallow seeing some games are already showing signs of tipping over the limit. Thats the bit that annoys me the most... pushed to the limit before change can materialise and yet discounting the oodles of graphical splendour wished away with current VRAM limitations (tip or no tip).
 
Joined
Jan 12, 2023
Messages
193 (0.31/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base 5.0Ghz boost, -30 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Corsair RGB Harpoon PRO Wired Mouse
Keyboard Corsair K60 RGB PRO Mechanical Gaming Keyboard
Software Arch Linux
Interesting ideas from Nvidia, but I think I share similar concerns with others in the thread in that this will be used as a shortcut for bad development, similar to upscaling now. Why bother with optimization when DLSS/FSR/Shiny-new-AI-technology-of-the-day will fix it? I guess only time will tell at this point.

I don't think this is damage control from Nvidia in response to low VRAM complaints. Rather I think that AI is currently Nvidia's focus and this is another product banking on it (Microsoft and the cloud anyone?). The timing is mildly inconvenient from a perception standpoint, but I doubt Nvidia are fussed by that.
 
Joined
Oct 15, 2011
Messages
2,303 (0.49/day)
Location
Springfield, Vermont
System Name KHR-1
Processor Ryzen 9 5900X
Motherboard ASRock B550 PG Velocita (UEFI-BIOS P3.40)
Memory 32 GB G.Skill RipJawsV F4-3200C16D-32GVR
Video Card(s) Sapphire Nitro+ Radeon RX 6750 XT
Storage Western Digital Black SN850 1 TB NVMe SSD
Display(s) Alienware AW3423DWF OLED-ASRock PG27Q15R2A (backup)
Case Corsair 275R
Audio Device(s) Technics SA-EX140 receiver with Polk VT60 speakers
Power Supply eVGA Supernova G3 750W
Mouse Logitech G Pro (Hero)
Software Windows 11 Pro x64 23H2
they will have cores for path-tracing and will call it PTX6090, gone are the days for RTX :clap:
Wasn't path-tracing the predecessor to ray-tracing?
 
Joined
Sep 17, 2014
Messages
22,042 (6.01/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Be happy with what good can come(from this) and not look for the bad.
Are we happy with the good Nvidia has for us today?
Are you?

I'm certainly not.

We have a live analogy to this fancy new technology, its called DLSS3. Nvidia sells cards at premium with it.
DLSS also shows how developers use it to save themselves time, instead of it making games truly run better. I'm trying to not look for the bad here, but I just don't have this much cognitive dissonance in my gut, sorry.

If you were hoping for a circlejerk topic on Nvidia's next marketing push... wow. If you were thinking this was a serious technology in its current state... wow x2. Read the article, and please get a reality check. Its full of fantasy, like most current day articles surrounding the infinite capabilities of AI are. Remember Big Data? It was going to make our lives complete much the same. They call these buzzwords. Good for clicks & attention, and inflated shareholder value.

it can. suppose they make a 5060 with 24g physical vram. If It can neurally show me textures 4 times larger, let it do.

I'm sure more vram would draw more power. so first find a way to keep things from flying over 500w, then talk about adding vram.
That's easy, you just clock your GPU 100-200 mhz lower instead of pulling boost frequencies way beyond the optimal curve.

There is just one card flying over 500W. Its the one that has 24GB ;) And it cán run way below 500W, still carrying 24GB, and still topping the performance charts.

I'm all for finding ways to use hardware better but why is Nvidia seem so reluctant to increasing vram? Is there an industry shortage?
This was a thought I had: https://www.techpowerup.com/forums/threads/general-nonsense.232862/post-5013923
Nvidia has historically always used VRAM to secure its market and limit the potential use of its product, especially on Geforce.
They have also always pushed heavily on reducing VRAM footprint and maximizing usage. So in that sense, its understandable Nvidia wants to 'cash in' on that R&D, that's why they sell their proprietary crap instead of their hardware these days. The hardware is pretty efficient. But its also monolithic, so Nvidia is rapidly going to Intel way, and it shows, with exploding TDPs even on highly efficient Ada. If it wasn't for Ada's major efficiency gap compared to Ampere, the gen probably wouldn't even have a right to exist. There is literally nothing else in it over Ampere.

For much the same reasons AMD has stalled on RDNA3, and they've forgotten to add some spicy software to keep people interested. But there is a similar efficiency jump there too, and they've focused R&D on the hardware/packaging front, which arguably is a better long term plan to keep cost down without cutting the hardware down completely.

There are no shortages, there is only a matter of cost and profit.
 
Last edited:
Top