• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

Joined
Jun 14, 2020
Messages
3,275 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
They've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:


It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder

Contrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable :laugh:

I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.
Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.
 
Joined
Sep 26, 2022
Messages
224 (0.32/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
They've somewhat already done this, comparing the 3070 to its pro counterpart, if you missed it you might want to watch it:


It's quite the terrifying difference and I bet it's had a lot to do with people's sudden change of heart on 16 GB RAM+8 GB GPU PCs in the recent months.
Yes I saw it. But we're still not 100% sure where the 4060TI will sit on the stack.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
 
Last edited:
Joined
Dec 25, 2020
Messages
5,903 (4.39/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Yes I saw it. But we're still not 100% sure where the 4060TI will sit on the stac.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.

It will 100% undoubtedly suffer from the same problems the 3070 and 3070 Ti does. Lesser degree due to Ada's special sauce/highly optimized BVH traversal resolution which saves some memory and a significant amount of memory bandwidth on top of the large on-die cache, but I do not think it's going to save enough to save these cards' proverbial bacon :laugh:

His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder


Generally speaking, sure, but cyberpunk specifically, for some freaking reason 40 fps is a horrible experience to me. I've played games at 30 fps and it doesn't bother me that much, but 40 on cyberpunk feels atrocious.

I believe you missed the point, that the RTX 3070 would be a significantly more capable graphics card if it had the memory to deal with the workload it was presented with. That is not a flaw that the similarly performing RX 6750 XT and 6800 share, which makes of them a generally better product for the same money, especially as the cards age and VRAM requirements rise. Which they are.

Regarding Cyberpunk: Understandable, playing an atrocious game at an atrocious frame rate just isn't a good thing :laugh:

Real talk though, this is likely due to the crazy blur (even with motion blur setting off) from the imperfect path tracing and motion imperfections due to lower than intended frame rate added to the general engine jankiness, it's definitely not going to be a great experience.
 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
How is it playable on a 7900xt? It averages 70-80 fps on my 4090 with dlss on.
I had indistinguishable IQ and 60+ FPS with FSR quality and some minor tweaks, compared to native maxed out. And I've been staring at several scenes for about a half hour swapping between options there. I was genuinely curious. The only caveat is that FSR will have its typical ghosting thingy in places.

Path traced the game runs at 15 FPS :D And then considering the visual upgrade, that is in fact more of a sidegrade, I nearly fell off my chair laughing at the ridiculousness and pointlessness of it. Seriously.

But then the things that truly matter... the game isn't fun. I genuinely don't care how it runs anymore, I played it on a GTX 1080... at 50-55FPS. With FSR.
Its a neat tech demo to me, with a decent story and otherwise entirely forgettable experience. I don't even feel the urge to play it again with a new GPU that runs it ten times better and adds RT go figure.

Yes I saw it. But we're still not 100% sure where the 4060TI will sit on the stack.
Since the 3070 is already at around 450€, I really want the 4060TI to be better than the 3070, but at the same time I don't want to set very high expectations.
The 4060ti has a stronger core that is for damn sure, so with 8GB it'll be starved there is no doubt whatsoever.
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
It will 100% undoubtedly suffer from the same problems the 3070 and 3070 Ti does. Lesser degree due to Ada's special sauce/highly optimized BVH traversal resolution which saves some memory and a significant amount of memory bandwidth on top of the large on-die cache, but I do not think it's going to save enough to save these cards' proverbial bacon :laugh:



I believe you missed the point, that the RTX 3070 would be a significantly more capable graphics card if it had the memory to deal with the workload it was presented with. That is not a flaw that the similarly performing RX 6750 XT and 6800 share, which makes of them a generally better product for the same money, especially as the cards age and VRAM requirements rise. Which they are.

Regarding Cyberpunk: Understandable, playing an atrocious game at an atrocious frame rate just isn't a good thing :laugh:

Real talk though, this is likely due to the crazy blur (even with motion blur setting off) from the imperfect path tracing and motion imperfections due to lower than intended frame rate added to the general engine jankiness, it's definitely not going to be a great experience.
Of course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.
 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Of course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.
Now you're just grasping at straws buddy, please don't continue, this is turning into sad pixel peeper DLSS/FSR comparison topic territory. I'm sure Hogwarts has fantastic unforgettable generic walls to look at.

Of course it damn well matters that the 3070 would have been better with more VRAM. That is precisely the damn point that is being made wrt your average Nvidia release. Again: are you going to keep parroting the marketing story, or can we just concede on the fact Nvidia is anal probing us all? This is no you vs me debate... Its us vs them.

I agree the sensationalist Youtuber tone of voice is annoying as fuck, but the facts just don't lie, and the 16GB endowed 3070 shows facts.
People need to stop the denial, the proof is everywhere, even if that doesn't directly affect your personal use case - in the same way I tweak some settings to get Cyberpunk playable. But when people start saying 'lower quality textures do look better' to somehow defend the absence of VRAM... wow, just wow. Its of the same category of selective blindness as someone up here stating a 4070ti is unfit for 4K. It is absolute nonsense.

It reminds me of Al Gore's 'Inconvenient Truth'. Look where we are today wrt climate. We humans are exceptionally good at denial if it doesn't fit our agenda. Recognize. I'm using a rhetorical sledgehammer to keep reminding people here.
 
Last edited:
Joined
Dec 25, 2020
Messages
5,903 (4.39/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Of course the 3070 would have been better with more vram but does it matter? The question is does the current 3070 offer better image quality than it's competitor the 6700xt? I'd argue probably, because dlss + high textures is better than fsr + ultra textures, at least in some of the games, hogwarts being the prime example.

Sorry, I'm gonna have to disagree. Despite DLSS indeed being generally superior to FSR (especially with the DLSS 2.5 series DLLs installed), IMHO the single biggest improvement in image quality you can have on a game, ranking above even rendering resolution (thanks to excellent upscaling tech from both vendors) is high resolution assets and textures.

That's why I bought the 3090 (and would have bought the Titan RTX even earlier had it been available in my country at all) actually. The 24 GB lets me use and abuse of high resolution assets, even in the most unoptimized formats and engines you can imagine.

I'm an avid fan of Bethesda's RPGs, I promise you haven't seen a game chug VRAM until you run Fallout 4 or 76 with a proper UHD texture pack :laugh:
 
Joined
Jun 14, 2020
Messages
3,275 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Now you're just grasping at straws buddy, please don't continue, this is turning into sad pixel peeper DLSS/FSR comparison topic territory. I'm sure Hogwarts has fantastic unforgettable generic walls to look at.

Of course it damn well matters that the 3070 would have been better with more VRAM. That is precisely the damn point that is being made wrt your average Nvidia release. Again: are you going to keep parroting the marketing story, or can we just concede on the fact Nvidia is anal probing us all? This is no you vs me debate... Its us vs them.

I agree the sensationalist Youtuber tone of voice is annoying as fuck, but the facts just don't lie, and the 16GB endowed 3070 shows facts.
People need to stop the denial, the proof is everywhere, even if that doesn't directly affect your personal use case - in the same way I tweak some settings to get
The same argument can be used about textures. If you can't tell the difference between dlss and fsr, can you tell the difference between ultra and high textures? I've done some tests on hogwarts and it is the case that high textures + Dlss looks better than ultra + fsr. Don't know if it's the case with other games as well, but that's what hwunboxed should be testing imo.

I don't care which card rubs higher presets, I care about which card offers higher image quality, and none of his testing shows us which which is which.
Cyberpunk playable. But when people start saying 'lower quality textures do look better' to somehow defend the absence of VRAM... wow, just wow. Its of the same category of selective blindness as someone up here stating a 4070ti is unfit for 4K. It is absolute nonsense.

It reminds me of Al Gore's 'Inconvenient Truth'. Look where we are today wrt climate. We humans are exceptionally good at denial if it doesn't fit our agenda. Recognize. I'm using a rhetorical sledgehammer to keep reminding people here.

Sorry, I'm gonna have to disagree. Despite DLSS indeed being generally superior to FSR (especially with the DLSS 2.5 series DLLs installed), IMHO the single biggest improvement in image quality you can have on a game, ranking above even rendering resolution (thanks to excellent upscaling tech from both vendors) is high resolution assets and textures.

That's why I bought the 3090 (and would have bought the Titan RTX even earlier had it been available in my country at all) actually. The 24 GB lets me use and abuse of high resolution assets, even in the most unoptimized formats and engines you can imagine.

I'm an avid fan of Bethesda's RPGs, I promise you haven't seen a game chug VRAM until you run Fallout 4 or 76 with a proper UHD texture pack :laugh:
That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
 
Joined
Feb 20, 2019
Messages
7,878 (3.90/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
 

TheLostSwede

News Editor
Joined
Nov 11, 2004
Messages
16,916 (2.34/day)
Location
Sweden
System Name Overlord Mk MLI
Processor AMD Ryzen 7 7800X3D
Motherboard Gigabyte X670E Aorus Master
Cooling Noctua NH-D15 SE with offsets
Memory 32GB Team T-Create Expert DDR5 6000 MHz @ CL30-34-34-68
Video Card(s) Gainward GeForce RTX 4080 Phantom GS
Storage 1TB Solidigm P44 Pro, 2 TB Corsair MP600 Pro, 2TB Kingston KC3000
Display(s) Acer XV272K LVbmiipruzx 4K@160Hz
Case Fractal Design Torrent Compact
Audio Device(s) Corsair Virtuoso SE
Power Supply be quiet! Pure Power 12 M 850 W
Mouse Logitech G502 Lightspeed
Keyboard Corsair K70 Max
Software Windows 10 Pro
Benchmark Scores https://valid.x86.fr/yfsd9w
How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it
Sorry, not a graphics card person, but twice the memory in any application, is rarely twice the price.
Sadly DramExchange only lists pricing for 1 GB chips, although the average price for 1 GB (8 Gbit) of GDDR6 on there is listed at US$3.4, so 8 GB would be just over $27 and 16 GB if we assume the same cost, would be around $54. Keep in mind that these are spot prices and contract prices are negotiated months ahead of any production and can as such be both higher and lower.

The actual cost at the fab that makes the memory ICs, I really don't know, but again, it's hardly going to be twice the price.
 
Joined
Dec 25, 2020
Messages
5,903 (4.39/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
How expensive is it to double-stack or double-density GDDR6? That's probably a question aimed @TheLostSwede

I'm curious why AMD, Intel and Nvidia didn't just double all their VRAM this generation. 8GB has been a problem for something approaching a year now, and in the professional space, 12-24GB has been crippling in the "mainstream" space where people are trying to GPU-accelerate things that used to run in 64-128GB of system RAM. We've never bought a 48GB Quadro RTX 8000. I demoed one and it was great but the actual markup Nvidia put on that was so high it was vastly cheaper for us to just farm those jobs out to Amazon/Deadline in the cloud. You'd need to be using your RTX 8000 24/7 on high value projects to justify it

It's pricier, but something customers would accept given recent price increases. However, they do not want to eat their own lunch so to speak, they need to give people a reason to buy their workstation grade hardware. Quite artificial reasons. Also, you don't want clamshell fast memory if it can be helped: see RTX 3090's extreme VRAM energy consumption

That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.

Like I said, try some modded Bethesda RPGs with actually handmade, high resolution assets, you'll find that it's well beyond a midrange GPU's capabilities :)

1684143150191.png
 
Joined
May 17, 2021
Messages
3,005 (2.50/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
the forum is in a loop at this point with this subject.
 
Joined
Feb 20, 2019
Messages
7,878 (3.90/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
the forum is in a loop at this point with this subject.
Yes, because games need more VRAM and many new cards don't have enough VRAM.
The cycle will continue until the problem is fixed, one way or another, and I highly doubt that game devs are going to take two steps backwards just to accommodate GPU manufacturers being cheap.
 
Joined
May 17, 2021
Messages
3,005 (2.50/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Yes, because games need more VRAM and many new cards don't have enough VRAM.
The cycle will continue until the problem is fixed, one way or another, and I highly doubt that game devs are going to take two steps backwards just to accommodate GPU manufacturers being cheap.

Considering that 80% or more of gamers don't have more then 8GB of vram i would say they are just dumb by doing so. But these are the same genius that can't release a finished game to save their lives so i guess it checks out.
 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
The same argument can be used about textures. If you can't tell the difference between dlss and fsr, can you tell the difference between ultra and high textures? I've done some tests on hogwarts and it is the case that high textures + Dlss looks better than ultra + fsr. Don't know if it's the case with other games as well, but that's what hwunboxed should be testing imo.

I don't care which card rubs higher presets, I care about which card offers higher image quality, and none of his testing shows us which which is which.



That is assuming games on ultra textures have 4k resolutions, which they do not. Even tlou uses a combination of 256 and 512. All I'm saying is, until dlss + high textures vs fsr + ultra textures is actually tested, I can't say one card is better than the other cause of the Vram.
You are totally glossing over the fact that even high textures might go out of reach for low VRAM GPUs, while they will still be in reach for higher VRAM GPUs. After all if the fidelity of said texturing is SO incredible, you're talking about big chunks of data. Case in point, because Hogwarts eats VRAM like candy.

The problem hasn't changed and won't change, your argument only exists because Nvidia is stingy on VRAM, not because AMD has 'lower RT perf' nor because FSR differs from DLSS. The two are not the same thing and never will be. Especially if you consider that the higher RT perf will also eat a chunk of VRAM. You might end up having to choose between those sweet sweet High textures and RT, again because of VRAM constraints. Your comparison makes no sense. Another easy way to figure that out is by turning the argument around: what if high + DLSS actually looks worse than ultra? Still picking that 'better image quality' now... or are you actually left with no choice but to compromise after all.

And on top of that, you're fully reliant on Nvidia's DLSS support to get that supposed higher image quality in the odd title. If this isn't grasping at straws on your end, I don't know what is. You're comparing a per-game proprietary implementation quality with the presence of hardware chips to make any kind of graphics happen. Da. Fuq. It takes AMD one copy/paste iteration of FSR to get that done. That's a driver update. Did you download additional RAM yet?

the forum is in a loop at this point with this subject.
As much as you're still in denial that the world is actually moving on wrt VRAM ;) After all the majority market has 8GB you say. But then you keep forgetting the console market.
You can comfort yourself that you're not the only one in denial. See above subject.

The PS4 was the era when 8GB was enough. It has ended, and PC GPUs then suffered a pandemic and crypto, several times over, while at the end of that cycle we noticed raster perf is actually stalling. All those factors weigh in on the fact that 8GB is history. To get gains, yes, you will see more stuff moved to VRAM. Its the whole reason consoles have that special memory subsystem and its (part of) the whole reason AMD is also moving to chiplet for GPU. This train has been coming for a decade, and it has now arrived.

Did anyone REALLY think the consoles were given 16GB of GDDR to let it sit idle or to keep 2GB unused because of the poor PC market? If you ever did, you clearly don't understand consoles at all.
 
Last edited:
Joined
May 17, 2021
Messages
3,005 (2.50/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
As much as you're still in denial that the world is actually moving on wrt VRAM ;) After all the majority market has 8GB you say. But then you keep forgetting the console market.
You can comfort yourself that you're not the only one in denial. See above subject.

i've been lowering settings since before you were born probably and will do it again. Like someone said before i don't care about ultra, or ever cared about it, i doubt anyone can tell the difference in gameplay, and i can't and wont be examining power lines in a game or the pimple on a npc's face.

My case is settled so not really a choice i have to make. Going out stupid on Vram will only damn the industry, because like is said the insanely majority of gamers don't really have more then 8GB and that won't change anytime soon especially with stupid prices like we have no.

I don't live in a bubble and think everyone is buying 600$, 32GB VRAM cards. PC will not survive on the small number of sales from the few deep pocketed gamers. And no the majority is not slowing down progress, they are the ones that keep the industry alive, not the infinite minority
 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
i've been lowering settings since before you were born probably and will do it again. Like someone said before i don't care about ultra, or ever cared about it, i doubt anyone can tell the difference in gameplay, and i can't and wont be examining power lines in a game or the pimple on a npc's face.

My case is settled so not really a choice i have to make. Going out stupid on Vram will only damn the industry, because like is said the insanely majority of gamers don't really have more then 8GB and that won't change anytime soon especially with stupid prices like we have no.

I don't live in a bubble and think everyone is buying 600$, 32GB VRAM cards. PC will not survive on the small number of sales from the few deep pocketed gamers. And no the majority is not slowing down progress, they are the ones that keep the industry alive, not the infinite minority
I think we fully agree on those points, honestly.
But nobody is going out 'stupid on VRAM', that is a figment of your imagination, and thát is what I'm calling out.
What will really happen to the PC market though is that the baseline of 8GB will slowly die off and get replaced with 12, or 16. You don't need deep pockets to buy a 12 or 16GB GPU. They're readily available and they are not expensive. And commerce also wants you to buy them - both publishers and camps red&green. You will get those killer apps where you can't do without and then you'll empty that wallet, or you've just switched off from gaming entirely and you would have regardless.

Just because Nvidia chooses to adopt a market strategy of progressive planned obsolescence over the last three generations of cards, and 'because they have the larger market share' (news flash: they don't, on x86 gaming as a whole), that does not mean they get to decide what the bottomline of mainstream happens to be. Just as they don't dictate when 'the industry adopts RT' - as we can clearly see. The industry is ready when its ready, and when it moves, it moves. Despite what Nvidia loves to tell us, they're not dictating shit. Consoles dictate where gaming goes.

History repeats...

Here's PC gaming over the years. Pandemic, War, Inflation, crypto... irrelevant! It is a growth market. Its almost as certain as the revenue of bread or milk. Even the 'pandemic effect' - major growth in 2020 - didn't get corrected to this day - people started gaming and they keep doing it. Sorry about the weird thing over the title, I had to grab the screen fast or it would get covered in popups. Link below

1684155714015.png

 
Last edited:
Joined
Feb 24, 2021
Messages
126 (0.10/day)
System Name Upgraded CyberpowerPC Ultra 5 Elite Gaming PC
Processor AMD Ryzen 7 5800X3D
Motherboard MSI B450M Pro-VDH Plus
Cooling Thermalright Peerless Assassin 120 SE
Memory CM4X8GD3000C16K4D (OC to CL14)
Video Card(s) XFX Speedster MERC RX 7800 XT
Storage TCSunbow X3 1TB, ADATA SU630 240GB, Seagate BarraCuda ST2000DM008 2TB
Display(s) AOC Agon AG241QX 1440p 144Hz
Case Cooler Master MasterBox MB520 (CyberpowerPC variant)
Power Supply 600W Cooler Master
Yeah pretty sure the biggest package GDDR6 chips come in is 16Gb. Dunno where OP pulled 32Gb from... AFAIK the only 32Gb ones are the GDDR6W Samsung made, but I'm not even aware of any consumer card having them.
GDDR6W is also 64-bit per package instead of 32-bit like normal GDDR6 and GDDR6X, so it doesn't increase the amount of VRAM you can install for a given bus width. All 3 types are only up to 4GB per 64-bit of bus width, unless you use a clamshell topology.
 

64K

Joined
Mar 13, 2014
Messages
6,540 (1.71/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
Contrary to popular belief, 80fps at native settings isn't necessary for it to be considered playable :laugh:

I'm of the "60 fps or go home" school of thought, but a lot of people may be happy with 40ish in games like that. FSR would sacrifice some image quality to keep that on the upper end of that range, closer to 60 really.

I want 60 FPS in shooters. I will be alright with a little less in other genres.
 
Joined
May 17, 2021
Messages
3,005 (2.50/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I think we fully agree on those points, honestly.
But nobody is going out 'stupid on VRAM', that is a figment of your imagination, and thát is what I'm calling out.
What will really happen to the PC market though is that the baseline of 8GB will slowly die off and get replaced with 12, or 16. You don't need deep pockets to buy a 12 or 16GB GPU. They're readily available and they are not expensive. And commerce also wants you to buy them - both publishers and camps red&green. You will get those killer apps where you can't do without and then you'll empty that wallet, or you've just switched off from gaming entirely and you would have regardless.

Just because Nvidia chooses to adopt a market strategy of progressive planned obsolescence over the last three generations of cards, and 'because they have the larger market share' (news flash: they don't, on x86 gaming as a whole), that does not mean they get to decide what the bottomline of mainstream happens to be. Just as they don't dictate when 'the industry adopts RT' - as we can clearly see. The industry is ready when its ready, and when it moves, it moves. Despite what Nvidia loves to tell us, they're not dictating shit. Consoles dictate where gaming goes.

History repeats...

Here's PC gaming over the years. Pandemic, War, Inflation, crypto... irrelevant! It is a growth market. Its almost as certain as the revenue of bread or milk. Sorry about the weird thing over the title, I had to grab the screen fast or it would get covered in popups. Link below


AMD is just about to release a new 8GB card, this fixation with nvidia is idiotic at this point, a case for your psychologist to diagnose

All tech companies are down, sales are down, shares are down, profits are down, lay offs, cutting production, etc... only in your dreams would you say things are going good.
 
Joined
Jun 14, 2020
Messages
3,275 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
You are totally glossing over the fact that even high textures might go out of reach for low VRAM GPUs, while they will still be in reach for higher VRAM GPUs. After all if the fidelity of said texturing is SO incredible, you're talking about big chunks of data. Case in point, because Hogwarts eats VRAM like candy.

The problem hasn't changed and won't change, your argument only exists because Nvidia is stingy on VRAM, not because AMD has 'lower RT perf' nor because FSR differs from DLSS. The two are not the same thing and never will be. Especially if you consider that the higher RT perf will also eat a chunk of VRAM. You might end up having to choose between those sweet sweet High textures and RT, again because of VRAM constraints. Your comparison makes no sense. Another easy way to figure that out is by turning the argument around: what if high + DLSS actually looks worse than ultra? Still picking that 'better image quality' now... or are you actually left with no choice but to compromise after all.

And on top of that, you're fully reliant on Nvidia's DLSS support to get that supposed higher image quality in the odd title. If this isn't grasping at straws on your end, I don't know what is. You're comparing a per-game proprietary implementation quality with the presence of hardware chips to make any kind of graphics happen. Da. Fuq. It takes AMD one copy/paste iteration of FSR to get that done. That's a driver update. Did you download additional RAM yet?
If high textures go out of reach on an 8gb 3070 or 3060ti then ultra will probably go out of reach for the 10 / 12 gb 6700 and 6700xt as well. Up until now it hasn't happened. I know cause I have a 3060ti, it works great on hogwarts for example with textures at high.

My argument exists because nvidia is stingy on vram and amd is stingy on rt and upscaling. If high textures + Dlss looks worse than fsr + ultra textures then obviously amd cards are the better choice. That's why I want someone to test this first before concluding one way or another.

Until fsr matches dlss my point stands. After all ampere cards are already approaching the 3 year old mark. If the amd user has to wait 4-5 years for his midrange card to match or beat the Nvidia card in image quality... then I'd call that a bad purchase.

Of course it comes down to preferences after all but personally I can't stand FSR,cause I dislike sharpening with a passion and fsr adds a ton of it. I don't use fsr even on my 6700s laptop while I'm Always using dlss on my 4090 desktop, cause it is that good.
 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
AMD is just about to release a new 8GB card, this fixation with nvidia is idiotic at this point, a case for your psychologist to diagnose

All tech companies are down, sales are down, shares are down, profits are down, lay offs, cutting production, etc... only in your dreams would you say things are going good.
Yes, AMD will too, and its price will make or break it, is clearly the sentiment. I agree on that. 8GB is fine, it just depends where on the ladder it sits. And I reckon there is a good chance AMD will also price it too high. The fixation with 'Nvidia' is focused on pretty much the 4060ti and up.

As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market - even despite all the shit you too describe. Tech down... gaming up. And that explains why I have confidence the baseline will be pushed up and you will be forced to move up from 8GB. 8GB has been live in the midrange since 2016. Its 2023.

Of course it comes down to preferences
Let's leave it there, because on that point we can indeed agree.
 
Joined
May 17, 2021
Messages
3,005 (2.50/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Yes, AMD will too, and its price will make or break it, is clearly the sentiment. I agree on that. 8GB is fine, it just depends where on the ladder it sits. And I reckon there is a good chance AMD will also price it too high. The fixation with 'Nvidia' is focused on pretty much the 4060ti and up.

As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market - even despite all the shit you too describe. Tech down... gaming up. And that explains why I have confidence the baseline will be pushed up and you will be forced to move up from 8GB. 8GB has been live in the midrange since 2016. Its 2023.


Let's leave it there, because on that point we can indeed agree.

pc gaming is a lot of things, PC gaming is csgo, is pubg, is dota, sims 4, is microtransactions, is a lot of things. I doubt this is a broad discussion when we are talking about the 8GB VRAM issue. Your statement is more vram fine, people will just buy new gpus, so coming with a general pc gaming revenue chart is disingenuous at best. Specifically and that's what matters gpu sales are down, ssd sales are down, chip production is being cut down, motherboard sales are down, etc.. etc... etc...

people buy software, it just takes to be seen on what, if hardware sales are down, not mega hyper super AAA releases that demand 120GB of vram that's for sure
 

64K

Joined
Mar 13, 2014
Messages
6,540 (1.71/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) Temporary MSI RTX 4070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Temporary Viewsonic 4K 60 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
As for things going 'good' - there is a big gap between PC gaming dying and 'good'. What I'm saying is that despite all the crap we've endured, it remains a growth market

True. The doom-and-gloomers have been around for a long, long time saying PC gaming is dying. They have cut down on their predictions of doom on forums a good bit but they are still out there. The fact is that PC gaming hardware growth has shrunk since the Pandemic and mining boom when growth in PC hardware sales ballooned enormously and prices did as well but it is still positive growth and the future looks bright.

PC gaming market figures from JPR.


 
Joined
Sep 17, 2014
Messages
21,816 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
pc gaming is a lot of things, PC gaming is csgo, is pubg, is dota, sims 4, is microtransactions, is a lot of things. I doubt this is a broad discussion when we are talking about the 8GB VRAM issue. Your statement is more vram fine, people will just buy new gpus, so coming with a general pc gaming revenue chart is disingenuous at best. Specifically and that's what matters gpu sales are down, ssd sales are down, chip production is being cut down, motherboard sales are down, etc.. etc... etc...

people buy software, it just takes to be seen on what, if hardware sales are down, not mega hyper super AAA releases that demand 120GB of vram that's for sure
That is a strong argument, indeed. People will simply not play things they can't run. Time will tell who blinks first, consumer or industry push.

But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.
 
Top