• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 4060 Ti 16GB Model Features 5W Higher TDP, Slightly Different ASIC Code

Joined
May 17, 2021
Messages
3,005 (2.43/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
That is a strong argument, indeed. People will simply not play things they can't run. Time will tell who blinks first, consumer or industry push.

But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.

what will save them is gamers will buy them because they want to play them if the game is really good, and just play at low, until they can upgrade, like everyone ever did since PC gaming started. People play games on toasters and it never stopped no one.

The shitty games will not sell and blame the weather and stuff. In this case most AAA games this days.
 
Joined
Apr 21, 2005
Messages
174 (0.02/day)
His videos lately are just BS. Chooses settings to hog the Vram of the 3070,and then acts surprised it stutters like crazy. The actual question is, what is the image quality impact in those games if you drop textures to high instead of ultra? Not a lot I'd imagine, that's why he is not testing it. Won't generate as many clicks as just pooping on nvidia will.

Why isn't he for example testing amd vs nvidia on the new PT cyberpunk upgrade? I really wonder

Given price comparable cards can run the games at the chosen settings without stuttering or without having really ugly texture swapping it is an issue with the 3070. The fact you need to downgrade the IQ on the 3070 vs price comparable cards is not great.

This is only going to happen in more and more games as devs drop PS4 and Xbox One development and focus solely on PC, PS5 and Series X.
 

the54thvoid

Super Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,891 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
But consider for a moment that usually the very same people dó applaud the RT push, the new tech, etc. ;) They already blinked.

I'm not taking that personally, I came from a long line of top end NV cards, up to the awfully priced 2080ti (I had money come my way - so why not). But in that backstory, I bought a hardware gsync monitor (and it works well, especially in sub 60fps games like Control and CP2077). I'm tied into that eco-system to a degree, otherwise, I'd probably have bought a 7900XT, although same rules -- no more expensive than £800. Also, I've another rule for the hell of it which is I'll move up a tier when I can get 50% extra perf for the same power budget. The 4070ti gave that over my 2080ti. The 7900XT didn't quite. Plus, my card's ultra silent.

It's not always about blinking first.
 
Joined
Apr 13, 2023
Messages
307 (0.57/day)
System Name Can it run Warhammer 3?
Processor 7800X3D @ 5Ghz
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Enermax Liqmax III 360mm
Memory Teamgroup DDR5 CL30 6000Mhz 32GB
Video Card(s) Gigabyte 4090
Storage Silicon Power XS70, Corsair T700
Display(s) BenQ EX2710Q, BenQEX270M
Case NZXT H7 Flow
Audio Device(s) AudioTechnica M50xBT
Power Supply SuperFlower Leadex III 850W
I don't think nvidia cares much about whether or not you buy their new card. They are not putting much vram to prevent pros going for the mainstream models instead of the quadros that sell for much more.

Now with that said, someone could make the same argument about AMD, they are RT performance starved to force you to upgrade. The thing is, anyone with a 3080 or a 3090 will feel that the 7900xtx is a sidegrade in some areas, and a downgrade in others, which it is when it comes to losing DLSS and the RT performance.
If losing some features only in a handful of titles in exchange for 24-32% performance uplift is a "sidegrade"- that end user was never considering AMD anyway.
1684165208824.png
 
Joined
Apr 10, 2020
Messages
496 (0.30/day)
4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:
 
Joined
Sep 17, 2014
Messages
22,050 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I'm not taking that personally, I came from a long line of top end NV cards, up to the awfully priced 2080ti (I had money come my way - so why not). But in that backstory, I bought a hardware gsync monitor (and it works well, especially in sub 60fps games like Control and CP2077). I'm tied into that eco-system to a degree, otherwise, I'd probably have bought a 7900XT, although same rules -- no more expensive than £800. Also, I've another rule for the hell of it which is I'll move up a tier when I can get 50% extra perf for the same power budget. The 4070ti gave that over my 2080ti. The 7900XT didn't quite. Plus, my card's ultra silent.

It's not always about blinking first.
I agree on the blinking. That 7900XT wasnt the 'optimal' choice either to me... just the most interesting really. Could have waited even longer... just didnt want to.
 
Joined
Feb 20, 2019
Messages
8,044 (3.91/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
Considering that 80% or more of gamers don't have more then 8GB of vram i would say they are just dumb by doing so. But these are the same genius that can't release a finished game to save their lives so i guess it checks out.
More than half of all gamers are on current-gen consoles, where VRAM is 10GB+ and those 10GB+ are more efficiently allocated than a desktop GPU with Windows+driver overheads. So no, it's not 80% of gamers. It's the 20% of 80% of PC gamers who actually play AAA games on PC. As a percentage of the AAA gaming market, something approaching 90% of gamers have >8GB of RAM.

It's about choosing the relevant statistic for the issue at hand, and the issue is that developers make games for the largest demographic - which is console-first and PC second. Of those 80% of machines in the Steam Hardware Survey that have 8GB or less, only a tiny fraction of them will be playing the latest AAA titles.
 
Joined
Oct 29, 2019
Messages
466 (0.26/day)
4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:
Yeah I'm having that issue as well. My wife bought me a quest 2 and I really wanted to play flight simulator VR.

Quest 2 doesn't like the AMD encoder so I can't really go with them. Nvidia the 4070s are gimped and lots of people on the flight simulator forum where complaining of the stutter your experiencing.

Thanks to Nvidia artificially raising the price of the 4070 to $600-650 range I can't even really get a worthwhile price on a used 3080 (which I would need a worthwhile price if I'm going to gamble on a card that 50% chance was mined on).
 
Joined
Dec 25, 2020
Messages
6,270 (4.54/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
I want 60 FPS in shooters. I will be alright with a little less in other genres.

Yeah, I consider 60 to be the bare minimum myself regardless of genre. 120 fps is nicer, but not a must as long as I can upkeep the settings and use tools like Special K to do some frametime magic and keep it smooth :)

4070 TI is a shitshow and 4060TI will be even more so, due to even narrower memory bus, so why even bother with 16 gigs on a 128 bit bus dGPU?

I was misled by 4070TI's review benchmarks. I pulled a trigger on it to replace my 3090 due to favorable poweer consumption and oh boy what a mistake that has been. My primarly use of these GPUs is for VR gaming at around 3124 x 3056 px*2 resolution (HP reverb G2's 1SS resolution) and at this res 4070TI's narrow bus comes into play, causing really bad micro stutters with huge ms spikes, making it borderline unusable for VR. True, fps averages of 4070TI are slightly better than that of 3090, but 3090 totally kills it in terms of smoothness, even when undervolted/underclocked to 1750MHz/750mV to get consuption down to 250W. The whole ADA gen with exception of 4090 is an utter joke :banghead:

Yep, that is exactly why I did not upgrade this generation. I'm unwilling to pay what they ask for the 4090, and no other GPU provides me with an upgrade worth my time. The 4080 isn't enough and the 7900 XTX has too many downsides for it to be worth it.

I'll be waiting for RDNA 4 and Blackwell, in the meantime, I upgraded my processor to something that will last a considerable amount of time and will be going after finally buying a super high end display.
 
Joined
Jun 14, 2020
Messages
3,275 (2.08/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Given price comparable cards can run the games at the chosen settings without stuttering or without having really ugly texture swapping it is an issue with the 3070. The fact you need to downgrade the IQ on the 3070 vs price comparable cards is not great.

This is only going to happen in more and more games as devs drop PS4 and Xbox One development and focus solely on PC, PS5 and Series X.
But you need to downgrade the IQ on the comparable card as well. A 6700xt for instance doesn't max out hogwarts for example. Not because of vram - but because of lack of performance, both raster and RT. So for 1440p you have to activate FSR to have decent framerate. So my question is very simple, does FSR Q + Ultra textures look better than DLSS Q + High textures that youll play on a 3060ti / 3070? That's what I wanna see tested before I conclude which card is better
 
Top