• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
I think a lack of VRAM is holding this card back, a lack of VRAM holds all cards back apparently.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
I think a lack of VRAM is holding this card back, a lack of VRAM holds all cards back apparently.

64-bit 288GB/s effective bandwidth 32GB VRAM here we come! RTX 5060 SUPER Ti...have they done done a SUPER Ti or is the marketing team sleeping on making it sound more impressive than it is in actuality which is what they specialize at.
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
It does make me wonder with the bigger titles at least if it is a matter of the games originating on Consoles. Both the PS5 and Xbox series X have 16gb of memory. They also have at least 256-bit buses. The Xbox series X has a mixed memory with 10GB's being 256-bit and 6gb's being 320-bit and with both consoles having much larger bandwidth.
[Major reeleases are primarily optimized for consoles, as those have more optimal and standardised features. Then optimizations for PC hardware and its diverse VRAM and performance tiers. Devs don't have time and money to cover everything perfectly. Obvious victim could be 8GB in increasing number of cases. Not all textures would load and game automatically looks much better on consoles.
 
Joined
Nov 4, 2005
Messages
12,014 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Sooo, all the crap talking and it means nothing. Where dem Green bois at?
 
Joined
Feb 24, 2023
Messages
3,126 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
>10GB and >500Gbps is where you must be for any new purchase or you are wasting time and precious money.
I don't think this statement is valid if you're investing $40 in an RX 480 or something like that. Yeah, this card sucks but $40 is also a joke of a price. A good joke, unlike 4060 Ti's.

I still have this GPU as my backup and it's allowing for playing if not everything but at least most games. Even demanding ones like Cyberpunk 2077 are give or take playable (with asterisk but still).

There also are games which don't need you to have a serious GPU. Starcraft 2, for example. It's very CPU-taxing but even HD 6950 is enough. And this GPU is 13 years old or so. Some gamers just ignore graphics settings and can live with everything low if framerate and frametimes are comfortable. CS:GO as a clear example of such approach.

But yeah, if you need ultimate 1080p performance you have to acquire something which has double-digit VRAM amount and at least 500 GBps VRAM bandwidth. RX 6800/RTX 3080/RTX 4070 as the minimum if we consider 1% lows below 60 FPS unacceptable.
 
Joined
Oct 18, 2017
Messages
189 (0.07/day)
System Name 1080p 144hz
Processor 7800X3D
Motherboard Asus X670E crosshair hero
Cooling Noctua NH-D15
Memory G.skill flare X5 2*16 GB DDR5 6000 Mhz CL30
Video Card(s) Nvidia RTX 4070 FE
Storage Western digital SN850 1 TB NVME
Display(s) Asus PG248Q
Case Phanteks P600S
Audio Device(s) Logitech pro X2 lightspeed
Power Supply EVGA 1200 P2
Mouse Logitech G PRO
Keyboard Logitech G710+
Software Windows 11 24H2
Benchmark Scores https://www.3dmark.com/sw/1143551
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
 
Last edited:
Joined
Feb 20, 2021
Messages
311 (0.22/day)
System Name Office,Home and Game PC
Processor Intel Core i5 12600k Up to 4.9 GHz
Motherboard Z690 Gaming X Gigabyte DDR4 Version
Cooling Fuma 2 Air Cooler
Memory 32GB DDR4 2x16 3600 MHz Patriot Viper Steel RAM
Video Card(s) NVIDIA GeForce GTX 1080 and RTX 3070
Storage 512 GB M2 PCI Ex 3.0 NVMe SX6000 Pro, 1TB NV2 Kingston M2 PCI Ex 4.0 and 4TB WD Blue SATA 3.0 HDD
Display(s) 27 inç 75 Hz LG
Case Cooler Master MB511
Audio Device(s) Creative 2+1
Power Supply 750W 80+ Bronze PSU High Power Element
Mouse Logitech Wireless
Keyboard Microsoft
VR HMD N/A
Software Windows 10-11
The test results show why Nvidia isn't too keen on submitting the product for review. Except for a few optimized disabled games, the 16gb model doesn't make a difference in performance.
Assuming the intended use of this card is Full HD, it won't matter for a long time.
In short, this gpu is a complete waste of sand.
 

bblzd

New Member
Joined
Mar 9, 2023
Messages
3 (0.00/day)
Portal RTX would be a good title to test and compare 8 v 16.

Problem is 16GB helps most at higher resolutions, but 4060 Ti already struggles at resolutions above 1080P due to low bus width. Too many other bottlenecks before it reaches the VRAM pool size limit.
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Question.
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?
 
Joined
Aug 21, 2015
Messages
1,752 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors. Yes, it costs too much, by at least USD50, and has too narrow a memory bus. But let's look at the last 9 generations of x60 cards.

I collated the results from TPU testing at the second-highest tested resolution. 960 is listed at two resolutions as the transition point to 1440p. The 560, 660, 960, 1060 and 1660 are all within 5 FPS of 60 and range from USD200-250 and 120-150W. The 760 was an outlier pre-Turing. It's also tied for hungriest here, as well as tied for most expensive until the 2060.

Note that with the 2060, price and power both took a big jump to achieve the accompanying performance increase. This was where Nvidia decided to redefine what 60-series, and all performance tiers, meant. If one simply looks at model numbers, yeah; the 4060 is a slap in the face relative to the 3060. But look at power. Gen-on-gen, performance of x60 cards pre-RT was pretty consistent relative to contemporary titles, as was power and price (again, 760 excepted). Anyone expecting the 4060 ti or even 4060 to be sold for USD200 is bordering on delusional. But at $250 and, say, $280? Whole different ball game. Downward price pressure is nearly always weaker than upward. Barring another crypto-style event, we could see "normal" 60-series P/P and pricing in a gen or two.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
5602001501GB2561920x120060
6602301402GB1921920x120063
7602501702GB2561920x108073
9602001202GB1281920x108065
9602001202GB1282460x144043
10602501206GB1922460x144055
16602201206GB1922460x144056
20603501606GB1922460x144085
306033017012GB1922460x144085
40603001158GB1282460x144070

I will beat this drum until people start listening: Performance expectations are, in many cases, simply too high. Beyond resolutions increasing, rendering demand rises right along with it. x60 cards were meant, pre-Turing (and for Turing if you count the 1660), to make around 60 FPS in mainstream to upper-mainstream resolutions for 150W/$250 or less. Then Nvidia belched out RTX, and PC gamers got Stockholm Syndrome and let NV nudge "mainstream" up the ladder. Hell, by the above numbers, the 3050 should have launched as a 60-series card: $250*. 130W. 60fps@1440p. Inflation adjusted, it's practically a dead ringer for the 960.

*Yes, I know the 3050 never actually listed for it's "launch" price

Question.
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?

4th worst at 1080p, 3rd worst above (sticking to Ada, anyway).

1690261127200.png
 
Last edited:
Joined
Feb 24, 2023
Messages
3,126 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Is this THE WORST or the 2nd worst Video Card released by NVidia, price/performance wise?
Laaaaaaadies and gentleman, and all the other 65535 genders, welcome to our Ridiculous Crap Chart 2023!

Today's participants are Ada Lovelace GPUs and the question is who is giving the least bang per buck!

We are measuring their bangs in relative performance at 4K in video games from the 16-gigabyte 4060 Ti's TPU review, one and only to this moment!
We are measuring their bucks in illegal transients United States dollars. $20 is $20.

RTX 4060 is 60% the price and 78% the performance. Not bad considering how puny and cheeky this GPU looks.
RTX 4060 Ti 8 GB is 80% the price and 98% the performance. Record breaking efficiency here! Call in the cops!
RTX 4060 Ti 16 GB is 100% the price and 100% the performance. How come!
RTX 4070 is 120% the price and 139% the performance. Not like we didn't expect this but... Smells fishy.
RTX 4070 Ti is 160% the price and 169% the performance. This participant tried their best to disappoint us and our jury rated their attempts high, yet not high enough.
RTX 4080 is 240% the price and 213% the performance. That's how we laze!
RTX 4090 is 320% the price and 274% the performance. This participant has been banned from participating due to bearing too many ARMs being unable to run at 100% potential due to current state of CPU performance.

So our golden tier underperformer is RTX 4090, yet it has to be revoked due to tango only being a thing if both perform, yet CPUs nowadays are too slow to make completely fair use of 4090. Thus, the golden award is bestown upon the RTX 4080 which has grasped the fail from the jaws of victory. And the silver award, indeed, is given to the 4060 Ti 16 GB, the second worst performer of this generation.

The bronze award should have been given to the RTX 4070 Ti but all things considered, this GPU doesn't deserve the credit.
 
Joined
Oct 1, 2006
Messages
4,934 (0.74/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
So much for games needing more than 8 GB VRAM. In 4K with RT on, perhaps.
It is not as simple as that, many games just strait up not loading some assets when it runs out of vram. Meaning the image might not be the same on both cards.
Also it depends on the scene, often the game starts out smooth but as vram fills up the 8GB cards can starts stuttering. So it also depends on how long the benchmark run is.
 
Last edited:

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,042 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
"The fact is that NVIDIA owns the GPU market"

Sums up everything for me with your statement W1z for the 4xxx release.

Thanks for the purchase and review.
 
Joined
Feb 24, 2023
Messages
3,126 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors.
Whilst performance-wise you're more correct than not there are other things.

0. All Ada Lovelace products and especially, the lower tier ones, lack reasonable VRAM amount and, most importantly, bandwidth. This shifts their real tier one tier lower.
1. All Ada Lovelace products are more cut than their predecessors. 3060 sports the same potential of Ampere architecture as 4070 sports the potential of Ada. But hey, 4070 suddenly costs almost twice as expensive despite being nowhere near twice as fast!
2. All Ada Lovelace products have been launched at the time when almost nobody wants a GPU. There is little demand and most demanders are now either picky, or broke, or both. There is almost no miners with their "please give us another million GPUs for whatever price" around as of now. They're gone. nVidia has to consider this fact as well.
3. If everyone submits to nVidia's policy and pays what they ask for they will see no reason for improvement. Now, seeing the worst demand ever, they are forced to think. Mining fever hangover is still here in the air but it will fade away some day. And then, nVidia will sober their pricing up, unless AMD and Intel fail to achieve anything. And they are currently achieving very little with only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
4. You can't take GPU's naming seriously if it is not faster than its predecessor in every possible scenario. And yes, all 4060 Series GPUs do lose to their 3060 Series predecessors in certain tasks which has never happened before.
 
Joined
Jan 14, 2019
Messages
12,584 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
... only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
I agree with your points, except for this. RDNA 2 is still alive and kicking and deserving of attention. Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
This card has hidden value for plenty of gamers on budget. They will see the value in one or two years from now when the price drops. DLSS+frame gen+reflex+L2 cache+16GB VRAM is death combo for future games which will be more adapted to this gen. The combo was very well designed by nvidia.
It is good that 16GB vram got this model, because 60class are used by the most gamers who are keeping their cards many years in opposite 4070/4080 gamers who are changing cards more often for new ones.

Another point gamers are overlooking is we have plenty of options how to manage fps using game settings(even without frame gen), but only few how to manage VRAM utilization/allocation.

16GB could be immediatelly used by any modern open world game. CP77 is constantly reading game data fom disk at avg. speed of 30MB/s on Ultra settings.

Graphics HW is not responsible for the software we are running on it. The software needs to be adapted to HW. The bad product simply does not exist, every one has its palce.
All the comments I see on web about this gen has simply almost zero value for me.
 
Joined
Feb 24, 2023
Messages
3,126 (4.68/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC / FULLRETARD
Processor i5-12400F / 10600KF / C2D E6750
Motherboard Gigabyte B760M DS3H / Z490 Vision D / P5GC-MX/1333
Cooling Laminar RM1 / Gammaxx 400 / 775 Box cooler
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333 / 3 GB DDR2-700
Video Card(s) RX 6700 XT / R9 380 2 GB / 9600 GT
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 / 500 GB HDD
Display(s) Compit HA2704 / MSi G2712 / non-existent
Case Matrexx 55 / Junkyard special / non-existent
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / non-existent
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 11 / 10 / 8
Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
No clue (aterisk) about pricings abroad but in my particular piece of the globe the whole RDNA2 makes zero sense to buy. RX 6600 is absolutely destroyed by aftermarket 2080s, RX 6600 XT is trashed by 3060-12, 6700 does not exist, 6700 XT somehow is priced like 3070 which basically is far from passing the reality check, the 6800 is priced a bit lower than 3080 (and y'know, 3080 DESTROYS 6800 in RT and doesn't let it relax in non-RT), the 6800 XT is priced like 3080 (but do you remember absence of RT performance and DLSS? I do) and 6900s are priced so high it makes no sense to even consider them. I speak about both BNIB and aftermarket ones. They're equally biased and not in favour of AMD Perf/Price ratio.

Half a year ago, yes, my purchase of 6700 XT was questionable, yet somewhat reasonable. In today's market, I'd be ballin' on a used 3080 or perhaps a 6800 if my [Barter 95] wreaked some sense in a vendor.

* just checked the Newegg.
RX 6600 XT is more expensive than RTX 3060-12. Fail.
RX 6700 non-XT is same price as 4060/3060 Ti, yet it's noticeably slower than them, especially in RT-happy scenarios. Fail.
RX 6700 XT is 10 bucks more expensive. Fair enough but "great value" is a very ambitious way to describe it.
RX 6800 makes no sense since the XT version is only 50 bucks more expensive and provides with significantly more pace.
XT version, though, makes sense.
 
Joined
May 3, 2016
Messages
137 (0.04/day)
Maybe the upcoming Radeon RX 7800 XT and RX 7700 XT can change that.

All the leaks point to non-XT versions. We might see the XT ones once NVIDIA refreshes its line-up somewhere next year.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Thanks for the review but you should include dlss3 frame gen enabled fps in your game benchmark graphs for the games which allow it like cyberpunk.

Because who is going to buy a 4xxx series and play these games with DLSS3 and FG off? It doesn't make any sense. It's like if you purchase an electric bike but in the test you only use it without using the engine.

For people who are not familiar with the latest technologies and only look at graphs it gives the false impression that a 3070 is better at running cyberpunk than a 4060 while the 4060 crushes the 3070 with FG.

But at least this review shows properly for all the people crying on the net about nvidia cards not having enough vram that currently 16GB or 8GB for midrange cards equals to the same performance. Of course in the following years more vram will be beneficial, the same way a 6GB 1060 is way better than a 1060 4GB nowdays but it wont come to this before a few years.

In other words for people who change their GPU every 2-3 years there is no reason to put an extra $100 for a 16GB 4060.
Wrong, DLSS3 is not a universal feature so no. It needs its own chart as much as a feature like Hairworks did way back. Its proprietary + game specific + hardware specific (ada only) and even driver /patch/update specific performance. It should have no effect on the overall averages.

"The fact is that NVIDIA owns the GPU market"

Sums up everything for me with your statement W1z for the 4xxx release.

Thanks for the purchase and review.
GPUs in x86 consoles amount to an equal market share and drive most gaming forward. The amount of PC first releases these days is a rarity rather than commonplace, on top of that. A good thing to keep in mind. This is also the main reason we have seen games for PC ported over as badly as they are now and why features like RT are still an afterthought more often than not, even despite the fact there are three generations of RT capable hardware from this supposed GPU market 'owner'...
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,966 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Wrong, DLSS3 is not a universal feature so no. It needs its own chart as much as a feature like Hairworks did way back. Its proprietary + game specific + hardware specific (ada only) and even driver /patch/update specific performance. It should have no effect on the overall averages.
Yup .. with the new H2 2023 test platform I've added testing in 3 games to the DLSS page:

Didn't feel it was relevant for the 16 GB model, so I didn't test.

This will be included in all new releases going forward, but I have no plans to bring DLSS performance results into our "normal" testing.

This card has hidden value for plenty of gamers on budget. They will see the value in one or two years from now when the price drops
No doubt, if the price is right, this will be an interesting choice. /doubt on value in a few years, how's 3080 12 GB doing these days? Seems to me most of the action on the used market is for the 10 GB model (I could be wrong though)
 
Joined
Jan 14, 2019
Messages
12,584 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
No clue (aterisk) about pricings abroad but in my particular piece of the globe the whole RDNA2 makes zero sense to buy. RX 6600 is absolutely destroyed by aftermarket 2080s, RX 6600 XT is trashed by 3060-12, 6700 does not exist, 6700 XT somehow is priced like 3070 which basically is far from passing the reality check, the 6800 is priced a bit lower than 3080 (and y'know, 3080 DESTROYS 6800 in RT and doesn't let it relax in non-RT), the 6800 XT is priced like 3080 (but do you remember absence of RT performance and DLSS? I do) and 6900s are priced so high it makes no sense to even consider them. I speak about both BNIB and aftermarket ones. They're equally biased and not in favour of AMD Perf/Price ratio.

Half a year ago, yes, my purchase of 6700 XT was questionable, yet somewhat reasonable. In today's market, I'd be ballin' on a used 3080 or perhaps a 6800 if my [Barter 95] wreaked some sense in a vendor.

* just checked the Newegg.
RX 6600 XT is more expensive than RTX 3060-12. Fail.
RX 6700 non-XT is same price as 4060/3060 Ti, yet it's noticeably slower than them, especially in RT-happy scenarios. Fail.
RX 6700 XT is 10 bucks more expensive. Fair enough but "great value" is a very ambitious way to describe it.
RX 6800 makes no sense since the XT version is only 50 bucks more expensive and provides with significantly more pace.
XT version, though, makes sense.
At Scan UK, the 6650 XT is £100 cheaper than the 3060 - that's about third of the price. The 6700 XT is the same price as the 3060. The 6800 is on par with the 3070. There is an XFX 6950 XT that is £50 cheaper than the 3080 10 GB right now. So yeah, I think AMD offers damn good value right now, at least here.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
It is a *50-tier card, rebadged to mislead the customers that it belongs to a higher tier. No.

Card | Chip | Die area
RTX 4060 AD107 159 mm^2
RTX 3050 GA106 276 mm^2
GTX 1650 S TU116 284 mm^2
GTX 1650 TU117 200 mm^2
GTX 1050 GP107 132 mm^2
GTX 950 GM206 228 mm^2
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I agree with your points, except for this. RDNA 2 is still alive and kicking and deserving of attention. Both the 6600 and 6700 series are great value, even the 6800 and 6900 series are okay with the right discount.
Sure but having to watch out for PSU limitations with RDNA3 and Ada to compete with makes them a bit problematic imho.
 
Joined
Jan 14, 2019
Messages
12,584 (5.80/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Sure but having to watch out for PSU limitations with RDNA3 and Ada to compete with makes them a bit problematic imho.
What PSU limitations? A basic 500 W unit (not the cheapest kind) can drive a 6600 XT with any CPU.

If by "Ada", you mean the 4060, then sure, I agree. The rest are too expensive, imo.
 
Joined
Apr 1, 2009
Messages
60 (0.01/day)
To anyone who thinks this should have been a 50-tier card... No. Even the 4060 (which I'll grant could be argued as a 4050 ti) holds up reasonably well against its predecessors. Yes, it costs too much, by at least USD50, and has too narrow a memory bus. But let's look at the last 9 generations of x60 cards.

I collated the results from TPU testing at the second-highest tested resolution. 960 is listed at two resolutions as the transition point to 1440p. The 560, 660, 960, 1060 and 1660 are all within 5 FPS of 60 and range from USD200-250 and 120-150W. The 760 was an outlier pre-Turing. It's also tied for hungriest here, as well as tied for most expensive until the 2060.

Note that with the 2060, price and power both took a big jump to achieve the accompanying performance increase. This was where Nvidia decided to redefine what 60-series, and all performance tiers, meant. If one simply looks at model numbers, yeah; the 4060 is a slap in the face relative to the 3060. But look at power. Gen-on-gen, performance of x60 cards pre-RT was pretty consistent relative to contemporary titles, as was power and price (again, 760 excepted). Anyone expecting the 4060 ti or even 4060 to be sold for USD200 is bordering on delusional. But at $250 and, say, $280? Whole different ball game. Downward price pressure is nearly always weaker than upward. Barring another crypto-style event, we could see "normal" 60-series P/P and pricing in a gen or two.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
5602001501GB2561920x120060
6602301402GB1921920x120063
7602501702GB2561920x108073
9602001202GB1281920x108065
9602001202GB1282460x144043
10602501206GB1922460x144055
16602201206GB1922460x144056
20603501606GB1922460x144085
306033017012GB1922460x144085
40603001158GB1282460x144070

I will beat this drum until people start listening: Performance expectations are, in many cases, simply too high. Beyond resolutions increasing, rendering demand rises right along with it. x60 cards were meant, pre-Turing (and for Turing if you count the 1660), to make around 60 FPS in mainstream to upper-mainstream resolutions for 150W/$250 or less. Then Nvidia belched out RTX, and PC gamers got Stockholm Syndrome and let NV nudge "mainstream" up the ladder. Hell, by the above numbers, the 3050 should have launched as a 60-series card: $250*. 130W. 60fps@1440p. Inflation adjusted, it's practically a dead ringer for the 960.

*Yes, I know the 3050 never actually listed for it's "launch" price



4th worst at 1080p, 3rd worst above (sticking to Ada, anyway).

View attachment 306153

[Major reeleases are primarily optimized for consoles, as those have more optimal and standardised features. Then optimizations for PC hardware and its diverse VRAM and performance tiers. Devs don't have time and money to cover everything perfectly. Obvious victim could be 8GB in increasing number of cases. Not all textures would load and game automatically looks much better on consoles.

The problem is in the past you could always lower settings a bit and run at higher resolutions on cards. The 4060 ti is clearly 100 percent only a 1080p card. In fact Hardware Unboxed took it further and said the 4060 ti is only a 720p card because it can't complete all textures in a lot of games at 1080p.

I don't think expectations are to high. We have had two console generations in a row that support 4k gaming in one form or another. A 4060 ti doesn't get you 4k at all. And it cost up to over $500 for $16gb versions. Here is a metric for you. The first 1080P graphics card was the 8800gts. It released in 2007. 16 years pc gamers are still expected to pay $500 or more to game at 1080P just on the gpu side of things. While for $400 or $500 with disc drive you can buy a PS5 or Xbox Series X that games at 4k.

Also remember that the 4070ti was supposed to be a 4080 model. That means that the whole product stack was originally planned to even be up one place. Also those plans included a $100 price increase for the 4080 that eventually became the 4070ti. That means the 4070 was supposed to be the 4070ti. That means the 4060ti was supposed to be the 4070. That the 4060 was supposed to be the 4060 ti.

That also means they meant to sell what is now the 4060ti at probably $600. That is a total joke.
 
Top