• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

Joined
Sep 17, 2014
Messages
21,881 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
No.
AMD has nothing to replace these cards with. They are still in production and will be in production for a very long time.
You can undervolt the RX 6800 and approach the RTX 4060 Ti power consumption.
The 6800XTs are definitely running thin over here in NL. A few available at around 560 EUR at this time. Some priced double that for reasons unknown. It signifies theyre EOL. 6800 and 6700 are better populated still though.
 
Joined
Nov 26, 2021
Messages
1,517 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
But wouldn't faster system RAM help with performance, somehow? I mean, DDR5-6000 provides 96 GB/s in dual-channel - fairly decent, I'd say.
It's 1/3 of what the actual bandwidth of the 4060 Ti is. They could use the system RAM to store textures in the distance, and dedicated VRAM for the closer proximity (if that's possible?).
I know graphics cards do this (taking system RAM), but I believe there's some unexplored potential here: differences should be more significant going from 51.2 GB/s to 96 GB/s.
No it won't as the GPU can't access that RAM at anywhere close to 96 GB/s. The 4060 Ti has 8 lanes of PCIe 4 so it won't be able to access system RAM at more than 15.75 GB/s. Real bandwidth will be lower than that.
 
Joined
Jan 14, 2019
Messages
11,279 (5.46/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
But wouldn't faster system RAM help with performance, somehow? I mean, DDR5-6000 provides 96 GB/s in dual-channel - fairly decent, I'd say.
It's 1/3 of what the actual bandwidth of the 4060 Ti is. They could use the system RAM to store textures in the distance, and dedicated VRAM for the closer proximity (if that's possible?).
I know graphics cards do this (taking system RAM), but I believe there's some unexplored potential here: differences should be more significant going from 51.2 GB/s to 96 GB/s.
Not when you're limited to PCI-e x8.

No it won't as the GPU can't access that RAM at anywhere close to 96 GB/s. The 4060 Ti has 8 lanes of PCIe 4 so it won't be able to access system RAM at more than 15.75 GB/s. Real bandwidth will be lower than that.
Oops, didn't see your comment before posting. Sorry. :ohwell:
 
Joined
Apr 9, 2013
Messages
280 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
I bet you've never tried it yourself
I have a 4090 & I've tried FG & I was also very underwhelmed. It was really interesting to try, when the base fps is low (the fps before FG slots in the generated frames) I could really feel the input lag, yet I was still seeing a pretty smooth image. To me it felt a bit like gaming on an old TV with high input lag - the image looks smooth but everything just feels a bit like it's in jelly.
It's a nice option to have & I hope a version that's supported by all GPUs comes soon, but for me personally it's only useful when I've got a decent frame rate already anyway (100+ fps).
 

3x0

Joined
Oct 6, 2022
Messages
954 (1.35/day)
Processor AMD Ryzen 7 5800X3D
Motherboard MSI MPG B550I Gaming Edge Wi-Fi ITX
Cooling Scythe Fuma 2 rev. B Noctua NF-A12x25 Edition
Memory 2x16GiB G.Skill TridentZ DDR4 3200Mb/s CL14 F4-3200C14D-32GTZKW
Video Card(s) PowerColor Radeon RX7800 XT Hellhound 16GiB
Storage Western Digital Black SN850 WDS100T1X0E-00AFY0 1TiB, Western Digital Blue 3D WDS200T2B0A 2TiB
Display(s) Dell G2724D 27" IPS 1440P 165Hz, ASUS VG259QM 25” IPS 1080P 240Hz
Case Cooler Master NR200P ITX
Audio Device(s) Altec Lansing 220, HyperX Cloud II
Power Supply Corsair SF750 Platinum 750W SFX
Mouse Lamzu Atlantis Mini Wireless
Keyboard HyperX Alloy Origins Aqua
@W1zzard Sorry if it was asked before, but are there any notable differences in PCB complexity (number of layers etc.) between the 8GB and 16GB versions? I've read that the PCB complexity is also a big factor in the price increase, according to some sources. Which is complete nonsense IMO, but would like to confirm.
 
Joined
Feb 24, 2023
Messages
2,483 (4.40/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
I've read that the PCB complexity is also a big factor in the price increase
Yes if we compare PCBs of 3090 VS 3060. In this case, difference is about $5 if ever exists.
 
Joined
Dec 10, 2022
Messages
484 (0.76/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
The 6800 XT is irrelevant to the matter at hand, it is a much higher segment, previous generation card built on a far larger and much more advanced GPU that draws twice as much power, of course both it and the RTX 3080 are going to clobber it.
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.
The price analogy doesn't work very well because neither of those cards are manufactured any longer, availability is relying on leftover stock which hasn't been sold yet. In essence, these cards don't matter. Stocks of any remaining new units are depleting fast.
Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.
I'll go out on a limb and say that the ones smoking moon rocks were those kvetching about Nvidia not putting enough VRAM on their GPUs, chiefly, an AMD camp complaint.
Does this look like AMD to you?
It sure doesn't look like AMD to me.
Nvidia's own lack of interest in this SKU makes it look like they just put this out to prove naysayers wrong.
Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.
Of course, convenient detail to hide @Vayra86's excellent point of "the GPU is only as strong as its weakest link", and this cutdown AD106 on 128-bit should never have been sold as anything other than an RTX 4050, but that's a problem this entire generation is facing.
I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.
II just find it bizarre that this card has effectively the same fundamental design flaw of the RX 6500 XT, an overreliance on the cache to make up for the extremely anemic memory bandwidth... that is to say, both are low-end, power efficient chips you'd otherwise find in a budget laptop that your average League of Legends player would have.
Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
 
Joined
Aug 21, 2015
Messages
1,698 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
Whilst performance-wise you're more correct than not there are other things.

0. All Ada Lovelace products and especially, the lower tier ones, lack reasonable VRAM amount and, most importantly, bandwidth. This shifts their real tier one tier lower.
1. All Ada Lovelace products are more cut than their predecessors. 3060 sports the same potential of Ampere architecture as 4070 sports the potential of Ada. But hey, 4070 suddenly costs almost twice as expensive despite being nowhere near twice as fast!
2. All Ada Lovelace products have been launched at the time when almost nobody wants a GPU. There is little demand and most demanders are now either picky, or broke, or both. There is almost no miners with their "please give us another million GPUs for whatever price" around as of now. They're gone. nVidia has to consider this fact as well.
3. If everyone submits to nVidia's policy and pays what they ask for they will see no reason for improvement. Now, seeing the worst demand ever, they are forced to think. Mining fever hangover is still here in the air but it will fade away some day. And then, nVidia will sober their pricing up, unless AMD and Intel fail to achieve anything. And they are currently achieving very little with only two meaningful products available, namely A770 from Intel and RX 7900 XTX from AMD. The rest is incompetitive. Or non-existent.
4. You can't take GPU's naming seriously if it is not faster than its predecessor in every possible scenario. And yes, all 4060 Series GPUs do lose to their 3060 Series predecessors in certain tasks which has never happened before.

Ya, Ada 60 should, or at least could, be better and/or cheaper. As mentioned, I can see the justification behind the argument for the 4060 being a 4050 ti (and/or 4060 ti being a 4060), and the memory bandwidth definitely seems to be hampering AD10[6,7]. When it comes to model naming, performance matters. Nvidia's not going to throw anyone a bone unless absolutely forced. New take (but probably not an original one) after mulling this whole thing over for another day: AD10[6,7] are bandwidth-constrained on purpose. A 192-bit AD60 family may have been too competitive with the 4070, and cannibalized those higher-margin sales. If AD70 is moving slower than NV would like, the above would make some sense. Beyond that, lots of buyers don't cross-shop brands, so NV might figure they can save a few bucks on the BOM without driving said buyers AMD's way.

Now for my argument against the 4060 ti being a 50-series card. The 4060 fits the mold in some ways (power, bus width), but the ti just doesn't. The sole thing it shares is memory bus. It literally DOUBLES contemporary performance relative to pre-RTX x50 cards, and clobbers the 3050 (which, btw, performed on par with pre-RTX x60 parts) by a full 50%.

ModelPriceWattsVRAMBus widthResolutionAvg FPS
550 ti1501161GB1281680x105051
650 ti1501101GB1281680x105050
750 ti150602GB1281600x90058
950160902GB1281600x90065
950160902GB1281080p52
1050 ti140754GB1281080p48
1650150754GB1281080p55
30502501308GB1281080p85
40603001158GB1281080p101
4060 ti4001608GB1281080p120

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.

Nice to see that somebody gets it.
 
Joined
Mar 7, 2023
Messages
767 (1.39/day)
System Name BarnacleMan
Processor 14700KF
Motherboard Gigabyte B760 Aorus Elite Ax DDR5
Cooling ARCTIC Liquid Freezer II 240 + P12 Max Fans
Memory 32GB Kingston Fury Beast @ 6000 cl32 + Trefi 40,000
Video Card(s) Asus Tuf 4090 24GB
Storage 4TB sn850x, 2TB sn850x, 2TB Netac Nv7000 + 2TB p5 plus, 4TB MX500 * 2 = 18TB. Plus dvd burner.
Display(s) Dell 23.5" 1440P IPS panel
Case Lian Li LANCOOL II MESH Performance Mid-Tower
Audio Device(s) Logitech Z623
Power Supply Gigabyte 850w
Mouse Some piece of shit from China
Keyboard Some piece of shit from China
Software Yes Please?
I hope this is the very last generation of 8GB cards from both vendors.
Isn't the 4050 rumoured to have 6GB? That would make me think that maybe the 5050 would still have 8gb, or less. Though I suppose thats more excusable on a 50 series card, if its priced accordingly.
 
Joined
Jan 20, 2019
Messages
1,430 (0.69/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
RTX 2060 6GB
RTX 2060 12GB (increased TMUs, increased cores)

RTX 3060 8GB
RTX 3060 12GB (increased bus)

RTX 3080 10GB
RTX 3080 12GB (increased bus, increased TMUs, increased cores)

RTX 4060 TI 8GB
RTX 4060 TI 16GB (nothing)

Yep absolutely nothing, just slapped on 16 gigs and called it a night. Reminds me of those funny looking hybrid blokes in the gym. BIGUPPERBOD.littleskinnylegs - it just doesn't work together.

Anyway, you can dress her as you want, it's still and always will be a praiseworthy x50-class cutback with a huge gap from Lovelace's best. Actually forget captain halo, even the 4080 sees an enormous 80-115% increase in perf over the 4060 "ti" (thats crazy!). Both 20 and 30 series cards carried the x60 and x80 subdivisions on a well received 30-40% relative perf discrepancy - what on earth were they thinking with 40-series? To make matters tremendously worse, the awarded MSRP of $500 is just sinfully offensive.
 
Joined
Feb 24, 2023
Messages
2,483 (4.40/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
A 192-bit AD60 family may have been too competitive with the 4070
No. 4060 Ti's core is too weak to compete with 4070. They still would be miles away from each other, just like GTX 1080 VS GTX 1070 with the latter being up to 30 percent slower despite quite the same VRAM bus of 256 bits.
It literally DOUBLES contemporary performance relative to pre-RTX x50 cards
And more than doubles their price making its speed nothing all too impressive. It's also almost double as Watt-hungry compared to an average xx50 card (GTX 750 is either 60 or 75W, 950 is 90W, 1050 is 75W, 1650 is 75W, whereas 4060 Ti eats up to 160W).
clobbers the 3050
Not an achievement. Almost everything clobbers the 3050; with 3070 Ti being the only Ampere GPU which is worse in Perf/W*USD ratio.
the 4060 ti being a 50-series card
It STILL technically is. It loses to plain OG RTX 3060, the one of 12 GB, in a couple scenarios where 8 GB of VRAM just don't cut it and 12 GB allow for something despite very slow GPU. Playing at 15 FPS is better than not playing at all, y'know.
and/or cheaper
The GTX 1070 from 2016 was priced $380 at launch (which roughly translates to 2022's $460 considering inflation). So, the only game which wasn't comfortable for this card at 1080p was Deus Ex: Mankind Divided, and others were playable at 60+ FPS. https://www.techpowerup.com/review/msi-gtx-1070-quick-silver-oc/12.html
The RTX 4060 series do already have a handful of games which are COMPLETELY unplayable at 1080p at ultra settings. Do I need to remind this was not the case for the GTX 1060 ($250, or $305 of 2022) which was just slow in some titles but has not run into slideshow and texture quality shenanigans?
This probably has not been the case for the GTX 1050 Ti as well. Just a reminder: that was a $140 ($170 as of 2022) GPU. https://www.techpowerup.com/review/asus-gtx-1050-ti-strix-oc/12.html

And now, with 10+ GB VRAM GPUs getting more and more popularity, game studios will definitely increase the VRAM tax in their future products. Which means the utter garbage state of things for RTX 4060 series with the sole survivor by the name of 4060 Ti 16 GB being able to actually play games from, say, 2026. But when you pay half a thousand dollars for a card you expect it to run them, not to snail them.
 
Joined
Nov 26, 2021
Messages
1,517 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Joined
Aug 21, 2015
Messages
1,698 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
No. 4060 Ti's core is too weak to compete with 4070. They still would be miles away from each other, just like GTX 1080 VS GTX 1070 with the latter being up to 30 percent slower despite quite the same VRAM bus of 256 bits.

30% is significant, but not miles IMO. Let's assume for the moment that a better memory subsystem would have gotten it to within 20%, or at least alleviated the 1% and .1% lows. Nvidia's already asking buyers to spend $200, an extra 50%, over the 4060 ti to get that 30%. If the performance delta were smaller, you don't think a portion of potential 4070 buyers would step down? Yeah, it gets you that magic 60 fps number at 4k, so if you bleed green and are set on 4k, AD60's not gonna do it for ya. For the rest of us, 115 vs 85 at 1440p doesn't seem like a good trade for those two Benjamins. At least not to me. (115 vs 85 is +26%, which I guess rounds up to 30% if you're going in tens.)

And more than doubles their price making its speed nothing all too impressive. It's also almost double as Watt-hungry compared to an average xx50 card (GTX 750 is either 60 or 75W, 950 is 90W, 1050 is 75W, 1650 is 75W, whereas 4060 Ti eats up to 160W).

We all know the price is out of whack. Re: power, 160W being too much for x50 is what I'm saying here.

Not an achievement. Almost everything clobbers the 3050; with 3070 Ti being the only Ampere GPU which is worse in Perf/W*USD ratio.

Because NV abandoned the midrange. The 3050 performed relative to contemporary titles at the same level pre-RTX x60 cards did relative to theirs. Even on spec, it's very similar to the 960, which didn't get anything like the 3050s hate then or now.

It STILL technically is. It loses to plain OG RTX 3060, the one of 12 GB, in a couple scenarios where 8 GB of VRAM just don't cut it and 12 GB allow for something despite very slow GPU. Playing at 15 FPS is better than not playing at all, y'know.

Gotta disagree there. I'm going to go out on a limb and say that if a game can't at least run, and ideally be configurable to run at ~30fps, on all current-gen cards at a games release, that's on the game.

The GTX 1070 from 2016 was priced $380 at launch (which roughly translates to 2022's $460 considering inflation). So, the only game which wasn't comfortable for this card at 1080p was Deus Ex: Mankind Divided, and others were playable at 60+ FPS. https://www.techpowerup.com/review/msi-gtx-1070-quick-silver-oc/12.html
The RTX 4060 series do already have a handful of games which are COMPLETELY unplayable at 1080p at ultra settings. Do I need to remind this was not the case for the GTX 1060 ($250, or $305 of 2022) which was just slow in some titles but has not run into slideshow and texture quality shenanigans?

Pascal was really good. This is well known. It's also known that pricing is completely out of whack for Nvidia cards, so I'm not sure what you're getting at here.

What games are unplayable on AD60 at 1080p Ultra? Everything but two in the TPU suite hits at least 60 average on the 4060, and those don't miss by much.

This probably has not been the case for the GTX 1050 Ti as well. Just a reminder: that was a $140 ($170 as of 2022) GPU. https://www.techpowerup.com/review/asus-gtx-1050-ti-strix-oc/12.html

Yes, I know. Every x50 was in that range or less until Ampere.

And now, with 10+ GB VRAM GPUs getting more and more popularity, game studios will definitely increase the VRAM tax in their future products. Which means the utter garbage state of things for RTX 4060 series with the sole survivor by the name of 4060 Ti 16 GB being able to actually play games from, say, 2026. But when you pay half a thousand dollars for a card you expect it to run them, not to snail them.

Let me be very clear: I am not claiming AD60 is good, especially not at RRP. The ti def should have had 12. What I'm claiming is that it's not, the 4060 ti in particular, 50-series, regardless of the number of bits on the bus. From Fermi to Turing, x50 cost significantly less than 200 bucks, pulled less than 120 watts, and provided playable performance at mainstream resolutions. The 4060 ti is $400 (should be no more than $300), draws 160W, and puts up a reasonable showing at 1440p. Not everything quite hits 60, but with every AAA title apparently trying to be the next Crysis, that's hardly surprising.

Actually forget captain halo, even the 4080 sees an enormous 80-115% increase in perf over the 4060 "ti" (thats crazy!). Both 20 and 30 series cards carried the x60 and x80 subdivisions on a well received 30-40% relative perf discrepancy - what on earth were they thinking with 40-series?

4080 beats 4060 ti by 80-115% by pulling 100% more power. I'm not sure why this is surprising. There was the same gap between the 3060 and 3080: +95% performance for +90% watts. Turing's 60-80 delta was as you describe.
 
Joined
Dec 25, 2020
Messages
6,010 (4.44/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
30% is significant, but not miles IMO.

30% is more than the delta between the RTX 4080 and 4090, despite the latter's much wider core and abundance of execution resources (roughly +70% resources for +20-25% performance), didn't make sense to get one, even from an enthusiast point of view. Ada's smoothest chips imo are AD104 and AD103. Neither are wasteful and both come in well-balanced configurations, even if the AD104 doesn't quite live to the "RTX 4080" branding.

I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.

1. The logic I'm applying is that you're dishonestly comparing a previous generation product that's going for clearance prices to a current generation product that is indubitably priced high, but where the manufacturer intended it to be priced all along. Forgot the 6950 XT's original $1199 MSRP?

2. Good, buy the AMD card while you still can. I'm merely pointing out that this is the exception and not the norm, once RDNA 2 and Ampere stocks deplete you will not be able to do this

3. Nvidia GPUs outsell AMD ones 9 to 1, it's irrelevant. Nvidia themselves have shown little interest in this 16 GB model, as I have mentioned earlier, it's more of an SKU they've released to prove a point than anything else.

4. You could have linked to the video where HUB tested the 8 GB 3070 v. the 16 GB RTX A4000 to make a fairer point as those cards share the exact same architecture and feature set, but like I mentioned on the forum before, the situation between 4060 Ti 8 GB and 4060 Ti 16 GB is exactly the same as described in the aforementioned video. The 16 GB card is better, when you need the memory... but with this anemic a configuration, it matters less than ever - it's just a bad card all around

5. Never said you didn't or did dump on the 6500 XT and my point wasn't to demean it as a product, just to say that Nvidia committed the same reckless cost-saving measure that cost this card the performance it could have, the 64-bit bus in Navi 24 is the primary bottleneck even on such a small die

6. Agreed that the prices are obnoxiously high, at $500 this GPU is simply not worth the price of admission, even if last-gen 6800 XT and RTX 3080s weren't, for the time being, accessible at this price
 
Joined
Jan 14, 2019
Messages
11,279 (5.46/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2, 4 + 8 TB Seagate Barracuda 3.5"
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Windows 10 Pro
Joined
Dec 25, 2020
Messages
6,010 (4.44/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Galax Stealth STL-03
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Nah, they didn't. They just moved it up by a few hundred bucks together with all the other tiers. :roll:

At least they released something there... AMD didn't yet :(
 
Joined
Apr 13, 2023
Messages
38 (0.07/day)
Especially if you consider today this 7900XT rivals a 4080 at raster perf for nearly half the price. Or better - as an AIB 4080 goes for above 1200 with ease. I bought it at 899 and I still cant say Im unhappy. Its highly competitive to say the very least. There is also definitely room now for a 7800XT at around 600 with 16GB and a -10/15% perf deficit. And even in RT these two cards would do just fine comparatively. Without proprietary BS as a bonus - no game ready driver or DLSS3 updates required, the perf is just there out of the box. As it should be.

Its a no brainer to me tbh
The 7900XT is less than 4% behind the 4080 at 1440p and 8% at 4K. Both the XT & XTX got huge FPS boost from recent drivers with FH5 and TLOU.



Since the 4060Ti is set at $400/$500 and the 4070 at $600, AMD can easily take over these 3 lower-mid to mid-range cards. They can rebrand both the 6950XT and 6800XT as 7800XTX and 7800XT, 6800 as 7800 and charge $600, $500 and $400. Of course with 16GB , lower power draw and RDNA features. Do you think that's fair?
 
Last edited:
Joined
Feb 24, 2023
Messages
2,483 (4.40/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / R9 380 2 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / Corsair CX650M / DQ550ST [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
AMD didn't yet
Because they are on their mission! The Very Important Mission of losing in every price segment as hard as possible. They seem quad US national debt worth bribed to never be competitive by nVidia.
30% is significant, but not miles
It's just enough to justify the fact these cards don't belong to same segment. And if you look at 1% lows, it's more than 30%. Providing the 4060 its 192 bit bus back would've skimped this difference a bit but 4070 would still be significantly better despite being a de facto rebranded 4060... Ti perhaps? Nah, I'm paying too much respect. A rebranded 4060 it is.
If the performance delta were smaller, you don't think a portion of potential 4070 buyers would step down?
Yes but this doesn't matter because if nGreedia had put 1.5 times more RAM on 4060s they'd still have a whole lot of profit just because Adas are cheaper than "equivalent" Amperes to produce and sell for historical maximums (MSRP-wise, not street price wise).
Because NV abandoned the midrange
Incorrect. They abandoned reasonable pricing and adopted misleading branding.
at least 60 average
This doesn't matter how many millions you have in your average framerate if 1% low is a single digit number. And by 2025, the number of games allergic to 8 GB cards will by no mean be a single digit one (pun intended).
What I'm claiming is that it's not, the 4060 ti in particular, 50-series
And you're incorrect. Real 4060 must beat 3060 in EVERY POSSIBLE scenario. If there is a ONE case where 4060 is slower than 3060 it's a fail and you (meaning nVidia, not you personally) must've named this at least a half tier lower, namely 4050 Super or Ti. Or at least stopped pretending that $400 is not twice the reasonable price.
Pascal was really good
Maxwell, if we don't count GTX 960 (really BAD product all things considered), was even better considering how far it's gone from Kepler.
 
Joined
Apr 13, 2023
Messages
38 (0.07/day)
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
I find it hilarious and annoyed at the same while watching Steve make up excuses after excuses how the nVidia GeForce RTX 3070 8GB didn't suck even though his own data says the 6800 is the superior card. He also chose the 3070 over the 6800 from day one review because MuH rAy tRaCiNg.
 
Joined
Feb 20, 2019
Messages
7,914 (3.90/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
TPU states that it's so-so fine even for 4K. Yellow colour means depends on the settings and use cases.

I believe that the sweet spot for the potential RTX 4060 Ti owners is 1440p screens, but some enthusiasts can move to the better quality 2160p screens.
Personally I don't think people buying $500 graphics cards are gaming at 60Hz. If they are, then it's a 1440p card at best, and unsuitable for 4K IMO.

If you have a high-refresh display like most gamers do these days, it's only matching the display capabilites in older games and esports. Here's the kicker - if you are only playing older games and esports, you don't need a $500 graphics card.
 
Joined
Sep 17, 2014
Messages
21,881 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
I don't know what logic you're trying to use but you're wrong. When two cards are $10 apart, they're relevant to each other. If someone has ~$500 to spend on a card, they're going to be looking at everything that's available at that price point. If power draw was such a big deal to people, then nobody would buy an Intel 13th-gen CPU.

Be that as it may, they're still around. When there aren't any left, then you'll be correct. At the moment, you are not.

Does this look like AMD to you?
It sure doesn't look like AMD to me.

Well, they failed in that endeavour because their sales numbers are in the toilet. The amount of faith that they and their AIBs have in the RTX 4060 Ti is so low that they didn't even sample it out or give it any fanfare. It's like when they silently released the RTX 3060 8GB.

I haven't seen the 128-bit bus preventin the RTX 4060 Ti from addressing and using all 16GB of its VRAM. What I have seen though is that it suffers bigger performance drops when resolution is increased than cards with buses that are greater than 128-bit.

Oh don't kid yourself, I personally dumped all over the RX 6500 XT. Whoever it was at AMD who thought that the RX 6500 XT was a good idea should've been fired. However, it had its "moment in the crapper" long ago. Now it's the RTX 4060 Ti's turn. It's not about who did what, it's about who did what and when.

I actually bought an RX 6500 XT. Not because I thought that it was any good but because I thought that an R9 Fury in my mother's HTPC was a waste of electricity. She doesn't need a 275W card to be a glorified video adapter and Canada Computers happened to have a Powercolor Radeon RX 6500 XT ITX model on clearance for what was at the time, the equivalent of ~$113USD. At that price, even the RX 6500 XT looks good and since she doesn't need a hardware encoder, a 3D accelerator or more than 4GB of VRAM (she wouldn't know what to do with any of those anyway), at that price, it was perfect for her.

If and when the price of the RTX 4060 Ti variants drop to what they should be ($300 for 8GB and $350 for 16GB), then they will be compelling products. As it stands right now, they are far from that.
Cards need to be readily available for them to be relevant in comparison, the 6800XTs are running out slowly. Its a while stocks last thing and on top of that, local availability can be worse, making the price of such highly competitive cards skyrocket. I see them go over 800 eur here for some AIB models, and a rare couple over 1200 (!!). Theory vs practice.,.

At least they released something there... AMD didn't yet :(
They really should be pushing that 7800(XT) button just about now, the momentum is there if they would given how poor 8GB cards turn out. Even a 12GB 7700 would be marvellous... if they position that at 450 they will sell. But like @Beginner Micro Device stated... Always Mighty Delayed...
 
Last edited:
Joined
Jun 14, 2020
Messages
3,275 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Quite obvious from the results that nvidias 8gbs are more comparable to AMD's 10gb. Look at the 7600, in some cases it loses 90% of it's performance moving from 1440p to 4k while the 4060 8gb is doing fine.

More and more games will need lots of VRAM. More VRAM means more future-proofing. AMD chose to equip the RX 6800 with as much as 16 GB and it will pay off in the long run - fine wine.

View attachment 306188
View attachment 306187
What fine wine are you drinking? RDNA 2 is slow as heck due to RT being non existent

Especially if you consider today this 7900XT rivals a 4080 at raster perf for nearly half the price.
It doesn't? The XT is in fact closer to the 4070ti in raster performance than it is to the 4080. So, the 4070ti rivals the 7900xt in raster.
 
Joined
Jan 20, 2019
Messages
1,430 (0.69/day)
Location
London, UK
System Name ❶ Oooh (2024) ❷ Aaaah (2021) ❸ Ahemm (2017)
Processor ❶ 5800X3D ❷ i7-9700K ❸ i7-7700K
Motherboard ❶ X570-F ❷ Z390-E ❸ Z270-E
Cooling ❶ ALFIII 360 ❷ X62 + X72 (GPU mod) ❸ X62
Memory ❶ 32-3600/16 ❷ 32-3200/16 ❸ 16-3200/16
Video Card(s) ❶ 3080 X Trio ❷ 2080TI (AIOmod) ❸ 1080TI
Storage ❶ NVME/SSD/HDD ❷ <SAME ❸ SSD/HDD
Display(s) ❶ 1440/165/IPS ❷ 1440/144/IPS ❸ 1080/144/IPS
Case ❶ BQ Silent 601 ❷ Cors 465X ❸ Frac Mesh C
Audio Device(s) ❶ HyperX C2 ❷ HyperX C2 ❸ Logi G432
Power Supply ❶ HX1200 Plat ❷ RM750X ❸ EVGA 650W G2
Mouse ❶ Logi G Pro ❷ Razer Bas V3 ❸ Logi G502
Keyboard ❶ Logi G915 TKL ❷ Anne P2 ❸ Logi G610
Software ❶ Win 11 ❷ 10 ❸ 10
Benchmark Scores I have wrestled bandwidths, Tussled with voltages, Handcuffed Overclocks, Thrown Gigahertz in Jail
4080 beats 4060 ti by 80-115% by pulling 100% more power. I'm not sure why this is surprising.

With this sort of context it gets uglier... eg. if next Gen x60 vs x80 saw a ~200% perf discrepancy with a 200% power variation it doesn't mean x60 is a good fit.

anyway perf-per-watt is off-topic to what was being referenced. What was being suggested was, its crazy seeing "x60(ti)" being massively condensed with a 80/115% perf diff coming from a 4080. Keep in mind thats without mention of the wider 4090 delta or the standard x60 (non-TI) vs x80 with a 100/135% cutback in perf. This sort of widely elongated performance disparity cannot be justified as a x60-class product but a x50. Whether its gimping on bandwidth, bus-width, core% segment, memory, etc etc sometimes its simply sufficient to look to "raw performance" comparables alone to conclude something is amiss.... or out of character..... or in this case, out-of-class!

There was the same gap between the 3060 and 3080: +95% performance for +90% watts.

You might want to check x60 (TI) 30-series benchmarks again.
 
Last edited:
Joined
Sep 17, 2014
Messages
21,881 (6.00/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Quite obvious from the results that nvidias 8gbs are more comparable to AMD's 10gb. Look at the 7600, in some cases it loses 90% of it's performance moving from 1440p to 4k while the 4060 8gb is doing fine.


What fine wine are you drinking? RDNA 2 is slow as heck due to RT being non existent


It doesn't? The XT is in fact closer to the 4070ti in raster performance than it is to the 4080. So, the 4070ti rivals the 7900xt in raster.
EC87846E-FEF3-4E7D-892C-0035462A0EE4.png

Depends where you look ;)

As for RT.. sure, but old news and x60 is not really RT capable.
 
Joined
Jun 14, 2020
Messages
3,275 (2.11/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
View attachment 306431
Depends where you look ;)

As for RT.. sure, but old news and x60 is not really RT capable.
No it doesn't depend where you look, you just cherry-picked. I can find a game where the 4070ti is similar or better than the XTX. Doesn't mean much. The XT certainly does not rival the 4080,not in raster or anything else for that matter.
 
Top