• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti 16 GB

Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
7900GRE performance is fine for a $500-600 target price, and it ought to be more power-efficient on smaller Navi32 silicon rather than the big Navi31's worst yields with tons of dead die area.

10-20% faster than a 6800XT at 10-20% lower power draw is a win. Pricing should (hopefully) be in line with current market price of RDNA2 - and it's no faster than the 6900XT which has been selling for sub-$600 for a long time now.

It's only competing against RDNA2 and Ampere. It's not competing against the 4070/Ti in this segment, because this far into the enthusiast price range, features like DLSS3 and RT performance are actually relevant. Raytraced with DLSS3 even comparing a 4070 to a 7900XTX is basically no contest; Upscaling is required, so DLSS is worth having over FSR. Frame gen is a bonus on top of that that AMD doesn't have at all, and Nvidia's base RT performance is significantly ahead of AMD's so the end result is an unplayable 20-30fps on one and silky-smooth 80fps on the other, at better image quality to boot....

10% faster than a 6800 XT, bearing in mind that the 6800 XT is a Navi 21 processor that has 10% of its execution units (72/80 present) disabled, while offering practically no new features whatsoever, after 3 years is... a disaster, and that's being nice. It's no wonder that the 6950 XT beats it.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
1. The logic I'm applying is that you're dishonestly comparing a previous generation product that's going for clearance prices to a current generation product that is indubitably priced high, but where the manufacturer intended it to be priced all along. Forgot the 6950 XT's original $1199 MSRP?
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.
2. Good, buy the AMD card while you still can. I'm merely pointing out that this is the exception and not the norm, once RDNA 2 and Ampere stocks deplete you will not be able to do this
Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.
3. Nvidia GPUs outsell AMD ones 9 to 1, it's irrelevant. Nvidia themselves have shown little interest in this 16 GB model, as I have mentioned earlier, it's more of an SKU they've released to prove a point than anything else.
So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)
4. You could have linked to the video where HUB tested the 8 GB 3070 v. the 16 GB RTX A4000 to make a fairer point as those cards share the exact same architecture and feature set, but like I mentioned on the forum before, the situation between 4060 Ti 8 GB and 4060 Ti 16 GB is exactly the same as described in the aforementioned video. The 16 GB card is better, when you need the memory... but with this anemic a configuration, it matters less than ever - it's just a bad card all around
I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.
5. Never said you didn't or did dump on the 6500 XT and my point wasn't to demean it as a product, just to say that Nvidia committed the same reckless cost-saving measure that cost this card the performance it could have, the 64-bit bus in Navi 24 is the primary bottleneck even on such a small die
Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.
6. Agreed that the prices are obnoxiously high, at $500 this GPU is simply not worth the price of admission, even if last-gen 6800 XT and RTX 3080s weren't, for the time being, accessible at this price
Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.

I find it hilarious and annoyed at the same while watching Steve make up excuses after excuses how the nVidia GeForce RTX 3070 8GB didn't suck even though his own data says the 6800 is the superior card. He also chose the 3070 over the 6800 from day one review because MuH rAy tRaCiNg.
I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(

Cards need to be readily available for them to be relevant in comparison, the 6800XTs are running out slowly. Its a while stocks last thing and on top of that, local availability can be worse, making the price of such highly competitive cards skyrocket. I see them go over 800 eur here for some AIB models, and a rare couple over 1200 (!!). Theory vs practice.,.
When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.
They really should be pushing that 7800(XT) button just about now, the momentum is there if they would given how poor 8GB cards turn out. Even a 12GB 7700 would be marvellous... if they position that at 450 they will sell. But like @Beginner Micro Device stated... Always Mighty Delayed...
That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.

It's like I said, if you thought that the review for the RTX 4060 Ti 8GB was the worst that you've ever seen (and it certainly was the worst that I can remember), just wait, because the 16GB review will be even worse.

Sure enough...

I swear man, the ones in charge over at nVidia must be smoking moon rocks because putting this card up against the RX 6800 XT is asking for a Romulan Bloodbath (green blood will flow).
You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
 
Last edited:

Myfists

New Member
Joined
Aug 1, 2023
Messages
1 (0.00/day)
I think you and most reviewers are failing to see the point of the 16gb of Vram. This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU. With 16gb Vram you get to run up to a 13b parameter LLM and have no issues with VRam while using Stable Diffusion, even the larger SDXL models will work fine. Generative AI is about Vram capapacity because you need to load the entire model onto the existing Vram and not vram speed. Your next bet for Generative AI gpu after this is the much pricier 4080 or even the absurd 4090.

Not saying its good value or a wise decision but I think there is reason behind the madness...
 
Joined
Aug 5, 2019
Messages
153 (0.08/day)
System Name Locutus TT P90 open air case
Processor Intel I7 12700K
Motherboard Asus Z690 WIFI D4
Cooling NZXT 280 mm AIO
Memory 32 Gig Corsair Vengeance 3600 DDR4
Video Card(s) Zotac 3080 Holo LHR
Storage 3 various branded SSDs and 6 TB Seagate HD
Display(s) LG 32'' g-sync 144 Hz VA IPS
Case P90 open air case
Power Supply EVGA G2 1 KW
Mouse Logitech G900
Keyboard Corsair Strafe
Software Win 11 Pro
The review doesn't paint the whole picture. It is either the chip itself is so slow, that it can not use more than 8 GB, or that nvidia cheats and automatically modifies the image quality in order to fit in the available VRAM buffer.

There is a difference if you know where to look at.

View attachment 306114
Boring only a 6 FPS difference here, if I were to build 2 identical rigs and game on them you cannot tell me which one is the higher FPS rig. VRR makes this even more difficult to judge. And what is the sigma deviation/range here ?

NV marketing has a bunch a peeps brainwashed.
 
Joined
Apr 13, 2023
Messages
38 (0.06/day)
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.

Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.

So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)

I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.

Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.

Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.


I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(


When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.

That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.


You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
That guy harps on RT too much and productivity for a gaming channel and now he's gotten big with over 100K subs. He's just like HUB picking RT over VRAM and raw performance.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU.
It is still gaming GPU with large potential play upcoming games which will be more vram hungry and will use frame gen routinely.
Frame gen and additional 8GB is perfect combo even on 128bit bus. This is simply card for masses playing still at 1080p. The price is adequate.
People and reviewers made decision too soon.
Also perfect for non game industry as you mentioned.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.

What amazes me is that you actually reached the conclusion I was pushing you towards yet you still felt a compulsive need to justify it for AMD. Romulan bloodbath, eh
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I do remember and at the time, I crapped all over it. It wasn't worth anywhere close to that, especially since it was only slightly faster than my RX 6800 XT which cost a lot less. The difference was that, at launch, AMD was very candid about how the RX 6900 XT (and by extension the 6950 XT) was not a card for anyone who wanted any kind of value. That's what the RX 6800 XT was for. As far as the generation is concerned, what difference does it make? A new card with high-performance is a new card with high-performance. It would be far more dishonest to ignore it than to talk about it. If you care about generation, then you're in a small minority because the sales numbers of the RX 4060 Ti vis a vis the sales numbers of the RX 6700 XT paint a picture that pretty much screams that nobody cares about what generation something is, they care about what it can do for what they have to pay and I don't blame them for that. For gamers, especially ones that don't have more money than brains, price-to-performance is very important.

Since you brought up the RX 6950 XT, let's look at it and compare it to the "that which should not be compared because it's a newer-gen" RTX 4070 Ti.
XFX Radeon RX 6950 XT MERC319 OC 16GB - $630 at Newegg
PNY GeForce RTX 4070 Ti 12GB - $790 at Best Buy
Performance delta between the two cards: The RTX 4070 Ti is all of 4% faster than the RX 6950 XT and has 4 fewer gigabytes of VRAM (TPU GPU Database Numbers). I'm also comparing a factory OC XFX to what appears to be a Vanilla PNY so the performance difference is probably even less than that but I don't care, I'll still say that it's 4%.

Now, I don't know about you and I can only speak for myself in this situation but.... I would have a REALLY hard time spending an extra $160 (an extra 25%) for only 4% extra performance. To be fair, I'm not the least bit impressed with the current implementations of RT and I have little to no interest in using the fake frames of DLSS3 or FSR3 because I'm not buying software here, I'm buying hardware. There's also the fact that I don't care about upscaling because these are high-end cards and by the time they need upscaling, DLSS, FSR and XeSS will be completely different from what they are now just as they are now completely different from their iterations from three years ago. That's why I never understood people caring about what upscaler a card has when it's a high-end card. I mean, sure, DLSS can be a big deal but keep in mind that both XeSS and FSR also look good if they're all that you have.

Now sure, the RTX 4070 Ti probably uses way less electricity but if people actually cared about that, nobody would be buying Intel 13th-gen CPUs which use about twice as much juice as comparable Zen4 Ryzens, but they do, so they don't. Not only that, even in the UK, one of the most expensive places in the world for electricity right now, it would take about 6 years just to break even when it comes to power use and cost and all costs are less painful when amortised over long periods of time.

This is why I don't care about what gen a card is (and from what I've seen, neither do most people) so calling that comparison dishonest is completely out of touch with what most people would consider simple reality. You can call it dishonest all you want, but believe it or not, it doesn't get more true just because someone keeps repeating it.

Absolutely. When the situation changes, my opinion will change to meet the new conditions. I'm only talking about right now.

So... You're saying that they released something pointless to prove a point? :roll:
Ok... I'll bite... What point was nVidia trying to prove? That they could make a worse release than the 8GB RTX 4060 Ti? (I'll grant them that they totally succeeded at that!) ;)

I couldn't agree more. It's the worst version of the card that itself is likely the worst release in GPU history. I'd like to say that it can't get worse but I don't want to jinx anything.

Well I sure as hell have no problem demeaning the RX 6500 XT because as a product, it's incomplete. It's like back in the day when Hyundai's were awful. Sure, they were junk but since they were dirt cheap, they still sold like crazy. The RX 6500 XT is a very niche product that doesn't have enough VRAM, has a PCIe4x4 connection and is about as potent as a GTX 980. If you want to sell a glorified video adapter, I don't have a problem with that. What I did have a problem with was AMD trying to market it as a gaming card and pricing it as such. I realise that they were just trying to get something out there but they would've been better off with cut-down versions of the RX 6600.

Yep. The fact that the RX 6800 XT is available at $500 (I haven't seen a new RTX 3080 in forever) only makes things even worse.


I know, and in some cases he dismissed the VRAM disparity because "We only review for today." which made me scratch my head and think "Where is the Steve Walton that I know and love from so many years ago?". :(


When the stock runs out, then they won't be readily available and I'll be totally in agreement with you. Right now though... They are an obstacle to both companies and I'm glad that they are because it gives consumers a better option than to buy the crap that they're putting in front of us and it's forcing the prices down. Years from now, we might be thanking RDNA2 for having blocked nVidia and AMD from just charging whatever they felt like.

That RX 7800 XT is going to be a pointless product because if the RX 7900 GRE is comparable to the RX 6800 XT, then the RX 7800 XT is going to be inferior to the RX 6800 XT and AMD's going to have to sell it for no more than $450. Even if they do that, they'll still have lots of egg on their faces for releasing a card that was inferior to its predecessor.


You know, I think that Daniel Owen has been reading my "Romulan Bloodbath" posts because...

This is why I don't understand anyone who calls the RX 6800 XT "irrelevant" to the RTX 4060 Ti 16GB, because it's obviously relevant while it's still out there! :laugh:
The GRE is not comparable to the 6800XT, why would it be? Its noticeably faster even if it just matches a 4070ti, and first results show it to be faster than a 4070 which is about equal to a 6800XT.

There is 35% perf there to play with, that's more than a full tier of GPU (which I consider to be a 25-30% gap). It also won't require a larger than life PSU to fight transient spikes.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
It is still gaming GPU with large potential play upcoming games which will be more vram hungry and will use frame gen routinely.
Frame gen and additional 8GB is perfect combo even on 128bit bus. This is simply card for masses playing still at 1080p. The price is adequate.
People and reviewers made decision too soon.
Also perfect for non game industry as you mentioned.
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.
 
Joined
Aug 21, 2015
Messages
1,752 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.

Except the 4070 is two hundred more dollars, a full 50% more expensive for a single step up the ladder. And it's not even close to 50% faster. If the 4060 ti is truly as poor a value as everyone's saying, it's wierd to recommend a card whose P/P ratio is even worse.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Except the 4070 is two hundred more dollars, a full 50% more expensive for a single step up the ladder. And it's not even close to 50% faster. If the 4060 ti is truly as poor a value as everyone's saying, it's wierd to recommend a card whose P/P ratio is even worse.
The poster I replied to was praising the 4060 Ti 16 GB not its cheaper sibling. That is only $100 less than the 4070 and the gulf in performance between them is far more than 20%. In fact, two variants of the 4070 are available for $590 right now and others have been sold for a bit less than that recently.

1691003020183.png
 
Last edited:
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
You could make an argument for AI, but for gaming, you would be better off going with a 4070 or a RDNA2/RDNA3 equivalent.
4070 or a RDNA2/RDNA3 equivalent is good choice for people who have plans sell it in 2-3 years or simply said changing cards more often.
But for long term strategy(6+ years) is better 16GB and frame gen support for that price we see now on 4060ti 16GB.

4060ti can easily catch up 30% loss in performance on 4070 with adjusting game settings. For decreasing 4GB VRAM(16 -> 12) consumption we need to sacrifice a lot more settings(especially in new games 6 years from now).
My prediction is that existence of 16GB version in low end segment will make change how developers will be setting min. recommendations in terms of VRAM in the future. Games are already big now, there is no problem to allocate even 24GB of VRAM in today games....because games(mostly open worlds) are constantly loading data from SSD....e.g. CP77 ~30MB/s.
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
4070 or a RDNA2/RDNA3 equivalent is good choice for people who have plans sell it in 2-3 years or simply said changing cards more often.
But for long term strategy(6+ years) is better 16GB and frame gen support for that price we see now on 4060ti 16GB.

4060ti can easily catch up 30% loss in performance on 4070 with adjusting game settings. For decreasing 4GB VRAM(16 -> 12) consumption we need to sacrifice a lot more settings(especially in new games 6 years from now).
My prediction is that existence of 16GB version in low end segment will make change how developers will be setting min. recommendations in terms of VRAM in the future. Games are already big now, there is no problem to allocate even 24GB of VRAM in today games....because games(mostly open worlds) are constantly loading data from SSD....e.g. CP77 ~30MB/s.
The gap between the 4060 Ti and the 4070 is bigger than the gap between the 7900 XTX and the 4090. Even after adjusting settings, it would still be slower than a 4070 at the same settings. I reiterate; for gaming, this SKU makes no sense. The only use would be AI models that hunger for VRAM.

I agree with one point. With time, games will use more VRAM, but given that both AMD and Nvidia have opted to equip their mass market GPUs with only 8 GB, the time for 16 GB isn't here yet.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
The gap between the 4060 Ti and the 4070 is bigger than the gap between the 7900 XTX and the 4090. Even after adjusting settings, it would still be slower than a 4070 at the same settings. I reiterate; for gaming, this SKU makes no sense. The only use would be AI models that hunger for VRAM.

I agree with one point. With time, games will use more VRAM, but given that both AMD and Nvidia have opted to equip their mass market GPUs with only 8 GB, the time for 16 GB isn't here yet.
Why you are comparing that gap with another gap(7900xtx/4090)?
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
To show how big the gap is. It's also bigger than the gap between the 4060 and the 4060 Ti.
View attachment 307379
Comparing gaps like you did does not make sense.

The point is, in long term or very long term(x060 gamers) 4070 with 12GB will be useless. It is better grab slower card but with larger VRAM. You do not have so many options how to decrease VRAM usage in games, but you have tonsof options how to increase fps(core performance).
 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Comparing gaps like you did does not make sense.

The point is, in long term or very long term(x060 gamers) 4070 with 12GB will be useless. It is better grab slower card but with larger VRAM. You do not have so many options how to decrease VRAM usage in games, but you have tonsof options how to increase fps(core performance).
I don't understand what is nonsensical about showing that the 4060 Ti 16 GB is the worst gaming GPU based upon the fps per $ metric. If you're a gamer and worried about VRAM, then buy a 7800 next month. I agree that lowering VRAM usage isn't easy and the options that lower it are the ones that harm image quality the most: texture quality.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
I don't understand what is nonsensical about showing that the 4060 Ti 16 GB is the worst gaming GPU based upon the fps per $ metric. If you're a gamer and worried about VRAM, then buy a 7800 next month. I agree that lowering VRAM usage isn't easy and the options that lower it are the ones that harm image quality the most: texture quality.
Fps per $ metric is giving us false illusion about product value. It has very narrow field of view in decision process. It is mostly reflecting software performance and not HW performance and features and benefits. It can be used for short term scenario for specific player's needs.
4060 ti 16GB is valuable product in long term strategy at that price. Short term metric fps/$ is not able to point to this value.

7800 is out of business, still no frame gen possibility. Again, false illusion for future games(it is not only about VRAM, everything counts if you are making complex decision).
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I think you and most reviewers are failing to see the point of the 16gb of Vram. This, is not a gaming GPU I think this is Nvidia's entre level Generative AI GPU. With 16gb Vram you get to run up to a 13b parameter LLM and have no issues with VRam while using Stable Diffusion, even the larger SDXL models will work fine. Generative AI is about Vram capapacity because you need to load the entire model onto the existing Vram and not vram speed. Your next bet for Generative AI gpu after this is the much pricier 4080 or even the absurd 4090.

Not saying its good value or a wise decision but I think there is reason behind the madness...
I kinda think that the reason behind the madness is that Jensen knows that there are a lot of people who will buy a card in a green box no matter how good or bad the value is. I'm pretty sure that the reason behind all of nVidia's madness is profit, pure and simple.

It makes no sense to have the 4060 Ti have 16GB of VRAM but the 4070 and 4070 Ti, two cards that would be a lot more useful for AI, only have 12.

That guy harps on RT too much and productivity for a gaming channel and now he's gotten big with over 100K subs. He's just like HUB picking RT over VRAM and raw performance.
You're right, he does, but at the same time, he had no qualms about showing everyone the impossible position that the RTX 4060 Ti 16GB was forced into by nVidia. He showed that he has much bigger cojones than HUB because they somehow managed to "neglect" to do this comparison. There's no excuse for that because the comparison was immediately obvious to anyone with more than two brain cells to rub together and HUB isn't that stupid.

I'm sure that there's another reason why they didn't do that comparison. Based on Daniel Owen's results, we can be sure that the only ones who wouldn't be pleased with such a comparison would be nVidia. Now, I don't know for certain why HUB didn't do that ridiculously obvious comparison, but when one does the math...

What amazes me is that you actually reached the conclusion I was pushing you towards yet you still felt a compulsive need to justify it for AMD. Romulan bloodbath, eh
What are you talking about? You weren't pushing me anywhere and I didn't justify a damn thing for AMD. I predicted a bloodbath and since nVidia is green, I ironically called it a "Romulan Bloodbath".

You're really starting to make me wonder about you. :kookoo:

The GRE is not comparable to the 6800XT, why would it be? Its noticeably faster even if it just matches a 4070ti, and first results show it to be faster than a 4070 which is about equal to a 6800XT.

There is 35% perf there to play with, that's more than a full tier of GPU (which I consider to be a 25-30% gap). It also won't require a larger than life PSU to fight transient spikes.
That's not exactly what I meant. What I meant was that there's only a 20% performance difference between it and the RX 6800 XT. Where does that leave the RX 7800 XT? People are going to expect more than a 10% performance uplift between the 7800 and 7900 cards. Even if the RX 7800 XT is 10% faster than the RX 6800 XT, they're going to have to price it pretty low (like $450-$500) if it's going to sell because the market is currently saturated around the $500-$600 price point. There just isn't enough of a performance delta for the RX 7800 XT to have a proper place in the market because the RX 6950 XT still exists. AMD knew this all too well which is why they released the RX 7600 first. They were hoping that the RX 6800 XT and RX 6950 XT would sell out before they were forced to release the RX 7800 XT but that hasn't happened. If it is 10% faster than the RX 6800 XT, then it would be compelling at $500 but not so much at $550 because it would be 10% more expensive for 10% more performance. That's not exactly something to get excited about, especially if the RX 6800 XT's price gets cut even further. At $450, it would be the de facto market leader and would be the most praised card of this generation regardless of what colour box it comes in.

Of course, that would require foresight and AMD has demonstrated the exact opposite of that for this entire generation so far. :kookoo:

AMD will shoot themselves in the foot yet again. I'd like to say that it's stupidity but they're not stupid people. This whole situation was caused by an anti-consumer play by AMD that they failed to get away with and I'm really glad that they didn't. AMD is going to suffer pretty badly this generation and it's 100% their fault. I'm just going to sit back and laugh at them as they receive exactly what they deserve.

If the RX 7800 XT releases at $450, then I'll be the first to happily take back everything that I said. I would love to be wrong here but I wouldn't even bet a dime on it.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
What are you talking about? You weren't pushing me anywhere and I didn't justify a damn thing for AMD. I predicted a bloodbath and since nVidia is green, I ironically called it a "Romulan Bloodbath".

You're really starting to make me wonder about you. :kookoo:

For a bloodbath (or at the very least, a much needed price war) to occur, first AMD needs to have a better product, and then they need to sell that product to the masses and earn the mindshare of the people.

Currently, they do not meet any of these requirements. But don't worry, AMD's been granted another generational reprieve, the AI nuts are gonna buy up all the Radeon stock that gamers won't, so maybe RDNA 4 will be a better AI accelerator if not a better gaming card. They'll persist. :)

 
Joined
Nov 26, 2021
Messages
1,705 (1.52/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
For a bloodbath (or at the very least, a much needed price war) to occur, first AMD needs to have a better product, and then they need to sell that product to the masses and earn the mindshare of the people.

Currently, they do not meet any of these requirements. But don't worry, AMD's been granted another generational reprieve, the AI nuts are gonna buy up all the Radeon stock that gamers won't, so maybe RDNA 4 will be a better AI accelerator if not a better gaming card. They'll persist. :)

That piece about one AI focused company buying up Radeons could be troublesome if it actually turns into a trend.
 
Joined
Dec 25, 2020
Messages
7,013 (4.81/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Audio Device(s) Apple USB-C + Sony MDR-V7 headphones
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard IBM Model M type 1391405 (distribución española)
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
That piece about one AI focused company buying up Radeons could be troublesome if it actually turns into a trend.

The AI trend is only starting, and it may grow larger than the crypto one ever did. Once Ethereum went proof of stake, GPU mining practically died overnight. Prices still haven't fully normalized, especially at the budget end, which is flooded with expensive overstock purchased by distributors during the price high and also by low-quality recycled hardware and mining refuse. This also helped the performance segment (4060, 4060 Ti, RX 7600) and the budget segment (RX 6400, RX 6500 XT, GTX 1630 and 1650, Arc A380) to keep at a level that's basically twice the price they should currently be at. GPUs from the Arc A750 and up have been basically priced at near normal (pre-mining, pre-pandemic) levels already, usually the markup is no more than 25-30%.

It was one of the things I took into consideration when I decided to sell my 3090 and upgrade. I couldn't bear having it for another full generation with no hope of upgrading due to even crazier prices, and were I a betting man, I would say that this will happen within the year.
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Fps per $ metric is giving us false illusion about product value. It has very narrow field of view in decision process. It is mostly reflecting software performance and not HW performance and features and benefits. It can be used for short term scenario for specific player's needs.
4060 ti 16GB is valuable product in long term strategy at that price. Short term metric fps/$ is not able to point to this value.

7800 is out of business, still no frame gen possibility. Again, false illusion for future games(it is not only about VRAM, everything counts if you are making complex decision).
No, the 16GB 4060ti still has far too low bandwidth to properly use the large framebuffer in most games. This product is too handicapped beyond just capacity of VRAM. Compare that to the 4070 and you know its 12GB is a much better offer even at a sonewhat higher price. Still not optimal, but certainly not as poor as this 4060Ti which is a total dud.
 
Joined
Nov 16, 2020
Messages
43 (0.03/day)
Processor i7-11700F, undervolt 3.6GHz 0.96V
Motherboard ASUS TUF GAMING B560-PLUS WIFI
Cooling Cooler Master Hyper 212 Black Edition, 1x12cm case FAN
Memory 2x16GB DDR4 3200MHz CL16, Kingston FURY (KF432C16BBK2/32)
Video Card(s) GeForce RTX 2060 SUPER, ASUS DUAL O8G EVO V2, 70%, +120MHz core
Storage Crucial MX500 250GB, Crucial MX500 500GB, Seagate Barracuda 2.5" 2TB
Display(s) DELL P2417H
Case Fractal Design Focus G Black
Power Supply 550W, 80+ Gold, SilentiumPC Supremo M2 SPC140 rev 1.2
Mouse E-BLUE Silenz
Keyboard Genius KB-110X
No, the 16GB 4060ti still has far too low bandwidth to properly use the large framebuffer in most games. This product is too handicapped beyond just capacity of VRAM. Compare that to the 4070 and you know its 12GB is a much better offer even at a sonewhat higher price. Still not optimal, but certainly not as poor as this 4060Ti which is a total dud.
Check the mem bus utilization during gameplay, there is big big reserve. Frame gen and L2 cache are making the reserve even bigger.
Main purpose of VRAM is fast storage for game assets. Most of the hard work is done using caches. No need to worry that low bandwidth can not properly use frame buffer(of any size).
Loading assets through PCIe is way slower. Having larger VRAM is simply more beneficial, especially for open world games.
Owners of 4060ti 16GB will be able to play modern titles easily even after 5-7 years. Of course only on 1080p. But that is the point, for 500$ now, play many many years in the future.

I was watching CP77 gameplay on gtx1060 6GB @1080p. That card is still not penalized for its VRAM capacityon and try to imagine there would be frame gen for that card. I see 4060ti very very bright in the future, much brighter than 4070 12GB.
 
  • Haha
Reactions: 3x0
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Check the mem bus utilization during gameplay, there is big big reserve. Frame gen and L2 cache are making the reserve even bigger.
Main purpose of VRAM is fast storage for game assets. Most of the hard work is done using caches. No need to worry that low bandwidth can not properly use frame buffer(of any size).
Loading assets through PCIe is way slower. Having larger VRAM is simply more beneficial, especially for open world games.
Owners of 4060ti 16GB will be able to play modern titles easily even after 5-7 years. Of course only on 1080p. But that is the point, for 500$ now, play many many years in the future.

I was watching CP77 gameplay on gtx1060 6GB @1080p. That card is still not penalized for its VRAM capacityon and try to imagine there would be frame gen for that card. I see 4060ti very very bright in the future, much brighter than 4070 12GB.
The 1060 6GB has about 200GB/s while the 4060ti is 162% faster on a mere 288GB/s.

Its definitely not going to be able to use all that core advantage with even 2x the memory capacity of a 1060 filled up. Cache isn't large so it will simply choke on complex scenes when more data is needed at once. You say it right, 1080p only, yes. But even then, there is more data. The VRAM capacities required for games between 1080p and 4K is often no more than a 2GB gap. The 16GB will help for games that exceed 8GB, but it won't resolve the issue of lacking bandwidth everywhere.

The reason the 1060 6GB isn't penalized for its 6GB is because every game pegs the core at 100% anyway, and that card relatively has solid bandwidth to feed it. Comparatively it has more real bandwidth than whatever the 4060ti has in bandwidth + cache advantage. The same thing applies throughout Pascal's stack. The 1080 already sits at 500GB/s while being just 60% ish faster than a 1060 6GB - and here's the kicker, if you OC a 1080's memory to 11Gbps (a +500 on the rev. 1, every single one can do it) you gain real frames, consistently. So even thát card still has core to spare for its given bandwidth. It will choke only when games exceed its framebuffer ánd require large amounts of new data on top, a situation that applies, for example, to TW Warhammer 3 at 1440p ultra/high and up, which will readily use north of 10GB VRAM.

As for bright futures, I think they're both pretty weak offers; a 4060ti and a 4070 both are priced too high. With the 4060ti you spend $500,- for what is virtually entry level hardware.
 
Last edited:
Top