• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 4060 Ti Possible Specs Surface—160 W Power, Debuts AD106 Silicon

Joined
Oct 15, 2010
Messages
951 (0.18/day)
System Name Little Boy / New Guy
Processor AMD Ryzen 9 5900X / Intel Core I5 10400F
Motherboard Asrock X470 Taichi Ultimate / Asus H410M Prime
Cooling ARCTIC Liquid Freezer II 280 A-RGB / ARCTIC Freezer 34 eSports DUO
Memory TeamGroup Zeus 2x16GB 3200Mhz CL16 / Teamgroup 1x16GB 3000Mhz CL18
Video Card(s) Asrock Phantom RX 6800 XT 16GB / Asus RTX 3060 Ti 8GB DUAL Mini V2
Storage Patriot Viper VPN100 Nvme 1TB / OCZ Vertex 4 256GB Sata / Ultrastar 2TB / IronWolf 4TB / WD Red 8TB
Display(s) Compumax MF32C 144Hz QHD / ViewSonic OMNI 27 144Hz QHD
Case Phanteks Eclipse P400A / Montech X3 Mesh
Power Supply Aresgame 850W 80+ Gold / Aerocool 850W Plus bronze
Mouse Gigabyte Force M7 Thor
Keyboard Gigabyte Aivia K8100
Software Windows 10 Pro 64 Bits
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
Actually, because this is 2½ years later, the cost of manufacturing will have gone DOWN. As any technology matures, it becomes cheaper. This includes silicon fabrication because otherwise, with the tiny nodes that we have today, CPUs would cost literally MILLIONS of dollars. TSMC doesn't charge nVidia "a price per transistor" but that seems to be the pricing structure that nVidia has adopted for consumers.

The thing is, it is publicly known that TSMC was charging $10k for 7 nm wafers, but they are charging $16k for 5 nm wafers.

It is getting harder and harder to shrink transistors, RND costs have gone up significantly as well. But 7 to 5 nm is still just a 60% increase, not 113%.

3 nm wafers are said to cost $20k. To compare, 28 nm used to cost $3k a decade ago.

Costs are going up, but not to the degree NVIDIA and AMD are trying to convince us they are.
 
Joined
Sep 26, 2022
Messages
235 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
Basically this. They want fatter margins.
Almost everyone does.
But I feel something else at play here. Technically with the 4080 and 4070Ti they started cannibalizing/undercutting the previous generation, yet some people think it's bad value. And by not announcing "official" price cuts they also keep AIBs happy-ish, and let them keep the margins on the old gen models for those that prefer to save some cash and go for the 3080, which in some places has similar perf/$ of the 4070Ti.
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,167 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
with the 4080 and 4070Ti they started cannibalizing/undercutting the previous
With the exception of Turing, where did they not do this?

The thing is, it is publicly known that TSMC was charging $10k for 7 nm wafers, but they are charging $16k for 5 nm wafers.
And the transistor density difference is announced to be 80% more.

None of the "mitigating circumstances" in Nvidia's favor are any different from another generational/process change, yet in the Kepler/Maxwell example they cut the price, not increased it.
 
Joined
Sep 26, 2022
Messages
235 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
With the exception of Turing, where did they not do this?
They said they wouldn't do it
CEO Jen Hsun says that the first products will be "layered on top" of the current-generation "Ampere" products, so they don't cannibalize the sales of current-generation products.
The first products announced were the 4090, 4080 16GB, and 4080 12GB (later 4070Ti), and at least 2 out of those 3 undercut 3000 series models.

I wasn't saying they didn't cannibalize in the past, but some people in this thread are implying that the 4060Ti will bring very little to the table and I'm trying to show reasons to be a little bit more optimistic.
My apologies for not being clear enough before.
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,167 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
They said they wouldn't do it
They also said:
The remainder of 2022 could see a high-end debut of the RTX 40-series, selling alongside attractively priced RTX 30-series cards.
1000004061.gif

I wasn't saying they didn't cannibalize in the past, but some people in this thread are implying that the 4060Ti will bring very little to the table and I'm trying to show reasons to be a little bit more optimistic.
My apologies for not being clear enough before.
It will bring plenty to the table; however I did some calculations a little while ago. Usually, the halo to high end products have large price gaps and mid-range and lower have small price gaps. The 30-series had $100 price gaps from 3070 Ti to 3060 Ti - with the 4070 Ti launched at $800, where does that lead us?

If the 4060 Ti is $400, following the 3060 Ti, we have a $400 gap for a single card to fill - the 4070. Not even plausible.

If the 4060 Ti is $500, that is still a massive jump to the 4070, and another pretty big one to the 4070 Ti. Even so, where does it lead those of us who don't want to pay mass amounts of money? Used/older GPUs. Which should be well below MSRP, but due to the lack of replacement, are not. Heck, even a quick PCPartPicker search shows 0 models of 3060 available at MSRP at the moment.

If there are no replacements, prices will not fall. This will simply begin a vicious cycle of people being priced out of PC gaming for consoles, leading to less GPUs sold, less available, rinse and repeat.

That is saying nothing about older GPUs lacking modern features - the most popular ones out there (1660, RX 580, 1060) are locked out of DLSS, and if there are no sub-$300 Ada cards? DLSS 3 will only exist to excuse crappy optimization, not help those who actually NEED more frames.
 
Joined
Sep 26, 2022
Messages
235 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
If the 4060 Ti is $400, following the 3060 Ti, we have a $400 gap for a single card to fill - the 4070. Not even plausible.
If the 4060 Ti is $500, that is still a massive jump to the 4070, and another pretty big one to the 4070 Ti.
Maybe they'll throw a 4060 Super between the 4060Ti and the 4070. I'll admit it's wishful thinking.

But while we're in "fantasy land", please follow me:
  • 3080 (700~750$) > 3080Ti > 3090 > 3090Ti (1500$) > 4080 (1200$)
  • 3070Ti (600$) > 3080 > 3080Ti > 3090 (~950$?) > 4070Ti (800$)
Looking at this example we can see that going by naming scheme, although price increased between gens, we also see that they "overtake" 3 of the previous models (I'm putting the 3080 10 and 12GB together). So then:
  • 3060Ti (~400$) > 3070 (500$) > 3070Ti (600$) > 3080 (700~750$) > 4060Ti (500$ ???)
500$ for 3080 performance is still a bit more than I would consider spending, but otherwise "reasonable", also, I think that this could be too much to ask from a 160W card, but dreaming is still free! That's why I'm a bit confident that it should match the 3070Ti and I hope in that case they'll ask for less than 500$.
In this scenario the 4060 could go for ~400$ and should slot above the 3070.

That is saying nothing about older GPUs lacking modern features - the most popular ones out there (1660, RX 580, 1060) are locked out of DLSS, and if there are no sub-$300 Ada cards? DLSS 3 will only exist to excuse crappy optimization, not help those who actually NEED more frames.
RTX4050 / 4050Ti. But seeing how they botched the 3050 (slower than the 2060) I don't expect much; AMD will probably be the one to cater to that crowd.
If they do go ahead with the 4050, just for fun, how much VRAM and how wide would the bus be? 6Gb at 96bit? :roll:
 
Joined
Dec 28, 2012
Messages
3,954 (0.90/day)
System Name Skunkworks 3.0
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software Manjaro
Actually, because this is 2½ years later, the cost of manufacturing will have gone DOWN. As any technology matures, it becomes cheaper. This includes silicon fabrication because otherwise, with the tiny nodes that we have today, CPUs would cost literally MILLIONS of dollars. TSMC doesn't charge nVidia "a price per transistor" but that seems to be the pricing structure that nVidia has adopted for consumers.
Absolutely NOT. Unless you've been living under a rock the last 2 1/2 years, there was this little pandemic that shut down the world, and many things resulted from that, labor costs went though the roof, so did transportation costs, fuel costs, and lets not forget (for the 4000 series) the cost of wafers has significantly increased, nearly doubling from the estimated $9000 per wafer at samsung to $16000 per wafer from TSMC. Even for 7nm, TSMC has raised prices, and I'd bet good money samsung has as well.
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,167 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
If they do go ahead with the 4050, just for fun, how much VRAM and how wide would the bus be? 6Gb at 96bit?
8 @ 128bit, according to the database.

Have you looked at the expected specs of the 3060 vs the Ti model? Something goofy there, the 3060 is shown as outperforming the Ti by having nearly double the clock speeds.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
The thing is, it is publicly known that TSMC was charging $10k for 7 nm wafers, but they are charging $16k for 5 nm wafers.

It is getting harder and harder to shrink transistors, RND costs have gone up significantly as well. But 7 to 5 nm is still just a 60% increase, not 113%.

3 nm wafers are said to cost $20k. To compare, 28 nm used to cost $3k a decade ago.

Costs are going up, but not to the degree NVIDIA and AMD are trying to convince us they are.
What I was saying is that cost of manufacture goes down over time but I didn't mean on the new nodes. New nodes do cost more but not to the degree that we're being told (or charged) because a smaller node fits more dice on wafer. The thing is, look at the performance improvements that Intel managed to do while stuck on the same 10nm node. They did what, like, six generations at 10nm?

Manufacturers should consider following Intel's example and not necessarily go to a smaller node for each generation if they can still work with the one that they're on and make improvements. I may hate Intel but it's incredible how they managed to pull performance gains seemingly out of thin air at 10nm over and over and over again. Intel showed that it can be done so what's stopping everyone else from doing it?

Here's a perfect example of what would have been better:

AMD created the RX 6500 XT which, while a far better deal than anything nVidia had at that price point, was a dog. The previous-generation's RX 5500 XT was a superior card all-around because it wasn't limited to PCI-Express x4 and it had encoders. It was built on TSMC's 7nm process and had an MSRP of $169. With the pandemic and the silicon shortage, it would've been far better to continue production of the RX 5500 XT instead of wasting 6nm allocation (which was more expensive) on a card with that level of performance.

The same could've especially been said for the RX 5700 XT in the place of the RX 6600. Could you imagine just how many of those cards AMD could've sold during the mining craze if they were in production? If I were AMD, I would've been scrambling to find a way to produce the RX 5700 XT again, even if it meant going to GloFo. That would've alleviated so much of the strain on the new stock because the preferred mining cards were the RX 5700 XT and the RTX 3060 Ti.

We see nVidia still producing the GTX 1600-series and that makes a lot of sense. When availability is an issue, continuing to produce older models just makes sense, especially since the older models will be lower-end. Hell, I wouldn't even mind it if models were re-branded and sold at a lower price. The RX 6600's MSRP was $329 while the RX 5700 XT's MSRP was $399. Ignoring the mining craze, if AMD re-branded the 5700 XT as the 6600 and sold it at $300, people would've gone nuts for it. Even reviewers would've accepted it because while, sure, it was just a re-branded last-gen part, the price would've fallen by $100. The 7nm process would've been more mature and yields would've most likely been even better than before, only adding to the other economic benefits.

For all those weirdos saying "But, but, ray-tracing!" I would point out that all RT performance below the RTX 3070 is essentially garbage so who cares? There were a lot of gaps in availability that could've easily been filled by previous-gen silicon. It would've been cheaper and easier to produce and people would've been happy that they were getting reasonable performance for a great price.

If AMD decided to re-brand the RX 6800 and RX 6800 XT respectively as the RX 7600 and RX 7600 XT and sold them for $350-$400 each, I don't think that anyone could complain as long as they were transparent about it. It would also free up more of the new node capacity for the higher-tier products. There would still be plenty of use for the imperfect dice because they'd be perfect for mobile GPUs as the smaller nodes are more power-efficient, something that isn't a huge deal on the desktop. I think that this would be a great answer to the lack of stock issue, would increase sales and profits dramatically and would help AMD avoid pitfalls like the RX 6500 XT. After all, nobody complained when the HD 7970 was re-branded as the R9-280X because AMD admitted it openly. In fact, they dropped the price of all cards that were branded as the HD 7970, added three free games to choose per card and that's why I bought two of them at the same time (well, also for ARMA III). :laugh:

It's not like it hasn't been done before and it's such a simple concept that I can't believe I'm the first one to bring it up, but here we are. :rolleyes:

  • Nvidia are greedy (which every company is)
I think that it mostly boils down to this. ;)
 
Last edited:
Joined
Dec 12, 2012
Messages
777 (0.18/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
What I was saying is that cost of manufacture goes down over time but I didn't mean on the new nodes. New nodes do cost more but not to the degree that we're being told (or charged) because a smaller node fits more dice on wafer. The thing is, look at the performance improvements that Intel managed to do while stuck on the same 10nm node. They did what, like, six generations at 10nm?

Manufacturers should consider following Intel's example and not necessarily go to a smaller node for each generation if they can still work with the one that they're on and make improvements. I may hate Intel but it's incredible how they managed to pull performance gains seemingly out of thin air at 10nm over and over and over again. Intel showed that it can be done so what's stopping everyone else from doing it?

We definitely need another Maxwell. It was absolutely incredible how much better it was over Kepler using the same node.

Overall it seems like they need to scale down GPUs, at least the ones for gaming. Ada is great for professional applications, and they can charge thousands of dollars for those cards.

But gaming GPUs should go back to 250 W for the top end model. The progress in visuals is so slow now, but they are still trying to push it, games take forever to develop and most companies are starting to have financial trouble.

Gaming industry is in a really bad place right now. We need affordable hardware and games that are actually polished and fun to play. Graphics are the least important thing in gaming, It is all going in the wrong direction.
 
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
We definitely need another Maxwell. It was absolutely incredible how much better it was over Kepler using the same node.

Overall it seems like they need to scale down GPUs, at least the ones for gaming. Ada is great for professional applications, and they can charge thousands of dollars for those cards.

But gaming GPUs should go back to 250 W for the top end model. The progress in visuals is so slow now, but they are still trying to push it, games take forever to develop and most companies are starting to have financial trouble.
Tell me about it. How else could people get all excited about (and spend hundreds of dollars for) something as ineffectual as ray-tracing?
Gaming industry is in a really bad place right now. We need affordable hardware and games that are actually polished and fun to play. Graphics are the least important thing in gaming, It is all going in the wrong direction.
Exactly, and manufacturers re-branding their upper-tier models as the next-gen's lower-tier models could seriously turn it around. Let's say that you game primarily at 1080p or 1440p and you wanted to upgrade from, say, a GTX 1070. Don't you think that an RX 6800 XT re-branded as an RX 7600 XT for like $400 would be a fantastic option to have? I sure do!

We wouldn't have this issue of low stock anymore either because lots of people would be buying the previous-gen cards under their new branding, leaving the high-end new-gen cards for the fools and the fanboys (what I call "early adopters"). :laugh:
 

Count von Schwalbe

Nocturnus Moderatus
Staff member
Joined
Nov 15, 2021
Messages
3,167 (2.80/day)
Location
Knoxville, TN, USA
System Name Work Computer | Unfinished Computer
Processor Core i7-6700 | Ryzen 5 5600X
Motherboard Dell Q170 | Gigabyte Aorus Elite Wi-Fi
Cooling A fan? | Truly Custom Loop
Memory 4x4GB Crucial 2133 C17 | 4x8GB Corsair Vengeance RGB 3600 C26
Video Card(s) Dell Radeon R7 450 | RTX 2080 Ti FE
Storage Crucial BX500 2TB | TBD
Display(s) 3x LG QHD 32" GSM5B96 | TBD
Case Dell | Heavily Modified Phanteks P400
Power Supply Dell TFX Non-standard | EVGA BQ 650W
Mouse Monster No-Name $7 Gaming Mouse| TBD
Exactly, and manufacturers re-branding their upper-tier models as the next-gen's lower-tier models could seriously turn it around.
There are a couple of issues - primarily power consumption. Older cards with more power consumption will be fundamentally more expensive, even if the card is cheaper you need a larger PSU. Also, that leaves SFF builders and non-gaming/non-GPU workloads in the dust.

The other main issue is feature lockout. RT is not an issue, no low-end card has any business playing in that market, but things like DLSS are. Low-end is where that kind of stuff is actually needed.
 
Joined
Sep 26, 2022
Messages
235 (0.29/day)
Location
Portugal
System Name Main
Processor 5700X
Motherboard MSI B450M Mortar
Cooling Corsair H80i v2
Memory G.SKILL Ripjaws V 32GB (2x16GB) DDR4-3600MHz CL16
Video Card(s) MSI RTX 3060 Ti VENTUS 2X OC 8GB GDDR6X
Display(s) LG 32GK850G
Case NOX HUMMER ZN
Power Supply Seasonic GX-750
There are a couple of issues - primarily power consumption. Older cards with more power consumption will be fundamentally more expensive, even if the card is cheaper you need a larger PSU. Also, that leaves SFF builders and non-gaming/non-GPU workloads in the dust.

The other main issue is feature lockout. RT is not an issue, no low-end card has any business playing in that market, but things like DLSS are. Low-end is where that kind of stuff is actually needed.
So, rtx 3060 8gb rebranded as rtx4050 at 250$ and 3060 12gb as rtx 4050 ti at 280~300$ would be a good compromise?

Edit:
They can also rebrand the RTX 2060 as RTX 4050, since it's still an upgrade over the RTX3050...
 
Last edited:
Joined
Dec 10, 2022
Messages
486 (0.65/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
There are a couple of issues - primarily power consumption. Older cards with more power consumption will be fundamentally more expensive, even if the card is cheaper you need a larger PSU. Also, that leaves SFF builders and non-gaming/non-GPU workloads in the dust.
Well, yes and no. The newer cards tend to be more efficient per watt but let's be honest here, most cards at the mid and lower tiers will work just fine with current PSUs. The RX 5700 is 7% slower than the RX 6600 (no biggie) and has a TDP that's 48W higher (132W vs 180W). I don't see how someone's PSU would have trouble with less than 50 extra watts. Of course, there would be instances where it might be necessary but I think that those would be rather rare and the cure would be to just be in the habit of getting a good PSU in the first place. Nobody with a 650W PSU would have any trouble with any non-specialty card (like a Pro-Duo or DEVIL13) that was released in the last 10 years before the RTX 4090.
The other main issue is feature lockout. RT is not an issue, no low-end card has any business playing in that market, but things like DLSS are. Low-end is where that kind of stuff is actually needed.
That's only really an issue for nVidia so who cares? If people are happy bending over and taking it from nVidia, a company that happily locks even their own customers out of features (as they do with the GTX and RTX PRE-40 series) then who cares? In the case of RT, low to mid-tier cards shouldn't be using it yet anyway and the ship sailed on the RX 5000 to RX 6000 possibility long ago. OTOH, AMD's FSR can be used on ANY card (even the old R9 cards) so it would at least be feasible.

What we have now is clearly no longer working and this would alleviate the biggest problem that we're facing, price.
 
Last edited:
Top