Wednesday, December 14th 2022

NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

While NVIDIA has launched high-end GeForce RTX 4090 and RTX 4080 GPUs from its Ada Lovelace family, middle and lower-end products are brewing to satisfy the entire consumer market. Today, according to the kopite7kimi, a well-known leaker, we have potential information about the configuration of the upcoming GeForce RTX 4060 Ti graphics card. Featuring 4352 FP32 CUDA cores, the GPU is powered by an AD106-350-A1 die. On the die, there is 32 MB of L2 cache. To pair, it has 8 GB of GDDR6 18 Gbps memory, which should be enough to power games at 1440p resolution, which this card is aiming for.

The design of the cards reference PG190 PCB is supposedly very short, making it ideal for ITX-sized designs we could see from NVIDIA's AIB partners. Interestingly, with a TDP of 220 Watts, the reference card is powered by the infamous 16-pin 12VHPWR connector, capable of supplying 600 Watts of power. This choice of connector is unclear; however, it could be NVIDIA's push to standardize its usage across all products in the Ada Lovelace family stack. While the card should not need the full potential of the connector, it signals that the company could only be using this type of connector for all of its future designs.
Source: @kopite7kimi (Twitter)
Add your own comment

82 Comments on NVIDIA GeForce RTX 4060 Ti to Feature Shorter PCB, 220 Watt TDP, and 16-Pin 12VHPWR Power Connector

#1
Crackong
Let me guess

$500 MSRP and $600 actual AIB price
Posted on Reply
#2
P4-630
AleksandarKthe reference card is powered by the infamous 16-pin 12VHPWR connector
Oh noooos

Posted on Reply
#3
Ed_1
Those specs sound pretty bad considering the 3060ti had 4864 shader, 8gig DDR6, 256bit bus and 448 GB/s bandwidth. I would of thought 10 gig of memory and at least same BW.
These specs claim 128bit bus with 288 GB/s BW.
Seems Nvidia is shifting the whole product stack down, which is bad for price they are asking now.
Posted on Reply
#4
ZoneDymo
My RX480 also has 8gb of ram soooo yeah, idk why Nvidia is being so stingy on ram, same with the RTX4080, it just seems like they really dont want those cards to last long
Posted on Reply
#5
Quitessa
CrackongLet me guess

$500 MSRP and $600 actual AIB price
Bit optomistic there aren't you?
Pretty sure you're going to need to get a Mortgage on any gpu for the nVidia 4xxx and Amd 7xxx generation >.<
Posted on Reply
#6
TheLostSwede
News Editor
ZoneDymoMy RX480 also has 8gb of ram soooo yeah, idk why Nvidia is being so stingy on ram, same with the RTX4080, it just seems like they really dont want those cards to last long
GDDR6X is pricey... :rolleyes:
Posted on Reply
#7
Dristun
Calling it now: 3070ti performance for 3070ti money but 2 years after and with a crap power connector. Now with fake interpolated frames! It's safe to upgrade.
Posted on Reply
#8
Arkz
Over 200W TDP on a 0060 Ti series card? Woof.
DristunCalling it now: 3070ti performance for 3070ti money but 2 years after and with a crap power connector. Now with fake interpolated frames! It's safe to upgrade.
3070 Ti FE had that connector too, just without the 4 sense pins. TBH the connector was fine till they put it on a 4090. Sending 4x 8 pins of power through the little thing was always a bad idea, but 2 or 3 is fine.
Posted on Reply
#9
SirB
Nvidia has lost its mind. My next card WILL be AMD.
Posted on Reply
#10
cristi_io
High end GPU's like GTX1080 used to have a power consumption of 180W, now even mainstream models are more power hungry.
Posted on Reply
#11
Hofnaerrchen
"The reference card is powered by the infamous 16-pin 12VHPWR connector, capable of supplying 600 Watts of power. This choice of connector is unclear..."
If I had to guess: Mr. Jen-Hsun Huang single-handedly baked a good amount of the connectors in his kitchen and told his engineers that it has to be used with all new devices - even if not needed.

Joking apart: If you want a new standard to be widely used you have to put it everywhere. From an economic point of view it does not make any sense to have multiple plugs/connectors and as it seems the molten plugs were caused by people having to much money an not enough DIY skills.
Posted on Reply
#12
InVasMani
RTX 2060 12GB refresh buy me please!!! RTX 4060 [Ti] 8GB take it or leave it f*ckerz...
Posted on Reply
#13
CyberCT
It's stupid they're sticking to this new connector on the lower tier models. What's the point of a small GPU when you can't fit it in a small cause because of the connector length before the bend. straight up dumb NVIDIA. My 30 series cards are going to hold me over for a while, which isn't a problem.
Posted on Reply
#14
ARF
Poor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:
Posted on Reply
#15
N3utro
If this 8GB of vram is confirmed this will be a problem for most potential buyers. 12GB is mandatory for all new high end cards. 1080ti had 11GB 7 years ago.
ARFPoor evil nvidia :kookoo:

RTX 3060 Ti - 4864 shaders, 8 GB, 200 watts
RTX 4060 Ti - 4352 shaders, 8 GB, 220 watts

So, 4060 Ti will be slower and yet more power hungry? :kookoo:
This is not the same technology. Fewer shaders doesn't mean less performance.
Posted on Reply
#16
ARF
N3utroThis is not the same technology. Fewer shaders doesn't mean less performance.
Let's compare RTX 3090 Ti and RTX 4090.

RTX 3090 Ti - 10752 shaders at 1860 MHz
RTX 4090 - 16384 (52% more) shaders at 2520 MHz (35% higher)

Net performance increase: 45%.

So, RTX 4090 has 52% more shaders at 35% higher clocks and yet manages to return only 45% higher performance.
Posted on Reply
#17
bug
Ed_1Those specs sound pretty bad considering the 3060ti had 4864 shader, 8gig DDR6, 256bit bus and 448 GB/s bandwidth. I would of thought 10 gig of memory and at least same BW.
These specs claim 128bit bus with 288 GB/s BW.
Seems Nvidia is shifting the whole product stack down, which is bad for price they are asking now.
The 4080 has slightly fewer shaders then the 3090 and a narrower memory interface. It still ends up ~20% faster.
Posted on Reply
#18
ARF
bugThe 4080 has slightly fewer shaders then the 3090 and a narrower memory interface. It still ends up ~20% faster.
It has 800-1000 MHz higher clocks.
Posted on Reply
#19
TheDeeGee
Passing this one up due to 8GB.

I plan on keeping it 5 years or more so i want atleast 12GB. That means a 4070 Ti, or 4070 if that has 12GB as well.
Posted on Reply
#20
bug
ARFIt has 800-1000 MHz higher clocks.
And? You know for a fact 4060Ti won't?
Posted on Reply
#21
ARF
TheDeeGeePassing this one up due to 8GB.

I plan on keeping it 5 years or more so i want atleast 12GB. That means a 4070 Ti, or 4070 if that has 12GB as well.
Why not a Radeon with 16 GB?
bugAnd?
Period.. :D
Posted on Reply
#22
TheDeeGee
ARFWhy not a Radeon with 16 GB?



Period.. :D
Because i also play games from the late 90s and early 2000s, and AMD lacks the hidden anti-aliasing features that Nvidia has via Nvidia Inspector.

Apart from that my Ryzen 5000 experience was a giant disaster so i'll never want anything AMD again.
Posted on Reply
#23
ARF
TheDeeGeeApart from that my Ryzen 5000 experience was a giant disaster so i'll never want anything AMD again.
Why a disaster? Maybe you had a particular issue with your motherboard vendor. The silicon itself with label Ryzen is silicon as any other with other labels... So, it's not from the Ryzen chip but something else which you have to find.
Posted on Reply
#24
Prince Valiant
TheLostSwedeGDDR6X is pricey... :rolleyes:
PCBs too, Nvidia needs to cut off as much as they can. Think of their savings :).
Posted on Reply
#25
mechtech
When will the non-ti be widely available?
Posted on Reply
Add your own comment
Nov 21st, 2024 11:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts