Sunday, May 14th 2023

NVIDIA GeForce RTX 4060 Ti to Feature a PCI-Express 4.0 x8 Bus Interface

NVIDIA has traditionally refrained from lowering the PCIe lane counts on its mid-range GPUs, doing so only with its most entry-level SKUs, however, this is about to change with the GeForce RTX 40-series. A VideoCardz report says that the upcoming GeForce RTX 4060 Ti, based on the AD106 silicon, comes with a host interface of PCI-Express 4.0 x8.

While this is still plenty of interface bandwidth for a GPU of this market segment, with bandwidth comparable to that of PCI-Express 3.0 x16, using the RTX 4060 Ti on older platforms, such as 10th Gen Intel Core "Comet Lake," or even much newer processors such as the AMD Ryzen 5700G "Cezanne," would run the GPU at PCI-Express 3.0 x8, as the GPU physically lacks the remaining 8 lanes. The lower PCIe lane count should simplify board design for AIC partners, as it reduces the PCB traces and SMDs associated with each individual PCIe lane. Much like DRAM chip traces, PCIe traces are meticulously designed by EDA software (and later validated), to be of equal length across all lanes, for signal integrity.
Source: VideoCardz
Add your own comment

58 Comments on NVIDIA GeForce RTX 4060 Ti to Feature a PCI-Express 4.0 x8 Bus Interface

#1
csendesmark
I am using my RX 7900 XT at PCIe gen4 8×, and feel no drawbacks.
So the RTX 4060 will be just fine, but I would love to see benchmarks with gen3 8×.
Posted on Reply
#2
SL2
csendesmarkI am using my RX 7900 XT at PCIe gen4 8×, and feel no drawbacks.
So the RTX 4060 will be just fine, but I would love to see benchmarks with gen3 8×.
For the 16 GB version, yes. For 8 GB, I'm not so sure.
Posted on Reply
#3
Unregistered
Why this shouldn't be of concern performance wise, it should reflect on the price, RTX4000 are way cheaper than usual (except the silicon, though that one is smaller than previous generations) though the prices are ridiculous (I'm referring to fake MSRP, the 4070 launched at the same price as the 3080).
Posted on Edit | Reply
#4
Makaveli
csendesmarkI am using my RX 7900 XT at PCIe gen4 8×, and feel no drawbacks.
So the RTX 4060 will be just fine, but I would love to see benchmarks with gen3 8×.
Agreed the 4060 should be fine at gen4 x8
Posted on Reply
#5
Lew Zealand
It'll be fine, the 6600XT has a PCIe 4.0 x8 and it loses about 2% on average when restricted to PCIe 3.0. The 4060Ti should be a faster GPU so maybe it'll lose 3%. Nobody will notice.
Posted on Reply
#6
btarunr
Editor & Senior Moderator
csendesmarkI am using my RX 7900 XT at PCIe gen4 8×, and feel no drawbacks.
So the RTX 4060 will be just fine, but I would love to see benchmarks with gen3 8×.
We'll definitely do an RTX 4060 Ti PCIe scaling article at some point.
Posted on Reply
#7
eidairaman1
The Exiled Airman
Lew ZealandIt'll be fine, the 6600XT has a PCIe 4.0 x8 and it loses about 2% on average when restricted to PCIe 3.0. The 4060Ti should be a faster GPU so maybe it'll lose 3%. Nobody will notice.
Yet people were skewering amd for a low class gpu having a smaller bus. So where's the outrage over nv doing it?
Posted on Reply
#8
Arkz
For what they charge for these things, this takes the piss a bit. There's millions of potential customers still on gen3 boards that could look at this as a possible upgrade, and you're limiting it. Even if you only lost like 4% perf on average on gen3x8, it's still a loss. Since when did using all 16 lanes on a significantly priced GPU become a lost art? For the 4050 I would expect it, but not the 4060Ti.
Posted on Reply
#9
R-T-B
eidairaman1Yet people were skewering amd for a low class gpu having a smaller bus. So where's the outrage over nv doing it?
There'll be outrage if it hurts at PCIe 3 speeds. If it doesn't, it's a nonissue.
Posted on Reply
#10
The Von Matrices
eidairaman1Yet people were skewering amd for a low class gpu having a smaller bus. So where's the outrage over nv doing it?
Because AMD went with 4 lanes only, which made having PCIe 4.0 much more important.
Posted on Reply
#11
Zubasa
The Von MatricesBecause AMD went with 4 lanes only, which made having PCIe 4.0 much more important.
People were complaining about the 6600 having 4.0 x8 as well.
The 6500 / 6400 were always going to be meh, given they lack video encoders and no AV1 decoder, which are more important for low end gpus.
Posted on Reply
#12
Paranoir andando
""NVIDIA has traditionally refrained from lowering the PCIe lane counts on its mid-range GPUs, doing so only with its most entry-level SKUs,...""

. . .and that is exactly what Nvidia is still doing now. This is a card with a 24% of Cudas, TMUs, Tensor, RTcores... just like the previous 3050, 1650, 950, etc x50 to which it replaces. It's a real 4050 with a new name. This 4060 Ti is not a mid-range. Nvidia does not deceive anyone.

P.D: Only 3050 has pcie 8X, the others 1650, 1050, 950,... has pcie 16X. Nvidia is getting worse.
Posted on Reply
#13
cerulliber
btarunrWe'll definitely do an RTX 4060 Ti PCIe scaling article at some point.
definitely people would read an article x8 vs x16 on pci express 4.0 at the morning coffee
Posted on Reply
#14
Lew Zealand
The Von MatricesBecause AMD went with 4 lanes only, which made having PCIe 4.0 much more important.
But not very important. I used an RX 6400 for a few months before sending it to a new home and I didn't have any problems with it in any PCIe 3.0 machines because I own none with PCIe 4.0. There is the occasional game which can run out of video memory with 4GB when you play at Ultra (in reviews) so you lower the textures, but I didn't end up playing any game with that problem. However the more expensive 6500 XT really should have a 6GB 96-bit memory setup so most of the complaints (even about PCIe 4.0 x4) would have been greatly reduced.
Posted on Reply
#15
Jism
ArkzFor what they charge for these things, this takes the piss a bit. There's millions of potential customers still on gen3 boards that could look at this as a possible upgrade, and you're limiting it. Even if you only lost like 4% perf on average on gen3x8, it's still a loss. Since when did using all 16 lanes on a significantly priced GPU become a lost art? For the 4050 I would expect it, but not the 4060Ti.
If you buy a new computer today, it's likely it does have PCI-E 4.0 or even 5.

These "new" products are'nt designed with older platforms in mind, however it is backwards compatible.

PCI-E 3.0 X16 still is'nt fully taxed. That PCI-E 5.0 demand is simply from enterprise market.
Posted on Reply
#16
cerulliber
Paranoir andando""NVIDIA has traditionally refrained from lowering the PCIe lane counts on its mid-range GPUs, doing so only with its most entry-level SKUs,...""

. . .and that is exactly what Nvidia is still doing now. This is a card with a 24% of Cudas, TMUs, Tensor, RTcores... just like the previous 3050, 1650, 950, etc x50 to which it replaces. It's a real 4050 with a new name. This 4060 Ti is not a mid-range. Nvidia does not deceive anyone.

P.D: Only 3050 has pcie 8X, the others 1650, 1050, 950,... has pcie 16X. Nvidia is getting worse.
I think we have to see bigger picture. Strongest igpu i know radeon 780m is 2% stronger gtx 1650(gtx 1050ti replacement with codecs/adequate upgardes). If the plan is killing low end market gpus by replace them with igpu is totally make sense what they’re doing. Only con is allow consumers to buy more ram.
Posted on Reply
#17
Vayra86
eidairaman1Yet people were skewering amd for a low class gpu having a smaller bus. So where's the outrage over nv doing it?
Don't worry, I'm sure there is a battery of Youtubers firing up their cherry picked scenario's to show us how you cán lose more than 3%.
Posted on Reply
#20
LupintheIII
ArkzFor what they charge for these things, this takes the piss a bit. There's millions of potential customers still on gen3 boards that could look at this as a possible upgrade, and you're limiting it. Even if you only lost like 4% perf on average on gen3x8, it's still a loss. Since when did using all 16 lanes on a significantly priced GPU become a lost art? For the 4050 I would expect it, but not the 4060Ti.
It's because of laptops, mobile CPU have 8X interface anyway, so you (with you I mean Nvidia) are better off skip half the PCIe lanes from the get-go to save money and power.
As always desktop folks will get the leftover.

Still It's funny with RTX 4000 Nvidia is doing everything AMD did for RX 6000 series, Lovelace is litterally a crossover between RDNA2 and Ampere.
Posted on Reply
#21
Bomby569
BwazeBut is it really a "pay cut", as the article states in title, but then explains that most of his "sallary" are actually shareholder dividends, not his actual sallary?
his base salary is insanely low for a CEO of a company like Nvidia (1M), it becomes just symbolic (i know, i know, poor him lol). Almost all his payment is through shares (if what's there is true obviously, i have no idea). Down is down.
Posted on Reply
#22
SL2
Lew ZealandIt'll be fine, the 6600XT has a PCIe 4.0 x8 and it loses about 2% on average when restricted to PCIe 3.0. The 4060Ti should be a faster GPU so maybe it'll lose 3%. Nobody will notice.
Maybe you're right, but it's a step in the wrong direction. Still half the lanes compared to 3060 TI.
LupintheIIIStill It's funny with RTX 4000 Nvidia is doing everything AMD did for RX 6000 series, Lovelace is litterally a crossover between RDNA2 and Ampere.
RTX 4 didn't get worse at ray tracing, right? ;)
Posted on Reply
#23
R0H1T
Paranoir andandoNvidia does not deceive anyone.
Oh but that's where you're wrong! They have a massive rabid fanbase & probably an even larger influencer base o_O

Unless of course you think they don't like being lied to :slap:
Posted on Reply
#24
Bomby569
A page from the AMD book, they are just trying to see who can go lower at this point.
We need Intel to get their shit together
Posted on Reply
#25
Wirko
JismIf you buy a new computer today, it's likely it does have PCI-E 4.0 or even 5.

These "new" products are'nt designed with older platforms in mind, however it is backwards compatible.

PCI-E 3.0 X16 still is'nt fully taxed. That PCI-E 5.0 demand is simply from enterprise market.
Sure but I also see PCIe 5 as a long term investment, more so on AMD platforms.

Nvidia missed the opportunity to introduce PCIe 5 x8 with the 4060 Ti in order to create some push towards newer platforms.
Posted on Reply
Add your own comment
Dec 18th, 2024 11:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts