Sunday, May 14th 2023

NVIDIA GeForce RTX 4060 Ti to Feature a PCI-Express 4.0 x8 Bus Interface

NVIDIA has traditionally refrained from lowering the PCIe lane counts on its mid-range GPUs, doing so only with its most entry-level SKUs, however, this is about to change with the GeForce RTX 40-series. A VideoCardz report says that the upcoming GeForce RTX 4060 Ti, based on the AD106 silicon, comes with a host interface of PCI-Express 4.0 x8.

While this is still plenty of interface bandwidth for a GPU of this market segment, with bandwidth comparable to that of PCI-Express 3.0 x16, using the RTX 4060 Ti on older platforms, such as 10th Gen Intel Core "Comet Lake," or even much newer processors such as the AMD Ryzen 5700G "Cezanne," would run the GPU at PCI-Express 3.0 x8, as the GPU physically lacks the remaining 8 lanes. The lower PCIe lane count should simplify board design for AIC partners, as it reduces the PCB traces and SMDs associated with each individual PCIe lane. Much like DRAM chip traces, PCIe traces are meticulously designed by EDA software (and later validated), to be of equal length across all lanes, for signal integrity.
Source: VideoCardz
Add your own comment

58 Comments on NVIDIA GeForce RTX 4060 Ti to Feature a PCI-Express 4.0 x8 Bus Interface

#26
AnarchoPrimitiv
Bomby569A page from the AMD book, they are just trying to see who can go lower at this point.
We need Intel to get their shit together
If you think Intel wouldn't do the same you're mistaken, never forget, all the companies exist in the same exact capitalist system in which they're only beholden to share holders.
Posted on Reply
#27
Wirko
btarunrMuch like DRAM chip traces, PCIe traces are meticulously designed by EDA software (and later validated), to be of equal length across all lanes, for signal integrity.
A more exact explanation: the two wires that make up each lane must be really tightly matched, to within 5 mils (0.005 in = 0.127 mm). Between lanes, the requirements are loose. DDR DIMM is a different and more demanding animal, it has a true parallel interface, not multi-lane serial.
Posted on Reply
#28
SL2
AnarchoPrimitivIf you think Intel wouldn't do the same you're mistaken, never forget, all the companies exist in the same exact capitalist system in which they're only beholden to share holders.
Yes, Intel would do the same, if they had a substantial share of the graphics card market. Which they don't.
Posted on Reply
#29
Klemc
So, it becomes 4x when motherboard shares lanes ?
Posted on Reply
#30
Bomby569
AnarchoPrimitivIf you think Intel wouldn't do the same you're mistaken, never forget, all the companies exist in the same exact capitalist system in which they're only beholden to share holders.
just like AMD is a bit less worst because they are the underdogs, i believe Intel will be even a little less bad. Plus more competition.
Posted on Reply
#31
Wirko
AnarchoPrimitivIf you think Intel wouldn't do the same you're mistaken, never forget, all the companies exist in the same exact capitalist system in which they're only beholden to share holders.
Which shareholders? Many are people like us here, they own some RTX and they also own some NVDA. Hey, even I own 5.3 billionths of TSMC and 3.9 billionths of Samsung Electronics through mutual funds.
KlemcSo, it becomes 4x when motherboard shares lanes ?
I don't see how this could be possible. Consumer CPUs can't split the 16 lanes for graphics in any other way than 8+8. Maybe Ryzens can but AMD hasn't been clear about that.
Posted on Reply
#32
csendesmark
KlemcSo, it becomes 4x when motherboard shares lanes ?
No loss in that case.
It will use the first 8 lanes when put in a fully fledged (first PICe) 16× slot
So if you share it with the second, the second half will go over as the second PCIe slot's first 8 lane, where the PCIe slot it will be 8+8 or 8+2×4 lanes as you would expect them to work
Posted on Reply
#33
Wirko
Bomby569his base salary is insanely low for a CEO of a company like Nvidia (1M), it becomes just symbolic (i know, i know, poor him lol). Almost all his payment is through shares (if what's there is true obviously, i have no idea). Down is down.
He also owns 3.6% of his company. Which is surprisingly little given that he's the co-founder and all-time CEO. But yeah, down is down. Could have been more.
Posted on Reply
#35
Chrispy_
It's a perfectly valid trade-off to reduce costs.

It's only valid to the consumer if those reduced costs are also passed on - and that's where I suspect most people will be angry because the 4060Ti is cut down to reduce cost in so many ways - VRAM, TDP, VRAM bus width, PCIe lane count, and yet the rumoured pricing is still sky high.

Maybe Nvidia are pulling their sly old tactics of leaking a high-price all over the media and then "surprising" people at the last minute with a lower MSRP at launch. Honestly, even if the 8GB card launches at an unexpectedly low $399 it's still not a great deal, and at $449 it's DOA for anyone with a clue about the way the game industry is dropping support for older consoles and moving the VRAM requirements higher. It happens every generation of consoles, but this is the first time the PC industry has been caught with its pants down - Mainstream VRAM size has barely budged at all in 7-8 years.
Posted on Reply
#36
chrcoluk
There is a lot of people out there with gen 3 pcie still, (probably more than gen 4/5) and this card will be more likely to be at that market vs these on the cutting edge stuff, so not a fan of this move.

I bet the saving is about 5-10 usd or something similar on a product that will probably cost at least 400.

Wouldnt surprise me if agreements have been made with board vendors to accelerate obsolescence on gen 3.
Posted on Reply
#37
Squared
There's nothing wrong with this graphics card running at PCIe 4.0 x8 speeds. But People with a PCIe 3.0 x16 slot will be interested in this, and I think they may want to look elsewhere.
Posted on Reply
#38
Chrispy_
chrcolukThere is a lot of people out there with gen 3 pcie still, (probably more than gen 4/5) and this card will be more likely to be at that market vs these on the cutting edge stuff, so not a fan of this move.

I bet the saving is about 5-10 usd or something similar on a product that will probably cost at least 400.

Wouldnt surprise me if agreements have been made with board vendors to accelerate obsolescence on gen 3.
If you can afford a $450 GPU you can also afford a $100 B550 board and Ryzen 5 5600, especially if you sell your old CPU+board. Lets face it, if you are on old Zen2 or slower you probably have a CPU bottleneck with a 4060Ti anyway, which means you're wasting some of its potential long before the gen3 interface is the issue.

If you're on a classic Ryzen 5 3600 or i5-10400F then a used 3060Ti or RX 6700 10GB is going to be a better match for your system. Not only will you save a lot of cash, you'll also get all 16 gen3 lanes.
Posted on Reply
#39
Lew Zealand
chrcolukThere is a lot of people out there with gen 3 pcie still, (probably more than gen 4/5) and this card will be more likely to be at that market vs these on the cutting edge stuff, so not a fan of this move.
If it's 15% faster than the same price GPU with an x16 connector and it takes a 3% hit to frames at PCIe 3.0, then it's still 12% faster than the competition in any board made in the last decade. 3% performance penalty is not enough to keep this type of card out of any PCIe 3.0 system.
Posted on Reply
#40
Alan Smithee
- GeForce GTX 650 8 GB/s (gen3 x16) - in 2012!
- GeForce GTX 750 8 GB/s (gen3 x16)
- GeForce GTX 950 8 GB/s (gen3 x16)
- GeForce GTX 1050 8 GB/s (gen3 x16)
- GeForce GTX 1650 8 GB/s (gen3 x16)
- GeForce RTX 3050 8 GB/s (gen4 x8)
- GeForce RTX 4060 Ti: 8 GB/s (gen4 x8)

So not only are they giving us the same bus bandwidth as a low-end 2012 card, but they have moved the low-end bus bandwidth up 2 SKUs from the xx50 series to the xx60 Ti series.
Posted on Reply
#41
Recus
Alan Smithee- GeForce GTX 650 8 GB/s (gen3 x16) - in 2012!
- GeForce GTX 750 8 GB/s (gen3 x16)
- GeForce GTX 950 8 GB/s (gen3 x16)
- GeForce GTX 1050 8 GB/s (gen3 x16)
- GeForce GTX 1650 8 GB/s (gen3 x16)
- GeForce RTX 3050 8 GB/s (gen4 x8)
- GeForce RTX 4060 Ti: 8 GB/s (gen4 x8)

So not only are they giving us the same bus bandwidth as a low-end 2012 card, but they have moved the low-end bus bandwidth up 2 SKUs from the xx50 series to the xx60 Ti series.
Gen4 x8 is 16 GB/s.
Posted on Reply
#42
Hxx
Bomby569his base salary is insanely low for a CEO of a company like Nvidia (1M), it becomes just symbolic (i know, i know, poor him lol). Almost all his payment is through shares (if what's there is true obviously, i have no idea). Down is down.
no way his base pay is 1M. where did u find this? did you look at an 8K?
Posted on Reply
#43
Bomby569
Hxxno way his base pay is 1M. where did u find this? did you look at an 8K?
why don't you just read the topic you are replying to? did you just choose a random comment and reply without reading the context? how do you think that is going to end for you?
Posted on Reply
#44
lexluthermiester
btarunrNVIDIA GeForce RTX 4060 Ti to Feature a PCI-Express 4.0 x8 Bus Interface
"Feature"?!? Don't you mean;
"NVIDIA GeForce RTX 4060 Ti to be limited to a PCI-Express 4.0 x8 Bus Interface"
That seems more accurate and less like marketing BS.
Posted on Reply
#45
Arkz
JismIf you buy a new computer today, it's likely it does have PCI-E 4.0 or even 5.

These "new" products are'nt designed with older platforms in mind, however it is backwards compatible.

PCI-E 3.0 X16 still is'nt fully taxed. That PCI-E 5.0 demand is simply from enterprise market.
3.0x16 can be fully taxed, hence worse performance with high end cards vs gen4. So 3.0x8 will be a limiting factor on future mid cards that use it.
LupintheIIIIt's because of laptops, mobile CPU have 8X interface anyway, so you (with you I mean Nvidia) are better off skip half the PCIe lanes from the get-go to save money and power.
As always desktop folks will get the leftover.

Still It's funny with RTX 4000 Nvidia is doing everything AMD did for RX 6000 series, Lovelace is litterally a crossover between RDNA2 and Ampere.
But the 4060Ti isn't some mobile gpu slapped on a PCI-E board. We hardly ever see that.
Posted on Reply
#46
wheresmycar


Guys get with the program... What on earth did you think Jensen was cooking in that oven of his? He's left one too many clues for us to fail at this game.

the 4090 is the 4080

4080 > 4070

4070 > 4060

4060 > 4050

The mathematical approach to unravel the brainteaser - take the MSRP of the former and divide it in half to correctly identify the latter price

4080 - $800

4070 - $600

4060 - $300

4050 - $150/$200

So not bad, the 4050 is a 8GB VRAM card skating on 4.0 x8 Bus Interface. What more could you ask for?



I thought cracking the code would get me a free GPU or something... he just threw a fist in the air and smiled.

So why are you paying double you might ask? Cheeky! You'll have to wait for part #2 of the great wheresmycar deciphering escapades
Posted on Reply
#47
csendesmark
lexluthermiester"Feature"?!? Don't you mean;
"NVIDIA GeForce RTX 4060 Ti to be limited to a PCI-Express 4.0 x8 Bus Interface"
That seems more accurate and less like marketing BS.
The same card with a PCIe Gen4 16× would not do any better, so I would not call it a limitation.
Posted on Reply
#48
chrcoluk
Chrispy_If you can afford a $450 GPU you can also afford a $100 B550 board and Ryzen 5 5600, especially if you sell your old CPU+board. Lets face it, if you are on old Zen2 or slower you probably have a CPU bottleneck with a 4060Ti anyway, which means you're wasting some of its potential long before the gen3 interface is the issue.

If you're on a classic Ryzen 5 3600 or i5-10400F then a used 3060Ti or RX 6700 10GB is going to be a better match for your system. Not only will you save a lot of cash, you'll also get all 16 gen3 lanes.
Thats a bold statement, I am sure there will be people who have a budget that can pay for the GPU, but not also for the other parts you mention, thats a massive assumption to make. You also dodged the point, as if you think its easier for people to stretch type budgets, than to add $5 to board manufacture.

Is it easier to add 1% to manufacturing cost for multi billion dollar company, or easier for consumer to add 50% to their budget? Maybe that makes you think about it more rationally. (not to mention a 100usd board is typically junk these days).

The reality instead would be, that either the GPU gets downgraded to pay for the gen 4 board, or the consumer loses performance on their purchase, or they forego the purchase, I actually think your scenario would be the least likely outcome.
Posted on Reply
#49
Chrispy_
chrcolukThats a bold statement, I am sure there will be people who have a budget that can pay for the GPU, but not also for the other parts you mention, thats a massive assumption to make. You also dodged the point, as if you think its easier for people to stretch type budgets, than to add $5 to board manufacture.

Is it easier to add 1% to manufacturing cost for multi billion dollar company, or easier for consumer to add 50% to their budget? Maybe that makes you think about it more rationally. (not to mention a 100usd board is typically junk these days).

The reality instead would be, that either the GPU gets downgraded to pay for the gen 4 board, or the consumer loses performance on their purchase, or they forego the purchase, I actually think your scenario would be the least likely outcome.
I don't understand what you're getting at.

You're suggesting that people with an older board and CPU should overspend on a GPU that's too fast for their old platform?
In an ideal world, yes - this would have 16 lanes, but it doesn't, so you have to analyze based on what we are getting, not on what you'd like to be getting.

If someone is tight on cash, overpaying for a GPU isn't the right answer. IMO they should buy a cheaper GPU that better matches their existing CPU performance and save the money.

If someone is okay with buying a $450 GPU, I don't personally feel that they're tight on cash because a $450 GPU is definitely entering "luxury purchase" territory and people who are barely scraping buy don't typically make short-lived, luxury purchases.
Posted on Reply
#50
Hxx
Bomby569why don't you just read the topic you are replying to? did you just choose a random comment and reply without reading the context? how do you think that is going to end for you?
thats because you quoted some PC gamer article and said you have no idea if theyre correct. here is where u need to look to get his pay. His pay should include cash and its actually 3M plus shares up to 25M. page 53. But thanks for playing
Inline XBRL Viewer (sec.gov)
Posted on Reply
Add your own comment
Dec 18th, 2024 08:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts