Thursday, June 2nd 2022

Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).

We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.
Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
Sources: Igor's Lab, VideoCardz
Add your own comment

62 Comments on Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

#26
medi01
ARFWhy would anyone not want to see multi-GPU configurations healthy, up and running?
Because in this case it means using more than one oversized GPU.

I would like to see "chiplet" like multi-chip solutions though. (less waste of silikon, cheaper to produce etc)
Posted on Reply
#27
bug
ARFI don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards.
And nvidia because of its enormous power consumption was always beaten by AMD's better performance-per-watt solutions.

Look at the Radeon HD 3870 and Radeon HD 3870 x2.
Then Radeon HD 4870 and Radeon HD 4870 x2...

Today the CF technology is much better than it was 15 years ago, and if they push for it, they can make it work.
Even if they split that R&D costs to the entire product line - the product lineup is extremely overpriced today, anyways, and can take those expenses.

Only if I was the Manager in charge at AMD...
Are you aware multi-GPU support still exists? It's built right into DirectX now. What changed is AMD and Nvidia gave up doing all the work. Game engine developers are still able to offer you multi-GPU. But they also aren't eager to do the work, because, as I have already told you, the work doesn't pay off.
Posted on Reply
#28
ARF
bugAre you aware multi-GPU support still exists? It's built right into DirectX now. What changed is AMD and Nvidia gave up doing all the work. Game engine developers are still able to offer you multi-GPU. But they also aren't eager to do the work, because, as I have already told you, the work doesn't pay off.
Keeping flagships up and running is a very good advertisement policy for the companies.
Radeon RX 6950 XT is the current flagship but it is slower than the competition.

Imagine if AMD had kept its original "sweet-spot" strategy for and released the ultimate dual-GPU Radeon RX 6995X2 with two Radeon RX 6800 that would ultimately destroy any nvidia offering... Would it not be better for AMD?
Of course...

But who thinks about it?
Posted on Reply
#29
Patriot
MusselsThis seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
Yeah, why were they expecting it to leave?
Posted on Reply
#30
ghazi
ARFWhy would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Well, the answers are obvious and have already been given. How do you split the workload between two entirely separate compute engines? You either split the frame or do alternate frame rendering. Both introduce lag and stutter that is completely unavoidable and inherent to the design. I too miss the days of CFX/SLI, but at the end of the day, it was not worth the investment: maybe it was for a few people running quadfire 7970s with an 8000x2560 eyefinity setup, but not all the industry players who had to support the crap, or the market as a whole.

The whole point of the chiplet direction is that the separate dice no longer function as separate processors. That's not the case yet for the HPC cards, where those scheduling concerns don't play into the massively parallel workloads. But what you want is a single command front-end along with some other modifications that allow the compute chiplets to work in tandem with a single master chiplet that interfaces with the rest of the system. This results in an entirely software agnostic implementation where the MCM is seen as one GPU by the OS/driver.
ARFYes, yes. :rolleyes:

The market has two options:
1. Top-down forced kill-off which doesn't take the users' preferences and likes;
2. Bottom-up when the users force the game developers to get their act together and begin to resurrect the CF/multi-GPU support:

AMD Radeon RX 6800 XT tested in Multi-GPU configuration - VideoCardz.com
That simply will not happen. Putting the onus on the game developers with mGPU led to its own death. A lot of people were saying things like, "why would devs bother investing for RTX so 5% of the market can get some fancy reflections at 30 FPS for a few minutes until they turn it back off". That same line of reasoning applies several times over in this case. The benchmarks you show are almost all of the benchmarks that exist because no games offer the support.

I would also like to remind you that, quite ironically, your benchmarks are looking at average FPS, primarily in games that are already above 144fps. Is 230 avg 120 min better than 160 avg 120 min? I would say no...
Posted on Reply
#31
DemonicRyzen666
Amd 's current cards support more mGPU games than Nvidia's. It has to do with how implementing drive flags from dx11 on dx12 games, certain game engines look for those flags still even on dx12.

Every year so far, both manufacturers have only had an increase of about 50-60% from their last generation GPU. Consider that going from 1080p to 2160p in screen size, is 400% increase in area, when you double a square. It is pointless to think that a single card in the next 3-4 years will able to do 2160p at 120fps or higher, with that limited increase in performance. Mathematically it will not add up. I haven't even add in what raytracing in games takes away in performance either.
Posted on Reply
#32
Mussels
Freshwater Moderator
ARFWhy would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Because the previous implementations honestly sucked, they were a hacked in solution with a lot of complications.

Not adding in VRAM, broken in various game engines, required non stop tweaks and updates from the GPU developers to stop it breaking titles as well as requiring precisely matching cards (Yes AMD let you vary it a little, by reducing performance)


DX12 was meant to fix that situation but never did
Posted on Reply
#33
Vayra86
DemonicRyzen666Amd 's current cards support more mGPU games than Nvidia's. It has to do with how implementing drive flags from dx11 on dx12 games, certain game engines look for those flags still even on dx12.

Every year so far, both manufacturers have only had an increase of about 50-60% from their last generation GPU. Consider that going from 1080p to 2160p in screen size, is 400% increase in area, when you double a square. It is pointless to think that a single card in the next 3-4 years will able to do 2160p at 120fps or higher, with that limited increase in performance. Mathematically it will not add up. I haven't even add in what raytracing in games takes away in performance either.
You do realize that in the SLI days, like Kepler, the gen-to-gen performance jump was in fact smaller? 670 > 770 was +25%. The 970 was again about 30% faster.

Now fast forward to today. Maxwell > Pascal was a 45~50% perf jump per tier. Turing wasn't as big a jump, but still rivalled older generational jumps while adding hardware for RT. Ampere, was again a major leap, albeit with a power increase; but the generational performance increase has truly never been higher on GPU. I'm not really mentioning AMD here because there is no consistency, but they've followed suit or stalled on that scenario.

It is in fact the polar opposite of what you're thinking. Right now, we've already had such major performance jumps that 4K has become feasible within two generations of it slowly gaining market share on monitors and TVs. Resolution isn't what kills your performance. New games do.

Its really about a bit more than API flags btw. I ran 660s in SLI... it was horror. More often than not, the experience was either subpar or you just had half performance of what you paid for, but STILL had the extra heat, noise, etc.
ARFI don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards

Only if I was the Manager in charge at AMD...
Right. Remember this?

arstechnica.com/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/

You'd be another Raja ;)
Posted on Reply
#34
chrcoluk
ghaziWell, the answers are obvious and have already been given. How do you split the workload between two entirely separate compute engines? You either split the frame or do alternate frame rendering. Both introduce lag and stutter that is completely unavoidable and inherent to the design. I too miss the days of CFX/SLI, but at the end of the day, it was not worth the investment: maybe it was for a few people running quadfire 7970s with an 8000x2560 eyefinity setup, but not all the industry players who had to support the crap, or the market as a whole.

The whole point of the chiplet direction is that the separate dice no longer function as separate processors. That's not the case yet for the HPC cards, where those scheduling concerns don't play into the massively parallel workloads. But what you want is a single command front-end along with some other modifications that allow the compute chiplets to work in tandem with a single master chiplet that interfaces with the rest of the system. This results in an entirely software agnostic implementation where the MCM is seen as one GPU by the OS/driver.

That simply will not happen. Putting the onus on the game developers with mGPU led to its own death. A lot of people were saying things like, "why would devs bother investing for RTX so 5% of the market can get some fancy reflections at 30 FPS for a few minutes until they turn it back off". That same line of reasoning applies several times over in this case. The benchmarks you show are almost all of the benchmarks that exist because no games offer the support.

I would also like to remind you that, quite ironically, your benchmarks are looking at average FPS, primarily in games that are already above 144fps. Is 230 avg 120 min better than 160 avg 120 min? I would say no...
60fps and im happy :).

SLI is inefficient, hard to get to work properly and would make GPUs even more scarce so yep I agree with the popular view. :)
Posted on Reply
#35
Lycanwolfen
I'm not running a IPS 4k monitor. They are small and well smaller and higher res is expensive. In the TV market you can find some sweet deals but you have to really look at the specs. I bought a Samsung TV for like 299.00 on a boxing day sale. There was a 40 inch and even a 43 inch that was on sale for 100.00 cheaper. But the 43 inch did not have the specs. The 40 inch supported HDMI 2.1 and 120 hz refresh rate. Also supported 1.08 billion colors vs the 16.7 million.
I have had it for 6 years now 4k gaming on it is sweet. Also had 3 HDMI in with all supporting it so I have my PC and PS4 Pro and PS3 orginial all plugged up.

Good site to check out is
www.rtings.com/tv/reviews/best/by-usage/video-gaming

You can get a 40 inch TV cheaper than any monitor and get almost the same results.
chrcoluk60fps and im happy :).

SLI is inefficient, hard to get to work properly and would make GPUs even more scarce so yep I agree with the popular view. :)
SLI is easy to setup. The major problem I always found with SLI and setups when I worked at my own COmputer store I use to own. Was they would plug it in and think right off the bat they had SLI. One simple little click in the driver and wala SLI working on all games.

I could get SLI working in games that did not even support it.

Look if you do not see SLI inside the game config does not mean it does not support it. There is no SLI config in unreal 2003 or Quake 3 arena. All SLI config was done in the driver not the game.
Posted on Reply
#36
64K
MusselsThis seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
SLI was dead a long, long time ago. It was a problem causer more than a solution even when it was supported by Developers and Nvidia.

The last dual GPU gaming card from Nvidia was the Titan Z. That was 8 years ago. It was silly priced at $3,000. That should have been a warning to all paying at least a little attention that SLI was going to be history.

The future is MCM.
Posted on Reply
#37
Lycanwolfen
Just wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps

Oh and the 3090 yes has that feature you turn on the AI deep learning to increase FPS. You know what it really does just cut corners to produce those higher FPS. If it does not need to render long distance items it blurs them.
Posted on Reply
#38
64K
LycanwolfenJust wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps
According to the Steam Hardware Survey the vast majority are still on 1080p. Only 2.6% report using 4K.

8K is not going to go mainstream for decades. If anyone plans to buy an 8K gaming monitor before then they will see it's just a money pit to game on and hardly worth it.
Posted on Reply
#39
Vayra86
LycanwolfenJust wait till 7680 × 4320 comes out. Unless there is SLI or Crossfire on a Single Card there will be no way it will run unless you like playing games at 20 fps

Oh and the 3090 yes has that feature you turn on the AI deep learning to increase FPS. You know what it really does just cut corners to produce those higher FPS. If it does not need to render long distance items it blurs them.
Bigger numbers do not equal better games or gaming. PPI is a thing, most people game on a downscaled res, or lower internal render res on 4K panels and they own one for the simple fact there is nothing else anymore.

Even ultrawide has more traction in the gaming market than a 4x resolution boost. for very little reasons other than 'omg its more pixuls'. 15 minutes of gaming and you stop noticing the added detail because quite simply, PPI is a thing your brain wont ignore. In a desktop setting, 4K is overkill.

The fact some enthusiasts and tubers do the marketing for corporations does not make it something the majority either needs or wants. 8K is another step in the wrong, utterly wasteful direction. Maybe you missed the climate discussion lately... the age of rampant growth is finite, in comes an age of bare necessity. Its about time, too. mGPU is a horribly wasteful tech.
Posted on Reply
#40
Lycanwolfen
Vayra86Bigger numbers do not equal better games or gaming. PPI is a thing, most people game on a downscaled res, or lower internal render res on 4K panels and they own one for the simple fact there is nothing else anymore.

Even ultrawide has more traction in the gaming market than a 4x resolution boost. for very little reasons other than 'omg its more pixuls'. 15 minutes of gaming and you stop noticing the added detail because quite simply, PPI is a thing your brain wont ignore. In a desktop setting, 4K is overkill.

The fact some enthusiasts and tubers do the marketing for corporations does not make it something the majority either needs or wants. 8K is another step in the wrong, utterly wasteful direction. Maybe you missed the climate discussion lately... the age of rampant growth is finite, in comes an age of bare necessity. Its about time, too. mGPU is a horribly wasteful tech.
You see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
Posted on Reply
#41
64K
LycanwolfenYou see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
SLI cards aren't the problem. It's the Developer support in games that is not available. A good example is the dual GPU GTX 690 and the single GPU GTX 680. A 690 is two 680s paired together on one card but even years ago we saw that they benched the same FPS because only one of the GPUs on the 690 was used.

There's no place for SLI anymore. There is only a few die shrinks to be had before making GPUs too expensive to continue on with die shrinks.

multi chip modules is the future.
Posted on Reply
#42
Vayra86
LycanwolfenYou see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
Games got heavier irrespective of resolution. You can run tons of games in 4K. But not the supposed triple A cutting edge... which in fact is nothing but a bunch of ultra settings that make very little sense... and perhaps a few weak RT effects. Core of most game development is scoped based on the console capability. Not the PC.

Again... marketing vs reality vs your view on whats what. The vast majority of games releases on engines that are fully enabled to push 4K efficiently. Tricks are deployed already as i pointed out. Dont mistake submission with a complete lack of actual demand. You can run anything at any resolution, just tweak it to fit your use case. Between FSR and DLSS things only got more accessible.

People want games before pixels, higher res was always pure luxury, as it will remain. Its as old as running 1600x1200 on your CRT.

Also what is submission to 1440p in the eyes of a GPU vendor anyway? Not exactly a good business case is it? Nvidia and AMD have every reason to push the bar ever higher. Why do you think the AMD powered consoles tout themselves as 4K machines?

The fact is, they figured out some time ago people wont just buy into 4K for all the aforementioned reasons. They aint as stupid as we are... so Nvidia quickly pushed RTX out to create new buyer incentive... too bad they forgot to bring some actual content along. They rushed it to market because they knew Pascal was fine, too fine even, and AMD owned the console space. They had to have a new way to differentiate and sell product.
Wake up man ;)
64KSLI cards aren't the problem. It's the Developer support in games that is not available. A good example is the dual GPU GTX 690 and the single GPU GTX 680. A 690 is two 680s paired together on one card but even years ago we saw that they benched the same FPS because only one of the GPUs on the 690 was used.

There's no place for SLI anymore. There is only a few die shrinks to be had before making GPUs too expensive to continue on with die shrinks.

multi chip modules is the future.
MCM is a case of seeing is believing... but yeah, seems to move that way.
Posted on Reply
#43
64K
Vayra86Games got heavier irrespective of resolution. You can run tons of games in 4K. But not the supposed triple A cutting edge... which in fact is nothing but a bunch of ultra settings that make very little sense... and perhaps a few weak RT effects. Core of most game development is scoped based on the console capability. Not the PC.

Again... marketing vs reality vs your view on whats what. The vast majority of games releases on engines that are fully enabled to push 4K efficiently. Tricks are deployed already as i pointed out. Dont mistake submission with a complete lack of actual demand. You can run anything at any resolution, just tweak it to fit your use case. Between FSR and DLSS things only got more accessible.

People want games before pixels, higher res was always pure luxury, as it will remain. Its as old as running 1600x1200 on your CRT.

Also what is submission to 1440p in the eyes of a GPU vendor anyway? Not exactly a good business case is it? Nvidia and AMD have every reason to push the bar ever higher. Why do you think the AMD powered consoles tout themselves as 4K machines?

The fact is, they figured out some time ago people wont just buy into 4K for all the aforementioned reasons. They aint as stupid as we are... so Nvidia quickly pushed RTX out to create new buyer incentive... too bad they forgot to bring some actual content along. They rushed it to market because they knew Pascal was fine, too fine even, and AMD owned the console space. They had to have a new way to differentiate and sell product.
Wake up man ;)


MCM is a case of seeing is believing... but yeah, seems to move that way.
The latest rumor is that the RX 7970 XT will be MCM and release Q4 this year. It could easily get delayed though.
Posted on Reply
#44
tpu7887
LycanwolfenYou see if we goto submission and settle for 1440p then 4k becomes nothing but a pipe dream. I love 4k man its sweet. Games never looked soo good. Gamers did not strive for it. They just went into submission and marketing brainwashed them into less for more. Just an example 3090 is like 5 grand in Canada to find one. I can buy a second hand car for that price. I think Nvidia will do a SLI single card maybe a dual gpu setup for high end gamers. I mean thats what voodoo did in the 2000's 20 years ago.
Why do you blatantly lie?
(rhetorical question)

3090 are accessible and less than $2k

www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=181351
Posted on Reply
#45
ARF
tpu7887Why do you blatantly lie?
(rhetorical question)

3090 are accessible and less than $2k

www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=181351
The question is why would you pay even that much for it? 2020 technology.
Until very recently, the guy is talking absolute truth.

About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
Posted on Reply
#46
tpu7887
ARFThe question is why would you pay even that much for it? 2020 technology.
Until very recently, the guy is talking absolute truth.

About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
"very recently"
Prices have been down for months!
I never said the state of affairs wasn't ridiculous, I just hate blatant exaggerations.

For reference, I didn't pay much attention to games for a good 7-8 years from 2013. Except UT2004, I played that now and then. When I started checking things out again I thought for sure I'd be blown away by the graphics - I was underwhelmed. Extremely. Literally almost no progression. If you disregard ray tracing (which you can pretty much do) there's actually been no progression. Sure frame rates are up and 720p/1080p/1440p is now 1080p/1440p/4k, and if you look veeeerry carefully things look ever so slightly more realistic...
Posted on Reply
#47
Vayra86
tpu7887"very recently"
Prices have been down for months!
I never said the state of affairs wasn't ridiculous, I just hate blatant exaggerations.

For reference, I didn't pay much attention to games for a good 7-8 years from 2013. Except UT2004, I played that now and then. When I started checking things out again I thought for sure I'd be blown away by the graphics - I was underwhelmed. Extremely. Literally almost no progression. If you disregard ray tracing (which you can pretty much do) there's actually been no progression. Sure frame rates are up and 720p/1080p/1440p is now 1080p/1440p/4k, and if you look veeeerry carefully things look ever so slightly more realistic...
Well... there are real gems in graphics that are truly new and it definitely is more detailed but yeah, none of these shiny effects truly enabled 'new gameplay'.

Its really more of the same, even RT effects are just another 'pass' of polish over what is essentially the same core of graphics. I would have expected way better integration; even 15 years ago games would play around with light and shadow, tying gameplay elements to it (stealth, for example). Those kinds of integration of gameplay and graphics, or a thing such as physics, are so rare... I dont get it.
ARFAbout 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
If everyone is on 4K what is the upgrade path exactly? How would Nv sell 4xxx series?
Posted on Reply
#48
ARF
Vayra86If everyone is on 4K what is the upgrade path exactly? How would Nv sell 4xxx series?
- Higher FPS;
- New more demanding games;
- Higher Hz 4K screens - 4K@60, 4K@120, 4K@144, 4K@360...
Posted on Reply
#49
Vayra86
ARF- Higher FPS;
- New more demanding games;
- Higher Hz 4K screens - 4K@60, 4K@120, 4K@144, 4K@360...
Yes, and now reality: even HDMI 2.1 is not mainstream yet. You're mistaking wishful thinking with economic reality.
Posted on Reply
#50
ARF
Vayra86Yes, and now reality: even HDMI 2.1 is not mainstream yet. You're mistaking wishful thinking with economic reality.
Innovation can't be driven by pure corporate greed, you should help by explaining to everyone around you to move to higher resolutions - crisper, higher quality images, much improved ergonomics, etc..
Posted on Reply
Add your own comment
Nov 25th, 2024 09:31 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts