Thursday, June 2nd 2022

Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).

We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.
Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
Sources: Igor's Lab, VideoCardz
Add your own comment

62 Comments on Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

#1
ZetZet
October/November instead of July/August as the leakers were saying. Not surprising.
Posted on Reply
#2
Mussels
Freshwater Moderator
This seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
Posted on Reply
#3
ARF
Why would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Posted on Reply
#4
Vayra86
ARFWhy would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Power usage / scaling. More chips for equal perf. Frametime consistency issues. Required dev and driver support . Engine support. Heat and limited OC potential because of top gpu...

Want more reasons? This was really never a great idea, but rather a necessity to cater to a higher end segment. Now that demand/segment has a GPU stack with a higher price ceiling instead. And much bigger perf gap from top to bottom.
Posted on Reply
#5
ARF
Vayra86Power usage / scaling. More chips for equal perf. Frametime consistency issues. Required dev and driver support . Engine support. Heat and limited OC potential because of top gpu...

Want more reasons? This was really never a great idea, but rather a necessity to cater to a higher end segment. Now that demand/segment has a GPU stack with a higher price ceiling instead. And much bigger perf gap from top to bottom.
I see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
Posted on Reply
#6
Unregistered
Ya I really miss multi gpu. Had a great experience with all of my mGPU setups and with DX12 / Vulkan having the support integrated to where the devs just have to code in their games for it, I had hoped it would be a good thing for the market and remove the whole driver profile requirement thing.
#7
Vayra86
ARFI see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
The effort just isn't economically interesting at all.

It never really was, it was always a highly wasteful practice. Anything that won't scale 100% is a waste in that sense. The reasons SLI/Crossfire exist are simple: the price point of lower tier GPUs in duo was often competitive against a single top end GPU, and by combining two top end GPUs, you could get a ridiculous amount of frames. Badly scaled, regardless, but big numbers.

So what market was this really for anyway?! The budget gamer that wants more bang for buck (and gets along with a 10% savings a whole bunch of problems), and a niche that now buys upwards of 2000 dollar GPUs anyway.
Posted on Reply
#8
ARF
I don't think it would cost that much to pay to good software developers to write the code properly to support CF.
You probably know that scaling with shaders count addition is never linear - 100% more shaders doesn't mean 100% higher FPS.

Also, CF on the opposite, does have the intention to sometimes mitigate the bottlenecks elsewhere and give higher than 100% FPS uplift.


Crossfire scaling for the Fury X is over 100% (Firestrike) with voltage unlocked + OC : Amd (reddit.com)

Fury X Crossfire Almost 100% scaling @ 4k ! - Graphics Cards - Linus Tech Tips

Radeon 7770 crossfire 100% scaling @ 1200p | Tom's Hardware Forum (tomshardware.com)
Posted on Reply
#9
Vayra86
ARFI don't think it would cost that much to pay to good software developers to write the code properly to support CF.
You probably know that scaling with shaders count addition is never linear - 100% more shaders doesn't mean 100% higher FPS.

Also, CF on the opposite, does have the intention to sometimes mitigate the bottlenecks elsewhere and give higher than 100% FPS uplift.


Crossfire scaling for the Fury X is over 100% (Firestrike) with voltage unlocked + OC : Amd (reddit.com)

Fury X Crossfire Almost 100% scaling @ 4k ! - Graphics Cards - Linus Tech Tips

Radeon 7770 crossfire 100% scaling @ 1200p | Tom's Hardware Forum (tomshardware.com)
Okay. You think you, but the entire market killed this idea and not because they hate making money. Fact of the matter is that even DX12's mGPU plan was dropped like a stone.
Posted on Reply
#10
ARF
Yes, yes. :rolleyes:

The market has two options:
1. Top-down forced kill-off which doesn't take the users' preferences and likes;
2. Bottom-up when the users force the game developers to get their act together and begin to resurrect the CF/multi-GPU support:

AMD Radeon RX 6800 XT tested in Multi-GPU configuration - VideoCardz.com
As expected, the mGPU support makes more sense as the resolution increases (4K is heavily GPU bound). Depending on the game settings we are looking at a 101% to 156% performance scaling at 1080p resolution when two RX 6800 XT cards are used. At 1440p Deus Ex the performance even drops to 95% when mGPU is enabled. Overall it appears that Strange Brigade and Sniper Elite 4 are well optimized for mGPU settings:


Posted on Reply
#11
Solaris17
Super Dainty Moderator
btarunrminers could pump the market with used cards at attractive prices, which could get some attention, too
While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
Posted on Reply
#12
R-T-B
Solaris17While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
That's likely to hurt the gpu buyers more than the miners, honestly.
Posted on Reply
#13
TheEndIsNear
I miss multiple Gpu's. I still got the 1500 watt power supply from my 295x2 and 290x tri-fire. That was a space heater. I don't care about power usage or heat. I want to have fun.
Posted on Reply
#14
MikeMurphy
MusselsThis seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
Multi-chip modules (or chiplets) are next gen GPUs as there are some major advantages over huge monolithic dies.

The same engineering that makes the chiplets work for GPU workloads may also allow some synergy with chiplets on other packages/cards.

So, I'm not sure it's dead. Very exciting stuff.
Posted on Reply
#15
Lycanwolfen
Who needs Multi GPU any more. The hard core gamers have gotten soft. They have settled for less for more. 1440 is now the new 1080p. gamers basicly went into submission. I myself was looking forward to 2160p and even 8K gaming on home computers. SLI and Multi-CPU's would have taken us there. But nope everyone wanted 1440P and the More HZ the more it Matters. I remember playing on my 27 inch Flattron Samsung TV running 1024x1280 on my voodoo 2's in SLI playing quake 3 arena at 60 hz and over 500 FPS man that was smooth. Then we switched to 1080p and games like crysis and farcry came out and our mouths dropped to the floor. We all needed need video cards to run these new games I remember my 8800GTX Just to get 60 fps was hard. It took a few years before the tech improved enough we could play 1080p with crysis and farcry running above 200 fps. Then the next big jump was 2160p that 4k crown was amazing and who could do it and game at it was jaw dropping. AT the time I had two 660ti's in SLI and at 1080p it ran everything so sweet but when I switched to 4k OMG games that did 150fps dropped to 5 fps. DOOM was the big thing back in 2016 and running it at 4k was like the KING of the gaming crown. I ran my 660ti in 1080p on my 4k monitor till I got twin 1080ti's in SLI. Finally 4k was a real. I was jaw dropping in gamer heaven. 100+ FPS in DOOM 2016 4k man it was sweet. I even had friends over and man they wanted that setup. When the 2080 series came out I look at it and said to myself where is SLI I wanted to see improvments to 4k I wanted to see nvidia do the same as in the past faster GPU and less power. I looked at all the benchmarks and reviews and never saw a huge giant leap in tech to 4k or 8k. Then I kept hearing about the 3000 series. When the 3000 series came out it was all about 1440p. I was like wait a min did I miss something. Are we going backwards in time now and not forward. Why the hell would I want to go from 2160p to 1440p. I wanted to see improvments in the high end. I wanted to see a push for even 8k gaming. But no nothing here is a super expensive 3090 card thats does great at 1440p. Why on this earth would I spend a huge ammount of money on something that hardly is an improvement. All the review sites raving about the new card oh the output is insane. Bunch freaking lies all I heard. Also the power needed to run these cards was triple or even more than two 1080ti's. Nvidia roadmap in 2016 was more GPU power higher FPS and less power consumption. Now it seem more Power Consumption less GPU power and sell it to gamers and make them think they are getting the best. I tested a 3090 just too see if it was better I ran all my games at 4k and well average fps was maybe 10 FPS higher than my 1080ti's in SLI no real big improvement at all. I took the card back and said I want a refund. Guy at the counter said to me why. I said till nvidia actually starts making improvments on video cards. Even today 8k is becoming the new norm and 12k is starting to come out. I will wait and see about the 4000 series again if I see 4k is doing over 200 fps then I will be impressed and I might buy one to replace my SLI setup. Till then Nvidia and AMD do not impress me one dame bit. Also the Gamers and Reviews stuck in the past at 1440P Grow up!
Posted on Reply
#16
eidairaman1
The Exiled Airman
Solaris17While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
There is one on here trying to sell many for 875 a piece, i doubt anyone has responded
Posted on Reply
#17
Bwaze
LycanwolfenWho needs Multi GPU any more. The hard core gamers have gotten soft. They have settled for less for more. 1440 is now the new 1080p. gamers basicly went into submission...
Lycanwolfen... Also the Gamers and Reviews stuck in the past at 1440P Grow up!
I never had that feeling. All reviewers now also include 4K numbers, and there are always articles about how to get there even with lower end cards. And high end cards of this generation got more uplift with the higher resolution, so 3080 and upwards really show their strenght in 4K.

What I got the feeling was rather that the screen industry is somewhat lagging. It took a long time for relatively good IPS 27, 28 inch 4K high fps monitors to arrive, and high fps 4K 30+ inch monitors are just arriving. But they are pricey, and it's hard for monitor companies to sell us old technologies (VA, IPS) with all their flaws for premium when we can go and buy 55" OLED high fps for roughly the same money? Something isn't priced right here!
Posted on Reply
#18
ARF
BwazeWhat I got the feeling was rather that the screen industry is somewhat lagging. It took a long time for relatively good IPS 27, 28 inch 4K high fps monitors to arrive, and high fps 4K 30+ inch monitors are just arriving. But they are pricey
I think we have just reached the historical low for the 3840x2160 screens price level:


4K Monitor Preisvergleich | Günstig bei idealo kaufen
Posted on Reply
#19
bug
ARFI see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
Spot on! I mean, they just launched SLI in 2004 and bam! only a short 18 years later, they give up.

The idea did not work, plain and simple. Most of its touted appeal was "instead of an upgrade, just throw a second *cheap) card in there and you're good". That clearly didn't work, buying a new generation card was always better. SLI/Crossfire only made some sense for those that already bought highest end GPUs and could get better performance anywhere else. But since those are a tiny fraction of the market, the business case just wasn't there.
Posted on Reply
#20
ARF
bugSpot on! I mean, they just launched SLI in 2004 and bam! only a short 18 years later, they give up.

The idea did not work, plain and simple. Most of its touted appeal was "instead of an upgrade, just throw a second *cheap) card in there and you're good". That clearly didn't work, buying a new generation card was always better. SLI/Crossfire only made some sense for those that already bought highest end GPUs and could get better performance anywhere else. But since those are a tiny fraction of the market, the business case just wasn't there.
It works. Just with some compromises but still better than a single and much slower one-GPU setup..

You mean two Radeon RX 6800 XT 16 GB in CrossFire in 2020 are better than one Radeon RX 7800 XT in 2023. :D
Posted on Reply
#21
Bwaze
SLI / Crossfire was always completely uninteresting to me, since they never worked correctly with less popular, but still graphically demanding games such as flight simulators.

Sure, reviews could gather a handful of games that did scale well with two cards (even if the issues with frame pacing were largely ignored), and they did show you that in some games it doesn't work at all, and you could even get negative scaling...

But if you weren't playing latest AAA game with the latest gaming engine, you were screwed. And even with games that did work every game update, every new driver meant new fears that something will be broken. SLI / Crossfire was never treated as a function that had to work.
Posted on Reply
#22
bug
ARFIt works. Just with some compromises but still better than a single and much slower one-GPU setup..

You mean two Radeon RX 6800 XT 16 GB in CrossFire in 2020 are better than one Radeon RX 7800 XT in 2023. :D
Once again, those GPUs are well below10% market share. There's no business case here.
Posted on Reply
#23
ARF
bugOnce again, those GPUs are well below10% market share. There's no business case here.
Sometimes you must invest more efforts in brand image and reputation.
Of course, my expectations are too high, since we all know that AMD's business case is very poor, and its direct competitors manage the market better..
Posted on Reply
#24
bug
ARFSometimes you must invest more efforts in brand image and reputation.
Of course, my expectations are too high, since we all know that AMD's business case is very poor, and its direct competitors manage the market better..
And there is zero change that after more than 15 years, they simply gave up because it just didn't make sense paying for testing, support and keeping profiles updated for games new and decade-old. Right?
Posted on Reply
#25
ARF
bugAnd there is zero change that after more than 15 years, they simply gave up because it just didn't make sense paying for testing, support and keeping profiles updated for games new and decade-old. Right?
I don't know.
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards.
And nvidia because of its enormous power consumption was always beaten by AMD's better performance-per-watt solutions.

Look at the Radeon HD 3870 and Radeon HD 3870 x2.
Then Radeon HD 4870 and Radeon HD 4870 x2...

Today the CF technology is much better than it was 15 years ago, and if they push for it, they can make it work.
Even if they split that R&D costs to the entire product line - the product lineup is extremely overpriced today, anyways, and can take those expenses.

Only if I was the Manager in charge at AMD...
Posted on Reply
Add your own comment
Nov 25th, 2024 07:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts