Thursday, June 2nd 2022
Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
Sources:
Igor's Lab, VideoCardz
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
62 Comments on Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
SLI on the other hand, is dead. so very very dead.
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.
You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..
Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Want more reasons? This was really never a great idea, but rather a necessity to cater to a higher end segment. Now that demand/segment has a GPU stack with a higher price ceiling instead. And much bigger perf gap from top to bottom.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
It never really was, it was always a highly wasteful practice. Anything that won't scale 100% is a waste in that sense. The reasons SLI/Crossfire exist are simple: the price point of lower tier GPUs in duo was often competitive against a single top end GPU, and by combining two top end GPUs, you could get a ridiculous amount of frames. Badly scaled, regardless, but big numbers.
So what market was this really for anyway?! The budget gamer that wants more bang for buck (and gets along with a 10% savings a whole bunch of problems), and a niche that now buys upwards of 2000 dollar GPUs anyway.
You probably know that scaling with shaders count addition is never linear - 100% more shaders doesn't mean 100% higher FPS.
Also, CF on the opposite, does have the intention to sometimes mitigate the bottlenecks elsewhere and give higher than 100% FPS uplift.
Crossfire scaling for the Fury X is over 100% (Firestrike) with voltage unlocked + OC : Amd (reddit.com)
Fury X Crossfire Almost 100% scaling @ 4k ! - Graphics Cards - Linus Tech Tips
Radeon 7770 crossfire 100% scaling @ 1200p | Tom's Hardware Forum (tomshardware.com)
The market has two options:
1. Top-down forced kill-off which doesn't take the users' preferences and likes;
2. Bottom-up when the users force the game developers to get their act together and begin to resurrect the CF/multi-GPU support:
AMD Radeon RX 6800 XT tested in Multi-GPU configuration - VideoCardz.com
The same engineering that makes the chiplets work for GPU workloads may also allow some synergy with chiplets on other packages/cards.
So, I'm not sure it's dead. Very exciting stuff.
What I got the feeling was rather that the screen industry is somewhat lagging. It took a long time for relatively good IPS 27, 28 inch 4K high fps monitors to arrive, and high fps 4K 30+ inch monitors are just arriving. But they are pricey, and it's hard for monitor companies to sell us old technologies (VA, IPS) with all their flaws for premium when we can go and buy 55" OLED high fps for roughly the same money? Something isn't priced right here!
4K Monitor Preisvergleich | Günstig bei idealo kaufen
The idea did not work, plain and simple. Most of its touted appeal was "instead of an upgrade, just throw a second *cheap) card in there and you're good". That clearly didn't work, buying a new generation card was always better. SLI/Crossfire only made some sense for those that already bought highest end GPUs and could get better performance anywhere else. But since those are a tiny fraction of the market, the business case just wasn't there.
You mean two Radeon RX 6800 XT 16 GB in CrossFire in 2020 are better than one Radeon RX 7800 XT in 2023. :D
Sure, reviews could gather a handful of games that did scale well with two cards (even if the issues with frame pacing were largely ignored), and they did show you that in some games it doesn't work at all, and you could even get negative scaling...
But if you weren't playing latest AAA game with the latest gaming engine, you were screwed. And even with games that did work every game update, every new driver meant new fears that something will be broken. SLI / Crossfire was never treated as a function that had to work.
Of course, my expectations are too high, since we all know that AMD's business case is very poor, and its direct competitors manage the market better..
I can only guess that there is very strong anti-multi-GPU opposition from nvidia because AMD's original strategy for "sweet spot" GPUs was to use X2 cards.
And nvidia because of its enormous power consumption was always beaten by AMD's better performance-per-watt solutions.
Look at the Radeon HD 3870 and Radeon HD 3870 x2.
Then Radeon HD 4870 and Radeon HD 4870 x2...
Today the CF technology is much better than it was 15 years ago, and if they push for it, they can make it work.
Even if they split that R&D costs to the entire product line - the product lineup is extremely overpriced today, anyways, and can take those expenses.
Only if I was the Manager in charge at AMD...