Thursday, June 2nd 2022
Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
Sources:
Igor's Lab, VideoCardz
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
62 Comments on Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
I would like to see "chiplet" like multi-chip solutions though. (less waste of silikon, cheaper to produce etc)
Radeon RX 6950 XT is the current flagship but it is slower than the competition.
Imagine if AMD had kept its original "sweet-spot" strategy for and released the ultimate dual-GPU Radeon RX 6995X2 with two Radeon RX 6800 that would ultimately destroy any nvidia offering... Would it not be better for AMD?
Of course...
But who thinks about it?
The whole point of the chiplet direction is that the separate dice no longer function as separate processors. That's not the case yet for the HPC cards, where those scheduling concerns don't play into the massively parallel workloads. But what you want is a single command front-end along with some other modifications that allow the compute chiplets to work in tandem with a single master chiplet that interfaces with the rest of the system. This results in an entirely software agnostic implementation where the MCM is seen as one GPU by the OS/driver. That simply will not happen. Putting the onus on the game developers with mGPU led to its own death. A lot of people were saying things like, "why would devs bother investing for RTX so 5% of the market can get some fancy reflections at 30 FPS for a few minutes until they turn it back off". That same line of reasoning applies several times over in this case. The benchmarks you show are almost all of the benchmarks that exist because no games offer the support.
I would also like to remind you that, quite ironically, your benchmarks are looking at average FPS, primarily in games that are already above 144fps. Is 230 avg 120 min better than 160 avg 120 min? I would say no...
Every year so far, both manufacturers have only had an increase of about 50-60% from their last generation GPU. Consider that going from 1080p to 2160p in screen size, is 400% increase in area, when you double a square. It is pointless to think that a single card in the next 3-4 years will able to do 2160p at 120fps or higher, with that limited increase in performance. Mathematically it will not add up. I haven't even add in what raytracing in games takes away in performance either.
Not adding in VRAM, broken in various game engines, required non stop tweaks and updates from the GPU developers to stop it breaking titles as well as requiring precisely matching cards (Yes AMD let you vary it a little, by reducing performance)
DX12 was meant to fix that situation but never did
Now fast forward to today. Maxwell > Pascal was a 45~50% perf jump per tier. Turing wasn't as big a jump, but still rivalled older generational jumps while adding hardware for RT. Ampere, was again a major leap, albeit with a power increase; but the generational performance increase has truly never been higher on GPU. I'm not really mentioning AMD here because there is no consistency, but they've followed suit or stalled on that scenario.
It is in fact the polar opposite of what you're thinking. Right now, we've already had such major performance jumps that 4K has become feasible within two generations of it slowly gaining market share on monitors and TVs. Resolution isn't what kills your performance. New games do.
Its really about a bit more than API flags btw. I ran 660s in SLI... it was horror. More often than not, the experience was either subpar or you just had half performance of what you paid for, but STILL had the extra heat, noise, etc. Right. Remember this?
arstechnica.com/gadgets/2016/07/amd-rx-480-crossfire-vs-nvidia-gtx-1080-ashes/
You'd be another Raja ;)
SLI is inefficient, hard to get to work properly and would make GPUs even more scarce so yep I agree with the popular view. :)
I have had it for 6 years now 4k gaming on it is sweet. Also had 3 HDMI in with all supporting it so I have my PC and PS4 Pro and PS3 orginial all plugged up.
Good site to check out is
www.rtings.com/tv/reviews/best/by-usage/video-gaming
You can get a 40 inch TV cheaper than any monitor and get almost the same results. SLI is easy to setup. The major problem I always found with SLI and setups when I worked at my own COmputer store I use to own. Was they would plug it in and think right off the bat they had SLI. One simple little click in the driver and wala SLI working on all games.
I could get SLI working in games that did not even support it.
Look if you do not see SLI inside the game config does not mean it does not support it. There is no SLI config in unreal 2003 or Quake 3 arena. All SLI config was done in the driver not the game.
The last dual GPU gaming card from Nvidia was the Titan Z. That was 8 years ago. It was silly priced at $3,000. That should have been a warning to all paying at least a little attention that SLI was going to be history.
The future is MCM.
Oh and the 3090 yes has that feature you turn on the AI deep learning to increase FPS. You know what it really does just cut corners to produce those higher FPS. If it does not need to render long distance items it blurs them.
8K is not going to go mainstream for decades. If anyone plans to buy an 8K gaming monitor before then they will see it's just a money pit to game on and hardly worth it.
Even ultrawide has more traction in the gaming market than a 4x resolution boost. for very little reasons other than 'omg its more pixuls'. 15 minutes of gaming and you stop noticing the added detail because quite simply, PPI is a thing your brain wont ignore. In a desktop setting, 4K is overkill.
The fact some enthusiasts and tubers do the marketing for corporations does not make it something the majority either needs or wants. 8K is another step in the wrong, utterly wasteful direction. Maybe you missed the climate discussion lately... the age of rampant growth is finite, in comes an age of bare necessity. Its about time, too. mGPU is a horribly wasteful tech.
There's no place for SLI anymore. There is only a few die shrinks to be had before making GPUs too expensive to continue on with die shrinks.
multi chip modules is the future.
Again... marketing vs reality vs your view on whats what. The vast majority of games releases on engines that are fully enabled to push 4K efficiently. Tricks are deployed already as i pointed out. Dont mistake submission with a complete lack of actual demand. You can run anything at any resolution, just tweak it to fit your use case. Between FSR and DLSS things only got more accessible.
People want games before pixels, higher res was always pure luxury, as it will remain. Its as old as running 1600x1200 on your CRT.
Also what is submission to 1440p in the eyes of a GPU vendor anyway? Not exactly a good business case is it? Nvidia and AMD have every reason to push the bar ever higher. Why do you think the AMD powered consoles tout themselves as 4K machines?
The fact is, they figured out some time ago people wont just buy into 4K for all the aforementioned reasons. They aint as stupid as we are... so Nvidia quickly pushed RTX out to create new buyer incentive... too bad they forgot to bring some actual content along. They rushed it to market because they knew Pascal was fine, too fine even, and AMD owned the console space. They had to have a new way to differentiate and sell product.
Wake up man ;) MCM is a case of seeing is believing... but yeah, seems to move that way.
(rhetorical question)
3090 are accessible and less than $2k
www.canadacomputers.com/product_info.php?cPath=43_557_559&item_id=181351
Until very recently, the guy is talking absolute truth.
About 4K 3840x2160 - it should be the main mainstream resolution today. 2560x1440 should be low-end, entry level, and 5K, 6K, 8K, etc should be enthusiast class but not necessary to upgrade to...
Prices have been down for months!
I never said the state of affairs wasn't ridiculous, I just hate blatant exaggerations.
For reference, I didn't pay much attention to games for a good 7-8 years from 2013. Except UT2004, I played that now and then. When I started checking things out again I thought for sure I'd be blown away by the graphics - I was underwhelmed. Extremely. Literally almost no progression. If you disregard ray tracing (which you can pretty much do) there's actually been no progression. Sure frame rates are up and 720p/1080p/1440p is now 1080p/1440p/4k, and if you look veeeerry carefully things look ever so slightly more realistic...
Its really more of the same, even RT effects are just another 'pass' of polish over what is essentially the same core of graphics. I would have expected way better integration; even 15 years ago games would play around with light and shadow, tying gameplay elements to it (stealth, for example). Those kinds of integration of gameplay and graphics, or a thing such as physics, are so rare... I dont get it. If everyone is on 4K what is the upgrade path exactly? How would Nv sell 4xxx series?
- New more demanding games;
- Higher Hz 4K screens - 4K@60, 4K@120, 4K@144, 4K@360...