Thursday, June 2nd 2022
Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
Sources:
Igor's Lab, VideoCardz
We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.
62 Comments on Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed
It is not gaming that drives the higher resolutions, but as I said ergonomics - Full HD is a very old resolution dating back to the 90s.High-definition television - Wikipedia
In this particular case, I've been experiencing increasingly higher resolutions since I started using my ZX Spectrum clone. I have never thought of that as a problem.
HDR is far more important (and noticeable) than 4k, but for some reason, creators decided HDR and FHD do not go together so you need 4k for HDR anyway :wtf:
4K is perfectly fine for all screen sizes, including 40-inch, 43-inch, 50-inch.
If you don't see the difference, that is another topic for discussion.
And yes, people don't care to switch to the HD channels, sometimes they are either used to the SD numbers on their remote control, or they simply don't know about the higher-res channels.
You can delete the SD channels and leave the HD channels.
Oh wait... nope. It never was. Gaming evolved, like gaming evolved... irrespective of resolutions and/or devices you could run it on. Even today people happily play low res content - on their phones - and whatever they play today is not low res, but rather low poly count. Mobile games are of a different, more simplified nature in every possible way. But yeah, they run in '4K'... that nobody really notices or even cares about. Phones haven't ever sold on the basis of pixel count. Even Apple's Retina displays didn't.
You're really not connecting the right dots mate, this is devolving into nonsense.
And eh... 850 eur for a late-in-gen 2021 GPU to run something I can run with a 2016 500 eur GPU on 1440p... surely you see the problem here that is 'the price of 4K', right? That's a massive gap for playing the exact same games in much the same quality. 4K will never lose this gap compared to a more sensible resolution. You seem to want to ignore that this 850 eur 6800XT will also have trouble running 4K in two years time, while I'm still content running medium settings on 1440p with that same 2016 GPU. Its really that simple. Chasing top resolutions means chasing the top end GPUs all the time. There are no '4K GPUs'. And let's not even start about high refresh, which for gaming is arguably a bigger win than more pixels.
There is pre-empting a mainstream resolution and there is trailing it. The bottom line: one is overpriced and early adopting, the other is cheap and effective gaming. To each their own, but the market will never start early adopting en masse, as it never has.
No, the people in question could literally not tell the difference on their size TV. I'm sure they would notice it if that had a 50-inch plus screen, but their 70-year-old eyes just couldn't discern the additional picture clarity of HD on their TV. I was flabbergasted at the time. I could see the difference, but they also would rather watch 4:3 content stretched than use letter boxing. Stretched content drives me bananas. Some people just don't see it. My wife thinks 4K is just marketing BS. She can't see the difference. I explained to her there were four times as many pixels and explained what pixels mean't, she didn't care. If the bigger 4K TV makes me happy, she is alright with it but she doesn't see the big deal. She is still in her thirties and doesn't require vision correction according to the eye professionals or the government. Some people can't see it, some people don't really care. I'm sure there are people who care and can see it that just can't afford it as well. $800 may sound like a deal for 6800XT to some people but for a lot of people that is way more than they can afford. I'm sure, like anything, the adoption of 4K for PC gaming is a combination of a lot of things. That has nothing to do with the quality of the screens. Most people again have no idea about PPI. Smartphone adoption is about convenience, plain and simple. A lot of people choose their phone based on what their friends suggest or what the salesman says. Sure, there are enthusiasts who care and of course there are the Apple zealots that care about whatever Apple tells them to care about. Ask someone who has an iPhone why they have an iPhone and most people will say something like oh I'm used to Apple phones or my daughter or son said I should get one or it's so easy to use. I've never had a person say, "well, this is the highest PPI screen I could find". On top of that, phones versus PC monitors is Apples and Oranges. People hold a phone 6 inches from their face. I sure as heck don't sit 6 inches from my monitor.
And about going with 30x0 cards if the 40x0 proves inefficient - remember the 20x0 release. In spring of 2018 we had a cryptomarket collapse, and used GTX 1080 Ti cards were plentiful in the summer (but prices in stores remained high, market was still drunk on success of cryptoboom). When the RTX 2080 was released in September 2018, the prices of used 1080 Ti actually shot up - because the new generation didn't bring any price / performance increase - you paid to be guinnea pig for new technologies like RT and DLSS which were only very slowly getting released.
None of this is catching on as it should for widespread adoption. I reckon all of these options are also competing against each other, people can only spend money once.
Its not new either, as VR isn't - we've already seen past attempts at doing something along the lines of VR, and AR. I remember the Eye Toy :D And where did PSVR go? Not hearing a lot of it anymore.
They simply do not know when VR will catch on and, whenever it looks like VR is coming back, they have to jump on the bandwagon, so they do not get left behind. This happens to 3D TV from time time and will happen to flying cars at some point.