• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA AD103 and AD104 Chips Powering RTX 4080 Series Detailed

Joined
Jul 13, 2016
Messages
3,337 (1.08/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Y the flying f* do you care about the manufactor profit?? How can it be a factor at all?
If the product suit your needs and in your budget frame and it's the best price\preformance in it's segments then get it.
Simple as that.

As I pointed out in my last comment, it let's you know how much value you are getting relative to the entire stack or market at large.

If there is a large difference between SKUs die size wise (and by extension manufacturer cost) that would indicate to the customer that Nvidia is likely to release products to fill that gap or AMD will do it for them. In addition, comparing the die size of the 3080 and 4080 shows you that you are getting less than half the die area. Even accounting for inflation and the cost increases of smaller nodes, it does not even come close to making a die less than 300mm2 in size worth $900 USD, especially when you compare it to last gen products.

I think just about any customer would be mad to know they are getting much less relative to the 4090 compared to the prior GPU generation while also being charged $200 more.

Your criteria for what product to buy is simply far too naive. You advise customers to just blindly buy without considering factors that could net then massive savings or a better end product.
 
Joined
Mar 10, 2010
Messages
11,878 (2.20/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506




The only post-Fermi x80s NOT based on a 104 were the 780 and 3080.
Hey we all get shit wrong sometimes.
 
Joined
Oct 27, 2020
Messages
799 (0.53/day)
It is closest to TSMC's 10 nm; TSMC's 12 nm is 16 nm with different standard cells.
It's definitely closest to N10 (N10 has zero frequency benefit vs N16 but double the logic density)
Samsung's 8LPP has similar frequency potential as N10 but even better density (around +17%)
16nm GP104 has 7.2b transistors at 314mm² while GA104 has 17.4b transistors at 392mm² (1.9X more dense not in logic only but on overall design)

---------------------------------

New info regarding Ada Lovelace features

"DLSS 3 combines Super Resolution, Frame Generation, and NVIDIA Reflex!"
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.61/day)
Location
Ex-usa | slava the trolls
JHH is not ok. Someone should tell him that he completely lost his mind with this ridiculous new pricing scheme.
This is something that had never been seen before.

Compare AL to AMP:

chip | die size | pricing
AD102 608mm^2 1600
AD103 379mm^2 1200
AD104 295mm^2 900

GA102 628mm^2 2000
GA102 628mm^2 1500
GA102 628mm^2 1200
GA102 628mm^2 700
GA104 392mm^2 600
GA104 392mm^2 500
GA104 392mm^2 400
GA106 276mm^2 330
GA106 276mm^2 250
GA107 200mm^2 200

JHH literally priced something that should cost around 350 max 2.6 times higher up to 900!!
 
Joined
Jun 10, 2021
Messages
126 (0.10/day)
Wuuuut? GTX 1080 begs to differ... or GTX 980... or the 680, or...

The new kid on the block here is the fact there is a 103-SKU in between the 102 and 104. And even the 102 is a spin off from Titan's 110.

The fact is we're looking at the exact opposite situation: they can't place a 104 that high in the stack anymore. They need TWO bigger SKUs to cater to their top end of the stack, and 104 is upper midrange but at best - it no longer 'stands in as early high end'. It used to only get succeeded by a 102 later in gen, now they have to place it at the front of the gen to make even a tiny bit of impact compared to the last. ADA's current positioning is the best example: they're not even removing the 3xxx cards below it; we're looking at ONLY 102 dies from gen to gen populating half their new stack for a while. These are all signs Nvidia's running into new territory wrt their wiggle room: we're paying their wiggle room now on the Geforce stack; the 102/103 SKUs are simply new tiers also in terms of price, and they need every single piece of it to survive against the competition.

Back in the HD-days, they could make do with a 104 and then destroy AMD with a 102 either cut down or full later on. Which they have been doing up until they started pushing RT. Ever since, the changes happened, prices soared and VRAM magically went poof. The price of RT... again... ;) We still laughing about this fantastic tech?

Agree. Don't forget EE/cooler design is also WAY more expensive now too. Nvidia was screwing people for years, People are clearly memory holed or just new to the scene.

Like I said in previous post, 4080 12GB IS overpriced, but its only like $200 give or take from some of the older x80 G104 cards with full enabled dies (Adjusted inflation).

The only problem I can see from here is how the future 4070/4060 will be priced. Those won't scale as linearly with inflation metrics... Not that the 4080 12GB does, but theres compensation on card "upgrades" and EE requirements at least.

My assumption? RTX30 die cost/Samsung 8 was insanely cheap. Nvidia priced and positioned it to be a saving grace to how horrible RTX20 was for general rasterization improvement from Pascal.

Turing had HUGE dies and Nvidia ate so much well deserved crap because RTX/DLSS was essentially vaporware. Generational gains were also weak relative to 10/9 series. Of course most logical people skipped it.
 
Last edited:
Joined
Aug 4, 2020
Messages
1,624 (1.01/day)
Location
::1
The x90 cards are really Titans in all but name. Whether they are called Titan or 4090 is a marketing decision that now has a habit of changing without notice.

It's not wise to compare NVIDIA's graphics card generations by comparing model numbers since they aren't consistent with what each model number represents. It's really just a numerical slot relative to a given card's placement within that generation's product stack.

Clearly NVIDIA's current strategy is to use binning to maximize gross margin. They're not going to sell Joe Gamer an $800 graphics card with a GPU that can be overclocked +30%. Those days are over. They're going to keep those binned chips, relabel them, and sell them at a higher price like $1200.
the biggest problem i see here is that, previously the x80ti was basically a x90/titan w/ half the memory (same/very similar core count/config); this'd be the first time where we wouldn't have this option - we either swallow it and buy the x90 or stick w/ a gpu half as powerful, basically.
and the 4080's not cheap either
 
Joined
Sep 15, 2011
Messages
6,762 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Serious question.
Why didn't nGreedia used the proper naming convention for their GPUs, such as:
AD102 = RTX 4080 (NOT RTX 4090)
AD103 = RTX 4070 (NOT RTX 4080 16 GB)
AD104 = RTX 4060 (NOT RTX 4080 12 GB)

To be honest those could also be the Ti variants, comparing to previous iterations, or just some bump up specs.
Their naming convention is totally retarded and BS tbh.
 
Last edited:
Joined
Dec 22, 2011
Messages
3,890 (0.82/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
Because they are greedy? You lot get upset every single friggin launch.

TIME TO ACCEPT THEY ARE GREEDY, buy AMD, as far as multi-billion corps that just want your cash go, they are your only choice.

Sad times, but hey ho.
 
Joined
Jun 21, 2021
Messages
3,121 (2.43/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
the biggest problem i see here is that, previously the x80ti was basically a x90/titan w/ half the memory (same/very similar core count/config); this'd be the first time where we wouldn't have this option - we either swallow it and buy the x90 or stick w/ a gpu half as powerful, basically.
and the 4080's not cheap either

We didn't have that option with the Ampere generation when it debuted.

Remember, both 3090 and 3080 10G were announced September 1, 2020. 3080 Ti was announced on May 31, 2021. For nine months, there was 3090 or 3080 10G and nothing in between. The 3080 12G didn't arrive until January of this year.

Will there be a 4080 Ti at some point? Probably. Looking at the numbers of cores (CUDA, RT, Tensor) and pricing in the three Ada Lovelace cards announced, it appears that NVIDIA can slot in offerings around these three (above, in between, below).

NVIDIA did not release the entire Ampere lineup at the same time. People will have more choices if they choose to be patient whether it be additional NVIDIA 40 series models or whatever AMD releases. And like NVIDIA, AMD did not release the entire RNDA2 lineup at the same time.
 
Last edited:
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
Unfortunately, Nvidia has disappointed in connectivity department with 4000 series cards. They do not offer DisplayPort 2.0 on cards that cost up to $1600. Completely silly and unacceptable.
They will be the last GPU to offer this port in...2024 on 5000 series, unless Nvidia consumers send them a clear message of disapproval and demand this upgrade on Ti cards.

Intel's A380, the lowest tier card, has DP 2.0 port at 40 Gbps.
 
Joined
Jun 10, 2021
Messages
126 (0.10/day)
Because they are greedy? You lot get upset every single friggin launch.

TIME TO ACCEPT THEY ARE GREEDY, buy AMD, as far as multi-billion corps that just want your cash go, they are your only choice.

Sad times, but hey ho.

Yes buy AMD.. They literally jacked the MSRP of both the 6700XT and 6600XT around $100 USD prior to launch due to mining demand. :laugh:

Today you can grab a 6700XT for $350 with promo code +rebate off Newegg... lol. Both are scum.
 
Joined
Aug 4, 2020
Messages
1,624 (1.01/day)
Location
::1
We didn't have that option with the Ampere generation when it debuted.

Remember, both 3090 and 3080 10G were announced September 1, 2020. 3080 Ti was announced on May 31, 2021. For nine months, there was 3090 or 3080 10G and nothing in between. The 3080 12G didn't arrive until January of this year.

Will there be a 4080 Ti at some point? Probably. Looking at the numbers of cores (CUDA, RT, Tensor) and pricing in the three Ada Lovelace cards announced, it appears that NVIDIA can slot in offerings around these three (above, in between, below).

NVIDIA did not release the entire Ampere lineup at the same time. People will have more choices if they choose to be patient whether it be additional NVIDIA 40 series models or whatever AMD releases. And like NVIDIA, AMD did not release the entire RNDA2 lineup at the same time.
i'll count the 3080-10gb as just that, for all intents and purposes (neither 1080ti nor 2080ti had the full memory bus enabled anyways) since it's close enough anyways. the different 12gb flavors of the 3080 were just nv's attempt to milk a market that bought any- and everything they release.

but my point is: the 3080 had a core count and thus performance similar to the 3090, as the 2080ti & the titan rtx, et cetera.
 
Joined
Jun 21, 2021
Messages
3,121 (2.43/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
i'll count the 3080-10gb as just that, for all intents and purposes (neither 1080ti nor 2080ti had the full memory bus enabled anyways) since it's close enough anyways. the different 12gb flavors of the 3080 were just nv's attempt to milk a market that bought any- and everything they release.

but my point is: the 3080 had a core count and thus performance similar to the 3090, as the 2080ti & the titan rtx, et cetera.

I own the 3080 Ti. I bought it because it's very near to the 3090 in all specs apart from VRAM. I didn't want to pay the premium for the extra 12GB of graphics memory.

In any case, it isn't worth comparing different generations of NVIDIA cards strictly by model number since they change how they think about them. It's really just a way to rank the various models within the product stack of a given generation.

Especially with Ada Lovelace generation, NVIDIA was motivated to change the numbering system yet again to make sure the Ada 4080 has performance above the Ampere 30 Series cards that they will continue selling alongside the new releases in order to draw down channel inventory.
 
Joined
Aug 21, 2013
Messages
1,940 (0.47/day)
Unfortunately, Nvidia has disappointed in connectivity department with 4000 series cards. They do not offer DisplayPort 2.0 on cards that cost up to $1600. Completely silly and unacceptable.
They will be the last GPU to offer this port in...2024 on 5000 series, unless Nvidia consumers send them a clear message of disapproval and demand this upgrade on Ti cards.

Intel's A380, the lowest tier card, has DP 2.0 port at 40 Gbps.
Well actually Intel has it's own problems with HDMI 2.1 on their cards. So i seems everyone has their own shortcomings. Perhaps AMD's 7000 series will offer full support for both HDMI 2.1 and DP 2.0.
 
Joined
Apr 15, 2021
Messages
884 (0.65/day)
No competition? What do you mean by that? RDNA2 matched or beat 30 series in raster, FSR 2.0 has great reviews, and most certainly RDNA3 will compete, and because AMD's chiplet approach should be cheaper to manufacture, RDNA3 should offer better performance per dollar....but despite all of that, everyone will buy Nvidia and reward their behavior and perpetuate Nvidia's constant price increases in perpetuity.

Let's be honest everyone, AMD could release a GPU that matched Nvidia in every way including raytracing, and have FSR equal to DLSS in every way and charge less than Nvidia for it, and everyone would STILL buy Nvidia (which only proves consumer choices are quite irrational and are NOT decided by simply comparing specs, as the existence of fanboys should testify to...)...and as long as that's true, the GPU market will ALWAYS be hostile to consumers. The ONLY way things are going to improve for consumers is if AMD starts capturing marketshare and Nvidia is punished by consumers... but based on historical precedent, I have no hope for that...

And I don't believe Intel's presence would have improved the situation much, not as much as a wholly new company in the GPU space would have, because Intel would have leveraged success in the GPU market (which would have probably been carved away from AMD's limited marketshare instead of Nvidia's and would have resulted in Nvidia's marketshare remaining at 80% and AMD's 20% being divided between AMD and Intel) to further marginalize AMD in the x86 space (for example, by using influence with OEMs to have an Intel CPU matched with an Intel GPU and further diminish AMDs position among OEMs, which is how Intel devastated AMD in the 2000s BTW), and it would have been trading a marginally better GPU market for a much worse CPU market, imo. Although it'd never happen, what would be really improve the market would be if Nvidia got broken up like AT&T did in the 80s...
So what's your point? If AMD was in NVidia's place, they would be pulling the same horseshit. Yes, I'll still buy NVidia's graphics cards because we can't iray render with an AMD GPU. Of course, if I was only gaming, then it would be a different story.
 
Joined
Jan 31, 2010
Messages
5,560 (1.02/day)
Location
Gougeland (NZ)
System Name Cumquat 2021
Processor AMD RyZen R7 7800X3D
Motherboard Asus Strix X670E - E Gaming WIFI
Cooling Deep Cool LT720 + CM MasterGel Pro TP + Lian Li Uni Fan V2
Memory 32GB GSkill Trident Z5 Neo 6000
Video Card(s) PowerColor HellHound RX7800XT 2550cclk/2450mclk
Storage 1x Adata SX8200PRO NVMe 1TB gen3 x4 1X Samsung 980 Pro NVMe Gen 4 x4 1TB, 12TB of HDD Storage
Display(s) AOC 24G2 IPS 144Hz FreeSync Premium 1920x1080p
Case Lian Li O11D XL ROG edition
Audio Device(s) RX7800XT via HDMI + Pioneer VSX-531 amp Technics 100W 5.1 Speaker set
Power Supply EVGA 1000W G5 Gold
Mouse Logitech G502 Proteus Core Wired
Keyboard Logitech G915 Wireless
Software Windows 11 X64 PRO (build 24H2)
Benchmark Scores it sucks even more less now ;)
2022-09-24 14.17.13  ff87cdc6c5ab.jpg
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
Well actually Intel has it's own problems with HDMI 2.1 on their cards. So i seems everyone has their own shortcomings. Perhaps AMD's 7000 series will offer full support for both HDMI 2.1 and DP 2.0.
I am not aware of HDMI 2.1 issues of Arc. Could you, please, elaborate a bit and share the problems identified? Much appreciated.
 
Joined
Aug 21, 2013
Messages
1,940 (0.47/day)
I am not aware of HDMI 2.1 issues of Arc. Could you, please, elaborate a bit and share the problems identified? Much appreciated.

So if i read this correct it affects the 3 series that AIB's need to have an extra conversion chip included. The 5 and 7 series per Ryan Shrout natively include this and thus should be 2.1 compliant. Just seems like an odd omission even on budget cards. HDMI 2.1 is not exactly new and AMD's 6400 card even supports it.
 
Joined
Feb 22, 2022
Messages
109 (0.11/day)
System Name Lexx
Processor Threadripper 2950X
Motherboard Asus ROG Zenith Extreme
Cooling Custom Water
Memory 32/64GB Corsair 3200MHz
Video Card(s) Liquid Devil 6900XT
Storage 4TB Solid State PCI/NVME/M.2
Display(s) LG 34" Curved Ultrawide 160Hz
Case Thermaltake View T71
Audio Device(s) Onboard
Power Supply Corsair 1000W
Mouse Logitech G502
Keyboard Asus
VR HMD NA
Software Windows 10 Pro
Fine products at very fair prices... just buy them!
 
Joined
Oct 26, 2019
Messages
169 (0.09/day)
So, the fin width is 6 nm on a TSMC 10FF process?
Fin top point doesn't mean anything. Look density and gate pitch. Density is a little higher than TSMC 10FF, but falls into same generations.
And Samsung 8N is a part of 10N series. It's 10N+++ speaking plain former Intel language.
 
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)

So if i read this correct it affects the 3 series that AIB's need to have an extra conversion chip included. The 5 and 7 series per Ryan Shrout natively include this and thus should be 2.1 compliant. Just seems like an odd omission even on budget cards. HDMI 2.1 is not exactly new and AMD's 6400 card even supports it.
Thank you for this. I really appreciate it.
Wow! I cannot believe that Intel waited for so long to tell us that only A750 and A770 Limited Edition cards have PCON converter chip on PCB for HDMI 2.1 FRL signal (unclear if 24Gbps?, 32 Gbps?, 40 Gbps? or 48 Gbps?), and all other cards are HDMI 2.0 with 18 Gbps speed.

This is what happens when HDMI Administrator publishes their decision to rebrand 2.0=2.1. And companies sell us a total mess...
 
Joined
Oct 26, 2019
Messages
169 (0.09/day)
No competition? What do you mean by that? RDNA2 matched or beat 30 series in raster, FSR 2.0 has great reviews, and most certainly RDNA3 will compete, and because AMD's chiplet approach should be cheaper to manufacture, RDNA3 should offer better performance per dollar....but despite all of that, everyone will buy Nvidia and reward their behavior and perpetuate Nvidia's constant price increases in perpetuity.
AD103 and AD104 are not 30 series. They are RTX 40 series. RDNA3 is not even here. So when released we will see. And Nvidia will see. May be they will rename them back to 4070 and 4060. May be. At least RDNA3 might offer quite a leap in midrange - double ALU and higher clocks.

Let's be honest everyone, AMD could release a GPU that matched Nvidia in every way including raytracing, and have FSR equal to DLSS in every way and charge less than Nvidia for it, and everyone would STILL buy Nvidia (which only proves consumer choices are quite irrational and are NOT decided by simply comparing specs, as the existence of fanboys should testify to...)...and as long as that's true, the GPU market will ALWAYS be hostile to consumers. The ONLY way things are going to improve for consumers is if AMD starts capturing marketshare and Nvidia is punished by consumers... but based on historical precedent, I have no hope for that...
FSR 2.0 is a huge step over FSR 1.0, which was absolutely useless, but still DLSS is quite ahead, not only in picture quality, but also in performance, thank to dedicated blocks.

Thank you for this. I really appreciate it.
Wow! I cannot believe that Intel waited for so long to tell us that only A750 and A770 Limited Edition cards have PCON converter chip on PCB for HDMI 2.1 FRL signal (unclear if 24Gbps?, 32 Gbps?, 40 Gbps? or 48 Gbps?), and all other cards are HDMI 2.0 with 18 Gbps speed.

This is what happens when HDMI Administrator publishes their decision to rebrand 2.0=2.1. And companies sell us a total mess...
Why would anyone want to use HDMI 2.1, when there is DisplayPort UHBR 20, which is, according to rumors, will be supported by Arc?
 
Last edited:
Joined
Aug 25, 2021
Messages
1,183 (0.97/day)
Why would anyone want to use HDMI 2.1, when there is DisplayPort UHBR 20, which is, according to rumors, will be supported by Arc?
It's UHBR10 port at 40 Gbps, which is still being under certification process by VESA. Have a look at A380 spec on Intel's website.
Anyone can use either port and match it with whatever monitor or TV port they have at home.
Why not use HDMI 2.1 if someone has 4K/120 OLED TV? None of TVs has DisplayPort.
HDMI 2.1 port is currently superior in terms of bandwidth until DP 2.0 ports start working finally and become more mainstream.

Nvidia customers will unfortunately not enjoy this pleasure until 2024 and 5000 series, as the company has disappointed big time with omission of DP 2.0 on cards that cost up to $1600. Shambles. So much for promoting innovative connectivity technologies.
 
Joined
Nov 11, 2016
Messages
3,459 (1.17/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Man people don't know what TSMC is charging way more dough than Samsung, Samsung 8N cost ~4k usd while TSMC 5nm cost ~17k usd per wafer according to 2020 data

AD104 chip could cost 2x as much as GA102 chip that are on 3090Ti/3090/3080Ti/3080

That means Ampere will have the cost benefit for now while ADA will have performance and efficiency benefit.
 
  • Haha
Reactions: ARF
Joined
Oct 26, 2019
Messages
169 (0.09/day)
It's UHBR10 port at 40 Gbps, which is still being under certification process by VESA. Have a look at A380 spec on Intel's website.
A380 is ACM-G11 chip. Rest will be ACM-G10. Specs aren't fully announced yet.

Why not use HDMI 2.1 if someone has 4K/120 OLED TV?
It's paid, it's proprietary and it's wporse in any way. Why paying for that? Industry should completely abolish HDMI.
 
Top