• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Alleged NVIDIA AD102 PCB Drawing Reveals NVLink is Here to Stay, Launch Timelines Revealed

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
An alleged technical drawing of the PCB of reference-design NVIDIA "Ada" AD102 silicon was leaked to the web, courtesy of Igor's Lab. It reveals a large GPU pad that's roughly the size of the GA102 (the size of the fiberglass substrate or package, only, not the die); surrounded by twelve memory chips, which are likely GDDR6X. There are also provision for at least 24 power phases, although not all of them are populated by sets of chokes and DrMOS in the final products (a few of them end up vacant).

We also spy the 16-pin ATX 3.0 power connector that's capable of delivering up to 600 W of power; and four display outputs, including a USB-C in lieu of a larger connector (such as DP or HDMI). A curious thing to note is that the card continues to have an NVLink connector. Multi-GPU is dead, which means the NVLink on the reference design will likely be rudimentary in the GeForce RTX product (unless used for implicit multi-GPU). The connector may play a bigger role in the professional-visualization graphics cards (RTX AD-series) based on this silicon.



Igor's Lab also put out timelines for the product-development cycle of the next-generation GeForce RTX product based on the AD102. It speaks of a bill-of-materials release in June 2022 (which should help NVIDIA finalize pricing); design and validations running through the Summer (June thru August), with mass-production commencing either toward the end of August, or sometime in September. It could be late-October or November by the time these cards reach the shelves. This leaves NVIDIA with quite some time to get the market to digest existing inventory of GeForce "Ampere" graphics cards, which could see steadily decreasing prices and increased market availability. Thanks to the crypto-crash, miners could pump the market with used cards at attractive prices, which could get some attention, too.

View at TechPowerUp Main Site | Source
 
Joined
Feb 27, 2013
Messages
445 (0.10/day)
Location
Lithuania
October/November instead of July/August as the leakers were saying. Not surprising.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
This seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Why would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
 
Joined
Sep 17, 2014
Messages
22,461 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Why would anyone not want to see multi-GPU configurations healthy, up and running?
AMD's CrossFire has been a very nice approach to get some more FPS, significantly so in the best optimised scenarios.

You can't get any more performance out of a single Radeon RX 6950 XT but if you pair two in CF, you probably will get at least 50-60% FPS uplift on average..


Using multiple chiplets on a single PCB is a very similar approach to multi-GPU on several PCBs.
Power usage / scaling. More chips for equal perf. Frametime consistency issues. Required dev and driver support . Engine support. Heat and limited OC potential because of top gpu...

Want more reasons? This was really never a great idea, but rather a necessity to cater to a higher end segment. Now that demand/segment has a GPU stack with a higher price ceiling instead. And much bigger perf gap from top to bottom.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Power usage / scaling. More chips for equal perf. Frametime consistency issues. Required dev and driver support . Engine support. Heat and limited OC potential because of top gpu...

Want more reasons? This was really never a great idea, but rather a necessity to cater to a higher end segment. Now that demand/segment has a GPU stack with a higher price ceiling instead. And much bigger perf gap from top to bottom.

I see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
 
D

Deleted member 177333

Guest
Ya I really miss multi gpu. Had a great experience with all of my mGPU setups and with DX12 / Vulkan having the support integrated to where the devs just have to code in their games for it, I had hoped it would be a good thing for the market and remove the whole driver profile requirement thing.
 
Joined
Sep 17, 2014
Messages
22,461 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
The effort just isn't economically interesting at all.

It never really was, it was always a highly wasteful practice. Anything that won't scale 100% is a waste in that sense. The reasons SLI/Crossfire exist are simple: the price point of lower tier GPUs in duo was often competitive against a single top end GPU, and by combining two top end GPUs, you could get a ridiculous amount of frames. Badly scaled, regardless, but big numbers.

So what market was this really for anyway?! The budget gamer that wants more bang for buck (and gets along with a 10% savings a whole bunch of problems), and a niche that now buys upwards of 2000 dollar GPUs anyway.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
I don't think it would cost that much to pay to good software developers to write the code properly to support CF.
You probably know that scaling with shaders count addition is never linear - 100% more shaders doesn't mean 100% higher FPS.

Also, CF on the opposite, does have the intention to sometimes mitigate the bottlenecks elsewhere and give higher than 100% FPS uplift.

1654191459088.png

Crossfire scaling for the Fury X is over 100% (Firestrike) with voltage unlocked + OC : Amd (reddit.com)

Fury X Crossfire Almost 100% scaling @ 4k ! - Graphics Cards - Linus Tech Tips

Radeon 7770 crossfire 100% scaling @ 1200p | Tom's Hardware Forum (tomshardware.com)
 
Joined
Sep 17, 2014
Messages
22,461 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I don't think it would cost that much to pay to good software developers to write the code properly to support CF.
You probably know that scaling with shaders count addition is never linear - 100% more shaders doesn't mean 100% higher FPS.

Also, CF on the opposite, does have the intention to sometimes mitigate the bottlenecks elsewhere and give higher than 100% FPS uplift.

View attachment 249673
Crossfire scaling for the Fury X is over 100% (Firestrike) with voltage unlocked + OC : Amd (reddit.com)

Fury X Crossfire Almost 100% scaling @ 4k ! - Graphics Cards - Linus Tech Tips

Radeon 7770 crossfire 100% scaling @ 1200p | Tom's Hardware Forum (tomshardware.com)
Okay. You think you, but the entire market killed this idea and not because they hate making money. Fact of the matter is that even DX12's mGPU plan was dropped like a stone.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Yes, yes. :rolleyes:

The market has two options:
1. Top-down forced kill-off which doesn't take the users' preferences and likes;
2. Bottom-up when the users force the game developers to get their act together and begin to resurrect the CF/multi-GPU support:

AMD Radeon RX 6800 XT tested in Multi-GPU configuration - VideoCardz.com

As expected, the mGPU support makes more sense as the resolution increases (4K is heavily GPU bound). Depending on the game settings we are looking at a 101% to 156% performance scaling at 1080p resolution when two RX 6800 XT cards are used. At 1440p Deus Ex the performance even drops to 95% when mGPU is enabled. Overall it appears that Strange Brigade and Sniper Elite 4 are well optimized for mGPU settings:
1654198826639.png


1654198860275.png
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
26,975 (3.83/day)
Location
Alabama
System Name RogueOne
Processor Xeon W9-3495x
Motherboard ASUS w790E Sage SE
Cooling SilverStone XE360-4677
Memory 128gb Gskill Zeta R5 DDR5 RDIMMs
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 2TB WD SN850X | 2x 8TB GAMMIX S70
Display(s) 49" Philips Evnia OLED (49M2C8900)
Case Thermaltake Core P3 Pro Snow
Audio Device(s) Moondrop S8's on schitt Gunnr
Power Supply Seasonic Prime TX-1600
Mouse Lamzu Maya Grey
Keyboard Monsgeek M3 Lavender, Moondrop Luna lights
VR HMD Quest 3
Software Windows 11 Pro Workstation
Benchmark Scores I dont have time for that.
miners could pump the market with used cards at attractive prices, which could get some attention, too

While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
 
Joined
Aug 20, 2007
Messages
21,476 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 905p Optane 960GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
That's likely to hurt the gpu buyers more than the miners, honestly.
 
Joined
Jul 27, 2020
Messages
129 (0.08/day)
I miss multiple Gpu's. I still got the 1500 watt power supply from my 295x2 and 290x tri-fire. That was a space heater. I don't care about power usage or heat. I want to have fun.
 
Joined
Feb 3, 2005
Messages
499 (0.07/day)
This seems obvious, NVlink is for the data center/enterprise segment so it'll stick around


SLI on the other hand, is dead. so very very dead.

Multi-chip modules (or chiplets) are next gen GPUs as there are some major advantages over huge monolithic dies.

The same engineering that makes the chiplets work for GPU workloads may also allow some synergy with chiplets on other packages/cards.

So, I'm not sure it's dead. Very exciting stuff.
 
Joined
Jun 11, 2017
Messages
276 (0.10/day)
Location
Montreal Canada
Who needs Multi GPU any more. The hard core gamers have gotten soft. They have settled for less for more. 1440 is now the new 1080p. gamers basicly went into submission. I myself was looking forward to 2160p and even 8K gaming on home computers. SLI and Multi-CPU's would have taken us there. But nope everyone wanted 1440P and the More HZ the more it Matters. I remember playing on my 27 inch Flattron Samsung TV running 1024x1280 on my voodoo 2's in SLI playing quake 3 arena at 60 hz and over 500 FPS man that was smooth. Then we switched to 1080p and games like crysis and farcry came out and our mouths dropped to the floor. We all needed need video cards to run these new games I remember my 8800GTX Just to get 60 fps was hard. It took a few years before the tech improved enough we could play 1080p with crysis and farcry running above 200 fps. Then the next big jump was 2160p that 4k crown was amazing and who could do it and game at it was jaw dropping. AT the time I had two 660ti's in SLI and at 1080p it ran everything so sweet but when I switched to 4k OMG games that did 150fps dropped to 5 fps. DOOM was the big thing back in 2016 and running it at 4k was like the KING of the gaming crown. I ran my 660ti in 1080p on my 4k monitor till I got twin 1080ti's in SLI. Finally 4k was a real. I was jaw dropping in gamer heaven. 100+ FPS in DOOM 2016 4k man it was sweet. I even had friends over and man they wanted that setup. When the 2080 series came out I look at it and said to myself where is SLI I wanted to see improvments to 4k I wanted to see nvidia do the same as in the past faster GPU and less power. I looked at all the benchmarks and reviews and never saw a huge giant leap in tech to 4k or 8k. Then I kept hearing about the 3000 series. When the 3000 series came out it was all about 1440p. I was like wait a min did I miss something. Are we going backwards in time now and not forward. Why the hell would I want to go from 2160p to 1440p. I wanted to see improvments in the high end. I wanted to see a push for even 8k gaming. But no nothing here is a super expensive 3090 card thats does great at 1440p. Why on this earth would I spend a huge ammount of money on something that hardly is an improvement. All the review sites raving about the new card oh the output is insane. Bunch freaking lies all I heard. Also the power needed to run these cards was triple or even more than two 1080ti's. Nvidia roadmap in 2016 was more GPU power higher FPS and less power consumption. Now it seem more Power Consumption less GPU power and sell it to gamers and make them think they are getting the best. I tested a 3090 just too see if it was better I ran all my games at 4k and well average fps was maybe 10 FPS higher than my 1080ti's in SLI no real big improvement at all. I took the card back and said I want a refund. Guy at the counter said to me why. I said till nvidia actually starts making improvments on video cards. Even today 8k is becoming the new norm and 12k is starting to come out. I will wait and see about the 4000 series again if I see 4k is doing over 200 fps then I will be impressed and I might buy one to replace my SLI setup. Till then Nvidia and AMD do not impress me one dame bit. Also the Gamers and Reviews stuck in the past at 1440P Grow up!
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,247 (6.64/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
While I know this will happen, I hope that every. single. one of them has to eat the cost of that card and they don't sell.
There is one on here trying to sell many for 875 a piece, i doubt anyone has responded
 
Joined
May 11, 2018
Messages
1,257 (0.52/day)
Who needs Multi GPU any more. The hard core gamers have gotten soft. They have settled for less for more. 1440 is now the new 1080p. gamers basicly went into submission...
... Also the Gamers and Reviews stuck in the past at 1440P Grow up!

I never had that feeling. All reviewers now also include 4K numbers, and there are always articles about how to get there even with lower end cards. And high end cards of this generation got more uplift with the higher resolution, so 3080 and upwards really show their strenght in 4K.

What I got the feeling was rather that the screen industry is somewhat lagging. It took a long time for relatively good IPS 27, 28 inch 4K high fps monitors to arrive, and high fps 4K 30+ inch monitors are just arriving. But they are pricey, and it's hard for monitor companies to sell us old technologies (VA, IPS) with all their flaws for premium when we can go and buy 55" OLED high fps for roughly the same money? Something isn't priced right here!
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
What I got the feeling was rather that the screen industry is somewhat lagging. It took a long time for relatively good IPS 27, 28 inch 4K high fps monitors to arrive, and high fps 4K 30+ inch monitors are just arriving. But they are pricey

I think we have just reached the historical low for the 3840x2160 screens price level:

1654239495461.png

4K Monitor Preisvergleich | Günstig bei idealo kaufen
 

bug

Joined
May 22, 2015
Messages
13,780 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I see but also we have sufficiently improved AI technologies that can be used for better frame time consistency, AFR related micro-stutter, etc.
I guess someone didn't want to put enough effort in making the whole initiative up and running well.
Spot on! I mean, they just launched SLI in 2004 and bam! only a short 18 years later, they give up.

The idea did not work, plain and simple. Most of its touted appeal was "instead of an upgrade, just throw a second *cheap) card in there and you're good". That clearly didn't work, buying a new generation card was always better. SLI/Crossfire only made some sense for those that already bought highest end GPUs and could get better performance anywhere else. But since those are a tiny fraction of the market, the business case just wasn't there.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Spot on! I mean, they just launched SLI in 2004 and bam! only a short 18 years later, they give up.

The idea did not work, plain and simple. Most of its touted appeal was "instead of an upgrade, just throw a second *cheap) card in there and you're good". That clearly didn't work, buying a new generation card was always better. SLI/Crossfire only made some sense for those that already bought highest end GPUs and could get better performance anywhere else. But since those are a tiny fraction of the market, the business case just wasn't there.

It works. Just with some compromises but still better than a single and much slower one-GPU setup..

You mean two Radeon RX 6800 XT 16 GB in CrossFire in 2020 are better than one Radeon RX 7800 XT in 2023. :D
 
Joined
May 11, 2018
Messages
1,257 (0.52/day)
SLI / Crossfire was always completely uninteresting to me, since they never worked correctly with less popular, but still graphically demanding games such as flight simulators.

Sure, reviews could gather a handful of games that did scale well with two cards (even if the issues with frame pacing were largely ignored), and they did show you that in some games it doesn't work at all, and you could even get negative scaling...

But if you weren't playing latest AAA game with the latest gaming engine, you were screwed. And even with games that did work every game update, every new driver meant new fears that something will be broken. SLI / Crossfire was never treated as a function that had to work.
 

bug

Joined
May 22, 2015
Messages
13,780 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It works. Just with some compromises but still better than a single and much slower one-GPU setup..

You mean two Radeon RX 6800 XT 16 GB in CrossFire in 2020 are better than one Radeon RX 7800 XT in 2023. :D
Once again, those GPUs are well below10% market share. There's no business case here.
 

ARF

Joined
Jan 28, 2020
Messages
4,670 (2.64/day)
Location
Ex-usa | slava the trolls
Once again, those GPUs are well below10% market share. There's no business case here.

Sometimes you must invest more efforts in brand image and reputation.
Of course, my expectations are too high, since we all know that AMD's business case is very poor, and its direct competitors manage the market better..
 

bug

Joined
May 22, 2015
Messages
13,780 (3.96/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Sometimes you must invest more efforts in brand image and reputation.
Of course, my expectations are too high, since we all know that AMD's business case is very poor, and its direct competitors manage the market better..
And there is zero change that after more than 15 years, they simply gave up because it just didn't make sense paying for testing, support and keeping profiles updated for games new and decade-old. Right?
 
Top