• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Potential graphics card shortage incoming... interesting idea. A.I. demand taking up TSMC space.

Joined
Jun 21, 2021
Messages
3,093 (2.51/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
This demand for machine learning hardware should not come as a surprise to anyone who follows semiconductor technology.

Nvidia has enjoyed a flourishing and fast growing AI business for several years. You can read about various companies and industries and how they use machine learning under the enterprise focused sections of Nvidia's website, companies like grocery chains doing inventory analysis.

Also, Nvidia's datacenter business eclipsed their gaming business a while ago (a year ago? maybe more?).

As we know, from a consumer angle, Nvidia's most prominent machine learning feature has been DLSS so far, but other applications have been in use for a while. I've used both Nvidia Broadcast and Nvidia Canvas for a couple of years.

Apple put machine learning cores (their "Neural Engine") starting on their A11 SoC back in 2017. They opened up access to the machine learning for third-party developers the following year with the A12 SoC.

Without a doubt Nvidia will continue servicing the graphics industry rather than becoming an AI pure play.

Perhaps the most interesting thing about Jensen's SIGGRAPH keynote was the heavy emphasis on the Nvidia Omniverse platform as a cloud technology. You can do the basic prototyping on an AI hardware equipped workstation but do all of the analysis and other heavy lifting on datacenter-hosted cloud systems.

That means Nvidia isn't planning on selling every GPU chip. They rent out GPUs for AI just like Amazon AWS has been renting out excess computing resources for over twenty years.

In the long run, a big corporation will find it more cost effective to buy their own GPU hardware and integrate it into their data centers. But for smaller companies, startups, and newcomers to machine learning, renting GPU cycles from Nvidia is possibility, just like using Amazon EC2 instances for small computing projects.

CAGR for the machine learning market blows doors over the consumer graphics business which plods along. Nothing new there, it has been in Nvidia's quarterly earning statements and the slide decks they post on their site. Nvidia is putting their focus on the market with the most upside in the next ten years. They would be stupid to churn out graphics cards for the DIY consumer PC market since the margins are so much higher elsewhere. Nvidia is a publicly owned company and their number one priority is to increase shareholder value.
 
Last edited:
D

Deleted member 229121

Guest
Even if the demand is high I don't see things being as bad as they were during the pandemic coupled with a Crypto boom.
At least this is being planned for, that was multiple crap-storms colliding all at once.

Already have a 4090 dgaf.

Saturday Night Live 90S GIF


Had to lol.
 
Joined
Aug 6, 2020
Messages
729 (0.47/day)
Who did not know this were not paying attention, for one of many examples if nVidia are not relaying on buying video cards they can mark up as much as they like and AMD will follow along with Intel if they get their shit together.

yeah, the continued disappearance of sub 350 GPUs from NVIDIA a year post-etherium at msrp at major retailers is a pretty clear sign where the industry is headed!

the move to 8gb ram/128-bit bus on 4060 /ti is a step in the right direction, but it didn't prevent the 3050 from being 50-100 over msrp since launch!

the only thing that dropped those 3050 prices were the 4060, and now the 4060 looks to be the first value card worth buying in years!

 
Joined
Sep 10, 2018
Messages
6,831 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Even if the demand is high I don't see things being as bad as they were during the pandemic coupled with a Crypto boom.
At least this is being planned for, that was multiple crap-storms colliding all at once.



Saturday Night Live 90S GIF


Had to lol.

I've actually expected this for a while being the next thing and even if it isn't as bad as the crypto/pandemic shortages we are already paying more than prior generations with pretty much every sku being moved up a tier in pricing or offering little to no improvements gen on gen.

With the recent rumors of AMD ditching the high end and Nvidia solely caring about AI it isn't a very pretty picture for gamers.
 
Joined
Jan 18, 2020
Messages
804 (0.46/day)
What is the killer end user app for LLMs and how do I use and / or invest in it?
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,064 (3.00/day)
Location
UK\USA
I honestly don't see the economic payback there... it would seem to my noobish experience with AI that it would be much cheaper and more efficient for someone to rent AWS or AZURE rigs for AI than try to build out a gfx farm....

Also isn't the value of AI in the training and the application? What is the application of a small ai farm?

Getting bad already,already fake youtube streamers and some charge to watch too.
 
Joined
Feb 3, 2023
Messages
212 (0.33/day)
Cryptocurrency was easy, just buy as many GPUs as you can afford, run a miner and profit. Great for the little guy, not very enticing for corporations since they mostly avoid dynamic and volatile markets. With generative tools, and soon AI, it's different, they're here to stay so corporations will divert manufacturing for their needs and the little guy can still profit from this. I personally know several people who already started building small "AI farms", creating fake social media/internet prostitution media accounts and begun, as they call it, "milking simps" with generated photos, chatbots posing as females and such. As it turns out, it's a much higher ROI than cryptocurrency because this kind of consumer is surprisingly naive and easy to separate from his money - and there's a lot of them. You can start by using available services but, as demand increases, it becomes much more cost effective to run your own. It's also very much a parallelized workload (since you're serving multiple people at once) so splitting it between several "consumer-grade" machines is not an impediment.
That being said, I'm sure that corporations diverting their attention and manufacturing capacity towards the much more profitable market will be felt by consumers more significantly.
 
Joined
Jan 18, 2020
Messages
804 (0.46/day)
That all existed for a long time. People are writing their own model and training it, or it's off the shelf code from somewhere? That is once they've got the hardware to run it.

To me this is all the biggest load of bs in a long time. Sure, there are some decent use cases for LLMs. Nothing in it screams some massive game changer, just better chat bots and programming tools for the most part.

There's more utility than blockchain of course, not very difficult!
 
Joined
Feb 3, 2023
Messages
212 (0.33/day)
Such things existed, but were much less advanced. Now you can easily generate "good enough" looking photos in seconds and create much more personalized and realistic language models. There is an increasing number of off the shelf solutions which you can easily customize to behave like a single person with a defined personality.

For me one of the best use cases - a game changer so to speak - is something like the "summarizer" in the Brave search. No visiting random websites with bullshit SEO texts to get to information I need, just get straight, short answer.
 
Joined
Jun 21, 2021
Messages
3,093 (2.51/day)
System Name daily driver Mac mini M2 Pro
Processor Apple proprietary M2 Pro (6 p-cores, 4 e-cores)
Motherboard Apple proprietary
Cooling Apple proprietary
Memory Apple proprietary 16GB LPDDR5 unified memory
Video Card(s) Apple proprietary M2 Pro (16-core GPU)
Storage Apple proprietary onboard 512GB SSD + various external HDDs
Display(s) LG UltraFine 27UL850W (4K@60Hz IPS)
Case Apple proprietary
Audio Device(s) Apple proprietary
Power Supply Apple proprietary
Mouse Apple Magic Trackpad 2
Keyboard Keychron K1 tenkeyless (Gateron Reds)
VR HMD Oculus Rift S (hosted on a different PC)
Software macOS Sonoma 14.7
Benchmark Scores (My Windows daily driver is a Beelink Mini S12 Pro. I'm not interested in benchmarking.)
For anyone really interested in this topic, watch Jensen's keynote from this week's SIGGRAPH conference.

There's far more to AI than LLM. In fact, he barely mentioned LLM during his presentation which is understandable since it's a computer graphics conference.
 
Joined
Jan 5, 2006
Messages
18,584 (2.70/day)
System Name AlderLake
Processor Intel i7 12700K P-Cores @ 5Ghz
Motherboard Gigabyte Z690 Aorus Master
Cooling Noctua NH-U12A 2 fans + Thermal Grizzly Kryonaut Extreme + 5 case fans
Memory 32GB DDR5 Corsair Dominator Platinum RGB 6000MT/s CL36
Video Card(s) MSI RTX 2070 Super Gaming X Trio
Storage Samsung 980 Pro 1TB + 970 Evo 500GB + 850 Pro 512GB + 860 Evo 1TB x2
Display(s) 23.8" Dell S2417DG 165Hz G-Sync 1440p
Case Be quiet! Silent Base 600 - Window
Audio Device(s) Panasonic SA-PMX94 / Realtek onboard + B&O speaker system / Harman Kardon Go + Play / Logitech G533
Power Supply Seasonic Focus Plus Gold 750W
Mouse Logitech MX Anywhere 2 Laser wireless
Keyboard RAPOO E9270P Black 5GHz wireless
Software Windows 11
Benchmark Scores Cinebench R23 (Single Core) 1936 @ stock Cinebench R23 (Multi Core) 23006 @ stock
Last edited:
Joined
Aug 14, 2013
Messages
2,373 (0.58/day)
System Name boomer--->zoomer not your typical millenial build
Processor i5-760 @ 3.8ghz + turbo ~goes wayyyyyyyyy fast cuz turboooooz~
Motherboard P55-GD80 ~best motherboard ever designed~
Cooling NH-D15 ~double stack thot twerk all day~
Memory 16GB Crucial Ballistix LP ~memory gone AWOL~
Video Card(s) MSI GTX 970 ~*~GOLDEN EDITION~*~ RAWRRRRRR
Storage 500GB Samsung 850 Evo (OS X, *nix), 128GB Samsung 840 Pro (W10 Pro), 1TB SpinPoint F3 ~best in class
Display(s) ASUS VW246H ~best 24" you've seen *FULL HD* *1O80PP* *SLAPS*~
Case FT02-W ~the W stands for white but it's brushed aluminum except for the disgusting ODD bays; *cries*
Audio Device(s) A LOT
Power Supply 850W EVGA SuperNova G2 ~hot fire like champagne~
Mouse CM Spawn ~cmcz R c00l seth mcfarlane darawss~
Keyboard CM QF Rapid - Browns ~fastrrr kees for fstr teens~
Software integrated into the chassis
Benchmark Scores 9999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999999
That’s been happening for centuries… there are far higher stakes for AI if you’re worried about shallow/deep-fakes
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
19,064 (3.00/day)
Location
UK\USA
That’s been happening for centuries… there are far higher stakes for AI if you’re worried about shallow/deep-fakes

Yeah it was one of many examples, it was just one were i thought i might not get accused of some thing.
 
Joined
Jan 14, 2019
Messages
12,247 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
I honestly don't see the economic payback there... it would seem to my noobish experience with AI that it would be much cheaper and more efficient for someone to rent AWS or AZURE rigs for AI than try to build out a gfx farm....

Also isn't the value of AI in the training and the application? What is the application of a small ai farm?
Exactly my thoughts watching the video... home-built AI farms... yeah, sure, for what purpose?
 
Joined
Sep 17, 2014
Messages
22,300 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I've actually expected this for a while being the next thing and even if it isn't as bad as the crypto/pandemic shortages we are already paying more than prior generations with pretty much every sku being moved up a tier in pricing or offering little to no improvements gen on gen.

With the recent rumors of AMD ditching the high end and Nvidia solely caring about AI it isn't a very pretty picture for gamers.
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,058 (4.65/day)
Location
Kepler-186f
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.

that's all very subjective mate, I mean Endwalker is a relatively new game and is my all time favorite game, and Dawntrail the next game is probably also going to be a lot of fun for me.

and until I can push 150+ fps in all games at 1440p ultra, raytracing turned off, I won't be fully content. I prefer the extra smoothness of high refresh even if you don't
 
Joined
Sep 17, 2014
Messages
22,300 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
that's all very subjective mate, I mean Endwalker is a relatively new game and is my all time favorite game, and Dawntrail the next game is probably also going to be a lot of fun for me.

and until I can push 150+ fps in all games at 1440p ultra, raytracing turned off, I won't be fully content. I prefer the extra smoothness of high refresh even if you don't
Preference and necessity are two entirely different things. In the end you play what you can run, right?

Though content wise sure, new games still get released, but they dont need newer graphics. Much of the graphical improvement we get now is more of a sidegrade, or its a GPU trick like DSR and DLSS, with its own traits. Its barely up to the game now; all engines look fine, its what devs put in the game in terms of art and design what makes the difference - and that echoes down history. Graphics are the technology, not the content, and a means, not a purpose.
 
Joined
Jan 14, 2019
Messages
12,247 (5.77/day)
Location
Midlands, UK
System Name Nebulon B
Processor AMD Ryzen 7 7800X3D
Motherboard MSi PRO B650M-A WiFi
Cooling be quiet! Dark Rock 4
Memory 2x 24 GB Corsair Vengeance DDR5-4800
Video Card(s) AMD Radeon RX 6750 XT 12 GB
Storage 2 TB Corsair MP600 GS, 2 TB Corsair MP600 R2
Display(s) Dell S3422DWG, 7" Waveshare touchscreen
Case Kolink Citadel Mesh black
Audio Device(s) Logitech Z333 2.1 speakers, AKG Y50 headphones
Power Supply Seasonic Prime GX-750
Mouse Logitech MX Master 2S
Keyboard Logitech G413 SE
Software Bazzite (Fedora Linux) KDE
Preference and necessity are two entirely different things. In the end you play what you can run, right?

Though content wise sure, new games still get released, but they dont need newer graphics. Much of the graphical improvement we get now is more of a sidegrade, or its a GPU trick like DSR and DLSS, with its own traits. Its barely up to the game now; all engines look fine, its what devs put in the game in terms of art and design what makes the difference - and that echoes down history. Graphics are the technology, not the content, and a means, not a purpose.
Where have the times gone when we played Doom 95 Shareware locked at 30 FPS and had tons of fun?
 
Joined
Sep 10, 2018
Messages
6,831 (3.04/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
Its fine, just remove RT and unreal engine and we can run any game on our current hardware. Gaming graphics have been done for over ten years, really. It shows because all improvement today is in fact regression. We already had all of what we get sold as new today. The best games are in history not on the horizon; this has been the case for years now.

I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.

Where have the times gone when we played Doom 95 Shareware locked at 30 FPS and had tons of fun?

We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.
 

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,058 (4.65/day)
Location
Kepler-186f
I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.



We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.

im a high refresh snob, and many know that here. but steam deck 30 fps capped games look great on Deck, for some reason I honestly don't mind capped games on the Deck, not sure if Linux is smoothing out the frames or what, but it just looks better on Deck at 30 fps than 30 fps on Windows. not sure why.

but yeah I love my steam deck for capped games, compliments my high refresh gaming pc nicely. i imagine steam deck 2 being OLED will improve that w.e it is doing even more. so can't wait for that.
 
Joined
Sep 17, 2014
Messages
22,300 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I mean I'm still looking forward to games like Gears of War 6 and Witcher 4 or whatever it's called really pushing the evelope when it comes to visual design. I mean a big reason I bought my 4090 was to play witcher 3 NG maxed at 4k and while the base graphics are more than fine the game is overall more immersive maxed out even some more recent games look quite a bit worse in my opinion than the full RT version of it. Now if people want to play old games good for them with 0 visual improvements good them.



We have higher expectations nearly 30 years later. Although plenty of people game on consoles or steamdeck at 30fps and are fine with it. I never liked it not even in the 90s.
We do, and yet, I never really cared much for them. I mean, I enjoyed watching graphics grow from Pong to The Witcher 3, but I never had the feeling graphics make or break a game. Mechanics, content, world/setting makes the game for me. Whether its Mystic Quest on the SNES or Skyrim with six dozen visual enhancers, I cant say I have more fun in Skyrim. On the contrary, perhaps even, because the focus is on 'how good can I make it look' rather than 'how awesome is the gameplay'. Mystic Quest is a similar game with a large game world and RPG stuff, but has actual functional combat, while Skyrim is the same, but all the focus on presentation left us with broken combat, ultra clunky movement and we can jump up the side of mountains for whatever reason. Yay for 3D...

Cyberpunk in RT. I honestly didnt care for it one second. I tried another playthrough on the 7900XT. Its just completely skippable to me, because the content was old. Okay things are more shiny. Its a gimmick the moment I realized it doesnt affect the game, and it really didnt add to immersion either. Immersion happens when the game grabs me, not its graphics, and frankly, CBP looks plasticky, artificial, and stylized rather than hyperrealistic, and RT makes the whole thing a reflective parody of itself almost.
 
Last edited:
Joined
Aug 10, 2023
Messages
341 (0.75/day)
Mechanics, content, world/setting makes the game for me.
Right, for me the game is about how it feels, foremost, the atmosphere it gives, the vibes. Had never anything to do with the technology it uses, far more, how well the game was made, how much love the dev put into it, for example.
 
Top