• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Completes Acquisition of Silo AI

Joined
Nov 4, 2005
Messages
12,057 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Saying this is like the dotcom bubble isn't the negative implication you seem to think it is. Online retail rose during the dotcom bubble and became successful because of the funding it provided:


"In 2000, the dot-com bubble burst, and many dot-com startups went out of business after burning through their venture capital and failing to become profitable.[5] However, many others, particularly online retailers like eBay and Amazon, blossomed and became highly profitable."

This is the same for any new market, a ton of startups pop up and most of them go out of business finding what does and doesn't work.

If you are implying that AI has no use you'd be dead wrong, just from a medical and engineering perspective it's already indispensable.

The AI of everything will burst.

Bubble? Burst? It was a medium-size correction before the cancer-like growth that hasn't stopped until now, it just ceased a bit in 2023. One more bubble burst like that, and Nvidia + Amazon + Apple + Microsoft + Google + a couple AI newcomers will constitute 80% of the world economy before 2050.


Its a bubble that is going to burst once people realize how little actual work it is doing and how many resources its sucking up for mediocre AI BS, it can't make me coffee yet. The dot com bubble burst when people didn't flock to all the websites that were flashy but did nothing, this will be the same, then it will be put to use where it should, drug research, climate modeling, DNA function modeling, physics. The fact that even with a stack of hardware it doesn't run the data density means its never going to scale to truly functional status with current silicon limitations, except in some supercomputers or for niche applications. A company I work with uses machine learning to determine seed health/viability in almost real time, and it takes a whopping 2 GPU's. I don't need AI to text my wife and ask whats for dinner.
 
Joined
Jul 13, 2016
Messages
3,418 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
The AI of everything will burst.

That is not going to happen given AI is already used in designing computer chips and many other products.

As an example, AI based OPC provides a much higher quality lithography mask that will be more resistant to errors and variation. Part of that variation is down to the difference between a design and what the lithography equipment actually produces. Of which AI based OPC is more capable of accounting for and adjusting the design so that it more correctly reflects the intended design.

In addition, there are already products on the market that utilize AI based reductive design (in fact you can checkout the Titan fingertip mouse for an example). This is a process by which a design is optimized by an AI through the removal of necessary material. This single use of the technology alone has the potential to reduce cost and waste on a massive scale.

The fact that even with a stack of hardware it doesn't run the data density means its never going to scale to truly functional status with current silicon limitations, except in some supercomputers or for niche applications. A company I work with uses machine learning to determine seed health/viability in almost real time, and it takes a whopping 2 GPU's. I don't need AI to text my wife and ask whats for dinner.

I wouldn't take a non-purpose built device like a GPU as a true measure of what future AI ASICs will be able to acomplish perf per watt wise.
 
Joined
May 3, 2018
Messages
2,881 (1.17/day)
Saying this is like the dotcom bubble isn't the negative implication you seem to think it is. Online retail rose during the dotcom bubble and became successful because of the funding it provided:


"In 2000, the dot-com bubble burst, and many dot-com startups went out of business after burning through their venture capital and failing to become profitable.[5] However, many others, particularly online retailers like eBay and Amazon, blossomed and became highly profitable."

This is the same for any new market, a ton of startups pop up and most of them go out of business finding what does and doesn't work.

If you are implying that AI has no use you'd be dead wrong, just from a medical and engineering perspective it's already indispensable.
Well my research group was basically wiped out and hundreds of us lost our jobs. And we were a high-tech optics group doing fibre optics etc not some retailer.
 
Joined
Apr 24, 2020
Messages
2,794 (1.61/day)
As an example, AI based OPC provides a much higher quality lithography mask that will be more resistant to errors and variation. Part of that variation is down to the difference between a design and what the lithography equipment actually produces. Of which AI based OPC is more capable of accounting for and adjusting the design so that it more correctly reflects the intended design.

In addition, there are already products on the market that utilize AI based reductive design (in fact you can checkout the Titan fingertip mouse for an example). This is a process by which a design is optimized by an AI through the removal of necessary material. This single use of the technology alone has the potential to reduce cost and waste on a massive scale.

As I've said before: its never called "AI" when its a true product with an actual use.

Various steps of chip manufacturing are called automated theorem proving, binary decision diagrams, verification, layout and routing, and more. They were once called "AI" in the 1980s back when there was another AI Bubble, but today these "old AI" tricks have proper terms that everyone understands and has an ability to describe why one algorithm is better than another.

Ex: Automated Layout and Routing could be tuned for less space, or higher performance for different chip designs. Its no longer "AI", its just a tool used by chip manufacturers.

And mind you: these tools don't need data-centers full of H100s in them. The tool that's actually kinda-sorta useful for this task is x3D Cache EPYC servers IIRC.

----------

Today's modern AI is summarized as:

1. Buy (or rent) NVidia chips

2. Run them with huge amounts of electricity.

3. ???!?!?!?!????

4. Profit.

And everyone seems to trip up at what step#3 entails exactly. Don't get me wrong, there's plenty of legitimate uses at step#3. But in my experience, the people who know what that task is don't call it "AI". They call it "Bin Packing", or "NP Complete solution", or "Automated Routing". More specific words that other technical people encode so that we know who the bullshitters vs real wizards are.

Not that I'm a high-wizard or anything. But I've seen them talk once or twice.
 
Joined
Aug 22, 2007
Messages
3,617 (0.57/day)
Location
Terra
System Name :)
Processor Intel 13700k
Motherboard Gigabyte z790 UD AC
Cooling Noctua NH-D15
Memory 64GB GSKILL DDR5
Video Card(s) Gigabyte RTX 4090 Gaming OC
Storage 960GB Optane 905P U.2 SSD + 4TB PCIe4 U.2 SSD
Display(s) Alienware AW3423DW 175Hz QD-OLED + AOC Agon Pro AG276QZD2 240Hz QD-OLED
Case Fractal Design Torrent
Audio Device(s) MOTU M4 - JBL 305P MKII w/2x JL Audio 10 Sealed --- X-Fi Titanium HD - Presonus Eris E5 - JBL 4412
Power Supply Silverstone 1000W
Mouse Roccat Kain 122 AIMO
Keyboard KBD67 Lite / Mammoth75
VR HMD Reverb G2 V2
Software Win 11 Pro
More specific words that other technical people encode so that we know who the bullshitters vs real wizards are.
Showtime Recording GIF by CBS
 
Joined
Jul 13, 2016
Messages
3,418 (1.10/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
As I've said before: its never called "AI" when its a true product with an actual use.

This is for the most part true but it's an entirely separate argument.

Various steps of chip manufacturing are called automated theorem proving, binary decision diagrams, verification, layout and routing, and more. They were once called "AI" in the 1980s back when there was another AI Bubble, but today these "old AI" tricks have proper terms that everyone understands and has an ability to describe why one algorithm is better than another.

Ex: Automated Layout and Routing could be tuned for less space, or higher performance for different chip designs. Its no longer "AI", its just a tool used by chip manufacturers.

And mind you: these tools don't need data-centers full of H100s in them. The tool that's actually kinda-sorta useful for this task is x3D Cache EPYC servers IIRC.

The examples I pointed out were not traditional algorithm based techniques and they were certainly not available in the 1980s. The advancements I pointed out are specifically based on Neural Networks and further enhance OPC and manufacturing beyond what is capable via a traditional algorithm.
 
Joined
Apr 24, 2020
Messages
2,794 (1.61/day)
The examples I pointed out were not traditional algorithm based techniques and they were certainly not available in the 1980s. The advancements I pointed out are specifically based on Neural Networks and further enhance OPC and manufacturing beyond what is capable via a traditional algorithm.

Oh, so it's faster and better than the old 1980s algorithm?

Why are we pouring Billions of dollars into GPUs and using so much electricity that the capacity auction on the East Coast just jumped 900% this past year?

That's why we need to get specific here. There is a lot of bullshit involving a lot of money here. A faster and better algorithm leads to FEWER computers needed and LESS power usage.

Not.... Whatever the hell is going on in Virginia's data center expansion or the ridiculous sums of money sent to NVidia.
 
Joined
Apr 12, 2013
Messages
7,598 (1.77/day)
And everyone seems to trip up at what step#3 entails exactly.
Not everyone, although the monetization part is still tricky. Given enough hours of training these robots on menial jobs you can basically kiss Foxconn cheap(?) slave factories goodbye :ohwell:
 
Joined
Apr 24, 2020
Messages
2,794 (1.61/day)
Not everyone, although the monetization part is still tricky. Given enough hours of training these robots on menial jobs you can basically kiss Foxconn cheap(?) slave factories goodbye :ohwell:




We've had robots for years. None of these systems require $Billion GPU Data centers.

The vast majority of electronics work at Foxconn and other Chinese factories are Pick and Place robots, reflow robots, automated testing robots, automatic XRay inspection robots.

Oh right, and it was all automated back in the Sony Walkman days. Remember lights out manufacturing?? That was back in the 90s when the factory would work without any humans in it so there was no need to install lightbulbs in the factory anymore. That was 30 years ago.

------------------

In any case, industrial automation (be it, palletizers, depalletizers, automated farm equipment, pick-and-place machines, reflow ovens, PCB manufacturing, verification and test. etc. etc. etc.) are "not AI". They don't even have GPUs or neural nets in them. Its just normal algorithms and regular ol' control theory.

Going back to my earlier statement: the issue is:

#1: Buy (or rent) $Billion worth of GPUs.

Which necessarily requires $Billion in profits (not revenue) before that's a worthwhile investment. And when #2 is "Shove $hundred-millions of electricity into those GPUs, its going to be difficult to turn a profit.
 
Last edited:
Joined
Sep 6, 2013
Messages
3,462 (0.83/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 7600 / Ryzen 5 4600G / Ryzen 5 5500
Motherboard X670E Gaming Plus WiFi / MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2)
Cooling Aigo ICE 400SE / Segotep T4 / Νoctua U12S
Memory Kingston FURY Beast 32GB DDR5 6000 / 16GB JUHOR / 32GB G.Skill RIPJAWS 3600 + Aegis 3200
Video Card(s) ASRock RX 6600 / Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes / NVMes, SATA Storage / NVMe, SATA, external storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) / 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Top