• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NEO Semiconductor Announces 3D X-AI Chip as HBM Successor

Nomad76

News Editor
Staff member
Joined
May 21, 2024
Messages
725 (3.42/day)
NEO Semiconductor, a leading developer of innovative technologies for 3D NAND flash memory and 3D DRAM, announced today the development of its 3D X-AI chip technology, targeted to replace the current DRAM chips inside high bandwidth memory (HBM) to solve data bus bottlenecks by enabling AI processing in 3D DRAM. 3D X-AI can reduce the huge amount of data transferred between HBM and GPUs during AI workloads. NEO's innovation is set to revolutionize the performance, power consumption, and cost of AI Chips for AI applications like generative AI.

AI Chips with NEO's 3D X-AI technology can achieve:
  • 100X Performance Acceleration: contains 8,000 neuron circuits to perform AI processing in 3D memory.
  • 99% Power Reduction: minimizes the requirement of transferring data to the GPU for calculation, reducing power consumption and heat generation by the data bus.
  • 8X Memory Density: contains 300 memory layers, allowing HBM to store larger AI models.





"Current AI Chips waste significant amounts of performance and power due to architectural and technological inefficiencies," said Andy Hsu, Founder & CEO of NEO Semiconductor. "The current AI Chip architecture stores data in HBM and relies on a GPU to perform all calculations. This separated data storage and data processing architecture makes the data bus an unavoidable performance bottleneck. Transferring huge amounts of data through the data bus causes limited performance and very high power consumption. 3D X-AI can perform AI processing in each HBM chip. This can drastically reduce the data transferred between HBM and GPU to improve performance and reduce power consumption dramatically."

A single 3D X-AI die includes 300 layers of 3D DRAM cells with 128 GB capacity and one layer of neural circuit with 8,000 neurons. According to NEO's estimation, this can support up to 10 TB/s of AI processing throughput per die. Using twelve 3D X-AI dies stacked with HBM packaging can achieve 120 TB/s processing throughput, resulting in a 100X performance increase.

"The application of 3D X-AI technology can accelerate the development of emerging AI use cases and promote the creation of new ones," said Jay Kramer, President of Network Storage Advisors. "Harnessing 3D X-AI technology to create the next generation of optimized AI Chips will spark a new era of innovation for AI Apps."

View at TechPowerUp Main Site | Source
 
Joined
Feb 17, 2019
Messages
120 (0.06/day)
Location
Reino de España
System Name No Name
Processor Ryzen 3700X
Motherboard MAG B550M Mortar Max WiFi
Cooling ARCTIC P12 PWM PST A-RGB 120 mm x6, ARCTIC P14 PWM PST A-RGB 140 mm x2, Artic AIO 240 RGB.
Memory Crucial Ballistix ddr4 3600 32 GB
Video Card(s) Sapphire AMD Radeon RX 7800 XT
Storage Crucial P3 Plus 4TB SSD M.2 3D NAND NVMe PCIe 4.0 + Crucial MX500 SSD 2TB SATA3
Case Lian Li air mini
Power Supply Corsair RM750X
WOOOOWWWW!!!!!
 
Joined
Apr 9, 2013
Messages
308 (0.07/day)
Location
Chippenham, UK
System Name Hulk
Processor 7800X3D
Motherboard Asus ROG Strix X670E-F Gaming Wi-Fi
Cooling Custom water
Memory 32GB 3600 CL18
Video Card(s) 4090
Display(s) LG 42C2 + Gigabyte Aorus FI32U 32" 4k 120Hz IPS
Case Corsair 750D
Power Supply beQuiet Dark Power Pro 1200W
Mouse SteelSeries Rival 700
Keyboard Logitech G815 GL-Tactile
VR HMD Quest 2
99% power reduction. Uh huh. I believe them & need no further evidence.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
Wonder how long it will be before a company tries to acquire them. This seems quite promising.
 

tfp

Joined
Jun 14, 2023
Messages
89 (0.16/day)
3D, AI, and X? Did they miss any buzz words/aberrations? I'm assuming "e" and "i" are no longer in vouge.
 
  • Like
Reactions: Jun
Joined
May 3, 2018
Messages
2,881 (1.19/day)
99% power reduction. Uh huh. I believe them & need no further evidence.
In AI work loads. We already have seen announced other products like this that slash AI power usage compared to GPU/GPGPU that rely on a lot of memory swaps and matrix multiplications,
 
Joined
Apr 18, 2019
Messages
2,393 (1.15/day)
Location
Olympia, WA
System Name Sleepy Painter
Processor AMD Ryzen 5 3600
Motherboard Asus TuF Gaming X570-PLUS/WIFI
Cooling FSP Windale 6 - Passive
Memory 2x16GB F4-3600C16-16GVKC @ 16-19-21-36-58-1T
Video Card(s) MSI RX580 8GB
Storage 2x Samsung PM963 960GB nVME RAID0, Crucial BX500 1TB SATA, WD Blue 3D 2TB SATA
Display(s) Microboard 32" Curved 1080P 144hz VA w/ Freesync
Case NZXT Gamma Classic Black
Audio Device(s) Asus Xonar D1
Power Supply Rosewill 1KW on 240V@60hz
Mouse Logitech MX518 Legend
Keyboard Red Dragon K552
Software Windows 10 Enterprise 2019 LTSC 1809 17763.1757
This looks like one of the 'neuro-emulative' memory technologies Intel was looking to compete with, using 3DXpoint memory.

If even remotely related, I wonder what IBM, etc. are working on/invested in?
 
Top