Friday, December 22nd 2023
Samsung and Naver Developing an AI Chip Claiming to be 8x More Power Efficient than NVIDIA H100
Naver, the firm behind the HyperCLOVA X large language model (LLM), has been working with Samsung Electronics toward the development of power-efficient AI accelerators. The collaboration brings Naver's expertise with Samsung's vast systems IP over silicon design, the ability to build complex SoCs, semiconductor fabrication, and its plethora of DRAM technologies. The two recently designed a proof of concept for an upcoming AI chip, which they iterated on an FPGA. Naver claims the AI chip it is co-developing with Samsung will be 8 times more energy efficient than an NVIDIA H100 AI accelerator, but did not elaborate on its actual throughput. Its solution, among other things, leverages energy-efficient LPDDR memory from Samsung. The two companies have been working on this project since December 2022.
Source:
BusinessKorea
6 Comments on Samsung and Naver Developing an AI Chip Claiming to be 8x More Power Efficient than NVIDIA H100
Are they saying that the new chip will be capable of the same level of general purpose compute as H100 while being 8x more power efficient? If not, and it's just for AI then this isn't something really revolutionary since we've had multiple AI-specific chips in the market already.
Oh and they still need to face the CUDA domination in the AI/ML software ecosystem.