Home » Technology » Samsung Electronics and NAVER Partner to Develop AI-Optimized Semiconductor Solutions at Scale – Samsung Global Newsroom

Samsung Electronics and NAVER Partner to Develop AI-Optimized Semiconductor Solutions at Scale – Samsung Global Newsroom

Both companies plan to combine their prowess in semiconductor design and manufacturing with proven AI capabilities to maximize the speed and power efficiency of large-scale AI models.

Samsung Electronics, a world leader in advanced memory technologies, and NAVER Corporation, a global Internet company with cutting-edge AI technology, today announced a far-reaching collaboration to develop semiconductor solutions suitable for large-scale artificial intelligence (AI) models. ladder. By leveraging Samsung’s next-generation memory technologies such as Compute Storage, In-Memory Processing (PIM), and Near-Memory Processing (PNM), as well as Compute Express Link (CXL), enterprises are looking to pool their hardware resources and software to accelerate the management of massive AI workloads.

Recent advances in AI at scale have led to exponential growth in the volumes of data to be processed. However, the performance and efficiency limitations of today’s computing systems pose significant challenges in meeting these heavy computational demands, fueling the need for new AI-optimized semiconductor solutions.

Developing such solutions requires a strong convergence of semiconductor and artificial intelligence disciplines. Samsung combines its expertise in semiconductor design and manufacturing with NAVER’s expertise in developing and testing AI algorithms and AI-powered services, to create solutions that drive AI performance and energy efficiency on a large scale to a higher level.

For years, Samsung has introduced memory and storage that supports high-speed data processing in AI applications, compute storage (SmartSSD), and PIM-enabled high-bandwidth memory (HBM-PIM) to new memory. generation that supports Compute Express Link (CXL ). Samsung will now join NAVER in optimizing these memory technologies to advance large-scale AI systems.

NAVER will continue to refine HyperCLOVA, a hyperscale language model with over 200 billion parameters, while improving its compression algorithms to create a more simplified model that dramatically increases computational efficiency.

“Through our collaboration with NAVER, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” said Jinman Han, executive vice president of global memory sales and marketing at Samsung Electronics. . “With tailored solutions that reflect the most pressing needs of service providers and AI users, we are committed to expanding our market-leading memory portfolio, including compute storage, PIM and more, to fully scale at ever growing data”.

“By combining our knowledge and know-how acquired from HyperCLOVA with Samsung’s prowess in semiconductor manufacturing, we believe we can create a new class of solutions that can better address the challenges of advanced artificial intelligence technologies today,” said Suk Geun Chung, head of NAVER CLOVA CIC. . “We look forward to expanding our AI capabilities and strengthening our competitive advantage through this strategic partnership.”

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.