The facility, located in both the US and South Korea, is set to develop chips that can cater to the processing needs of ‘artificial general intelligence.’ This term is used to denote AI capabilities that equal or surpass human abilities.
In a daring move, Samsung has established a research lab that is focussed on the development of a new variety of chip that takes artificial intelligence to the next level, making it even smarter than humans.
The Samsung Semiconductor AGI Computing Lab is set to function under the guidance of Dong Hyuk Woo, the senior vice president of Samsung, with locations in both the US and South Korea.
This lab has been solely created with the purpose to build a semiconductor that is capable of fulfilling the highly-compute-intensive processing requirements of what Samsung has named as “artificial general intelligence,” or simply AGI. This information has been released in a LinkedIn post penned by Kye Hyun Kyung, the CEO of Samsung Semiconductor.
Samsung’s AGI, or Artificial General Intelligence, is defined as a form of AI that has intelligence capabilities that surpass or equal human intelligence. Additionally, these models are capable of learning independently, without requiring initial training on human-led data. These models generally demand more computational power compared to the Large Language Models (LLMs) currently tied to AI technology, which must first be trained on various data sources by humans.
The tech giant’s lab’s initial focus is on the development of chips for LLMs, with a specific focus on AI inference and service applications, according to Kyung. This move sets the stage for the future development of more complex chips purposed for AGI.
“In order to develop chips that dramatically reduce the power required to operate LLMs, we are taking a fresh look at every aspect of chip architecture. This include memory design, light-weight model optimization, high-speed interconnect, advanced packaging, and much more,” Kyung wrote in his post.
Down the line, Samsung has ambitions to continually introduce new AGI Computing Lab chip designs. These new designs are part of “an iterative model designed to deliver stronger performance and larger model support at a reduced cost and power consumption,” he added.
“Through the creation of the AGI Computing Lab, I am confident that we will be better positioned to solve the complex system-level challenges inherent in AGI, while also contributing affordable and sustainable methods for the future generation of advanced AI/ML models,” Kyung wrote.
Samsung’s move in part appears to be an effort to find new revenue streams in an as-yet untapped market, as its core business, which is memory, has become a commodity, noted Gaurav Gupta, VP analyst, emerging trends and technologies, at Gartner.
“They are looking for another opportunity to grow,” he said. “This is where chips for inference come in.”
Indeed, most companies that currently build components for computer processing and memory are trying to keep pace with the rapid evolution of AI in various individual strategies to provide cost-effective computing resources.
As things stand, the majority of the generative AI chip market for model training is held by Nvidia, with AMD also having some market share, according to Gupta. However, these models operate on GPUs, which can be both hard to come by and expensive, thus they are not sustainable in the long-term for AI model applications.
There is a growing demand within companies for alternative resources to run inference-based AI, and Samsung plans to be ahead of the curve in offering semiconductors for this advanced technology, says Gupta.
He explains, “When you consider the implementation of AI at the edge or endpoint for various use cases, it’s anticipated that inference will take place on bespoke chips, specifically designed to run optimized models. This is Samsung’s ambition – to participate in the design of these chips and penetrate this market, which has yet to fully blossom.”