Meta is planning to embark on phase two of its RSC supercomputer among other new AI-related projects. The RSC was unveiled in 2021.
Meta has shared details of ambitious plans for AI advancements, including a custom chip for running AI models, further tweaks of a supercomputer for AI research and an AI-optimised data centre design.
In a blogpost published to accompany a press event yesterday (18 May), Meta predicted that its “AI compute needs will grow dramatically over the next decade” as it works to “break new ground” in AI research. It also mentioned the ever-present metaverse, adding that these developments will assist it in building its “long-term” vision.
“We are now executing on an ambitious plan to build the next generation of Meta’s infrastructure backbone – specifically built for AI – and in this blogpost we’re sharing some details on our recent progress.”
One of the key parts of Meta’s AI strategy has been its supercomputer for AI research, which it initially unveiled last year.
The RSC was built to train the next generation of large AI models to power new augmented reality tools, content understanding systems and real-time translation tech.
The 16,000 GPU machine has been powering research projects like LLaMA, the large language model Meta built and shared earlier this year. It is now beginning what it calls the second phase of its development.
The tech giant is also planning a new data centre design that is optimised for AI. “This new data centre will be an AI-optimised design, supporting liquid-cooled AI hardware and a high-performance AI network connecting thousands of AI chips for data centre–scale AI training clusters.”
According to the blogpost, it will be faster and more cost-effective to build. The company hopes it will complement some of its other new hardware, such as its first in-house-developed video transcoding tool called MSVP. The tool was designed to power Meta’s growing video workloads.
To round off its AI plans, Meta is working on an in-house, custom accelerator “chip family” to target inference workloads. The company claims the MTIA (Meta Training and Inference Accelerator) chip can provide “greater compute power and efficiency than CPUs”. It has been specially customised for Meta’s internal workloads.
In the past, Meta’s other AI-related announcements have also come in threes. In 2022, it said it was working on concepts such as universal speech translation, AI that can learn like a human and a more conversational AI assistant.
10 things you need to know direct to your inbox every weekday. Sign up for the Daily Brief, Silicon Republic’s digest of essential sci-tech news.