Meta Platforms has announced that it is working on a custom chip “family” for its data centers to better support artificial intelligence (AI) workloads. The company has designed a first-generation chip as part of the Meta Training and Inference Accelerator (MTIA) program to improve efficiency for recommendation models used to serve ads and other content in news feeds. The chip focused exclusively on inference, which is an AI process where algorithms make judgments about what content to show in a user’s feed. Meta has also provided an update on its plans to redesign its data centers around more modern AI-oriented networking and cooling systems.
Details of the Chip
Meta Platforms, the owner of Facebook and Instagram, has shared details on its MTIA program, which is aimed at making its data centers better suited to supporting AI workloads. The program includes the development of a custom chip “family” that the company is creating in-house. Meta has designed a first-generation chip as part of the program, which was focused exclusively on inference. Inference is an AI process in which algorithms trained on huge amounts of data make judgments about what content to show in a user’s feed. The chip was a learning opportunity for the company, and it has incorporated invaluable lessons learned from the design into its roadmap.
Meta Platforms has not disclosed deployment timelines or elaborated on its plans to develop chips that could train AI models. The company has been working on upgrading its AI infrastructure over the past year since executives realized that it lacked the hardware and software needed to support demand from product teams building AI-powered features. Meta had scrapped plans for a large-scale rollout of an in-house inference chip and started work on a more ambitious chip capable of performing both training and inference.
Meta acknowledged in its blog posts that its first MTIA chip had stumbled with high-complexity AI models, but it handled low- and medium-complexity models more efficiently than competitor chips. The MTIA chip used only 25 watts of power, a fraction of what market-leading chips from suppliers such as Nvidia consume, and used an open-source chip architecture called RISC-V.
Data Center Redesign
Meta Platforms has also provided an update on plans to redesign its data centers around more modern AI-oriented networking and cooling systems. The company plans to break ground on its first such facility this year. The new design would be 31 percent cheaper and could be built twice as quickly as the company’s current data centers, according to an employee in a video explaining the changes.
Meta Platforms is developing a custom chip “family” for its data centers to support AI workloads. The company designed a first-generation chip as part of the MTIA program, which focused exclusively on inference. Meta has incorporated lessons learned from the chip design into its roadmap. The company has not disclosed deployment timelines or elaborated on its plans to develop chips that could train AI models. Meta has also provided an update on its plans to redesign its data centers around more modern AI-oriented networking and cooling systems.
Leave a Reply