Tech

Meta platforms unveils next-gen AI chip to power ecosystem

Published

on

Meta Platforms has disclosed details about its latest in-house artificial intelligence accelerator chip, marking a significant step in the company’s efforts to bolster its infrastructure for AI-driven products across its platforms such as Facebook, Instagram, and WhatsApp.

The unveiling of the new chip, internally known as “Artemis,” underscores Meta’s strategy to reduce its reliance on third-party AI chips, particularly those from Nvidia, while simultaneously enhancing efficiency and reducing energy costs.

The chip, officially named the Meta Training and Inference Accelerator, is designed to strike a balance between compute power, memory bandwidth, and capacity to effectively handle ranking and recommendation models, according to a blog post by the company.

Meta’s custom silicon efforts extend beyond chip development, encompassing broader hardware systems as well as software optimization to harness the full potential of its infrastructure.

Flagship chips

In addition to its in-house chip development, Meta continues to invest in external AI chip suppliers. Earlier this year, CEO Mark Zuckerberg announced plans to acquire approximately 350,000 flagship H100 chips from Nvidia, alongside investments in other suppliers, to amass the equivalent of 600,000 H100 chips in total for the year.

The new MTIA chip, manufactured by Taiwan Semiconductor Manufacturing Co on its advanced “5nm” process, boasts three times the performance of its predecessor.

Already deployed in data centers, the MTIA chip is actively involved in serving AI applications, with ongoing efforts to expand its capabilities to support generative AI workloads.

Meta’s latest move underscores its commitment to advancing AI technologies to power its diverse range of products and services, signaling an era of enhanced efficiency and innovation within the Meta ecosystem.

Trending Now

Exit mobile version