December 25, 2024

Omar Marks | Sopa Images | Light Rocket | Getty Images

meta platform Details about the company’s next-generation in-house artificial intelligence accelerator chip were released Wednesday.

Reuters reported earlier this year that Meta plans to deploy a new version of its customized data center chip to address the growing computing power required to run artificial intelligence products at Facebook, Instagram and WhatsApp. The chip, known internally as “Artemis,” will help Meta reduce its reliance on Nvidia’s AI chips and reduce overall energy costs.

“The chip’s architecture is fundamentally focused on providing the right balance of compute, memory bandwidth and memory capacity for ranking and recommendation models,” the company wrote in a blog post.

The new Meta Training and Inference Accelerator (MTIA) chip is part of the company’s extensive custom chip work that also includes other hardware systems. In addition to building chips and hardware, Meta has invested heavily in developing the necessary software to harness the power of its infrastructure in the most efficient way.

Read more CNBC coverage of artificial intelligence

The company also spent billions of dollars buying Nvidia and other artificial intelligence chips: This year, CEO Mark Zuckerberg said the company plans to purchase about 350,000 of its flagship H100 chips from Nvidia. He said that combined with other suppliers, Meta plans to accumulate the equivalent of 600,000 H100 chips this year.

British Semiconductor The new wafers will be produced using the “5nm” process. Meta says it’s three times more powerful than first-generation processors.

This chip has been deployed in data centers and is dedicated to serving AI applications. The company said it has multiple ongoing initiatives “designed to expand the scope of MTIA to include support for (generated artificial intelligence) workloads.”

Don’t miss these stories from CNBC PRO:

About The Author

Leave a Reply

Your email address will not be published. Required fields are marked *