Meta’s new AI chips runs faster than before

Meta’s new AI chips runs faster than before

Illustration by Nick Barclay / The Verge

Meta promises the next generation of its custom AI chips will be more powerful and able to train its ranking models much faster.

The Meta Training and Inference Accelerator (MTIA) is designed to work best with Meta’s ranking and recommendation models. The chips can help make training more efficient and inference — aka the actual reasoning task — easier.

The company said in a blog post that MTIA is a big piece of its long-term plan to build infrastructure around how it uses AI in its services. It wants to design its chips to work with its current technology infrastructure and future advancements in GPUs.

“Meeting our ambitions for our custom silicon means investing not only in compute silicon but also in memory bandwidth, networking,…

Continue reading…

Leave a Reply

Your email address will not be published. Required fields are marked *