Meta will use AI to create lip-synced translations of creators’ Reels

RMAG news

Meta just announced an intriguing tool that uses AI to automatically dub Reels into other languages, complete with lip-sync. This feature was revealed at the annual Meta Connect livestream event and was introduced by CEO Mark Zuckerberg.

Zuckerberg showed this off during the keynote, and everything seemed to work flawlessly. The technology not only translates the content, according to Meta, but will also “simulate the speaker’s voice in another language and sync their lips to match.” It’s worth noting, however, that this didn’t appear to be a live demo, but was still pretty impressive. 

As for a rollout, the company says the feature will arrive first to “some creators’ videos” in English and Spanish in the US and Latin America. Meta didn’t give a timetable here. It just said the US and Latin America will be getting it first, which indicates that it’ll be tied to English and Spanish at launch. The company did mention that more languages are coming soon.

That wasn’t the only AI tool spotlighted during Meta Connect. The company’s AI platform will now allow voice chats, with a selection of celebrity voices to choose from. Meta AI is also getting new image capabilities, as it will be able to change and edit photos based on instructions from text chats within Instagram. Messenger and WhatsApp.

This article originally appeared on Engadget at https://www.engadget.com/ai/meta-will-use-ai-to-create-lip-synced-translations-of-creators-reels-175949373.html?src=rss

Please follow and like us:
Pin Share