Meta is introducing the voices of Judi Dench, John Cena, and Keegan-Michael Key to its AI chatbot.

Meta has collaborated with Judi Dench, John Cena, and Keegan-Michael Key to integrate their voices into its AI chatbot.

Facebook and Instagram users will now have the ability to interact with voices resembling John Cena and Judi Dench, but these won’t be the actual actors. Instead, they will be AI-generated voices integrated into Meta’s chatbot. On Wednesday, Meta (the parent company) announced that it’s incorporating voice interactions and celebrity voices into its AI chatbot, Meta AI. Now, users can engage in real-time conversations with the chatbot, selecting from a range of synthesized voices, including those of well-known celebrities.

Meta has partnered with actors like Cena, Dench, Kristen Bell, Awkwafina, and Keegan-Michael Key to develop these AI voices. This enhancement is part of Meta’s strategy to keep its chatbot, available on Facebook, Instagram, WhatsApp, and Threads, competitive with alternatives like ChatGPT, which is also introducing advanced voice capabilities. Meta CEO Mark Zuckerberg expressed confidence that Meta AI will become “the most used AI assistant in the world” by the end of the year, largely driven by the vast user base of over 3 billion people who use the company’s apps daily. However, details on how Meta tracks chatbot engagement remain unclear.

Earlier this year, competitor OpenAI faced criticism when it introduced its own voice mode for ChatGPT, with a demo voice that was strikingly similar to Scarlett Johansson. Johansson later clarified that she had declined participation in the project. OpenAI denied that the voice was modeled after her but paused its use regardless. In contrast, Meta seems to have formal agreements with the actors whose voices are being used.

Zuckerberg announced these new voice features during his keynote at Meta’s annual Meta Connect conference, where he also highlighted advancements in AI, a new budget-friendly Meta Quest headset, and updates to Meta’s augmented reality Ray-Ban glasses.

In addition, Meta revealed that influencers will now be able to create AI versions of themselves, allowing followers to engage in virtual video-like calls with these AI versions, expanding on a previous feature that enabled only text interactions.

Meta is also enhancing its Reels feature by auto-translating and dubbing foreign-language videos for users. For example, if an English-speaking user views a Reel originally in Spanish, it will appear in English with natural-looking mouth edits to match the dubbed audio.

Furthermore, users may soon see AI-generated content based on their interests or trending topics appearing in their Facebook and Instagram feeds. This feature, dubbed “imagined for you,” raises questions about whether users can opt out if they prefer to see content only from human sources.

Additionally, Meta’s augmented reality (AR) glasses are getting an update with live AI-powered translation capabilities, allowing users to hear real-time translations during conversations in foreign languages.

Lastly, Zuckerberg teased a prototype of advanced AR glasses called “Orion.” Unlike current AR headsets, which use screens to display overlays of digital content via cameras, the Orion glasses will have transparent lenses that use holograms to project content like emails, text messages, or 3D avatars of friends, floating in real space. Though touted as “the most advanced glasses the world has ever seen,” Orion is still in development and is not yet available for purchase. Meta plans to continue refining the technology with select third-party developers before releasing a consumer version.

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like