Meta’s multimodal AI, which guides its Ray-Ban smart glasses, can now recognize landmarks and describe them to its users, the company’s CTO said.
The feature, while still in beta mode, would allow smart glasses owners to get detailed information about an object only by gazing at it.
Andrew Bosworth, Meta’s CTO, shared several examples of the smart glass recognizing the Golden Gate Bridge, the Painted Ladies, and the Coit Tower.
Post by @boztankView on Threads
Users prompt their device to “tell me more about this landmark,” with the device giving back some historical details about a certain object.
Meanwhile, Mark Zuckerberg used Instagram to showcase the new capabilities, with the glasses providing an audio description of Big Sky Mountain and the history of the Roosevelt Arch.
Meta said that it's working on enhancing the overall experience with smart glasses, like adding voice commands to share Meta AI content on WhatsApp and Messenger, as well as crafting texts.
“And if you’re a podcast listener at 1.5x or greater speed, you’ll soon be able to configure Meta AI readouts to be slower or faster under voice settings,” Bosworth said.
Meta’s Ray-Ban smart glasses incorporate Meta’s AI assistant that can live-stream broadcasts of what a user is seeing, snap photos, and play sound.
Your email address will not be published. Required fields are markedmarked