views
Mark Zuckerberg, Meta CEO, has shared a new post on his Instagram account that gives us our first look at how its new Meta AI multimodal model can both “see and hear” using the Ray-Ban Meta smart glasses.
Meta, the parent company of Facebook, is introducing this functionality for customers to try through an early access program. “You can use Meta AI to spark creativity, get information, and control the glasses just by using your voice. Today we’re introducing new updates to make the glasses smarter and more helpful than ever before,” Meta said.
A post shared by Mark Zuckerberg (@zuck)
What Can This New Model Do?
In the demo shared by Zuckerberg, we see him asking Meta AI to suggest pants for the striped shirt he picked. The AI analyses what would be suitable with the shirt and then suggests that dark washed jeans or a solid-coloured pair of trousers would suit it well. However, this is just one of many possibilities that users can choose to experience with this new AI model using the Ray-Ban Meta smart glasses.
Meta further notes that users “won’t just be able to speak with your glasses—the glasses will be able to understand what you’re seeing using the built-in camera.”
It added, “You can ask Meta AI for help writing a caption for a photo taken during a hike, or you can ask Meta AI to describe an object you’re holding.” This means you can ask it about a multitude of things, and it’s not limited to a specific category. Also, Meta is partnering with Microsoft Bing to bring even more functionality. With it, you can ask Meta AI about real-time information like sports scores or information on local landmarks, restaurants, stocks, and even information about a nearby pharmacy.
Notably, this early access program is just for Meta Ray-Ban smart glasses owners in the US for now. To test it, users need to head to the Meta View app on iOS and Android, and sign up for the same.
What's your reaction?
Comments
0 comment