Meta, the company known for its cutting-edge technology, has unveiled the latest version of its smart glasses equipped with a built-in AI assistant. With an entry price of $299, these Ray-Ban Meta smart glasses are set to revolutionize the wearable tech market. In this article, we delve into the features and potential implications of Meta’s groundbreaking technology.
Unlike previous iterations, Meta’s AI assistant is designed to be primarily controlled by voice commands. The wearer can interact with the assistant as they would with Amazon’s Alexa or Apple’s Siri, making it intuitive and easy to use. This approach opens up a whole new realm of possibilities for smart glasses, enabling seamless hands-free operation and enhancing the user experience.
User Interface and Functionality
The user interface of Meta’s multimodal AI on the smart glasses shows promising capabilities. In a demonstration, the AI assistant successfully answered a query about an art piece, identifying it as a “wooden sculpture” and describing it as “beautiful.” This showcases the accuracy and adaptability of Meta’s AI technology. It’s worth noting that Meta has always been at the forefront of AI integration, with a strong emphasis on open-source AI development through their LLM Llama 2 platform.
The entry of generative AI into the hardware category has been slow, with only a few startups venturing into dedicated AI devices. One example is Humane, which has introduced the “Ai Pin” that runs on OpenAI’s GPT-4V. This marks the initial steps into this field, with Meta’s Ray Ban Meta smart glasses being the latest innovation in this emerging area. Additionally, OpenAI has taken a different approach by introducing their own multimodal AI, GPT-4V, through the ChatGPT app for iOS and Android.
Comparison to Google Glass
Meta’s smart glasses invite comparisons to Google’s ill-fated Google Glass prototype from the 2010s. Google Glass faced backlash for its fashion sense, or lack thereof, and its association with early adopters labeled as “Glassholes.” Furthermore, the limited practical use cases contributed to its lackluster reception despite the initial hype. It remains to be seen if Meta’s new smart glasses can learn from the mistakes of the past and avoid falling into the “Glasshole” trap.
Overcoming the “Glasshole” Trap
The success of Meta’s Ray-Ban Meta smart glasses hinges on whether public perception has evolved since the introduction of Google Glass. While strapping a camera to one’s face was met with skepticism in the early days, society’s acceptance of wearable technology may have shifted. Meta’s emphasis on voice control and improved functionality, coupled with a sleek design by partnering with Ray-Ban, might alleviate concerns and attract a wider user base.
Meta’s latest smart glasses with a built-in AI assistant represent a significant leap forward in the wearable tech market. With their affordable entry price and impressive user interface, Meta is poised to make a splash in the industry. By actively addressing past challenges and incorporating voice control technology, Meta aims to steer clear of the negative associations that plagued Google Glass. Only time will tell if these glasses will reshape the way we interact with AI assistants and reimagine the possibilities of wearable tech.