Meta debuts new features for AR glasses

AR Features
AR Features

Meta unveiled several new features for its Ray-Ban smart glasses at the Meta Connect event. Mark Zuckerberg showcased improvements in interaction via multimodal AI, enhancing the user experience. One standout feature is live translation.

The glasses will soon offer real-time translations for Spanish, French, and Italian. Zuckerberg demonstrated this by conversing with a Spanish speaker, with the glasses translating the conversation into English almost instantaneously.

Another beneficial upgrade is the glasses’ “photographic memory.” In a demo, the glasses solved the problem of remembering where you parked.

By simply looking at a parking spot number and saying, “Remember where I parked,” the user can later ask, “Hey Meta, where did I park?” and receive the parking spot number in response. These updates illustrate the potential for AI-enhanced wearable technology, promising to make everyday tasks more manageable and interactions more fluid. All around Meta’s Menlo Park campus, cameras stared at me.

I mean Ray-Ban and Meta’s smart glasses, which Meta hopes we’ll all wear one day. These glasses were prominent during Meta’s Connect conference, where nearly every hardware product involved cameras. Meta’s smart glasses are becoming more common.

Zuckerberg envisions a world where these glasses replace many functions of smartphones. Though not yet full-fledged augmented reality glasses, they capture photos and videos through built-in cameras without the need for a phone. The glasses come in classic Ray-Ban styles, equipped with a 12MP ultrawide camera and an indicator light that flashes when recording.

Meta’s Ray-Ban translation feature

As I wandered around the campus, spotting these glasses on person after person, I became increasingly aware of the possibility of being recorded inadvertently. When I tried the glasses myself, the perspective shifted.

With the camera at eye level, capturing moments felt more seamless than using a phone. A simple button press recorded what I saw without fumbling for a phone. This ease of recording could lead to new ways of sharing daily experiences but also raises privacy concerns.

The prevalence of these smart glasses at Meta’s campus highlights a future where seamless recording could be the norm. This vision brings both exciting possibilities and significant ethical questions. Society might need to navigate the balance between convenience and privacy as these devices become more common.

Zuckerberg also introduced a prototype of the Orion holographic smart glasses, which allow users to see digital objects overlaid on the real world. These glasses can be controlled with brain waves through a “neural interface.” While the glasses are not yet available, no specific timeline for their release or price was provided. “This is not pass-through, this is the real world with holograms overlaid on it,” Zuckerberg explained as he demonstrated the new display technology.

The glasses feature voice and AI capabilities, hand and eye-tracking, and the innovative neural interface that allows users to send signals directly from their brain to the device. Meta has invested heavily in building out its computing infrastructure to support a new generation of compute-intensive generative AI models. The company has integrated its AI tools across its portfolio of internet apps, including Instagram and WhatsApp.

Nearly 500 million users now use Meta’s AI products, and Zuckerberg claimed that Meta AI is on track to be the most used AI in the world.

More Stories