Be My Eyes Brings Its Volunteer Service To Meta Smart Glasses

In this episode of Double Tap, Steven Scott and Shaun Preece speak with Mike Buckley and Bryan Bashin from Be My Eyes about their new partnership with Meta and the integration of Be My Eyes volunteer support into the Meta Ray-Ban smart glasses. The conversation explores how this collaboration aims to provide blind and low-vision users with a seamless, hands-free way to access visual assistance in everyday situations.

“Our mission has always been to connect our community and make the world more accessible,” says Mike. “For years, people have been asking for a hands-free way to interact with Be My Eyes, and this partnership with Meta is the first step toward making that a reality.” Bryan adds, “As a blind person, I want to do things in the world, and the easier it is, the happier I am. Having both hands free while accessing visual assistance changes everything.”

Discussing how the partnership came about, Mike reveals that initially, Meta was hesitant. “When I first proposed this, they said no,” he admits. “But I kept asking, and what really convinced them was the enthusiasm from the blind community. Meta realized how much excitement there was around making their smart glasses accessible.” Engineers from both companies collaborated closely to integrate Be My Eyes into the Meta Ray-Ban ecosystem, ensuring a seamless experience.

Shaun highlights the practical limitations of the current Meta AI, particularly its cautious approach to safety-related descriptions. “Right now, if I ask it whether there’s something in front of me, it says, ‘I can’t give safety advice.’ That’s not helpful,” he notes. Mike acknowledges these concerns, stating that further improvements, including the integration of Be My AI for more advanced AI-driven visual descriptions, are already being discussed with Meta. “We’re working on bringing a fuller AI experience to the glasses, and Meta is open to evolving this further,” he confirms.

Looking to the future, Bryan emphasizes that this collaboration is about more than just one feature. “This isn’t just about these glasses today. It’s about a long-term relationship with Meta, ensuring blind people have a seat at the table when developing future accessibility solutions.” He also expresses optimism about Meta’s open-source AI models, which could be fine-tuned to provide better, more contextual responses tailored to visually impaired users’ needs.

Mike also shares an interesting insight from inside Meta: “Apparently, this was the most requested project by Meta engineers. More people signed up to work on this than anything else they’ve done before.” He sees this as a positive sign that accessibility is being taken seriously by the tech giant.

As for when users can expect to try Be My Eyes on the Meta Ray-Bans, Mike confirms that beta testing is already underway with a few hundred testers. “We’re aiming to launch before the end of the year, hopefully in the next couple of months,” he says, while emphasizing that the team continues to refine the experience based on user feedback.

With the potential for deeper AI integration, improved accessibility features, and an expanding range of use cases, this partnership marks a significant step forward in wearable assistive technology. “This is only the beginning,” Bryan assures listeners. “We’re just getting started.”

Share this article: