Envision & Meta Work Together To Bring ‘Ally’ To Life

Envision’s ‘Ally‘ assistant is an AI-based tool designed to enhance accessibility for people with visual impairments by providing a conversational interface across multiple platforms. The assistant, currently in beta, is part of a collaboration with Meta, which aims to integrate ‘Ally’ into wearable devices like Meta’s ‘Project Aria‘ smart glasses.

Karthik Karnan, co-founder of Envision, highlighted the purpose of ‘Ally’ during an interview with Double Tap on the 20th of September, prior to the official announcement of a partnership with Meta. He explained that the assistant is meant to simplify the way users interact with their devices:

“Ally at its heart is still Envision. You would still be able to read documents, detect objects, recognize faces… but instead of you having to learn how the app works and what button to press, you just talk to it. Your ally has a personality, so it’s not just some boring AI bot.”

Karnan also mentioned that ‘Ally’ is designed to work across platforms, including smartphones, browsers, and Envision glasses. The collaboration with Meta allows for potential integration with more wearable devices, but as Karnan pointed out, current wearables from companies like Meta still have limitations due to restricted access for developers:

“The main problem with these wearables today is that they’re not open for developers yet. There is very limited capability that you can use these glasses for… As more wearables open up to developers, we can make full use of their capability.”

‘Ally’ Is Now In Public Beta

Envision’s ‘Ally’ aims to provide an accessible, easy-to-use AI assistant, but the integration into wearable technology like Meta’s smart glasses could push the boundaries of accessibility further. The beta testing phase is ongoing, with Envision working to refine the assistant’s features based on user feedback.

“It’s going to be in public beta for a while… We just want to have as many people using the beta as possible and have conversations with the community to understand what features really matter to them.”

What Is ‘Project Aira’?

Project Aria, developed by Meta, is a research initiative designed to explore how wearable technology can capture and understand the world from a first-person perspective. The project involves the use of specially designed smart glasses that record data, including video, audio, eye-tracking, and localization information. This data helps researchers build AI models that can interpret the real world in real-time, with the goal of creating more immersive and intuitive augmented reality (AR) experiences.

A key aim of Project Aria is to gather insights that will support the development of AR systems that are context-aware, enabling seamless interaction with digital and physical environments. It’s not a consumer product yet, but a research tool to help Meta understand the challenges of creating future AR platforms.

Meta Connect Event Coverage

This partnership between Envision and Meta could lead to significant advancements in the accessibility of wearable tech, particularly for blind people. You can watch the Meta Connect event live on the 25th of September at 10am PT/1pm ET/6pm UK, and you can get a full recap of the event with special guests on Double Tap airing 26th September on AMI-audio, podcast and YouTube.

Share this article: