Envision’s Karthik Karnan Talks Wearables and Introducing Ally
In a recent episode of Double Tap aired on 20th September, hosts Steven and Shaun welcomed Karthik Karnan from Envision to talk about the latest developments in wearable technology and the company’s new AI-powered assistant, Ally. The conversation touched on the evolution of Envision, the functionality of Ally, and the current state of accessible technology for blind and visually impaired users.
Envision’s Journey to Wearables
Karthik Karnan began by recounting Envision’s history, starting with their early efforts to build accessible tools for visually impaired users. Envision initially launched in 2017 as a smartphone app before making waves with its innovative glasses, which brought AI-powered accessibility features to the blind and low-vision community.
“We’ve been really stuck on this wearables idea,” Karnan explained. “We started off in 2017 as an app, but even the earliest designs of Envision were always focused on how we could build something that could be worn.”
The Envision glasses were among the first to integrate Google Glass technology with their own software, leading to a more accessible experience for users.
What is Ally?
The highlight of the conversation was the introduction of *Ally*, Envision’s new AI-powered assistant. Karnan described Ally as a conversational personal assistant that simplifies how users interact with technology, cutting down on the need for complex menus and buttons.
“Ally is our way of reimagining all of what Envision’s been doing,” Karthik explained. “It’s a very easy-to-use conversational personal assistant that you can use on any platform today. Instead of learning how the app works, you just talk to Ally, and it talks back.”
Ally builds on Envision’s existing features, allowing users to read documents, detect objects, recognize faces, and perform other tasks simply by talking to it. The goal is to make interactions more natural and seamless, reducing the friction that comes with learning app interfaces or navigating complex systems.
Multi-Platform Flexibility
One of Ally’s standout features is its flexibility across different platforms. Whether using iOS, Android, or even a web browser, users can interact with Ally anywhere. Karthik also revealed that Ally works as a Chrome extension, providing accessibility support for inaccessible web forms, unlabeled images, and more.
“Let’s say you come across a form that’s inaccessible. You could basically ask Ally to fill out the form for you and submit it,” said Karnan. “It can confirm the details and submit it on your behalf.”
Ally’s integration extends to the Envision glasses and potentially other wearables in the future, allowing it to follow users across multiple devices.
Personalisation and Memory
One unique aspect of Ally is its personalisation. Users can modify Ally’s personality and even teach it preferences, like dietary restrictions, that Ally can then use in future interactions. As Karthik explained:
“Ally learns a thing or two about you as you keep using it. You can tell it things like, ‘I’m vegan’ or ‘I’m cutting back on sugar,’ and when you’re at a restaurant, Ally will remember and suggest what to eat based on those preferences.”
This personalisation extends to recalling previously scanned documents or receipts, making it easier for users to fill out forms or retrieve information across different platforms.
Security and Privacy
As AI technology evolves, concerns about data privacy are common. Karthik addressed this by highlighting Envision’s commitment to data security:
“We are a European company, fully GDPR compliant. We take data privacy very seriously,” he stated. “We don’t sell your data, and when we use data to train models, we anonymize it and only store what’s absolutely necessary.”
He reassured listeners that Ally’s memory is secure, and any data shared with Ally, such as scanned documents or personal details, stays private.
The Future of Wearables and AI
Karthik was also excited about the future of wearables. With Meta, Apple, and other companies pushing the boundaries of augmented reality and AI-powered glasses, he believes there’s room for growth in the accessibility space. However, he pointed out that many wearables today still lack the necessary developer support to make them fully accessible:
“The main problem we see with wearables like Meta’s is that they aren’t open to developers yet. For these devices to reach their full potential, we need the ability to integrate accessibility tools like Envision.”
Despite this, Envision continues to innovate, constantly improving its glasses and software with the goal of making daily life easier for blind and visually impaired users.
Beta Version Available Now
Envision’s *Ally* is set to change the way visually impaired users interact with technology. With its conversational AI, cross-platform functionality, and personalized experience, Ally promises to simplify tasks like reading, form-filling, and object recognition.
For those interested in trying Ally, Karnan encouraged users to sign up for the public beta: “We’re in public beta, and we want as many people as possible to try Ally and give us feedback.”
As wearables continue to evolve, Karnan remains optimistic about their potential. Whether through glasses, apps, or future devices, Envision aims to stay at the forefront of accessible technology.