Be My Eyes Partners With Microsoft To Help Train AI Models

In this episode of Double Tap, Steven Scott speaks with Mike Buckley from Be My Eyes and Jenny LayFlurrie from Microsoft about a groundbreaking partnership aimed at improving AI systems with more inclusive and representative data. The collaboration focuses on training AI models using datasets specific to the lived experiences of blind and low vision individuals, addressing the gaps in understanding and representation within current AI technologies.

Mike explains that Be My Eyes will contribute unique datasets from their platform, excluding AI session data to prioritize user privacy. Instead, anonymized video data showcasing real-world interactions between blind or low vision users and sighted volunteers will be utilized. This data includes verbal cues and imagery not readily available on the internet, providing a rich source for refining AI models. Transparency and user choice are at the forefront, with users given the option to opt out.

Jenny highlights Microsoft’s commitment to accessibility, emphasizing how this data will enhance tools like Seeing AI and Azure Copilots, enabling AI to better understand disability-related contexts. By integrating more disability-specific data, Microsoft aims to prevent ableist biases and make AI solutions more effective for disabled users.

The discussion also addresses broader implications of AI, including potential breakthroughs in areas like speech recognition and online accessibility. Jenny stresses the importance of using AI responsibly and inclusively to tackle long-standing accessibility barriers, with the ultimate goal of creating equitable experiences for all.

Listeners can learn more about this partnership and stay updated on its progress through Be My Eyes’ communication channels.

Share this article: