
For Mike May, who is blind, navigating new spaces can be a challenge. A few weeks ago, he went to a work event at a brewery and had a hard time figuring out where to go.
Thankfully, he had a pair of Envision smart glasses on him, which use artificial intelligence to help people who are blind or visually impaired better understand their surroundings. Using a small camera on the side, the glasses can scan objects, people and text, then relay that information via a small built-in speaker. Envision can tell you if someone is approaching, for instance, or describe what's in a room.
May was using a feature on the glasses called Ally, which lets him start video calls with friends and family to get help.
"I called up one of my colleagues, Evelyn, and said, 'What do you see?' and she described the environment to me," said May, chief evangelist at accessible navigation company Goodmaps. "She told me where the tables were and just gave me the lay of the land."
Envision Glasses are built on the enterprise edition of Google Glass. (Yes, Google Glass is still alive.) Google unveiled these smart glasses back in 2013, then touting them as a way for users to take calls, send texts, snap pictures and look at maps, among other things, right from the headset. But after a limited -- and unsuccessful -- release, they never hit store shelves.
A few years later, Google started working on an enterprise edition of the glasses, which is what Envision is built on. Their wearable design makes them ideal for capturing and relaying information as a user would see it.
"What Evision Glasses essentially does is takes in all the visual information that's around, tries to process that information, and then speaks it out to the user," says Karthik Kannan, Envision's co-founder.