Project Guideline is a new initiative set out by researchers to help people with visual impairments gain more independence. This project uses on-device machine learning on Google Pixel phones to help blind and low vision individuals navigate through the physical world on their own. It uses a waist-mounted phone, a guideline on the path, and audio cues to guide users safely.
The technology behind Project Guideline is quite advanced. It uses ARCore for tracking the user’s position, a segmentation model based on DeepLabV3+ to detect the guideline, and a depth ML model to identify obstacles. By using this unique approach, users can navigate outdoor paths marked with a painted line, which is a big improvement in assistive technology.
The system is made using C++ and integrates libraries like MediaPipe. ARCore estimates the user’s position and orientation, while the segmentation model outlines the guideline’s trajectory. A control system selects target points on the line and provides a navigation signal, taking into account the user’s position, velocity, and direction. It also includes obstacle detection to provide an extra layer of safety.
Project Guideline is a huge step forward in computer vision accessibility. This initiative combines machine learning, augmented reality technology, and audio feedback to address the challenges faced by people with visual impairments.
By open-sourcing Project Guideline, the researchers are emphasizing inclusivity and innovation. This initiative sets the stage for future advancements in assistive technology and illuminates the path toward a more accessible and inclusive future.
If you’re interested in learning more, check out the GitHub and Blog. And don’t forget to join their ML SubReddit, Facebook Community, Discord Channel, and Email Newsletter for the latest AI research news and cool AI projects.