GoodMaps Indoor Navigation: Leveraging Computer Vision to Foster Indoor Navigation

Open Access
Article
Conference Proceedings
Authors: Jennifer PalilonisCharlie Meredith

Abstract: People around the world have come to rely on digital maps–accessible with the tap of a smartphone screen–to guide them places, to locate areas with which they are unfamiliar, and to get directions to desired destinations. In a world that is overflowing with technological advancements, we expect to be shown, in real time, how to get from point A to point B. However, navigating indoors using assistive technology can prove more challenging than navigating outdoors, as indoor spaces have not traditionally been mapped the way outdoor spaces have by companies like Google and Apple. Additionally, most indoor mapping and navigation services are not very accurate or accessible to meet the needs of a variety of users, including individuals with disabilities. Founded by American Printing House for the Blind (APH) in 2019, the GoodMaps Indoor Navigation smartphone app is addressing these challenges by leveraging artificial intelligence (AI) and crowd-sourced data to scale its app and map coverage efficiently. GoodMaps uses AI–primarily through computer vision algorithms–to create highly accurate indoor maps and to enable precise navigation for users. The app allows them to navigate complex indoor spaces like airports, university campuses, and more by providing detailed turn-by-turn directions. GoodMaps’ computer vision technology interprets real-time data from a device's camera and sensors, effectively guiding users through the environment even without visual sightlines. This is particularly valuable for people with visual impairments or mobility challenges who need assistance navigating intricate indoor spaces. Currently, AI-driven translation capabilities enable the app to support four languages, with six more planned by early 2025, automating 90% of destination translations for non-English users. Looking ahead, GoodMaps aims to further harness AI to dynamically adapt to changes in indoor environments. AI-powered image recognition will enable automated detection of environmental updates via user device cameras, while crowd-sourced contributions from users will provide real-time feedback similar to Waze. This combination of AI and community-driven updates will streamline map maintenance, improve accessibility, and set a new standard for scalable indoor navigation solutions. GoodMaps is also partnering with Intel to deliver a high-quality indoor wayfinding solution for people who are blind or visually impaired. Safely and effectively navigating indoor spaces results in greater independence and confidence when traveling. Intel continues to investigate volumetric mapping algorithms and advances in artificial intelligence to improve the precision and accuracy of GoodMaps’ commercial indoor wayfinding service. This paper will explain GoodMaps’ user-centered design process, chronicling how user experience research has informed the development of AI-driven computer vision models to address user requirements.

Keywords: Indoor Navigation, Computer Vision

DOI: 10.54941/ahfe1005900

Cite this paper:

Downloads
8
Visits
24
Download