The National Park Service recently commissioned a pilot program to produce a mobile guide for vision impaired users at the Herbert Hoover National Historic Site visitor center in West Branch Iowa. iBeacons were used to trigger audio descriptions of nearby exhibits while users browsed through the space.We would like to share our experience and learning points and how we solved challenges related to designing a UI exclusively for accessibility purposes and triggering via iBeacons.
The process of designing the UX for an audio description-only interactive became a science unto itself and it meant letting go of conventional design aspirations in favor of a singular focus on voiceover interaction. Now what did we learn? We’ll share insights into the design, testing and iterating process as we refined the guide to provide an appropriate way of integrating manual navigation with automatic triggering. We’ll also share our approach for solving the technology riddles to make iBeacons work in a room where exhibits were spaced out by only 2-3 meters. As of May 2015 the project is still in beta but will be finished in June and we will have ample feedback and post-release learning points to share in time for November.