Project team: Pierre Karashchuk, Corten Singer, Tomas Vega
Project background: Supported by Maker Pass resources, including resources and guidance at the CITRIS Invention Lab
A winner of the 2017 Lemelson-MIT Student Prize and finalist in Fast Company’s 2017 World Changing Ideas Awards, WheelSense is a modular, open-source system that empowers visually impaired and movement-restricted wheelchair users to explore the world around them. Rooted in deep engagement with user needs, WheelSense takes a novel approach to autonomous wheelchair navigation by using haptic and auditory feedback to relay spatial information directly to the user. WheelSense provides three primary features: frontal drop-off detection (e.g., staircase detection) through auditory feedback; backward obstacle-avoidance assistance through auditory feedback delivered at a rate inversely proportional to the distance of the nearest obstacle; and lateral ramp-edge detection through haptic feedback on the user’s armrest.
Built with a common Arduino microcontroller, off-the-shelf infrared distance sensors, and a 3D printer, WheelSense is centered on a computer-mediated decision-making process in which the user has the ultimate say — a unique balance of enhanced functionality and total user control. By making their work open-source and creating designs compatible with low-cost 3D printing technology, the team aims to broaden access to personalized assistive technology.
Want to learn more about WheelSense? Check out:
“Los Gatos: New wheelchair technology,” The Mercury News
Topics: 3D printing, Assistive technology