Alsamsam, M. – VEMI Lab /vemi University of Maine Mon, 02 Mar 2026 19:33:07 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 Accessible maps for the future of inclusive ridesharing /vemi/publication/accessible-maps-for-the-future-of-inclusive-ridesharing/ Sun, 22 Sep 2024 16:25:06 +0000 /vemi/?post_type=publication&p=5455
For people who are blind and low vision (BLV), ridesharing provides an important means of independence and mobility. However, a common challenge relates to finding the vehicle when it arrives to an unanticipated location. Although coordinating with the driver for assistance is serviceable in the near term, new solutions are necessary when a human is no longer available in future automated vehicles. Therefore, this paper presents and evaluates a multisensory smartphone-based map system designed to enable nonvisual tracking of summoned vehicles. Results from a user study with (N=12) BLV users suggest that vibro-audio maps (VAMs) promote superior spatial confidence and reasoning compared to current nonvisual audio interfaces in ridesharing apps, while also being desirable and easy to use. A subsequent expert evaluation based on improvements suggested during the user study indicate the practical utility of VAMs to address both current and future wayfinding challenges for BLV travelers.
°ä¾±³Ù²¹³Ù¾±´Ç²Ô:Ìý
Fink, P.D.S., Milne, H0., Caccese, A., Alsamsam, M., Loranger, J., Colley, M., & Giudice, N.A. (2024). Accessible maps for the future of inclusive ridesharing. In the Proceedings of the 16th International Conference on Automotive User Interfaces and Interactive Vehicular Applications (AutomotiveUI ’24), 106-115.https://doi.org/10.1145/3640792.3675736.
]]>
Does it press? Investigating the efficacy of an ultrasonic haptic button interface for non-visual driving applications /vemi/publication/does-it-press-investigating-the-efficacy-of-an-ultrasonic-haptic-button-interface-for-non-visual-driving-applications/ Tue, 18 Jul 2023 18:13:33 +0000 /vemi/?post_type=publication&p=3583 Ultrasonic haptic (UH) feedback employs mid-air ultrasound waves detectable by the palm of the hand. This interface demonstrates a novel opportunity to utilize non-visual input and output (I/O) functionalities in interactive applications, such as vehicle controls that allow the user to keep their eyes on the road. However, more work is needed to evaluate the useability of such an interface. In this study, 16 blindfolded participants completed tasks involving finding and counting UH buttons, associating buttons with audio cues, learning spatial arrangements, and determining button states. Results showed that users were generally successful with 2–4 arranged buttons and could associate them with audio cues with an average accuracy of 77.1%. Participants were also able to comprehend button spatial arrangements with 77.8% accuracy and engage in reconstruction tasks, suggesting development of reasonably accurate spatial representations. These results signify the capability of UH feedback to have real-world I/O functionality and serve to guide future exploration in this area.

Keywords: Ultrasonic haptic feedback, Mid-air haptics, Ultrasonic buttons, Non-visual interface

Citation: Alsamsam, M., Fink, P.D.S., Brown, J.R., Dimitrov, V., & Giudice, N.A.  (2023). Does it press? Investigating the efficacy of an ultrasonic haptic button interface for non-visual driving applications. In: G. Praetorius, C. Sellberg, & R. Patriarca (eds) Human Factors in Transportation. Proceedings of the International Conference on Applied Human Factors and Ergonomics (AHFE’23). vol 95, 343-353. AHFE International, USA. DOI: http://doi.org/10.54941/ahfe1003819

]]>