Palani, H.P. – VEMI Lab /vemi University of Maine Sun, 26 Dec 2021 16:47:50 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 Comparing Map Learning between Touchscreen-Based Visual and Haptic Displays: A Behavioral Evaluation with Blind and Sighted Users /vemi/publication/comparing-map-learning-between-touchscreen-based-visual-and-haptic-displays-a-behavioral-evaluation-with-blind-and-sighted-users/ Sun, 26 Dec 2021 16:47:50 +0000 /vemi/?post_type=publication&p=3136 The ubiquity of multimodal smart devices affords new opportunities for eyes-free applications for conveying graphical information to both sighted and visually impaired users. Using previously established haptic design guidelines for generic rendering of graphical content on touchscreen interfaces, the current study evaluates the learning and mental representation of digital maps, representing a key real-world translational eyes-free application. Two experiments involving 12 blind participants and 16 sighted participants compared cognitive map development and test performance on a range of spatio-behavioral tasks across three information-matched learning-mode conditions: (1) our prototype vibro-audio map (VAM), (2) traditional hardcopy-tactile maps, and (3) visual maps. Results demonstrated that when perceptual parameters of the stimuli were matched between modalities during haptic and visual map learning, test performance was highly similar (functionally equivalent) between the learning modes and participant groups. These results suggest equivalent cognitive map formation between both blind and sighted users and between maps learned from different sensory inputs, providing compelling evidence supporting the development of amodal spatial representations in the brain. The practical implications of these results include empirical evidence supporting a growing interest in the efficacy of multisensory interfaces as a primary interaction style for people both with and without vision. Findings challenge the long-held assumption that blind people exhibit deficits on global spatial tasks compared to their sighted peers, with results also providing empirical support for the methodological use of sighted participants in studies pertaining to technologies primarily aimed at supporting blind users.

Citation:

Palani, H.P., Fink, P.D.S., & Giudice, N.A. (2021). Comparing Map Learning between Touchscreen-Based Visual and Haptic Displays: A Behavioral Evaluation with Blind and Sighted Users. Multimodal Technologies and Interaction, 6(1):1. DOI:

]]>
Design Guidelines for Schematizing and Rendering Haptically Perceivable Graphical Elements on Touchscreen Devices /vemi/publication/design-guidelines-for-schematizing-and-rendering-haptically-perceivable-graphical-elements-on-touchscreen-devices/ Thu, 17 Sep 2020 16:01:57 +0000 /vemi/?post_type=publication&p=2651 This paper explores the viability of new touchscreen-based haptic/vibrotactile interactions as a primary modality for perceiving visual graphical elements in eyes-free situations. For touchscreen-based haptic information extraction to be both accurate and meaningful, the onscreen graphical elements should be schematized and downsampled to: (1) maximize the perceptual specificity of touch-based sensing and (2) account for the technical characteristics of touchscreen interfaces. To this end, six human behavioral studies were conducted with 64 blind and 105 blindfolded-sighted participants. Experiments 1–3 evaluated three key rendering parameters that are necessary for supporting touchscreen-based vibrotactile perception of graphical information, with results providing empirical guidance on both minimally detectable and functionally discriminable line widths, inter-line spacing, and angular separation that should be maintained. Experiments 4–6 evaluated perceptually-motivated design guidelines governing visual-to-vibrotactile schematization required for tasks involving information extraction, learning, and cognition of multi-line paths (e.g., transit-maps and corridor-intersections), with results providing clear guidance as to the stimulus parameters maximizing accuracy and temporal performance. The six empirically-validated guidelines presented here, based on results from 169 participants, provide designers and content providers with much-needed guidance on effectively incorporating perceptually salient touchscreen-based haptic feedback as a primary interaction style for interfaces supporting nonvisual and eyes-free information access.

Citation:

Hari P. Palani , Paul D. S. Fink & Nicholas A. Giudice (2020) Design Guidelines for Schematizing and Rendering Haptically Perceivable Graphical Elements on Touchscreen Devices, International Journal of Human–Computer Interaction, 36:15, 1393-1414, DOI: 10.1080/10447318.2020.1752464

Download  PDF

]]>
Design Guidelines and Recommendations for Multimodal, Touchscreen-based Graphics /vemi/publication/design-guidelines-and-recommendations-for-multimodal-touchscreen-based-graphics/ Mon, 24 Aug 2020 20:20:31 +0000 /vemi/?post_type=publication&p=2530 With content rapidly moving to the electronic space, access to graphics for individuals with visual impairments is a growing concern. Recent research has demonstrated the potential for representing basic graphical content on touchscreens using vibrations and sounds, yet few guidelines or processes exist to guide the design of multimodal, touchscreen-based graphics. In this work, we seek to address this gap by synergizing our collective research efforts over the past eight years and implementing our findings into a compilation of recommendations, which we validate through an iterative design process and user study. We start by reviewing previous work and then collate findings into a set of design guidelines for generating basic elements of touchscreen-based multimodal graphics. We then use these guidelines to generate exemplary graphics in mathematics, specifically bar charts and geometry concepts. We discuss the iterative design process of moving from guidelines to actual graphics and highlight challenges. We then present a formal user study with 22 participants with visual impairments, comparing learning performance on using touchscreen-rendered graphics to embossed graphics.We conclude with qualitative feedback from participants on the touchscreen based approach and offer areas of future investigation as these recommendation are expanded to include more complex graphical concepts.

Citation:

Gorlewicz, J.L., Tennison, J.L., Uesbeck, P.M., Richard, M.E., Palani, H.P., Stefik, A., Smith, D.W., & Giudice, N.A. (2020). Design Guidelines and Recommendations for Multimodal, Touchscreen-Based Graphics. ACM Transactions on Accessible Computing (TACCESS), 13(3), 1-30.

Download PDF

]]>
The Graphical Access Challenge for People with Visual Impairments: Positions and Pathways Forward /vemi/publication/the-graphical-access-challenge-for-people-with-visual-impairments-positions-and-pathways-forward/ Fri, 03 May 2019 16:19:07 +0000 /vemi/?post_type=publication&p=2186 Graphical access is one of the most pressing challenges for individuals who are blind or visually impaired. This chapter discusses some of the factors underlying the graphics access challenge, reviews prior approaches to addressing this long-standing information access barrier, and describes some promising new solutions. We specifically focus on touchscreen-based smart devices, a relatively new class of
information access technologies, which our group believes represent an exemplary model of user-centered, needs-based design. We highlight both the challenges and the vast potential of these technologies for alleviating the graphics accessibility gap and share the latest results in this line of research. We close with recommendations on ideological shifts in mindset about how we approach solving this vexing access problem, which will complement both technological and perceptual advancements that are rapidly being uncovered through a growing research community in this domain.

Keywords: haptics, touchscreen-based accessibility, vibrotactile displays, multimodal interfaces, information-access technologies

Citation: Gorlewicz, J. L., Tennison, J. L., Palani, H. P., & Giudice, N. A. (2018). The Graphical Access Challenge for People with Visual Impairments: Positions and Pathways Forward. In Interactive Multimedia. IntechOpen (online publication ahead of print).

Download PDF

]]>
Touchscreen-based Haptic Information Access for assisting Blind and Visually-Impaired Users: Perceptual Parameters and Design Guidelines /vemi/publication/touchscreen-based-haptic-information-access-assisting-blind-visually-impaired-users-perceptual-parameters-design-guidelines/ Mon, 09 Jul 2018 21:05:58 +0000 /vemi/?post_type=publication&p=1858 Touchscreen-based smart devices, such as smartphones and tablets, offer great promise for providing blind and visually-impaired (BVI) users with a means for accessing graphics non-visually. However, they also offer novel challenges as they were primarily developed for use as a visual interface. This paper studies key usability parameters governing accurate rendering of haptically-perceivable graphical materials. Three psychophysically-motivated usability studies, incorporating 46 BVI participants, were conducted that identified three key parameters for accurate rendering of vibrotactile lines. Results suggested that the best performance and greatest perceptual salience is obtained with vibrotactile feedback based on: (1) a minimum width of 1mm for detecting lines, (2) a minimum gap of 4mm for discriminating lines rendered parallel to each other, and (3) a minimum angular separation (i.e., cord length) of 4mm for discriminating oriented lines. Findings provide foundational guidelines for converting/rendering visual graphical materials on touchscreen-based interfaces for supporting haptic/vibrotactile information access.

Keywords: Assistive Technology, Haptic information access, Haptic interaction, Multimodal interface, Design Guidelines

Citation:

Palani, H.P., Tennison, J.L., Giudice, G.B., & Giudice, N.A. (2018). Touchscreen-based haptic information access for assisting blind and visually-impaired users: Perceptual parameters and design guidelines. In: Ahram T., Falcão C. (eds.) Advances in Usability, User Experience and Assistive Technology, part of  the International Conference on Applied Human Factors and Ergonomics (AHFE’18). Advances in Intelligent Systems and Computing, vol 798, (Pp. 837-847). Springer, Cham.

Download PDF

]]>
Haptic Information Access using Touchscreen Devices: Guidelines for Accurate Perception of Angular Magnitude and Line Orientation /vemi/publication/haptic-information-access-using-touchscreen-devices-design-guidelines-accurate-perception-angular-magnitude-line-orientation/ Sat, 24 Feb 2018 16:36:49 +0000 /vemi/?post_type=publication&p=1763 The overarching goal of our research program is to address the longstanding issue of non-visual graphical accessibility for blind and visually impaired (BVI) people through development of a robust, low-cost solution. This paper contributes to our research agenda aimed at studying key usability parameters governing accurate rendering and perception of haptically-accessed graphical materials via commercial touchscreen-based smart devices, such as smart phones and tablets. The current work builds on the findings from our earlier studies by empirically investigating the minimum angular magnitude that must be maintained for accurate detection and angular judgment of oriented vibrotactile lines. To assess the minimum perceivable angular magnitude (i.e., cord length) between oriented lines, a psychophysically-motivated usability experiment was conducted that compared accuracy in oriented line detection across four angles (2°, 5°, 9°, and 22°) and two radiuses (1-inch and 2-inch). Results revealed that a minimum 4mm cord length (which corresponds to 5° at a 1-inch radius and 2° at a 2-inch radius) must be maintained between oriented lines for supporting accurate haptic perception via vibrotactile cuing. Findings provide foundational guidelines for converting/rendering oriented lines on touchscreen devices for supporting haptic information access based on vibrotactile stimuli.

Keywords: Assistive Technology, Haptic information access, Haptic interaction, Multimodal interface, Design Guidelines

Citation:

Palani, H.P., Giudice, G.B., and Giudice, N.A. (2018). Haptic Information Access on Touchscreen devices: Guidelines for accurate perception and judgment of line orientation. Proceedings of the 20th annual conference on Human-Computer Interaction (HCI International’18). Las Vegas, NV. July 15-18 (corresponding author).

Download PDF

]]>
Principles for Designing Large-Format Refreshable Haptic Graphics Using Touchscreen Devices: An Evaluation of Nonvisual Panning Methods /vemi/publication/principles-designing-large-format-refreshable-haptic-graphics-using-touchscreen-devices-evaluation-nonvisual-panning-methods/ Tue, 23 May 2017 15:39:12 +0000 /vemi/?post_type=publication&p=1473 Touchscreen devices, such as smartphones and tablets, represent a modern solution for providing graphical access to people with blindness and visual impairment (BVI). However, a significant problem with these solutions is their limited screen real estate, which necessitates panning or zooming operations for accessing large-format graphical materials such as maps. Non-visual interfaces cannot directly employ traditional panning or zooming techniques due to various perceptual and cognitive limitations (e.g., constraints of the haptic field of view and disorientation due to loss of one’s reference point after performing these operations). This article describes the development of four novel non-visual panning methods designed from the onset with consideration of these perceptual and cognitive constraints. Two studies evaluated the usability of these panning methods in comparison with a non-panning control condition. Results demonstrated that the exploration, learning, and subsequent spatial behaviors were similar between panning and non-panning conditions, with one panning mode, based on a two-finger drag technique, revealing the overall best performance. Findings provide compelling evidence that incorporating panning operations on touchscreen devices – the fastest growing computational platform among the BVI demographic – is a viable, low-cost, and immediate solution for providing BVI people with access to a broad range of large-format digital graphical information.

Citation:

Giudice, N.A., & Palani, H.P. (2017). Principles for Designing Large-Format Refreshable Haptic Graphics Using Touchscreen Devices: An Evaluation of Nonvisual Panning Methods. ACM Transactions on Accessible Computing (TACCESS), 9, 9:1-9:25.

Download PDF

]]>
Usability Parameters for Touchscreen-based Haptic Perception /vemi/publication/usability-parameters-touchscreen-based-haptic-perception/ Tue, 12 Jul 2016 18:32:58 +0000 /vemi/?post_type=publication&p=1550 Despite the advancements in touchscreen technologies, there is a surprising dearth of research on touchscreen-based haptic perception and guidance on best practices for haptic interface-design employing these devices. We address these shortcomings by investigating several key usability parameters and spatio-cognitive abilities pertinent to haptic information access via touchscreen devices. Two preliminary psychophysically-inspired usability studies investigated the haptic thresholds for detecting (Exp 1) and tracing (Exp 2) graphical stimuli rendered on a touchscreen interface. We found that a minimum of 1mm width is necessary for detecting lines using haptic feedback (i.e., vibrotactile or electrostatic stimulation) and a width of at least 3mm should be maintained for effective line tracing. Results provide foundational guidelines for designing information content that is optimized for rendering on touchscreen displays. Findings also demonstrate the importance of and need for further investigations into the usability parameters and cognitive abilities required for the design of effective haptic interfaces.

Citation:

Palani, H.P., and Giudice, N.A. (2016). Usability parameters for touchscreen-based haptic perception. Work in progress paper presented at the IEEE Haptics Symposium. April, Philadelphia PA, USA.

Download PDF

]]>
Evaluation of Non-Visual Zooming Operations using Touch-Screen Devices /vemi/publication/evaluation-non-visual-panning-operations-using-touch-screen-devices/ Fri, 01 Jan 2016 19:51:43 +0000 /vemi/?post_type=publication&p=1217 Abstract: 

This paper summarizes the implementation, evaluation, and usability of non-visual panning operations for accessing graphics rendered on touch screen devices. Four novel non-visual panning techniques were implemented and experimentally evaluated on our experimental prototype, called a Vibro-Audio Interface (VAI), which provides completely non-visual access to graphical information using vibration, audio, and kinesthetic cues on a commercial touch screen device. This demonstration will provide an overview of our system’s functionalities and will discuss the necessity for developing non-visual panning operations enabling visually-impaired people access to large-format graphics (such as maps and floor plans).

Citation: 

Palani, H.P., Giudice, U., and Giudice, N.A. (2016). Evaluation of non-visual zooming operations on touchscreen devices. In M. Antona & C. Stephanidis (Eds.), Proceedings of the 10th International Conference of Universal Access in Human-Computer Interaction (UAHCI), Part of HCI International 2016. Toronto, CA. July 17-22 (pp. 162-174). Springer International. (corresponding author).

Download PDF

]]>
Evaluation of Non-Visual Panning operations using Touch-Screen Devices /vemi/publication/evaluation-non-visual-panning-operations-using-touch-screen-devices-2/ Sun, 20 Jul 2014 15:19:10 +0000 /vemi/?post_type=publication&p=1568 This paper summarizes the implementation, evaluation, and usability of non-visual panning operations for accessing graphics rendered on touch screen devices. Four novel non-visual panning techniques were implemented and experimentally evaluated on our experimental prototype, called a Vibro-Audio Interface (VAI), which provides completely non-visual access to graphical information using vibration, audio, and kinesthetic cues on a commercial touch screen device. This demonstration will provide an overview of our system’s functionalities and will discuss the necessity for developing non-visual panning operations enabling visually-impaired people access to large-format graphics (such as maps and floor plans).

Citation:

*Palani, H. & Giudice, N.A. (2014). Evaluation of non-visual panning operations using touch-screen devices. Proceedings of the 16th international ACM SIGACCESS conference on Computers & accessibility (ASSETS’14). Pp. 293-294. ACM New York, NY, USA (corresponding author).

Download PDF

]]>