Award Abstract # 1830146
NRI: FND: COLLAB: Intuitive, Wearable Haptic Devices for Communication with Ubiquitous Robots

NSF Org: CMMI
Div Of Civil, Mechanical, & Manufact Inn
Recipient: WILLIAM MARSH RICE UNIVERSITY
Initial Amendment Date: August 27, 2018
Latest Amendment Date: March 31, 2022
Award Number: 1830146
Award Instrument: Standard Grant
Program Manager: Harry Dankowicz
CMMI
 Div Of Civil, Mechanical, & Manufact Inn
ENG
 Directorate For Engineering
Start Date: September 1, 2018
End Date: August 31, 2022 (Estimated)
Total Intended Award Amount: $327,242.00
Total Awarded Amount to Date: $375,242.00
Funds Obligated to Date: FY 2018 = $327,242.00
FY 2019 = $8,000.00

FY 2020 = $16,000.00

FY 2021 = $16,000.00

FY 2022 = $8,000.00
History of Investigator:
  • Marcia O'Malley (Principal Investigator)
    omalleym@rice.edu
Recipient Sponsored Research Office: William Marsh Rice University
6100 MAIN ST
Houston
TX  US  77005-1827
(713)348-4820
Sponsor Congressional District: 09
Primary Place of Performance: William Marsh Rice University
Houston
TX  US  77005-1827
Primary Place of Performance
Congressional District:
09
Unique Entity Identifier (UEI): K51LECU1G8N3
Parent UEI:
NSF Program(s): NRI-National Robotics Initiati
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
01001819DB NSF RESEARCH & RELATED ACTIVIT

01001920DB NSF RESEARCH & RELATED ACTIVIT

01002021DB NSF RESEARCH & RELATED ACTIVIT

01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 116E, 8086, 9102, 9178, 9231, 9251
Program Element Code(s): 801300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041

ABSTRACT

This National Robotics Initiative (NRI) project will promote the progress of science and beneficially impact human health and quality of life by developing wearable soft robotic devices with distributed tactile stimulation that enable new forms of communication. Human-robot interactions will be commonplace in the near future. In applications such as self-driving cars and physically assistive devices, interaction will require effective and intuitive bidirectional communication. Transferring information through vision and sound can be slow and inappropriate in many circumstances. This project focuses on haptic (touch-based) robotics to enable communication in a salient but private manner that alleviates demands on other sensory channels. This project serves the national interest by advancing knowledge in the fields of human perception, psychology and neuroscience, while developing novel, convergent technology that integrates concepts across the fields of robotics, haptics, and control engineering. Project results will be disseminated through tactile haptic devices for education and publicly available software and data. The project aims to broaden participation of underrepresented groups in engineering through outreach programs, public lab tours, and the mentoring of female and minority graduate students, undergraduates, and high school students.

Wearable haptic systems have the potential to enable private, salient communication between humans and intelligent systems through an underutilized sensory channel (somatosensation). In this research, information will be transmitted through the haptic channel via wearable, ubiquitous, soft robotic devices that provide both passive and active touch interactions with the human user. This research is comprised of four main objectives. First, a characterization of human perception of the forearm will set the requirements for the frequency, amplitude, directions, spacing, and temporal actuation patterns for a two-dimensional array of haptic stimulators that are able to convey a range of haptic cues. Second, the project will develop a wearable, soft, haptic device able to stimulate the skin of one forearm, while also providing mechanical stimuli that are intended to be explored by the fingertips of the other hand. Third, the project will develop rendering algorithms for the haptic device that take into consideration human perceptual abilities for passive stimulation of the arm and active exploration by the fingertips. Fourth, the project team will create application scenarios to evaluate and refine the system. Wearable haptic systems have potential to improve human health and well-being through a variety of applications including: physical cueing for rehabilitation/movement therapy; explosive ordnance defusing; feedback from assistive devices including mobile robots in the home; tactile communication to enable design and e-commerce; immersion in virtual worlds for education; and the facilitation of remote interaction between people.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 17)
Zook, Zane A. and Ozor-Ilo, Ozioma O. and Zook, Gabriel T. and O'Malley, Marcia K. "Snaptics: Low-Cost Open-Source Hardware for Wearable Multi-Sensory Haptics" World Haptics Conference , 2021 Citation Details
Alexander, Stephen A and Garcia, Roderico and O'Malley, Marcia K. "Enhancing Multi-Sensory Cue Salience and Perceptual Identification in a Wearable Haptic Device" World Haptics Conference , 2021 Citation Details
K. Low, Andrew and A. Zook, Zane and J. Fleck, Joshua and K. O'Malley, Marcia "Effects of Interfering Cue Separation Distance and Amplitude on the Haptic Detection of Skin Stretch" IEEE Transactions on Haptics , v.14 , 2021 https://doi.org/10.1109/TOH.2021.3075387 Citation Details
Pezent, Evan and Fani, Simone and Clark, Janelle and Bianchi, Matteo and OMalley, Marcia K. "Spatially Separating Haptic Guidance from Task Dynamics through Wearable Devices" IEEE Transactions on Haptics , 2019 10.1109/TOH.2019.2919281 Citation Details
Dunkelberger, Nathan and Sullivan, Jennifer L. and Bradley, Joshua and Manickam, Indu and Dasarathy, Gautam and Baraniuk, Richard G. and Omalley, Marcia K. "A Multi-sensory Approach to Present Phonemes as Language through a Wearable Haptic Device" IEEE Transactions on Haptics , 2020 10.1109/TOH.2020.3009581 Citation Details
Fleck, Joshua J. and Zook, Zane A. and Tjandra, Tiffani W. and O'Malley, Marcia K. "A Cutaneous Haptic Cue Characterization Testbed" 2019 IEEE World Haptics Conference (WHC) , 2019 10.1109/WHC.2019.8816086 Citation Details
Zook, Zane A. and Fleck, Joshua J. and Tjandra, Tiffani W. and O'Malley, Marcia K. "Effect of Interference on Multi-Sensory Haptic Perception of Stretch and Squeeze" 2019 IEEE World Haptics Conference (WHC) , 2019 10.1109/WHC.2019.8816139 Citation Details
Zook, Zane and O'Malley, Marcia "Effect of Focus Direction and Agency on Tactile Perceptibility" Haptics: Science, Technology, Applications. EuroHaptics 2022. Lecture Notes in Computer Science , v.13235 , 2022 https://doi.org/10.1007/978-3-031-06249-0_14 Citation Details
S. Macklin, Alix and M. Yau, Jeffrey and K. O'Malley, Marcia "Evaluating the Effect of Stimulus Duration on Vibrotactile Cue Localizability With a Tactile Sleeve" IEEE Transactions on Haptics , v.14 , 2021 https://doi.org/10.1109/TOH.2021.3079727 Citation Details
Battaglia, Edoardo and Clark, Janelle and Bianchi, Matteo and Catalano, Manuel and Bicchi, Antonio and O'Malley, Marcia K. "Skin stretch haptic feedback to convey closure information in anthropomorphic, under-actuated upper limb soft prostheses" IEEE Transactions on Haptics , 2019 10.1109/TOH.2019.2915075 Citation Details
Smith, Casimir and Pezent, Evan and O'Malley, Marcia K. "Spatially Separated Cutaneous Haptic Guidance for Training of a Virtual Sensorimotor Task" 2020 IEEE Haptics Symposium (HAPTICS) , 2020 10.1109/HAPTICS45997.2020.ras.HAP20.11.2032900c Citation Details
(Showing: 1 - 10 of 17)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Haptic devices allow touch-based information transfer between humans and intelligent systems -- enabling communication in a salient but private manner that frees other sensory channels. For such devices to become ubiquitous, they must be intuitive, unobtrusive, and wearable. However, the amount of information that can be transmitted through touch is limited in large part by the location and distribution of human mechanoreceptors. Our goal was to create wearable soft devices with distributed multi-modal haptic actuation that is mounted on the body and can also be actively touched by the fingertips. This approach could provide low-resolution haptic feedback appropriate for human perceptual abilities on the body (passive feeling), and also takes advantage of fingertip perception by allowing high- resolution information transfer when needed by the user (active touching). This project aimed to create new forms of communication and lead to increased capabilities of wearable devices that could impact human health and quality of life.

In this project, we developed several open source tools to design, fabricate, and control wearable devices that provide haptic feedback to users. Snaptics (snaptics.org) is a low-cost platform designed for rapid prototyping of fully wearable multi-sensory haptic devices. Snaptics uses modular framework allowing designers to quickly test and replace snaptic modules as desired. All modules are designed to be assembled using 3D printed components and inexpensive off-the-shelf actuators to minimize costs to the designer. Our platform exists to increase community engagement with and accessibility of wearable haptic devices by lowering the technical barrier to entry and cost of creating a wearable haptic device. Syntacts (syntacts.org) is a haptic rendering framework for vibrotactile feedback. It eliminates the need for expensive haptic controllers or custom electronics by leveraging commercial-off-the-shelf audio interfaces. As a complete package, Syntacts provides the software and hardware needed to interface audio devices with low latency, synthesize complex waveforms, and amplify signals to appropriate levels.

We conducted psychophysical experiments with human subjects to better understand the perception of multi-sensory haptic cues that combine vibration, skin stretch, and squeeze. We demonstrated the saliency and usability of Snaptics devices with their equivalent research grade counterparts in two experiments. Results indicate that users perform equally well in perceiving cue sets and in completing tasks when using Snaptics devices as compared to when using their research grade counterparts. Using a custom-designed test bed, we showed that cue amplitude and separate distances affect accurate perception of multi-sensory cues. We also demonstrated that users can detect discrete cues that are provided simultaneously with continuous cues. We designed a custom vibrotactile sleeve (the VT Sleeve) and investigated how cue durations of 100, 200, and 400ms affect the overall localizability of tactile cues presented along the forearm by measuring the response means to six tactile cues, as well as the response variance to each cue (to determine error in response). Tactile location had a significant effect on overall localizability but the durations considered had no effect. We are currently investigating tactile durations outside the 100-400ms range. 

We designed hardware and conducted experiments to better understand the relationship between haptic cue perception and the contact mechanics occurring between the haptic device end effector and the skin. We used our novel grounded test bed to complete formal psychophysical tests under position and force control, to measure the maximum comfort threshold, minimum felt threshold, and the perceptual resolution. Correlations between the psychophysical and contact mechanics measurements show strong correlations between the elastic strain energy measured from the force data in the normal and shear experiments and the comfort threshold and discrimination thresholds in the normal and shear directions. The resulting data inform device design processes and provides a framework for achievable performance on an individualized basis, leading to more salient and effective haptic feedback devices.

Finally, we developed wearable haptic devices and explored through pilot experiments how such devices can be used to elicit predictable emotional responses in users, with the goal of incorporating these devices as tools for emotion regulation, an approach used to treat mental disorders. We also conducted pilot experiments to better understand the neural correlates of haptic perception. We conducted an exploratory study to evaluate Representational Similarity Analysis (RSA) of electroencephalography (EEG) signals as a general method and specifically used our EEG-RSA framework to evaluate changes in the neural representation of haptic cues after association training, where the subject learned to map discrete phonemes to multi-sensory haptic cues. Our results suggest that training leads to a sharpening of the sensory response to haptic cues such that after training the neural representation of haptic cues starts to reflect the features of the cues themselves.

Throughout the project effort, we developed educational materials to teach concepts with haptics, we provided our tools and results through open access (software and publications), and we disseminated our findings through publications and presentations.

 


Last Modified: 10/07/2022
Modified by: Marcia K O'malley

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page