Medical Design and Outsourcing

  • Home
  • Medical Device Business
    • Mergers & Acquisitions
    • Financial
    • Regulatory
  • Applications
    • Cardiovascular
    • Devices
    • Imaging
    • Implantables
    • Medical Equipment
    • Orthopedic
    • Surgical
  • Technologies
    • Contract Manufacturing
    • Components
    • Electronics
    • Extrusions
    • Materials
    • Motion Control
    • Prototyping
    • Pumps
    • Tubing
  • Med Tech Resources
    • DeviceTalks Tuesdays
    • Digital Editions
    • eBooks
    • Manufacturer Search
    • Medical Device Handbook
    • MedTech 100 Index
    • Podcasts
    • Print Subscription
    • The Big 100
    • Webinars / Digital Events
    • Whitepapers
    • Video
  • 2022 Leadership in MedTech
    • 2022 Leadership Voting!
    • 2021 Winners
    • 2020 Winners
  • Women in Medtech

Can You Feel What I’m Saying?

October 30, 2018 By Rice University

Imagine the panic. Fire alarms blare. Smoke fills the room, and you’re left only with the sense of touch, feeling desperately along walls as you try to find the doorway.

Now imagine technology guiding you by sense of touch. Your smartwatch, alerted by the same alarms, begins “speaking” through your skin, giving directions with coded vibrations, squeezes and tugs with meanings as clear as spoken words.

That scenario could play out in the future thanks to technology under development in the laboratory of Rice mechanical engineer Marcia O’Malley, who has spent more than 15 years studying how people can use haptic sense to interact with technology—be it robots, prosthetic limbs or stroke-rehabilitation software.

“Skin covers our whole body and has many kinds of receptors in it, and we see that as an underutilized channel of information,” said O’Malley, director of the Rice Robotics Initiative and Rice’s Mechatronics and Haptic Interfaces Laboratory (MAHI).

Emergency situations like the fire scenario described above are just one example. O’Malley said there are many “other situations where you might not want to look at a screen, or you already have a lot of things displayed visually. For example, a surgeon or a pilot might find it very useful to have another channel of communication.”

With new funding from the National Science Foundation, O’Malley and Stanford University collaborator Allison Okamura will soon begin designing and testing soft, wearable devices that allow direct touch-based communications from nearby robots. The funding, which is made possible by the National Robotics Initiative, is geared toward developing new forms of communication that bypass visual clutter and noise to quickly and clearly communicate.

“Some warehouses and factories already have more robots than human workers, and technologies like self-driving cars and physically assistive devices will make human-robot interactions far more common in the near future,” said O’Malley, Rice’s Stanley C. Moore Professor of Mechanical Engineering and professor of both computer science and electrical and computer engineering.

Soft, wearable devices could be part of a uniform, like a sleeve, glove, watchband or belt. By delivering a range of haptic cues—like a hard or soft squeeze, or a stretch of the skin in a particular direction and place, O’Malley said it may be possible to build a significant “vocabulary” of sensations that carry specific meanings.

“I can see a car’s turn signal, but only if I’m looking at it,” O’Malley said. “We want technology that allows people to feel the robots the around them and to clearly understand what those robots are about to do and where they are about to be. Ideally, if we do this correctly, the cues will be easy to learn and intuitive.”

For example, in a study presented this month at the International Symposium on Wearable Computers (ISWC) in Singapore, MAHI graduate student Nathan Dunkelberger showed that users needed less than two hours of training to learn to “feel” most words that were transmitted by a haptic armband. The MAHI-developed “multi-sensory interface of stretch, squeeze and integrated vibrotactile elements,” or MISSIVE, consists of two bands that fit around the upper arm. One of these can gently squeeze, like a blood-pressure cuff, and can also slightly stretch or tug the skin in one direction. The second band has vibrotactile motors—the same vibrating alarms used in most cellphones—at the front, back, left and right sides of the arm.

Using these cues in combination, MAHI created a vocabulary of 23 of the most common vocal sounds for English speakers. These sounds, which are called phonemes, are used in combination to make words. For example, the words “ouch” and “chow” contain the same two phonemes, “ow” and “ch,” in different order. O’Malley said communicating with phonemes is faster than spelling words letter by letter, and subjects don’t need to know how a word is spelled, only how it’s pronounced.

Rice University graduate students Jenny Sullivan (left) and Nathan Dunkelberger demonstrate the MISSIVE haptic armband in the Mechatronics and Haptic Interfaces Laboratory. (Photo by Jeff Fitlow/Rice University)

Dunkelberger said English speakers use 39 phonemes, but for the proof-of-concept study, he and colleagues at MAHI used 23 of the most common. In tests, subjects were given limited training—just 1 hour, 40 minutes—which involved hearing the spoken phoneme while also feeling it displayed by MISSIVE. In later tests, subjects were asked to identify 150 spoken words consisting of two to six phonemes each. Those tested got 86 percent of the words correct.

“What this shows is that it’s possible, with a limited amount of training, to teach people a small vocabulary of words that they can recall with high accuracy,” O’Malley said. “And there are definitely things we could optimize. We could make the cues more salient. We could refine the training protocol. This was our prototype approach, and it worked pretty well.”

In the NSF project, she said the team will focus not on conveying words, but conveying non-verbal information.

“There are many potential applications for wearable haptic feedback systems to allow for communication between individuals, between individuals and robots or between individuals and virtual agents like Google maps,” O’Malley said. “Imagine a smartwatch that can convey a whole language of cues to you directly, and privately, so that you don’t have to look at your screen at all!”

The ISWC study was supported by Facebook. Additional study co-authors include Jenny Sullivan, Joshua Bradley, Nickolas Walling, Indu Manickam, Gautam Dasarathy and Richard Baraniuk, all of Rice; and Ali Israr, Frances Lau, Keith Klumb, Brian Knott and Freddy Abnousi, all of Facebook.

Related Articles Read More >

A woman wearing a hearing aid
Why filter bank design is critical for effective hearing aids
Northwestern University dissolvable pacemaker
This smart, dissolving pacemaker communicates with sensors on the body for remote patient monitoring
Outcome-Based Technologies' Excyabir CryoKnee
DJO’s Enovis buys orthopedic bracing assets of Outcome-Based Technologies
An array of Varta microbatteries
Varta presents microbattery product portfolio at Computex 2022

DeviceTalks Weekly.

June 24, 2022
How innovative design, commercial strategy is building Cala Trio’s bioelectronic medicine market
See More >

MDO Digital Edition

Digital Edition

Subscribe to Medical Design & Outsourcing. Bookmark, share and interact with the leading medical design engineering magazine today.

MEDTECH 100 INDEX

Medtech 100 logo
Market Summary > Current Price
The MedTech 100 is a financial index calculated using the BIG100 companies covered in Medical Design and Outsourcing.
DeviceTalks

DeviceTalks is a conversation among medical technology leaders. It's events, podcasts, webinars and one-on-one exchanges of ideas & insights.

DeviceTalks

New MedTech Resource

Medical Tubing

Enewsletter Subscriptions

Enewsletter Subscriptions

MassDevice

Mass Device

The Medical Device Business Journal. MassDevice is the leading medical device news business journal telling the stories of the devices that save lives.

Visit Website
MDO ad
Medical Design and Outsourcing
  • MassDevice
  • DeviceTalks
  • MedTech 100 Index
  • Medical Tubing + Extrusion
  • Drug Delivery Business News
  • Drug Discovery & Development
  • Pharmaceutical Processing World
  • R&D World
  • About Us/Contact
  • Advertise With Us
  • Subscribe to Print Magazine
  • Subscribe to E-newsletter
  • Attend our Monthly Webinars
  • Listen to our Weekly Podcasts
  • Join our DeviceTalks Tuesdays Discussion

Copyright © 2022 WTWH Media, LLC. All Rights Reserved. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of WTWH Media LLC. Site Map | Privacy Policy | RSS

Search Medical Design & Outsourcing

  • Home
  • Medical Device Business
    • Mergers & Acquisitions
    • Financial
    • Regulatory
  • Applications
    • Cardiovascular
    • Devices
    • Imaging
    • Implantables
    • Medical Equipment
    • Orthopedic
    • Surgical
  • Technologies
    • Contract Manufacturing
    • Components
    • Electronics
    • Extrusions
    • Materials
    • Motion Control
    • Prototyping
    • Pumps
    • Tubing
  • Med Tech Resources
    • DeviceTalks Tuesdays
    • Digital Editions
    • eBooks
    • Manufacturer Search
    • Medical Device Handbook
    • MedTech 100 Index
    • Podcasts
    • Print Subscription
    • The Big 100
    • Webinars / Digital Events
    • Whitepapers
    • Video
  • 2022 Leadership in MedTech
    • 2022 Leadership Voting!
    • 2021 Winners
    • 2020 Winners
  • Women in Medtech