MIT Invents A Tactile Display
Written by Kay Ewbank   
Sunday, 16 June 2013

Researchers at MIT have designed a wearable tactile display that uses vibrations to provide information. This could prove a useful way to provide feedback for those with impaired hearing, or in noisy environments.

Lynette Jones, a senior research scientist in MIT’s Department of Mechanical Engineering, came up with the idea of using the skin as a sensitive medium for communication. Her vibrotactile solution, that can be considered as a tactile Morse code, combines a source of vibration in a belt or jackets with a link to a GPS.

This could buzz to indicate to drivers whether you should be turning left or right, doing away with the need to look at maps. While most in-car GPS systems give vocal directions, the vibrating device would be a good alternative for anyone with impaired hearing, or in noisy environments.

Explaining her rationale for this alternative, haptic, approach, Jones says:

“If you compare the skin to the retina, you have about the same number of sensory receptors, you just have them over almost two square meters of space, unlike the eye where it’s all concentrated in an extremely small area. The skin is generally as useful as a very acute area. It’s just that you need to disperse the information that you’re presenting.”

One potential problem highlighted by Jones is that of varying sensitivity, as people might vary in how clearly they feel the vibrations. Jones has built an array that precisely tracks a motor’s vibrations through skin in three dimensions. The array consists of eight miniature accelerometers and a single pancake motor — a type of vibrating motor used in cellphones. She used the array to measure motor vibrations in three locations: the palm of the hand, the forearm and the thigh. From her studies with eight healthy participants, Jones found that a motor’s mechanical vibrations through skin drop off quickly in all three locations, within 8 millimeters from where the vibrations originated.

 

 

The test involved fitting the volunteers with a 3-by-3 array of pancake motors in each of the three locations – hand, forearm and thigh, and the results showed that while the vibrations reduced beyond the 8 millimeters, most people continued to perceive the vibrations as far away as 24 millimeters. When asked to identify which motors within an array were vibrating, as you might expect the volunteers were most accurate for the array on the palm as opposed to the ones on the forearm or thigh. They were also better at picking out vibrations in the outer corners of the array rather than the inner motors, leading Jones to conclude:

“there’s no point in making things much more compact, which may be a desirable feature from an engineering point of view, but from a human-use point of view, doesn’t make a difference.”

The fact that the volunteers found it difficult to work out precisely which sensors were buzzing may be important, Jones says, if engineers plan to build tactile displays that incorporate different frequencies of vibrations. For instance, the difference between two motors — one slightly faster than the other — may be indistinguishable in certain parts of the skin. Likewise, two motors spaced a certain distance apart may be differentiable in one area but not another. The research is tackling issues such as:

“Should I have eight motors, or is four enough that 90 percent of the time, I’ll know that when this one’s on, it’s this one and not that one?”

Those sorts of questions need to be tackled in the context of what information you want to present.

Jones sees promising applications for wearable tactile displays. In addition to helping drivers navigate, she says tactile stimuli may direct firefighters through burning buildings, or emergency workers through disaster sites. In more mundane scenarios, she says tactile displays may help joggers traverse an unfamiliar city, taking directions from a buzzing wristband, instead of having to look at a smartphone.

The results of Jones’ experiments will appear in the journal IEEE Transactions on Haptics.

 

More Information

MIT News

Related Article

A sticky touch screen improves interaction       

Make Android buzz with MOTIV       

 

 

To be informed about new articles on I Programmer, install the I Programmer Toolbar, subscribe to the RSS feed, follow us on, Twitter, Facebook, Google+ or Linkedin,  or sign up for our weekly newsletter.

 

espbook

 

Comments




or email your comment to: comments@i-programmer.info

 

Banner


.NET 9 Released
18/11/2024

.NET 9 has been released with a number of performance improvements and new features designed to help developers use AI.



Google Updates Responsible AI Toolkit
01/11/2024

Google has announced updates to the Responsible Generative AI Toolkit to enable it to be used with any LLM model. The Responsible GenAI Toolkit provides resources to design, build, and evaluate open A [ ... ]


More News

Last Updated ( Sunday, 16 June 2013 )