The cognitive overload a driver experiences while in the car can be overwhelming. Between devices that navigate, entertain and connect us it can be difficult to concentrate on the road. Current GPS systems offer visual and auditory feedback to the driver. But in some situations it's not always easy to hear and understand directions while keeping your eyes on the road. AT&T Labs researchers are working to integrate tactile feedback into your drive.
How did the Idea Hatch?
AT&T's "It Can Wait" campaign to encourage drivers to avoid texting while driving got researcher Kevin Li thinking about driver safety. He considered different ways to reduce the cognitive overload that driver's experience. In an increasingly mobile world, our senses of sound and sight are already heavily overloaded. From previous research exploring how to utilize the tactile channel for information delivery, Li realized it made sense to offload some information to our underutilized senses of touch. He began conversations with other researchers who have investigated automotive interface simulators. Once Li started to explore potential collaboration opportunities he began working with Anind Dey, an Associate Professor in the Human-Computer Interaction Institute at Carnegie Mellon University and SeungJun Kim, the postdoc working with Dey. Previously, Dey had worked on evaluating new automotive interfaces simulators with one of his post docs at Carnegie Mellon. In addition, the collaboration efforts that led to a prototype of the haptics-enhanced steering wheel involved Jodi Forlizzi, also from Carnegie Mellon University.
About the Project
The work and research resulted in AT&T creating a Haptics-enhanced Steering Wheel for navigation purposes. For example, when it's time to turn left, the left side of your steering wheel will produce a series of vibrations going counterclockwise. The wheel has 20 actuators that can vibrate in any pattern to indicate directions. This helps remove the distraction that typical navigation screens can cause. Initial research shows that it works well and that it's less distracting for drivers. When different types of vibrations are applied to deliver more information, it can help us to understand what our connected devices are trying to tell us — without looking.
The haptics teams at AT&T Labs are also looking at an array of connected wearable devices that will alert users with tactile cues. These devices don't just include phones, but accessories that might be attached to the phone via Bluetooth. For example, they have patented concepts for a connected earring that would vibrate when a message is waiting on the smartphone in your purse or pocket. And, it could vibrate in different ways to give you more information about the type of message including who it's from or whether it's urgent.
Work continues with collaborators at Carnegie Mellon University on using tactile feedback in the steering wheel to convey more complex information beyond single directional cues. One problem they are currently hoping to tackle is looking at how to provide overview information, not just next turn directions.
About the Researcher
Kevin Li is a Researcher at AT&T Labs, working on new and innovative ways of interacting with mobile devices. Li is also an Adjunct Professor in the Department of Computer Science at Columbia University. His research encompasses mobile computing, novel interaction techniques and Ubiquitous Computing and Human Computer Interaction. Li received his PhD in Computer Engineering from University of California, San Diego and his BS in Electrical Engineering and Computer Science from University of California, Berkeley.