Sensory, Light-Up Skin and the Dawn of Touchy-Feely Robots

Interactive e-skin can detect pressure changes

  • Share
  • Read Later
Ali Javey and Chuan Wang / UC Berkeley

It doesn’t sound like much: some material that lights up when you touch it. Don’t we already have that? This morning I touched a button on my microwave to warm a cup of cold coffee, and bam, the insides lit up. My family was in on the touch-sensitive lamp fad, where tapping the lamp’s metal base cycled the power. When I touch my iPhone’s dimmed glass screen, it lights up. We’ve had touch-sensitive material that can trip mechanisms (and turn on or off lights) for decades.

But this is different: the underlying material, made from plastic, is thin as a piece of paper — thin enough that it draws comparisons to human skin and flexible enough to fold into a cylindrical shape, like rolling up a newspaper. We’re not talking mere binary touch, either, where the embedded sensor detects an on/off state and either lights up or turns off: gently touch this new material with its network of sensors, and tiny lights illuminate. Press harder and those lights get brighter, scaling up proportionately as you apply pressure.

It’s all the work of engineers at UC Berkeley, led by associate professor of electrical engineering and computer sciences Ali Javey, who’ve designed what they’re calling electronic skin, or e-skin: a 16-by-16-pixel gridlike mesh, each pixel harboring a transistor, organic LED and pressure sensor.

“We are not just making devices, we are building systems,” said Javey, speaking to UC Berkeley’s NewsCenter. “With the interactive e-skin, we have demonstrated an elegant system on plastic that can be wrapped around different objects to enable a new form of human-machine interfacing.”

Robots with tactile sensor systems aren’t news, of course, nor is synthetic skin, which we’ve been experimenting with since at least the 1970s, when Dr. John F. Burke created a material made from plastics, cow tissue and shark cartilage that approximated human skin well enough to be used to treat severe-burn victims. But artificial e-skin with pressure sensitivity is a more recent development. Javey and his team first announced their take on the technology in September 2010: thin rubber sheets harboring semiconductor nanowire transistors that approximated human skin, offering basic pressure-sensitivity functions that provided, according to Javey, “the ability to feel and touch objects.”

It’s a bigger deal than you’d think when talking about human-interactive robots. Imagine you without your skin or much of a nervous system, just bones and sinew, capable of detecting whether you’re touching something or not, but little more. Pinching someone else would feel the same to your skinless, skeletal fingers, whether applying a little or a lot of pressure. If a hulking rescue robot like DARPA’s Atlas (work with me and imagine a future version) comes lumbering into a disaster scenario, say a burning house, to carry you off to safety, it needs to be aware of your relative skeletomuscular fragility, taking care to grip you firmly, but not too firmly.

And that’s just the humanlike-robotics angle. The material has other more immediate potential uses, from crazy light-up household wallpaper to customizable touchscreen displays or interactive surface laminates to a kind of blood-pressure cuff you might wear like an armband, keeping tabs on your vitals on the go.

“Integrating sensors into a network is not new, but converting the data obtained into something interactive is the breakthrough,” said Chuan Wang, an assistant professor of electrical and computer engineering at Michigan State University, who worked on the material while a postdoctoral researcher in the UC Berkeley lab. “And unlike the stiff touchscreens on iPhones, computer monitors and ATMs, the e-skin is flexible and can be easily laminated on any surface.”

Next up: the UC Berkeley team wants to engineer the sensors to respond to changes in temperature and light.