Humanoids look like us, somewhat act like us, and perhaps are becoming smarter.
Nonetheless, we think of humanoids as soul-less.
What they lack are true feelings of guilt, sadness, happiness, pain and suffering, the kind we see in movies like RoboCop.
Until now, or at least we have the start of robots that can feel pain. And pain is the other side of the coin of other feelings like joy or contentment.
In 2017, a team of engineers at the University of Minnesota developed a 3D printed stretchable “skin.” The fabric would be embedded with electronic sensors splayed out similar to our nervous system.
A team of researchers at Osaka University in Japan are today working on synthetic skin that could help robots empathize with human beings.
The synthetic skin that could one day help robots “feel” pain. This could help robots empathize with their human companions.
The tech works by embedding sensors in soft, artificial skin that can detect gentle touch and more “painful” sensations like being hit.
Reported in the annual meeting of the American Association for the Advancement of Science, robots equipped with this skin could potentially signal emotions. Called an artificial “pain nervous system” by Minoru Asada, a member of the research team, this small development could ultimately lead to robots experiencing pain like real people.
The Japanese team has already developed an unsettlingly realistic-looking robotic child’s head that can change facial expressions in reaction to touch and pain signals from the synthetic skin. Called “Affetto“, it has been shown to reliably pick up on a range of touch sensations.
According to neuroscientist Kingson Man of The University of Southern California, the skin, being soft rather than rigid, should allow for the “possibility of engagement in versatile and truly intelligent ways.”
Asada hopes that this development could open the door for robots to recognize pain in others. This would prove to be a vitally important skill for robots that are designed to help care for others, like the elderly.
Electronic skin that will allow robots to feel heat, cold and pain could be “life changing” for people with prostheses and paralysis.
Professor Gordon Cheng led a project at the Technical University of Munich to cover a robot with 1,260 small hexagonal plates, giving it an electronic skin.
Individual sensor cells placed in a honeycomb arrangement on the upper body, arms, legs and soles of the feet of the robot meant it could measure proximity, pressure, temperature and acceleration.
A combined sense of the internal and the external is completely lost to robots, which generally rely on computer vision or surface mechanosensors to track their movements and their interaction with the outside world.
What if, instead, we could give robots an artificial nervous system?
A team led by Dr. Rob Shepard at Cornell University did just that, with a seriously clever twist. Rather than mimicking the electric signals in our nervous system, his team turned to light. By embedding optical fibers inside a 3D printed stretchable material, the team engineered an “optical lace” that can detect changes in pressure less than a fraction of a pound, and pinpoint the location to a spot half the width of a tiny needle.
The invention isn’t just an artificial skin. Instead, the delicate fibers can be distributed both inside a robot and on its surface, giving it both a sense of tactile touch and—most importantly—an idea of its own body position in space.