To allow robots of the future to handle delicate objects, researchers have designed artificial skin that can detect touches 1,000 times faster than humans.
While many robots of today are adept at picking up heavy boxes in a large warehouse, scientists are working towards machines that can handle everyday objects not only as good as humans, but better.
Two researchers from the National University of Singapore (NUS) and members of the Intel Neuromorphic Research Community (INRC) have revealed an artificial skin that can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.
The artificial skin could be applied to robots working in factories and warehouses to give them greater tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping.
Solving half the puzzle
The researchers see that the ability to feel and perceive surrounds could also pave the way towards more human-robotic interactions, such as in elderly care homes or with surgical tasks in medicine that can’t be achieved with today’s robots.
The artificial skin’s ability to process sensory data is down to Intel’s Loihi neuromorphic research chip. In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read braille.
The tactile data was passed to Loihi through the cloud to convert the micro bumps felt by the hand into a semantic meaning. Intel’s chip achieved more than 92pc accuracy in classifying the braille letters, while using 20 times less power than a standard Von Neumann processor.
“Making an ultra-fast artificial skin sensor solves about half the puzzle of making robots smarter,” said assistant professor Benjamin Tee from the NUS Department of Materials Science and Engineering.
“They also need an artificial brain that can ultimately achieve perception and learning as another critical piece in the puzzle. Our unique demonstration of an AI skin system with neuromorphic chips such as the Intel Loihi provides a major step forward towards power efficiency and scalability.”
Promise of neuromorphic systems
Following initial testing, the NUS researchers attempted to improve robotic perception capabilities by combining both vision and touch data in a spiking neural network. To do this, they tasked a robot to classify various opaque containers holding differing amounts of liquid using sensory inputs from the artificial skin and an event-based camera.
The results of these tests, presented at the Robotics: Science and Systems conference this week, showed that the combination of event-based vision and touch using a spiking neural network led to 10pc greater accuracy in object classification compared to a vision-only system.
Harold Soh, also of NUS, said the team was “excited” by the results. “They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception,” he said.
“It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations.”