A joint research team of three universities has developed an artificial sensory interface system that mimics human skin and nerves, the Korea Advanced Institute of Science and Technology (KAIST) said in a news release Monday.

From left, Professors Park Seung-joon of the Korea Advanced Institute of Science and Technology, Chun Seong-woo of Korea University, and Kim Jong-seokof Hanyang University have developed an artificial sensory system that can mimic human skin and nerves. (KAIST)
From left, Professors Park Seung-joon of the Korea Advanced Institute of Science and Technology, Chun Seong-woo of Korea University, and Kim Jong-seokof Hanyang University have developed an artificial sensory system that can mimic human skin and nerves. (KAIST)

According to KAIST, artificial sensory systems used for virtual and augmented reality, metaverse, artificial skin for burn patients, and robotic prostheses were difficult to make due to the complexity of theory and system for its materialization.

The perfect materialization of the artificial sensory system was especially difficult because humans sense touch by combining information through various types of tactile receptors, such as pressure and vibration.

To solve the problem, the research team used a method of manufacturing a nanoparticle-based complex tactile sensor and connecting it with a signal conversion system based on actual neural patterns. Combining these two technologies, the research team succeeded in implementing an artificial sensory interface system that maximally mimics the human tactile recognition process.

Professors Park Seung-joon of the Department of Bio and Brain Engineering at KAIST, Chun Seong-woo at Korea University, and Kim Jong-seok at Hanyang University led the study.

The research team first produced an electronic skin made of a piezoelectric material and a piezoelectric resistive material.

This sensor has the characteristic of being able to simultaneously stimulate a slowly adapting mechanoreceptor that detects pressure in the skin and a fast adapting mechanoreceptor that senses vibration through an appropriate combination of nanoparticles, the KAIST said. Furthermore, the electric potential generated by the sensor converts into a pattern similar to the actual sensory signal through the circuit system manufactured by the team.

To simulate the in-vivo situation as much as possible, the team used a method of extracting actual sensory nerves, measuring signals from various senses, and functionalizing, it added.

As a result of applying the system to an animal model, the research team confirmed that the signals generated by the artificial sensory system transmitted without distortion in the living body and implemented biosensory-related phenomena such as muscle reflexes.

The researchers also brought the sensory system made of fingerprint structures into contact with more than 20 types of fabrics. They confirmed that the deep learning technology could classify more than 99 percent of the texture of the material but also predict the texture as same as humans based on the learned signal.

"This study is meaningful in that it is the world's first realization of a human-like sensory system based on pattern learning of actual neural signals," Professor Park said. "Through this research, it will become possible to realize more realistic senses in the future. We also expect that the biosignal simulation technique used in the study will create greater synergy when combined with various types of other sensory systems in the human body."

The results of the research were published in Nature Electronics.

Copyright © KBR Unauthorized reproduction, redistribution prohibited