South Korean researchers develop robotic fingers with pressure sensitivity
By Lee Byung-chul, Yeom Hyun-a

The robotic finger, jointly developed by ETRI and Wonik Robotics. / ETRI
South Korean researchers have developed a robotic finger that can sense the pressure of an object it grips in real time, mimicking the sensitivity of a human finger. This advancement comes as robots are increasingly employed in manufacturing and service sectors, helping address safety concerns caused by inaccurate sensor feedback. South Korean researchers have developed a robotic finger that can sense the pressure of an object it grips in real time, mimicking the sensitivity of a human finger. This advancement comes as robots are increasingly employed in manufacturing and service sectors, helping address safety concerns caused by inaccurate sensor feedback.
Kim Hye-jin, a senior researcher at the Korea Electronics and Telecommunications Research Institute (ETRI) in the Intelligent Sensor Technology Laboratory, along with her team and collaborators from Wonik Robotics, announced on March 26 the development of a robotic finger that leverages air pressure to precisely detect pressure from any direction. Kim Hye-jin, a senior researcher at the Korea Electronics and Telecommunications Research Institute (ETRI) in the Intelligent Sensor Technology Laboratory, along with her team and collaborators from Wonik Robotics, announced on March 26 the development of a robotic finger that leverages air pressure to precisely detect pressure from any direction.
This innovation is poised for applications across multiple sectors due to its ability to handle objects of varying hardness. Traditional robotic fingers often struggle with signal distortion during gripping, causing excessive or insufficient force—an issue that limits their ability to perform human-like tasks. This innovation is poised for applications across multiple sectors due to its ability to handle objects of varying hardness. Traditional robotic fingers often struggle with signal distortion during gripping, causing excessive or insufficient force—an issue that limits their ability to perform human-like tasks.
The research team overcame previous technological constraints by incorporating pneumatic-based pressure sensors into 3D robotic fingers, enabling accurate pressure detection from various angles while maintaining dexterity comparable to a human hand. The research team overcame previous technological constraints by incorporating pneumatic-based pressure sensors into 3D robotic fingers, enabling accurate pressure detection from various angles while maintaining dexterity comparable to a human hand.
To further enhance sensor precision, AI technology was integrated. The robotic fingers can evaluate object firmness in real time, with an LED that changes color based on pressure levels, allowing users to visually monitor the operation. Additionally, the fingers support vibration detection and wireless data communication. To further enhance sensor precision, AI technology was integrated. The robotic fingers can evaluate object firmness in real time, with an LED that changes color based on pressure levels, allowing users to visually monitor the operation. Additionally, the fingers support vibration detection and wireless data communication.
Durability issues common in existing tactile sensors were also addressed. By shielding the sensor from direct contact with pressure points, the team significantly extended the device’s lifespan. Durability issues common in existing tactile sensors were also addressed. By shielding the sensor from direct contact with pressure points, the team significantly extended the device’s lifespan.
Researchers envision the robotic finger performing intricate tasks in manufacturing and service industries, handling various objects, and improving human-robot interaction. Future developments include enabling the system to detect temperature, humidity, light, and ultrasound. Researchers envision the robotic finger performing intricate tasks in manufacturing and service industries, handling various objects, and improving human-robot interaction. Future developments include enabling the system to detect temperature, humidity, light, and ultrasound.
Kim stated, “We have elevated robot-human interaction and established a foundation for deeper integration of robots into our societal and industrial fabric. Kim stated, “We have elevated robot-human interaction and established a foundation for deeper integration of robots into our societal and industrial fabric.