Grasp Sense: Researchers teach robots to touch

Touching and grasping objects is surprisingly complex process, an area where contemporary robots are still clumsy. Principal investigator Jukka Häkkinen, PhD, and post-doctoral researcher Jussi Hakala, DSc (Tech), have developed an imaging method for measuring human touch.

“When humans grasp something, a very complicated subliminal calculation takes place about which muscles are needed in the process, as well as which neural pathways are used to control them and at what intensity. In the field of psychology, these brain mechanisms have been extensively studied,” says Jukka Häkkinen, a psychologist and principal investigator at the University of Helsinki. He is one half of the pair behind the Grasp Sense method.

With the help of thermal and depth cameras, Grasp Sense can be used to measure the heat signature left on the surface of objects by human touch. Data collected on human touch can be utilised in robotics. Thus far, grasping and touching has posed a challenge for the development of robots to be used, for example, in logistics and healthcare.

“Robots need to know exactly the object’s three-dimensional structure, material and weight distribution, whereas humans have the ability of intuitive grasp. Our goal is to transfer human skills to robots,” says Jussi Hakala, a post-doctoral researcher and the other developer of the Grasp Sense. Hakala’s research has focused on 3D imaging and display technologies.

Problems in robotics are related to whether a robot can maintain its hold on an object and, on the other hand, avoid crushing it. From the perspective of the care robots of the future, this aspect is becoming increasingly important.

“Their grip must be pleasant, unwavering and reliable,” notes Häkkinen.

Origins in eye tracking

Earlier, Häkkinen conducted a research project funded by the Academy of Finland that focused on measuring eye movements during grasping tasks.

“I examined how various grasping tasks impact the orientation of eye movements. The term 'just in time selection' is used in connection with eye movements. In other words, the eyes are focused on collecting the exact information required for the next 500 milliseconds,” explains Häkkinen.

This led to the idea for also measuring the manually completed actions during grasping tasks.

“Video-based methods are not accurate enough, so my first thought was to use finger paint,” Häkkinen laughs.

Later, he considered using heat signatures left by touch, and the multiple applications for the method.

Support for designing utility articles and promoting hospital hygiene

In addition to robotics, the Grasp Sense method could be applied to designing various utility articles. Touch data might be useful in designing objects that must be pleasant, ergonomic and precise to use.

According to Häkkinen, the same technology could also be used to create models for hospital hygiene by installing cameras on hospital ceilings. With the help of thermal cameras, models revealing the most touch-intensive surfaces could be created, making it easier and increasingly effective to keep them clean.

Contact details:

Website: graspsense.com
Principal investigator Jukka Häkkinen, jukka.hakkinen@helsinki.fi, tel: +358 50 483 9483
Postdoctoral Researcher Jussi Hakala, jussi.h.hakala@helsinki.fi
Project Manager Kaisu Sutinen, kaisu.sutinen@helsinki.fi

Video: Helena Hiltunen