TitleA technical framework for human-like motion generation with autonomous anthropomorphic redundant manipulators
Publication TypeConference Paper
Year of Publication2020
Conference Name2020 IEEE International Conference on Robotics and Automation (ICRA)
Pagination3853-3859
AuthorsAverta, G, Caporale, D, Santina, CD, Bicchi, A, Bianchi, M
Abstract

The need for users' safety and technology accept-ability has incredibly increased with the deployment of co-bots physically interacting with humans in industrial settings, and for people assistance. A well-studied approach to meet these requirements is to ensure human-like robot motions. Classic solutions for anthropomorphic movement generation usually rely on optimization procedures, which build upon hypotheses devised from neuroscientific literature, or capitalize on learning methods. However, these approaches come with limitations, e.g. limited motion variability or the need for high dimensional datasets. In this work, we present a technique to directly embed human upper limb principal motion modes computed through functional analysis in the robot trajectory optimization. We report on the implementation with manipulators with redundant anthropomorphic kinematic architectures - although dissimilar with respect to the human model used for functional mode extraction - via Cartesian impedance control. In our experiments, we show how human trajectories mapped onto a robotic manipulator still exhibit the main characteristics of human-likeness, e.g. low jerk values. We discuss the results with respect to the state of the art, and their implications for advanced human-robot interaction in industrial co-botics and for human assistance

URLhttps://ieeexplore.ieee.org/abstract/document/9196937?casa_token=AZXNKFnjz0IAAAAA:KN3-7UjwXePVIReKR_NHZK_s7SduQg442D7qXg4P6vaAkVY6nYrXmQ9ZR0eIvqY9v9nnvWrH
DOI10.1109/ICRA40945.2020.9196937
AttachmentSize
PDF icon 09196937.pdf3.81 MB