The usage of articulated tools for autonomous
robots is still a challenging task. One of the difficulties is to
automatically estimate the tool’s kinematics model. This model
cannot be obtained from a single passive observation, because
some information, such as a rotation axis (hinge), can only be
detected when the tool is being used. Inspired by a baby using
its hands while playing with an articulated toy, we employ a
dual arm robotic setup and propose an interactive manipulation
strategy based on visual-tactile servoing to estimate the tool’s
kinematics model.
In our proposed method, one hand is holding the tool’s
handle stably, and the other arm equipped with tactile finger
flips the movable part of the articulated tool. An innovative
visuo-tactile servoing controller is introduced to implement
the flipping task by integrating the vision and tactile feed-
back in a compact control loop. In order to deal with the
temporary invisibility of the movable part in camera, a data
fusion method which integrates the visual measurement of the
movable part and the fingertip’s motion trajectory is used to
optimally estimate the orientation of the tool’s movable part.
The important tool’s kinematic parameters are estimated by
geometric calculations while the movable part is flipped by the
finger.
We evaluate our method by flipping a pivoting cleaning
head (flap) of a wiper and estimating the wiper’s kinematic
parameters. We demonstrate that the flap of the wiper is flipped
robustly, even the flap is shortly invisible. The orientation of
the flap is tracked well compared to the ground truth data. The
kinematic parameters of the wiper are estimated correctly