We present an approach to dextrous robot
grasping which combines a purely tactile-driven reactive
algorithm with an implicit representation of grasp experience
to yield an algorithm which can handle arbitrary, partially
unknown grasp situations, i.e. vague object shape and
position. During the grasp movement, the obtained contact
information is used to dynamically adapt the grasping control
by targeting the best matching posture from the experience
base. Thus, the robot recalls and actuates a grasp
it already successfully performed in a similar tactile context.
To efficiently represent the experience, we introduce
the Grasp Manifold assuming that grasp postures form a
smooth manifold in hand posture space. We present a simple
way of providing approximations of Grasp Manifolds
using Self-Organising Maps (SOMs) and study the properties
of the represented grasp manifolds concerning their
smoothness and robustness against clustered training data.