Tactile-Based Task Definition Through Edge Contact Formation Setpoints for Object Exploration and Manipulation
In autonomous robot tasks involving physical contacts with the environment, it is still challenging to perform dexterous manipulation. Force control approaches and force sensors are usually used to control the actions of a robot. However, the spatial resolution of the force sensors is limited when exploring and manipulating an object through the tracking of salient tactile features, such as edges, while touching the surface of the object. In fact, the exploration or manipulation can be implemented via tactile servoing approaches that use the parameters of those edges. These parameters, obtained by an array of tactile sensors, are used for generating setpoints driving a robot arm to minimize the gap between the desired and current parameters of a given edge. This letter describes a new common strategy for defining tactile setpoint signals for tactile servoing approaches in order to implement different contact-based tasks. These setpoints represent artificial constraints which comply with natural constraints on the force and position of the robot end-effector imposed by the physical contact between the robot and the object. The sequence of setpoints for three different tasks are given as examples: alignment with an object, exploration of a linear object with variable stiffness and manipulation by rolling of ellipsoidal objects. These tasks are validated with real experiments using a KUKA LWR 4+ robot arm and a Weiss WTS-0614 piezoresistive sensor. The arm controller runs at 1 kHz and the tactile servoing control runs at 100 Hz, which is limited by the sampling rate of the sensing array.
keywords: Robotics, Sensor Data