Vision-based framework to estimate robot configuration and kinematic constraints

Valerio Ortenzi, Naresh Marturi, Michael Mistry, Jeffrey A. Kuo, Rustam Stolkin

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

This paper addresses the problem of estimating the configuration of robots with no proprioceptive sensors and with kinematic constraints while performing tasks. Our work is motivated by the use of unsensored (industrial) manipulators, currently tele-operated in rudimentary ways, in hazardous environments such as nuclear decommissioning. For such robots, basic proprioceptive sensors are often unavailable. Even if radiation-hardened sensors could be retrofitted, such manipulators are typically deployed on a mobile base, while equipped with powerful end-effector tools for forceful contact tasks, which significantly perturb the robot base with respect to the scene. This work contributes a step towards enabling advanced control and increased autonomy in nuclear applications, but could also be applied to mechanically compliant, under-actuated arms and hands, and soft manipulators. Our proposed framework: estimates the robot configuration by casting it as an optimisation problem using visually tracked information; detects contacts during task execution; triggers an exploration task for detected kinematic constraints, which are then modelled by comparing observed versus commanded velocity vectors. Unlike previous literature, no additional sensors are required. We demonstrate our method on a Kuka iiwa 14 R820, reliably estimating and controlling robot motions and checking our estimates against ground truth values, and accurately reconstructing kinematic constraints.
Original languageEnglish
JournalIEEE/ASME Transactions on Mechatronics
DOIs
Publication statusPublished - 17 Aug 2018

Keywords

  • robots
  • robot kinematics
  • robot vision systems

Fingerprint

Dive into the research topics of 'Vision-based framework to estimate robot configuration and kinematic constraints'. Together they form a unique fingerprint.

Cite this