Ashraf Aboshosha, Universität Tübingen, In industrial applications arises the need for a high level cognition system to guide mobile robots robustly. This article addresses an integrated solution to achieve this task. The proposed system exhibits robust tracking of a visual guide and provides an adaptive interaction, which maintains the stability of the overall system against fluctuations of internal or external system parameters. Moreover, the proposed guidance system counteracts the noise influence regarding vision and sensory entries. This system relies on a biologically inspired sensor integration, this means that outcomes of both of vision system and distributed sensors are fused to obtain a high precision guidance. This study has been implemented on the B21-RWI robot platform (laboratory for autonomous mobile robots, university of Tübingen).
Duration : 0:0:24