Hostname: page-component-78c5997874-g7gxr Total loading time: 0 Render date: 2024-11-02T23:32:35.066Z Has data issue: false hasContentIssue false

Off-line localisation of a mobile robot using ultrasonic measurements

Published online by Cambridge University Press:  01 May 2000

Philippe Hoppenot
Affiliation:
CEMIF – Complex Systems Group, University of Evry 40 rue du Pelvoux 91020 Evry Cedex (France). E-mail: hoppenot, ecolle, Barat | @ cemif.univ-evry.fr
Etienne Colle
Affiliation:
CEMIF – Complex Systems Group, University of Evry 40 rue du Pelvoux 91020 Evry Cedex (France). E-mail: hoppenot, ecolle, Barat | @ cemif.univ-evry.fr
Christian Barat
Affiliation:
CEMIF – Complex Systems Group, University of Evry 40 rue du Pelvoux 91020 Evry Cedex (France). E-mail: hoppenot, ecolle, Barat | @ cemif.univ-evry.fr

Abstract

Regarding assistance to disabled people for object manipulation and carrying, the paper focuses on the localisation for mobile robot autonomy. In order to respect strong low-cost constraints, the perception system of the mobile robot uses sensors of low metrological quality, ultrasonic ring and odometry. That poses new problems for localisation, in particular. Among different localisation techniques, we present only off-line localisation. With poor perception means, it is necessary to introduce a priori knowledge on sensors and environment models. To solve the localisation problem, the ultrasonic image is segmented applying the Hough transform, well-adapted to ultrasonic sensor characteristics. The segments are then matched with the room, modelled and assumed to be rectangular. Several positions are found. A first sort, based on a cost function, reduces the possibilities. The remaining ambiguities are removed by a neural network which plays the part of a classifier detecting the door in the environment. Improvements of the method are proposed to take into account obstacles and non-rectangular room. Experimental results show that the localisation operates even with one obstacle.

Type
Research Article
Copyright
© 2000 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)