| 作成者 |
|
|
|
| 本文言語 |
|
| 出版者 |
|
|
|
| 発行日 |
|
| 収録物名 |
|
| 巻 |
|
| 号 |
|
| 開始ページ |
|
| 終了ページ |
|
| 出版タイプ |
|
| アクセス権 |
|
| JaLC DOI |
|
| 関連DOI |
|
|
|
|
|
| 関連URI |
|
|
|
|
|
| 関連情報 |
|
|
|
|
|
| 概要 |
Global localization is a fundamental requirement for a mobile robot. Map-based global localization is a popular technique and gives a precise position by comparing a provided geometric map and current... sensory data. However, it is quite time-consuming if 3D range data is processed for 6D global localization. On the other hand, appearance-based global localization using a captured image and recorded images is simple and suitable for real-time processing. However, this technique does not work in the dark or in an environment in which the lighting conditions change remarkably. To cope with these problems, we have proposed a two-step strategy which combines map-based global localization and appearance-based global localization. Firstly, several candidate positions are selected according to an appearance-based technique, and then the optimum position is determined by a map-based technique. Instead of camera images, we use reflectance images, which are captured by a laser range finder as a by-product of range sensing. In this paper, a new technique based on this global localization technique is proposed by combining the two step algorithm and a sampling-based approach. To cope with the odometry data, a particle filter is adopted for tracking robot positions. The effectiveness of the proposed technique is demonstrated through experiments in real environments.続きを見る
|