Vision-Based Mobile Robot Self-localization and Mapping System for Indoor Environment

Authors

  • Lei Lei Tun Shwe Ph. D Candidate, Department of Mechatronics Engineering, Mandalay Technological University, Myanmar
  • Wut Yi Win Professor and Head, Department of Mechatronics Engineering, Mandalay Technological University, Myanmar

Keywords:

Artificial landmarks, Canny edge detection, Camera, Kalman filter, Encoders, Localization, Mapping.

Abstract

Localizing accurately and building map of an environment concurrently is a key factor of a mobile robot system. In this system, the robot makes localization and mapping with artificial landmarks and map-based system. It is a process by which a mobile robot can build a map of an environment while continuously determining the location of the robot within the map. The system estimates the robot position in indoor environments using sensors; a camera, three ultrasonic sensors and encoders. The main contribution of this paper is to reduce computational time and improve mapping with map-based system. The self-localization of mobile robot in an indoor environment is advanced through the construction of map based on sensors and recognition of artificial landmarks. Vision based localization system can benefit from using with ultrasonic sensors. From captured images, the system makes landmark detection by using Canny edge detection and Chain-code Approximation algorithms to represent the contour of landmarks by using edge points. The Kalman filter is aimed to accurately estimate position and orientation of the robot using relative distances to walls or artificial landmarks in environments. A robotic system is capable of mapping in an indoor environment and localizing with respect to the map, in real time, using artificial landmarks and sensors.

References

[1] I. J. Cox and G. Wilfong (Editors), Autonomous Robot Vehicles, Springer-Verlag, New York (1990).
[2] D. L. Hall and J. Llinas, “A challenge for the data fusion community I: Research imperatives for improved processing,” in Proc. 7th Natl. Symp. on Sensor Fusion, Albuquerque, NM, Mar. 1994.
[3] J. Leonard and H. Durrant-Whyte, “Mobile robot localization by tracking geometric beacons,” Robotics and Automation, IEEE Transactions on, vol. 7, no. 3, pp. 376 –382, jun. 1991.
[4] W. Burgard, D. Fox, D. Hennig, and T. Schmidt, “Estimating the absolute position of a mobile robot using position probability grids,” in Proc. of the National Conference-on Artificial Intelligence, 1996.
[5] G. Grisetti, D. L. Rizzini, C. Stachniss, E. Olson, and W. Burgard. Online constraint network optimization for efficient maximum likelihood map learning. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA), 2008.
[6] R. K¨ummerle, G. Grisetti, H. Strasdat, K. Konolige, and W. Burgard. g2o: A general framework for graph optimization. In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2011.
[7] A. Howard, M. Mataric, and G. Sukhatme, “Relaxation on a mesh: a formalism for generalized localization,” in Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2001.
[8] C. Rafflin and A. Fournier, “Learning with a friendly interactive robot for service tasks in hospital environments,” Autonomous Robots, vol. 3, pp. 399–414, 1996.
[9] H. Hu and D. Gu. Landmark-based navigation of autonomous robots in industry. International Journal of Industrial Robot, 27(6):458–467, November 2000.
[10] E. Jones, A. Vedaldi, and S. Soatto, “Inertial structure from motion with autocalibration,” in Proceedings of the Internetional Conference on Computer Vision - Workshop on Dynamical Vision, 2007.
[11] J. Kelly and G. S. Sukhatme, “Visual-inertial sensor fusion: Localization, mapping and sensor-to-sensor self-calibration,” Int. Journal of Robotics Research, vol. 30, no. 1, pp. 56–79, 2011.
[12] M. C. Silpa-Anan, S. Abdallah, and D. Wettergreen, Development of autonomous underwater vehicle towards visual servo control. Proceedings of the Australian Conference on Robotics and Automation. Melbourne, Australia (2000)
[13] I. J. Cox and G. Wilfong (Editors), Autonomous Robot Vehicles, Springer-Verlag, New York (1990).
[14] J. L. Crowley, Mathematical foundations of navigation and perception for an autonomous mobile robot. Reasoning with Uncertainty in Robotics (1995)
[15] I. Loevsky and I. Shimshoni, “Reliable and efficient landmark-based localization for mobile robots,” Robotics and Autonomous Systems, vol. In Press, 2010.
[16] J. Leonard and H. Durrant-Whyte, “Mo-bile robot localization by tracking geometric beacons,” Robotics and Automation, IEEE Transactions on, vol. 7, no. 3, pp. 376 –382, jun. 1991.
[17] W. Burgard, D. Fox, D. Hennig, and T. Schmidt, “Estimating the absolute position of a mobile robot using position probability grids,” in Proc. of the National Conference-on Artificial Intelligence, 1996.
[18] G. Grisetti, D. L. Rizzini, C. Stachniss, E. Olson, and W. Burgard. Online constraint network optimization for efficient maximum likelihood map learning. In Proc. of the IEEE Int. Conf. on Robotics & Automation (ICRA), 2008.
[19] R. K¨ummerle, G. Grisetti, H. Strasdat, K. Konolige, and W. Burgard. g2o: A general framework for graph optimization. In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), 2011.
[20] A. Howard, M. Mataric, and G. Sukhatme, “Relaxation on a mesh: a formalism for generalized localization,” in Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2001.
[21] J. Gonzalez, A. Stentz, and A. Ollero, “An iconic position estimator for a 2d laser range finger,” Proc. of IEEE International Conference on Robotics and Automation, pp. 2646-2651, 1992.
[22] F. Cheanvier and J. Crowley, “Position estimation for a mobile robot using vision and odometry,” Proc. of IEEE International Conference on Robotics and Automation, pp. 2588-2593, 1992.
[23] K. Sugihara, “Some location problems for robot navigation using a single camera,” Computer Vision, Graphics and Image Processing, vol. 42, no. 1, pp. 112-129, 1988.
[24] M. Grewal and A. Andrews, Kalman filtering: theory and practice, Prentice-Hall, Inc., Englewood Cliffs, New Jersey (1993).
[25] G. Hager and S. Atiya, “Real-time vision-based robot localization,” IEEE Trans. on Robotics and Automation, vol. 9, no. 6, pp. 785-800, 1993.

Downloads

Published

2017-12-07

How to Cite

Shwe, L. L. T., & Win, W. Y. (2017). Vision-Based Mobile Robot Self-localization and Mapping System for Indoor Environment. American Scientific Research Journal for Engineering, Technology, and Sciences, 38(1), 306–324. Retrieved from https://asrjetsjournal.org/index.php/American_Scientific_Journal/article/view/3593

Issue

Section

Articles