This paper presents a fast approach for edge-based selflocalization in RoboCup. The vision system extracts edges between the field and field lines, borders, and goals following a grid-based approach without processing whole images. These edges are employed for the selflocalization of the robot. Both image processing and self-localization work in real-time on a Sony Aibo, i. e. at the frame rate of the camera. The localization method was evaluated using a laser range sensor at the field border as a reference system.
CITATION STYLE
Röfer, T., & Jüngel, M. (2004). Fast and robust edge-based localization in the sony four-legged robot league. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3020, pp. 262–273). Springer Verlag. https://doi.org/10.1007/978-3-540-25940-4_23
Mendeley helps you to discover research relevant for your work.