A novel information fusion method for vision perception and location of intelligent industrial robots

3Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

An improved SURF (Speeded-Up Robust Feature)algorithm is proposed to deal with the time-consuming and low precision of positioning of industrial robot. Hessian matrix determinant is used to extract feature points from the target image and a multi-scale spatial pyramid is constructed. The location and scale value of feature points are determined by neighbourhood non-maximum suppression method. The direction of feature points is defined as directional feature descriptors by the binary robust independent elementary feature (BRIEF). The progressive sample consensus (PROSAC) is used to carry out second precise matching and remove mismatching points based on the Hamming distance. Then, an affine transformation model is established to describe the relationship between the template and target images. Centroid coordinates of the target can be obtained based on the affine transformation. Comparative tests were carried out to demonstrate that the proposed method can effectively improve the recognition rate and positioning accuracy of the industrial robots. The average time consuming is less than 0.2 s, the matching accuracy is 96 %, and the positioning error of the robot is less than 1.5 mm. Therefore, the proposed method has practical application importance.

Cite

CITATION STYLE

APA

Jin, S., Lin, Q., Yang, J., Bie, Y., Tian, M., & Li, Z. (2019). A novel information fusion method for vision perception and location of intelligent industrial robots. Elektronika Ir Elektrotechnika, 25(5), 4–10. https://doi.org/10.5755/j01.eie.25.5.20587

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free