One of the main problems encountered in manual assembly workstations is human error in performing the operations. Several approaches are currently used to face this problem, such as intensive training of personnel, poka-yoke devices or invasive sensing systems (e.g. sensing gloves) used for monitoring the process and detect wrong procedures or errors in joining the parts. This paper proposes an innovative system based on the interaction between a force sensor and an augmented reality (AR) equipment used to give to the worker the necessary information about the correct assembly sequence and to alert him in case of errors. The force sensor is placed under the workbench and it is used to monitor the assembly process by collecting force and torque data with respect to an XYZ reference system; a pattern recognition technique allows the error identification and the selection of the appropriate recovery procedure. Two AR devices have been tested in this application: a video-mixing spatial display and an optical see-through apparatus, comparing the pro and cons of these two solutions. The first device includes a CCD camera positioned over the workstation and an LCD display used by the worker as a support for the correct execution of assembly operations and receiving instructions about recovery procedures. The latter consists of a head mounted display (HMD) having the capability of reflecting projected images in front of the worker's eyes, allowing a real-world view with the superimposition of virtual objects. The CCD camera is also used for identifying errors that are not detectable by the force sensor. At the end, a case study concerning a typical assembly procedure is presented and discussed.
Mura, M. D., Dini, G., & Failli, F. (2016). An Integrated Environment Based on Augmented Reality and Sensing Device for Manual Assembly Workstations. In Procedia CIRP (Vol. 41, pp. 340–345). Elsevier B.V. https://doi.org/10.1016/j.procir.2015.12.128