Fauquet-Alekhine, Philippe and Bleuze, Julien (2024) Improving learning in robotics teleoperation: contribution of eye-tracking in digital ethnography. Current Journal of Applied Science and Technology, 43 (7). pp. 128-139. ISSN 2457-1024
Text (Improving learning in robotics teleoperation Contribution of eye-tracking in digital ethnography)
- Published Version
Available under License Creative Commons Attribution. Download (700kB) |
Abstract
Aims: Digital ethnography has shown its added-value for the analysis of work activity in order to improve the methods of developing the associated necessary competencies. In particular, tracing process methods based on first-person recording of the activity by first-person view camera combined with competencies-oriented and goal-oriented interviews have demonstrated their effectiveness in medicine, nuclear industry and education. However, the teleoperation of robots out of sight requires the use of a specific first-person view camera: an eye-tracking device. This is due to the fact that during the teleoperation, the pilot's head movements are almost non-existent while the eyes move enormously. Yet, the literature is completely void of this type of activity analysis using eye-tracking for teleoperation in robotics. The objective of the present article is to fill this gap by presenting a pilot study characterizing the potential contribution of first-person view tracing process combined with competencies-oriented and goal-oriented interviews for robotics teleoperation out of sight. Study Design: The pilot study has involved two robot pilots individually performing a teleoperation task out of sight. The pilots have been chosen for their difference of experience in teleoperation. Whilst performing the activity, they have been equipped with an eye-tracking device, enabling the recording of the activity at the first-person perspective. An interview based on the Square of PErceived Action model (SPEAC model) has followed in order to access what makes their competencies. Place and Duration of Study: The experiments were undertaken in the simulation training center of the Groupe INTRA-Intervention Robotique sur Accidents, in France, during 2023. Methodology: Two pilots had to individually teleoperate a robot from a control console using the videos transmitted on several screens placed in front of them from the cameras on board the robot. The activity consisted of moving the robot through a maze carrying a container in which a ring had to be put after being picked up from the ground, and then bring the whole out of the maze. The activity lasted about 10 to 20 minutes. Each pilot was equipped with an eye-tracking device that made it possible to record their activity for deferred access in order to identify the knowledge and know-how implemented. The interview was conducted using the SPEAC model. At the end of the interview, a matrix of competencies was built for each of the pilots. Software processing made it possible to access quantified data, in particular the vision fixation time for each of the pilots in order to take information from the screens and the control console. Results: The comparison of the matrices of competencies made it possible to measure the gap in competence between an experienced pilot and a novice pilot, as well as to identify knowledge and know-how not yet taught in pilot training. The measurement of fixation times has also made it possible to identify a difference that appears interesting to be analyzed in more depth in a future study. Conclusion: Results shows that the method applied is well suited for teleoperation of robots out of sight and provide relevant data to improve training.
Item Type: | Article |
---|---|
Additional Information: | © 2024 The Author(s) |
Divisions: | Psychological and Behavioural Science |
Subjects: | B Philosophy. Psychology. Religion > BF Psychology T Technology Q Science > QA Mathematics > QA75 Electronic computers. Computer science |
Date Deposited: | 17 Jul 2024 15:36 |
Last Modified: | 17 Oct 2024 16:57 |
URI: | http://eprints.lse.ac.uk/id/eprint/124267 |
Actions (login required)
View Item |