Laparoscopic Suture Gestures Recognition via Machine Learning: A Method for Validation of Kinematic Features Selection

University of Málaga
Medical Robotics Laboratory
Graphical abstract of the paper

Our approach uses the laparoscopic suturing manoeuvre with data from the JIGSAWS dataset, divided into the gestures showed above, to execute and validate a Features Selection method with which estimate the significance of each kinematic variable in the recognition task. The method is based on the training and evaluation of different configurations for Machine Learning models, mainly MLP and HMM.

Abstract

In minimally invasive surgery, robotics integration has been crucial, with a current focus on developing collaborative algorithms to reduce surgeons’ workload. Effective human-robot collaboration requires robots to perceive surgeons’ gestures during interventions for appropriate assistance. Research in this task has utilized both image data, mainly using Deep Learning and Convolutional Neural Networks, and kinematic data extracted from the surgeons’ instruments, processing kinematic sequences with Markov models, Recurrent Neural Networks and even unsupervised learning techniques. However, most studies that develop recognition models with kinematic data do not take into account any study of the significance that each kinematic variable plays in the recognition task, allowing for informed decisions at the time of training simpler models and choosing the sensor systems in deployment platforms. For that purpose, this work models the laparoscopic suturing manoeuvre as a set of simpler gestures to be recognized and, using the ReliefF algorithm on the JIGSAWS dataset’s kinematic data, presents a study of significance of the different kinematic variables. To validate this study, three classification models based on the multilayer perceptron and on Hidden Markov Models have been trained using both the complete set of variables and a reduced selection including only the most significant. The results show that the aperture angle and orientation of the surgical tools retain enough information about the chosen gestures that the accuracy does not vary between equivalent models by more than 5.84% in any case.

Results of the Feature Selection

Results of the feature selection process
Kinematic variables considered in the Feature Selection process

Accuracies of different model configurations

Difference in accuracies with the complete and reduced sets of variables

BibTeX


@ARTICLE{10799090,
  author={Herrera-López, Juan M. and Galán-Cuenca, Álvaro and Reina, Antonio J. and García-Morales, Isabel and Muñoz, Víctor F.},
  journal={IEEE Access},
  title={Laparoscopic Suture Gestures Recognition via Machine Learning: A Method for Validation of Kinematic Features Selection},
  year={2024},
  volume={12},
  number={},
  pages={190470-190486},
  keywords={Surgery;Kinematics;Needles;Hidden Markov models;Robots;Data models;Laparoscopes;Data mining;Image recognition;Vectors;Feature selection;hidden Markov models;laparoscopic suturing;neural networks;surgical gestures recognition;surgical robotics},
  doi={10.1109/ACCESS.2024.3516949}}