Estimation of Tool-Tissue Forces in Robot-Assisted Minimally Invasive Surgery
The advent of robots in surgery started in 1994 when the first AESOP (voice-controlled camera holder) prototype robot was used clinically in 1993. AESOP was then marketed as the first surgical robot ever in 1994 by the US FDA. Since then Robot-assisted minimally invasive surgery (RAMIS) has gained immense popularity over the last two decades. Statistics show that more than 50% of errors performed by surgical trainees are due to excessive force. This may often lead to irreparable damage. RAMIS reduces the error percentage of tool-tissue interaction forces significantly, it is much more accurate than an open surgery or laparoscopic minimally invasive surgery. Although it may seem like this approach saves the patient from excessive trauma this method is limiting. This is because the surgeon is not directly controlling the machine. Which means the tool might lose direct contact with the surface of surgery, i.e. loss of tactile information for the surgeon. This entire process of relaying tool-tissue force information to the surgeon is the haptic feedback or force feedback. The tool-tissue interaction forces affect the efficacy of the surgery to a great extent, so it is imperative for RAMIS to overcome this obstacle to reach its full potential.
This disparity in information is the main reason RAMIS has not reached its apogee. The lack of haptic feedback, tactile sensing, and tool tissue force-sensing contributes to this method’s dubiousness. Several attempts have been made over the years for the sensorization of surgical instruments but even the most of developed sensory systems required to alter the standard shape of the surgical tool (as it may affect the handling of the surgical tool and possibly the amount of tool-tissue interaction forces), are not suitable in terms of size or the installation place, require the filtration of noise signals, or have an unsuitable range of measurement. In addition to that they make the instruments bulky and complicate the process of sterilization. The size of force sensors, and incision ports, the sterilization of tools at high temperature and the disposable nature of surgical tools have so far prevented the integration of end-effector/tissue force sensing in RAMIS.
Scientists soon understood that sensors are definitely not the way to go.
There have been multitudes of research solutions for this problem. None of the methods seemed to provide a complete remedy. However, a recent research brings forth an answer to this long-lasting issue by proposing a new algorithm to estimate the tool-tissue force interaction in robot-assisted minimally invasive surgery in a way that it does not require the use of external force sensing (sensor-less). This incredible feat was achieved by applying a neural network method to estimate the tool-tissue force interaction during a grasp manoeuvre in RAMIS for future application in providing haptic feedback to surgeons. Its distinctive features reside in the fact that it considers a black box approach regarding the whole mechanism, thus rendering the analysis of the grippers mechanism unnecessary. The proposed method is based on utilization of the current of the gripper-actuating motors as well as a detailed examination of the various stages of a grasping motion. In contrast to existing algorithms in literature, this model does not require external sensors or equipment such as vision systems, predicting sufficiently both small and large forces.
This model was tested online and offline to check its feasibility and feedback system. The results showed that the developed algorithm can estimate the tool-tissue interaction force during the grasp online and could thus allow haptic feedback in robotic surgery to be provided. This model can predict both large and relatively small forces which was an issue in the algorithm developed by Zhao and Nelson (2015). The model developed by Zhao and Nelson used motor current and was the first model to tackle the existing problems of RAMIS without the use of sensors. Their model utilized the Gaussian Process Regression, which unfortunately failed when the target went out of range. This research tactfully handles the problem of GPR not being able to predict the outcomes when the target goes out of the range of the dataset. This problem was combated with the help of neural networks. This approach was inspired by Yu et al. (2018) who proposed a cable tension based method to estimate external forces and utilized a back propagation (BP) network to estimate resistance parameters such as friction to aid in the force estimation. Along with Yu, Aviles et al. (2015, 2015) also inspired this algorithm with his seminal paper on combined vision based methods with recurrent neural networks to estimate tool-tissue interaction.