In a new paper published in Science Robotics, Intuitive Surgical CEO Gary Guthart and lead author Ken Goldberg explain the concept of augmented dexterity in surgical robotics and how it could lead to more efficient procedures.
Goldberg is a professor in UC Berkeley’s Department of Industrial Engineering and Operations Research and director of the university’s AUTOLab, which uses Intuitive’s da Vinci Research Kit ( dVRK) for experiments in surgical robotics and telerobotics. He co-founded Ambi Robotics and Jacobi Robotics as well as the Berkeley AI Research (BAIR) Lab and IEEE Transactions on Automation Science and Engineering.
Guthart, the paper’s co-author, leads the surgical robotics field’s top company, which launched the latest-generation da Vinci 5 this year. The company is the world’s 19th-largest medical device manufacturer, according to Medical Design & Outsourcing‘s 2024 Medtech Big 100 ranking by revenue.
Previously: The Intuitive da Vinci 5’s top design changes: ‘This is groundbreaking for robotic surgery’
What is augmented dexterity?
“A surgeon’s dexterity often separates the good surgeons from the great ones,” the pair wrote in their paper. “Fortunately, emerging advances in artificial intelligence and robotics now have potential to narrow this gap. … Augmented dexterity has potential to elevate good surgeons to the level of the best surgeons, which could support safer, faster, and more reliable surgery.”
Goldberg and Guthart define augmented dexterity as “systems where surgical subtasks are controlled by a robot under the close supervision of a human surgeon who is ready to take over at a moment’s notice.”
Those subtasks could include suturing, debridement and resection. Digital images displayed on a surgeon’s view of the operation site with augmented reality could let the surgeon guide a surgical robot as it places sutures or removes debris from a wound. They could also allow for telesurgery and telementoring to expand access to minimally invasive surgical robotics procedures for patients and doctors in remote areas.
Precision and latency are two major challenges for surgical robotics developers.
“Surgical robots are imprecise because the motors that drive them must remain outside the patient’s body. The metal cables that adjust tool positions are long and prone to backlash,” the pair wrote. “Human surgeons learn to compensate intuitively for these challenges, but we found that the dVRK was challenging to control accurately without human supervision. We recently showed that a deep neural network can learn how to compensate so that the robot could perform a common surgical training task called ‘peg transfer’ at accuracy and speeds on par with (and in some cases better than) an expert surgeon.”
Goldberg and his researchers in UC Berkeley’s AUTOLab have already developed suturing software with an algorithm that “analyzes a photo of the laceration, then computes and displays an overlay showing the precise placement of each suture to optimally distribute forces across the laceration.” However, they note it’s still a challenge to manage the surgical thread’s slack and transfer the surgical needle from one gripper tool to another.
The pair also detailed how augmented dexterity could apply to debridement, or cleaning a wound of damage tissue or foreign objects.
“Debridement can be very tedious and time-consuming; it’s very easy for a surgeon to overlook fragments, which can lead to infections,” they wrote. “We conjecture that augmented dexterity could be applied to debridement by using a surgical camera and robot system to systematically identify and remove fragments under close supervision of the surgeon who is ready to take over if the system misidentifies a discolored human tissue as a foreign fragment.”
“Limited augmented dexterity” has been used for both suturing and debridement in the lab, but Goldberg and Guthart said more research is needed.
You can read the full paper at Science Robotics or Goldberg’s website. For more details on his team’s research work on autonomous robotic surgery, have a look at this May 2024 master’s thesis by AUTOLab alum Will Panitch.
Related: What Intuitive’s limited da Vinci 5 launch can teach other device developers