[go: up one dir, main page]

WO2025230837A1 - Estimation assistée par ordinateur de réseau de recouvrement cible - Google Patents

Estimation assistée par ordinateur de réseau de recouvrement cible

Info

Publication number
WO2025230837A1
WO2025230837A1 PCT/US2025/026442 US2025026442W WO2025230837A1 WO 2025230837 A1 WO2025230837 A1 WO 2025230837A1 US 2025026442 W US2025026442 W US 2025026442W WO 2025230837 A1 WO2025230837 A1 WO 2025230837A1
Authority
WO
WIPO (PCT)
Prior art keywords
anatomical structure
endoscope
instrument
distance
indicator
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/026442
Other languages
English (en)
Inventor
Holly Wang
Andrew J. Hazelton
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Publication of WO2025230837A1 publication Critical patent/WO2025230837A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000095Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/012Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
    • A61B1/018Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor for receiving instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/70Manipulators specially adapted for use in surgery
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2059Mechanical position encoders
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • A61B2090/3735Optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/374NMR or MRI
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • the present disclosure relates generally to medical systems. Specifically, the present disclosure relates to using a medical system in endoscopic applications for estimating placement of a target overlay.
  • a doctor may extend the tissue grabber and discover that the tissue grabber is not positioned to contact the target tissue. The doctor then retracts the tissue grabber, repositions the endoscope, and extends the tissue grabber again. Often, the doctor may have to extend and retract the tissue grabber several times before placing the tissue grabber on the target tissue. Thus, the doctor may make several failed tissue grabber extension attempts before the tissue grabber contacts the target issue, which may lengthen the duration of the medical procedure, negatively affecting safety.
  • a system includes a memory and a controller communicatively coupled to the memory.
  • the system receives, from an endoscope, a video of an anatomical structure, determines a pixel of the video that shows a portion of the anatomical structure where an instrument positioned in the endoscope will contact the anatomical structure when the instrument extends from the endoscope, generates a first overlay comprising a first indicator, and presents the first overlay on the video such that the first indicator is positioned on the pixel.
  • a method includes receiving, from an endoscope, a video of an anatomical structure, determining a pixel of the video that shows a portion of the anatomical structure where an instrument positioned in the endoscope will contact the anatomical structure when the instrument extends from the endoscope, generating a first overlay comprising a first indicator, and presenting the first overlay on the video such that the first indicator is positioned on the pixel.
  • Figure 1 illustrates an example medical system.
  • Figures 2A and 2B illustrate an example medical instrument system in the system of Figure 1 .
  • Figures 3A and 3B illustrate example operations performed by the system of Figure 1 .
  • Figures 4A and 4B illustrate example operations performed by the system of Figure 1 .
  • Figure 5A is a flowchart of an example method performed by the system of Figure 1.
  • Figures 5B through 5E illustrate example operations performed by the system of Figure 1 .
  • Figures 6A through 6D illustrate example operations performed by the system of Figure 1 .
  • Figures 7A through 7E illustrate example operations performed by the system of Figure 1 .
  • Figure 8 illustrates an example operation performed by the system of Figure 1.
  • the present disclosure describes a medical system that guides a user operating or handling an instrument (e.g., a tissue grabber) to accurately predict where the instrument will engage a surface of an anatomical structure (e.g., a target tissue).
  • an instrument e.g., a tissue grabber
  • the system displays an indicator (e.g., a reticle) that shows where the instrument is expected to contact an anatomical structure when the instrument is extended or deployed from an endoscope.
  • the system receives, from the endoscope, a video of the anatomical structure and determines pixels of the video that show a portion of the anatomical structure where the instrument will contact the anatomical structure when the instrument is extended or deployed.
  • the system then generates an overlay with the indicator and presents the overlay on the video such that the indicator is positioned on the pixels.
  • the system may be expanded to provide any number of indicators that show where any number of instruments will engage or contact the surface of the anatomical structure.
  • the system provides several technical advantages. For example, the system may provide a more accurate determination of where the instrument positioned in the endoscope will contact the anatomical structure when the instrument extends from the endoscope. As a result, a doctor may have a better understanding where the instrument will contact the anatomical structure before the instrument is extended from the endoscope. Additionally, the doctor may correctly position the endoscope so that the instrument contacts the desired or target tissue of the anatomical structure without resorting to trial and error. Thus, the system may reduce the duration of medical procedures, improve the health and safety of patients, and reduce recovery times and postoperative complications.
  • one or more components of a medical system may be implemented as a computer-assisted surgical system. It is understood, however, that the medical system may be implemented in any type of medical system (e.g., digital fiducial systems, anatomy detection systems, and clinical guidance systems).
  • Figure 1 illustrates an example computer-assisted surgical system 100 that implements some of the features described herein.
  • the surgical system 100 can be used, for example, in surgical, diagnostic, therapeutic, biopsy, or non-medical procedures.
  • the surgical system 100 (which may be a robotically-assisted surgical system) includes one or more manipulator assemblies 102 for operating one or more medical instrument systems 104 in performing various procedures on a patient P positioned on a table T in a medical environment.
  • the manipulator assembly 102 can drive catheter or end effector motion, can apply treatment to target tissue, and/or can manipulate control members.
  • the manipulator assembly 102 can be teleoperated, non-teleoperated, or a hybrid teleoperated and non-teleoperated assembly with select degrees of freedom of motion that can be motorized and/or teleoperated and select degrees of freedom of motion that can be non-motorized and/or non-teleoperated.
  • An operator input system 106 which can be inside or outside of the medical environment, generally includes one or more control devices for controlling the manipulator assembly 102.
  • the manipulator assembly 102 supports a medical instrument system 104 and can optionally include a plurality of actuators or motors that drive inputs on the medical instrument system 104 in response to commands from a control system 112.
  • the actuators can optionally include drive systems that when coupled to the medical instrument system 104 can advance the medical instrument system 104 into a natural or surgically created anatomic orifice.
  • Other drive systems can move the distal end of the medical instrument in multiple degrees of freedom, which can include three degrees of linear motion (e.g., linear motion along the x, y, and z Cartesian axes) and in three degrees of rotational motion (e.g., rotation about the x, y, and z Cartesian axes).
  • the manipulator assembly 102 can support various other systems for irrigation, treatment, or other purposes. Such systems can include fluid systems (e.g., reservoirs, heating/cooling elements, pumps, and valves), generators, lasers, interrogators, and ablation components.
  • the surgical system 100 also includes a display system 110 for displaying an image or representation of the surgical site and a medical instrument system 104.
  • the image or representation is generated by an imaging system 109, which may include an endoscopic imaging system.
  • the display system 110 and operator input system 106 may be oriented so that an operator O can control the medical instrument system 104 and the operator input system 106 with the perception of telepresence.
  • a graphical user interface can be displayable on the display system 110 and/or a display system of an independent planning workstation.
  • the imaging system 109 includes an endoscopic imaging system with components that are integrally or removably coupled to the medical instrument system 104.
  • a separate imaging device such as an endoscope, attached to a separate manipulator assembly can be used with the medical instrument system 104 to image the surgical site.
  • the imaging system 109 can be implemented as hardware, firmware, software, or a combination thereof, which interact with or are otherwise executed by one or more computer processors, which can include the controller 114 of the control system 112.
  • the surgical system 100 also includes a sensor system 108.
  • the sensor system 108 may include a position/location sensor system (e.g., an actuator encoder or an electromagnetic (EM) sensor system) and/or a shape sensor system (e.g., an optical fiber shape sensor) for determining the position, orientation, speed, velocity, pose, and/or shape of the medical instrument system 104.
  • EM electromagnetic
  • shape sensor system e.g., an optical fiber shape sensor
  • These sensors may also detect a position, orientation, or pose of the patient P on the table T. For example, the sensors may detect whether the patient P is face-down or face-up. As another example, the sensors may detect a direction in which the head of the patient P is directed.
  • the sensor system 108 can also include temperature, pressure, force, or contact sensors, or the like.
  • the surgical system 100 can also include a control system 112, which includes at least one memory 116 and at least one controller 114 (which may include a processor) for effecting control between the medical instrument system 104, the operator input system 106, the sensor system 108, and the display system 110.
  • the control system 112 includes programmed instructions (e.g., a non-transitory machine- readable medium storing the instructions) to implement a procedure using the surgical system 100, including for navigation, steering, imaging, engagement feature deployment or retraction, applying treatment to target tissue (e.g., via the application of energy), or the like.
  • the control system 112 may further include a virtual visualization system to provide navigation assistance to the operator O when controlling medical instrument system 104 during an image-guided surgical procedure.
  • Virtual navigation using the virtual visualization system can be based upon reference to an acquired pre-operative or intra-operative dataset of anatomic passageways.
  • the virtual visualization system processes images of the surgical site imaged using imaging technology, such as computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
  • CT computerized tomography
  • MRI magnetic resonance imaging
  • fluoroscopy thermography
  • ultrasound ultrasound
  • OCT optical coherence tomography
  • thermal imaging impedance imaging
  • laser imaging laser imaging
  • nanotube X-ray imaging and/or the like.
  • the control system 112 uses a pre-operative image to locate the target tissue (using vision imaging techniques and/or by receiving user input) and create a pre-operative plan, including an optimal first location for performing treatment.
  • the pre-operative plan can include, for example, a planned size to expand an expandable device, a treatment duration, a treatment temperature, and/or multiple deployment locations.
  • the controller 114 is any electronic circuitry, including, but not limited to one or a combination of microprocessors, microcontrollers, application specific integrated circuits (ASIC), application specific instruction set processor (ASIP), and/or state machines, that communicatively couples to the memory 116 and controls the operation of the control system 112.
  • the controller 114 may be 8-bit, 16-bit, 32-bit, 64-bit or of any other suitable architecture.
  • the controller 114 may include an arithmetic logic unit (ALU) for performing arithmetic and logic operations, processor registers that supply operands to the ALU and store the results of ALU operations, and a control unit that fetches instructions from memory and executes them by directing the coordinated operations of the ALU, registers and other components.
  • ALU arithmetic logic unit
  • the controller 114 may include other hardware that operates software to control and process information.
  • the controller 114 executes software stored on the memory 116 to perform any of the functions described herein.
  • the controller 114 controls the operation and administration of the control system 112 by processing information (e.g., information received from the manipulator assembly 102, the operator input system 106, and the memory 116).
  • the controller 114 is not limited to a single processing device and may encompass multiple processing devices contained in the same device or computer or distributed across multiple devices or computers.
  • the controller 114 is considered to perform a set of functions or actions if the multiple processing devices collectively perform the set of functions or actions, even if different processing devices perform different functions or actions in the set.
  • the memory 116 may store, either permanently or temporarily, data, operational software, or other information for the controller 114.
  • the memory 116 may include any one or a combination of volatile or non-volatile local or remote devices suitable for storing information.
  • the memory 116 may include random access memory (RAM), read only memory (ROM), magnetic storage devices, optical storage devices, or any other suitable information storage device or a combination of these devices.
  • the software represents any suitable set of instructions, logic, or code embodied in a computer-readable storage medium.
  • the software may be embodied in the memory 116, a disk, a CD, or a flash drive.
  • the software may include an application executable by the controller 114 to perform one or more of the functions described herein.
  • the memory 116 is not limited to a single memory and may encompass multiple memories contained in the same device or computer or distributed across multiple devices or computers.
  • the memory 116 is considered to store a set of data, operational software, or information if the multiple memories collectively store the set of data, operational software, or information, even if different memories store different portions of the data, operational software, or information in the set.
  • FIG. 2A illustrates an example medical instrument system 104 in the surgical system 100.
  • the medical instrument system 104 is used in an image-guided medical procedure.
  • the medical instrument system 104 may be used for non-teleoperational exploratory procedures or in procedures involving traditional manually operated medical instruments, such as endoscopy.
  • the medical instrument system 104 includes an elongate flexible device 220, such as a flexible catheter or endoscope (e.g., gastroscope, bronchoscope), coupled to a drive unit 222.
  • the elongate flexible device 220 includes a flexible body 224 having a proximal end 226 and a distal end, or tip portion, 228.
  • the flexible body 224 has an approximately 14-20 millimeter outer diameter. Other flexible body outer diameters may be larger or smaller.
  • the flexible body 224 has an appropriate length to reach certain portions of the anatomy, such as the lungs, sinuses, throat, or the upper or lower gastrointestional region, when the flexible body 224 is inserted into a patient’s oral or nasal cavity or other surgically created orifices.
  • the medical instrument system 104 includes a tracking system 230 for determining the position, orientation, speed, velocity, pose, and/or shape of the distal end 228 and/or of one or more segments 232 along the flexible body 224 using one or more sensors and/or imaging devices.
  • the tracking system 230 is implemented as hardware, firmware, software, or a combination thereof, which interact with or are otherwise executed by one or more computer processors, which may include the controller 114 of control system 112.
  • the tracking system 230 tracks distal the end 228 and/or one or more of the segments 232 using a shape sensor 234. In some embodiments, the tracking system 230 tracks the distal end 228 using a position sensor system 236, such as an electromagnetic (EM) sensor system. In some examples, the position sensor system 236 measures six degrees of freedom (e.g., three position coordinates x, y, and z and three orientation angles indicating pitch, yaw, and roll of a base point) or five degrees of freedom (e.g., three position coordinates x, y, and z and two orientation angles indicating pitch and yaw of a base point).
  • six degrees of freedom e.g., three position coordinates x, y, and z and three orientation angles indicating pitch, yaw, and roll of a base point
  • five degrees of freedom e.g., three position coordinates x, y, and z and two orientation angles indicating pitch and yaw of a base point.
  • the flexible body 224 includes one or more channels 238 sized and shaped to receive one or more medical instruments 240.
  • the flexible body 224 includes two channels 238 for separate instruments 240, however, a different number of channels 238 can be provided.
  • Figure 2B illustrates an example portion of the medical instrument system 104 of Figure 2A. As seen in Figure 2B, the medical instrument 240 extends through the flexible body 224.
  • the medical instrument 240 can be used for procedures and aspects of procedures, such as surgery, biopsy, ablation, mapping, imaging, illumination, irrigation, or suction.
  • the medical instrument 240 is deployed through the channel 238 of the flexible body 224 and is used at a target location within the anatomy.
  • the medical instrument 240 includes, for example, image capture devices, biopsy instruments, ablation instruments, catheters, laser ablation fibers, and/or other surgical, diagnostic, or therapeutic tools.
  • the medical tools include end effectors having a single working member such as a scalpel, a blunt blade, a lens, an optical fiber, an electrode, and/or the like.
  • Other end effectors include, for example, forceps, graspers, balloons, needles, scissors, clip appliers, tissue helixes, and/or the like.
  • Other end effectors further include electrically activated end effectors such as electrosurgical electrodes, transducers, sensors, imaging devices, argon plasma coagulator probes, and/or the like.
  • the medical instrument 240 is advanced from the opening of the channel 238 to perform the procedure and then retracted back into the channel when the procedure is complete.
  • the medical instrument 240 is removed from the proximal end 226 of the flexible body 224 or from another optional instrument port (not shown) along the flexible body 224.
  • the medical instrument 240 may be used with an image capture device (e.g., an endoscopic camera) also within the elongate flexible device 220. Alternatively, the medical instrument 240 may itself be the image capture device.
  • the medical instrument 240 additionally houses cables, linkages, or other actuation controls (not shown) that extend between the proximal and distal ends to controllably bend the distal end of the medical instrument 240.
  • the flexible body 224 also houses cables, linkages, or other steering controls (not shown) that extend between the drive unit 222 and the distal end 228 to controllably bend the distal end 228 as shown, for example, by the broken dashed line depictions 242 of the distal end 228.
  • at least four cables are used to provide independent “up- down” steering to control a pitch motion of the distal end 228 and “left-right” steering to control a yaw motion of the distal end 228.
  • the drive unit 222 can include drive inputs that removably couple to and receive power from drive elements, such as actuators, of the teleoperational assembly.
  • the medical instrument system 104 includes gripping features, manual actuators, or other components for manually controlling the motion of the medical instrument system 104.
  • the information from the tracking system 230 can be sent to a navigation system 244, where the information is combined with information from the visualization system 246 and/or the preoperatively obtained models to provide the physician or other operator with real-time position information.
  • Figures 3A through 8 illustrate example operations performed by a computer system in a medical system (e.g., the surgical system 100 of Figure 1 ).
  • the computer system (which may be implemented in the control system 112 of the surgical system 100 using the controller 114 and the memory 116) guides a surgical instrument positioned within an endoscope to a target point on an anatomical structure.
  • Figures 3A and 3B illustrate example operations 300A and 300B performed by the computer system.
  • the computer system receives a video 322 captured by a camera 320 of an endoscope 302.
  • the endoscope 302 has a first channel 304 for receiving an instrument 310A.
  • the instrument 310A may be a surgical instrument (e.g., a tissue grabber or helix).
  • the camera 320 is positioned at the front of the endoscope 302.
  • the video 322 may be captured from the perspective of the front of the endoscope 302.
  • the endoscope 302 may be set at a pose (e.g., position and orientation) such that the video 322 shows the surface of a region of an anatomical structure (e.g., at a surgical site).
  • the computer system generates a first overlay 326 including a first indicator 328 (e.g., a reticle).
  • the computer system then presents the first overlay 326 on the video 322 such that the first indicator 328 is positioned on a pixel 324A of the video 322.
  • the pixel 324A may show a portion of the anatomical structure that the computer system expects the instrument 310A to contact when the instrument 310A is extended from the endoscope 302 towards the anatomical structure.
  • the computer system thus displays the first indicator 328 that shows where the instrument 310A is expected to contact the anatomical structure when the instrument 310A is extended or deployed from the endoscope 302.
  • the computer system may determine the pixel 324A by determining a distance between the endoscope 302 and the anatomical structure. For example, the computer system may use stereo vision and/or light projections to determine a distance between the endoscope 302 and the anatomical structure. The computer system then uses the distance between the endoscope 302 and the anatomical structure, a distance between a channel (e.g., the first channel 304 of the endoscope 302) and the camera 320, and a distance between an image sensor of the camera and a front surface of the camera 320 to triangulate the pixel 324A of the video that shows the portion of the anatomical structure that the instrument 310A is expected contact. As the user moves the endoscope 302 within the surgical space, the computer system updates the distance between the endoscope 302 and the anatomical structure, which changes the pixel 324A.
  • a channel e.g., the first channel 304 of the endoscope 302
  • the computer system updates the distance between the endoscope 302
  • Figure 3B illustrates an example operation 300B performed by the computer system.
  • the computer system receives the video 322 captured by the camera 320 of the endoscope 302.
  • the endoscope 302 has a second channel 306 for receiving the instrument 310B (e.g., a tissue grabber or helix).
  • the camera 320 is positioned at the front of the endoscope 302.
  • the video 322 may be captured from the perspective of the front of the endoscope 302.
  • the endoscope 302 may be set at a pose (e.g., position and orientation) such that the video 322 shows the surface of a region of an anatomical structure (e.g., at a surgical site).
  • the instrument 310B may be positioned in the second channel 306 when the instrument 310A is positioned in the first channel 304.
  • the computer system generates a second overlay 336 including a second indicator 338.
  • the second overlay 336 is presented on the video 322 such that the second indicator 338 is positioned on a pixel 324B.
  • the computer system thus displays the second indicator 338 (e.g., a reticle) that shows where the instrument 310B is expected to contact the anatomical structure 401 when the instrument 310B is extended or deployed from the endoscope 302.
  • the computer system may determine the pixel 324B using a triangulation technique based on a distance between the endoscope and the anatomical structure, a distance between the second channel 306 of the endoscope 302 and the camera 320, and a distance between the image sensor of the camera 320 and the front surface of the camera 320.
  • the computer system updates the distance between the endoscope 302 and the anatomical structure, which changes the pixel 324B.
  • the computer system generates and presents an overlay that shows both indicators 328 and 338 simultaneously if both instruments 310A and 310B are positioned in their respective channels 304 and 306.
  • Figures 4A and 4B illustrate example operations 400A and 400B performed by the computer system.
  • the computer system performs the operations 400A and 400B to determine a distance between the surface of the anatomical structure 401 and the endoscope 302.
  • the computer system receives a video 420 of the anatomical structure 401.
  • the video 420 may be captured by one or more cameras of the endoscope 302.
  • the computer system presents an indicator 422 in the video 420 such that the indicator 422 is positioned on a portion of the surface of the anatomical structure 401 .
  • the computer system presents the indicator 422 in the video 420 when the endoscope 302 is within a first distance (di) from the surface of the anatomical structure 401.
  • the indicator 422 may be any shape or design (e.g., a circle).
  • the indicator 422 may be a first color (e.g., white).
  • the indicator 422 indicates a portion of the anatomical structure 401 where a tool located in the first channel 304 of the endoscope 302 will engage the anatomical structure 401 when the tool is extended from the endoscope 302.
  • the color of the indicator 422 may indicate to the user that the tool may now be extended or deployed from the endoscope 302 as the target point has been determined.
  • the indicator 422 also allows the user to determine, before the tool is extended from the endoscope 302, whether the tool will contact a desired portion of the anatomical structure when the tool is extended from the endoscope 302.
  • the computer system receives a video 430 of the anatomical structure 401.
  • the video 430 may be captured by one or more cameras of the endoscope 302.
  • the computer system presents an indicator 432 in the video 430 such that indicator 432 is positioned on a portion of the surface of the anatomical structure 401 .
  • the computer system presents the indicator 432 in the video 430 when the instrument 310A extends from the endoscope 302 and engages the surface of the anatomical structure 401 .
  • the indicator 432 may be any shape (e.g., a circle).
  • the indicator 432 may be a second color.
  • the indicator 432 may be a green circle.
  • the color of the indicator 432 may indicate that the instrument 310A is contacting the anatomical structure 401 and that the user may use the instrument 310A to operate on the surface of the anatomical structure 401 .
  • the instrument 310A is a tissue grabber, helix, or stapler
  • the user may trigger the instrument 310A to grab tissue of the anatomical structure 401 .
  • the instrument 310A contacts the surface of the anatomical structure 401 at a point 417. At the point 417, the instrument 310A engages the surface of the anatomical structure 401 to perform one or more operations at the target point or indicator 432.
  • the indicator 432 thus indicates an estimate of where the instrument 310A will contact the anatomical structure 401.
  • the target point and the indicator 432 changed accordingly.
  • the user need not engage in trial and error and deploy, retract, and reposition the instrument 310A several times until the instrument 310A contacts a desired portion of the anatomical structure 401.
  • the indicator 432 gives confidence to the user that the estimated target point is where the instrument 310A will contact the surface of the anatomical structure 401 .
  • more accurate measurements of the distance between the endoscope 302 and the anatomical structure 401 allow the computer system to determine a pixel 324A of the video 322 that shows a portion of the anatomical structure 401 where the surgical instrument 310A will contact the anatomical structure 401 when the instrument 310A is extended from the endoscope 302.
  • the user may use this information to plan and execute procedures with a high level of confidence, ensuring that critical structures are identified and treated appropriately.
  • More accurate spatial information also helps guide the placement of instruments and allows for effective navigation through narrow and confined spaces, which contributes to the efficient use of surgical resources, including time and equipment.
  • Figure 5A is a flowchart of an example method 500A performed by the computer system.
  • the computer system implements certain features that assist the user to guide an instrument through an endoscope to a target point on the anatomical structure, which may improve safety and efficacy of a medical procedure.
  • the computer system receives, from the endoscope 302, the video 322 of the anatomical structure 401 .
  • the endoscope 302 may include a plurality of lumens or channels for receiving a plurality of instruments. The instruments may then be extended from the lumens or channels towards the anatomical structure 401 . The instruments may also be retracted back towards the endoscope 302 away from the anatomical structure 401 .
  • the computer system determines the pixel 324A of the video 322 that shows a portion of the anatomical structure 401 where the instrument 310A positioned in the endoscope 302 will contact the anatomical structure 401 when the instrument 310A extends from the endoscope 302.
  • the computer system may use a triangulation technique to determine the pixel 324A.
  • the computer system may triangulate the position of the pixel 324A using a distance between the endoscope 302 and the anatomical structure 401 , a distance between the instrument 310A and a camera 320 positioned in the endoscope 302 (e.g., a front surface of the camera 320), and a distance between an image sensor in the camera 320 and the front surface of the camera 320.
  • the computer system generates a first overlay 326 that includes a first indicator 328 (e.g., a reticle).
  • the first indicator 328 may be a circle having a first color.
  • the computer system presents the first overlay 326 on the video 322 such that the first indicator 328 is positioned on the pixel 324A.
  • the first indicator 328 indicates the portion of the anatomical structure 401 where the instrument 31 OA is expected to contact the anatomical structure 401 when the instrument 31 OA is extended from the endoscope 302.
  • the color of the first indicator 328 may indicate to the user that the endoscope 302 is a certain distance from the anatomical structure 401 that the instrument 31 OA may be deployed from the endoscope 302 to contact the portion of the anatomical structure 401 .
  • the user need not deploy, retract, and reposition the instrument 310A several times until the instrument 31 OA contacts the portion of the anatomical structure 401 , thus reducing the time it takes to perform the surgical procedure.
  • Figures 5B and 5C illustrate example operations 500B and 500C performed by the computer system.
  • the computer system performs the operations 500B and 500C to determine the pixel 324A of the video 322 that shows a portion of the anatomical structure 401 where the instrument 310A positioned in the endoscope 302 will contact the anatomical structure 401 when the instrument 310A extends from the endoscope 302.
  • FIG. 5B shows the operation 500B with respect to the endoscope 302 in the x-z plane of a coordinate system.
  • the endoscope 302 includes the camera 320 (e.g., a pinhole camera).
  • a front surface of the camera 320 e.g., an entrance pupil of the camera 320
  • the camera 320 includes an image sensor 511 positioned a distance f from the camera 320 along the z-axis (which may also be referred to as an optical distance).
  • the image sensor 511 is arranged parallel to the x-axis and centered on the x-axis.
  • the instrument 310A is positioned a distance Dx from the camera 320 along the x- axis.
  • the anatomical structure 401 is positioned a distance z from the endoscope 302.
  • the distance z may be the distance from the endoscope 302 to the anatomical structure 401 along the length of the instrument 310A if the instrument 310A were extended from the endoscope 302.
  • the computer system knows (e.g., from specifications) the distance Dx and the distance f for the endoscope 302 and the camera 320 as these distances are based on the design and/or geometry of the endoscope 302 and the camera 320.
  • a projection of a point on the anatomical structure 401 onto the image sensor 511 is shown by the line 515. As shown by the line 515, the projection onto the image sensor 511 is a distance x from the z-axis and parallel to the x-axis.
  • Figure 5C is a view of the endoscope 302 along the z-axis.
  • Dx and Dy are the distances between the instrument 310A and the front surface of the camera 320 along the x and y axes
  • x and y are the distances between the projection on the image sensor and the center of the image sensor 511 along the x and y axes.
  • z [sqrt(Dx 2 + Dy 2 ) * f] / sqrt(x 2 + y 2 ).
  • the coordinates x and y of the projection on the image sensor are the coordinates of the pixel 324A in the video 322.
  • each point on the image sensor may be a pixel of the video 322.
  • the coordinates of the point on the image sensor (x, y) that show the projection are the coordinates of the pixel 324A.
  • the computer system can determine the x and y coordinates of the pixel 324A. The computer system may then determine the positioning of the indicator 328 in the overlay 326 such that the indicator 328 is positioned on the pixel 324A when the computer system presents the overlay 326 on the video 322.
  • Figures 5D and 5E illustrate example operations 500D and 500E performed by the computer system.
  • the computer system performs the operations 500D and 500E to determine the pixel 324A of the video 322 that shows the portion of the anatomical structure 401 that the instrument 310A is expected to contact when the instrument 310A is extended from the endoscope 302.
  • the computer system determines a product 554 of (i) the distance 550 between the instrument 310A and the camera 320 (e.g., Dx and/or Dy) and (ii) the distance 552 between the front surface of the camera 320 and the image sensor 511 of the camera 320 (e.g., f).
  • the computer system determines the pixel 324A based on a quotient of (i) the product 554 and (ii) the distance between the endoscope 302 and the anatomical structure 401 (e.g., z).
  • Figures 6A through 6D illustrate example operations 600A, 600B, 600C, and 600Dperformed by the computer system.
  • the computer system may perform one or more of the operations 600A, 600B, 600C, and 600Dto determine a distance between the endoscope 302 and the anatomical structure 401 (e.g., z).
  • Figure 6A shows the operation 600A for determining a distance between the endoscope 302 and the anatomical structure 401.
  • the computer system uses stereo vision and/or a stereo camera in the endoscope 302 to determine the distance between the endoscope 302 and the anatomical structure 401.
  • the stereo camera may include multiple cameras 320 that capture images or videos of the anatomical structure 401 from different perspectives.
  • the cameras 320 may capture images of the same portion of the anatomical structure, but the portion may appear in different pixels of the images due to the cameras 320 capturing the images of the portion from different perspectives.
  • the computer system determines a first pixel in a first image captured by a first camera (e.g., left camera) of the stereo camera. The portion of the anatomical structure 401 appears in the first pixel.
  • a first camera e.g., left camera
  • the computer system determines a second pixel in a second image captured by a second camera (e.g., right camera) of the stereo camera.
  • a second camera e.g., right camera
  • the portion of the anatomical structure 401 appears in the second pixel.
  • the computer system determines the distance between the endoscope 302 and the anatomical structure 401.
  • the computer system may determine a difference between the first pixel and the second pixel. For example, the computer system may subtract the coordinates of the first pixel from the coordinates of the second pixel, and vice versa, to determine the difference.
  • the computer system may also determine a distance between the first camera and the second camera. For example, the computer system may reference a specification describing the structure or design for the stereo camera to determine this distance. The computer system may then triangulate the distance between the endoscope 302 and the anatomical structure 401 using the difference between the first pixel and the second pixel and the distance between the first camera and the second camera.
  • Figure 6B shows an operation 600B for determining a distance between the endoscope 302 and the anatomical structure 401.
  • the computer system uses photometric stereo mapping to determine the distance between the endoscope 302 and the anatomical structure 401 .
  • the computer system directs multiple light sources at the anatomical structure 401 .
  • the computer system uses the camera 320 to capture an image of the anatomical structure 401 .
  • the computer system then switches to another light source and directs that light source at the anatomical structure 601 .
  • the computer system then captures another image of the anatomical structure 401 .
  • the computer system may repeat this process and capture any number of images of the anatomical structure 401 when illuminated by a different light source.
  • the computer system may analyze the shadows in the different captured images of the anatomical structure 401 to determine the distance between the endoscope and the anatomical structure 401 . For example, the computer system may know the locations of each light source. The computer system may analyze the images to determine the brighter portions of the images and the darker portions of the images. The computer system may determine that the brighter portions are closer to their respective light sources and that the darker portions are further away from their respective light sources.
  • the computer system may determine the distance between the endoscope 302 and the portion of the anatomical structure.
  • Figure 6C shows an operation 600C for determining a distance between the endoscope 302 and the anatomical structure 401 .
  • the computer system determines the distance between the endoscope 302 and the anatomical structure 401 using structured light.
  • the computer system directs a fixed pattern of light (e.g., a light pattern or light grid) onto the anatomical structure 401 .
  • the computer system knows the fixed pattern.
  • the shape of the anatomical structure 401 distorts this pattern of light.
  • the camera 320 captures an image of the anatomical structure 401 with the distorted pattern of light projected onto the anatomical structure 401 .
  • the computer system analyzes the distorted pattern of light on the anatomical structure 401 to determine the distance between the endoscope 302 and the anatomical structure 401. For example, the computer system may compare the distorted pattern of light with the fixed pattern of light to determine the distortions introduced by the anatomical structure 401. The computer system may then extract shape and distance information about the anatomical structure 401 using the distortions introduced by the anatomical structure 401. In some instances, the computer system may reconstruct or model the shape and distance of the anatomical structure using the distortions.
  • Figure 6D shows an operation 600D for determining a distance between the endoscope 302 and the anatomical structure 401.
  • the computer system determines the distance between the endoscope 302 and the anatomical structure 401 using time-of-flight.
  • the computer system directs a light pulse towards the anatomical structure 401 .
  • the computer system measures an amount of time for the anatomical structure 401 to reflect the light pulse back to the endoscope 302 (e.g., to a light sensor on the endoscope 302).
  • the computer system uses the measured time to determine the distance between the endoscope 302 and the anatomical structure 401 . For example, the further away the anatomical structure 401 is from the endoscope 302, the longer it will take for the anatomical structure 401 to reflect the light pulse back to the endoscope 302.
  • the computer system uses a time-of-flight sensor to measure the time to receive the reflected light pulse.
  • the sensor includes an illumination source and a camera.
  • the illumination source sends the pulse of light towards the anatomical structure 401 , and the sensor measures the time until the pulse is reflected and observed by the camera 320.
  • the distance is the measured time divided by the speed of light.
  • the computer system may not present the indicators on a display until the distance between the endoscope 302 and the anatomical structure 401 falls below a threshold (e.g., a few millimeters).
  • the computer system may continue to determine and update the distance between the endoscope 302 and the anatomical structure 401 as the endoscope 302 moves.
  • the computer system may compare the distance to the threshold. If the distance exceeds the threshold, the computer system may refrain from presenting the overlay on the video, which causes the indicator to not be visible. When the distance falls below the threshold, the computer system may present the overlay on the video to show the indicator.
  • Figures 7A through 7E illustrate example operations 700A, 700B, 700C, 700D, and 700E performed by the computer system.
  • the endoscope 302 includes the first channel 304 and the second channel 306.
  • the endoscope 302 may include more channels or lumens for receiving instruments.
  • the instrument 310A has been placed in the first channel 304.
  • the instrument 310A is placed a distance d2 from the distal end of the endoscope 302.
  • the camera 320 of the endoscope 302 captures the video 322 of the anatomical structure 401 .
  • the video 322 is presented on the display system 110.
  • the computer system presents an indicator 422 on the display system 110.
  • the indicator 422 may be a circle.
  • the color of the indicator 422 may be white.
  • the color of the indicator 422 may indicate to the user that the endoscope 302 is too far from the anatomical structure 401 to accurately determine where on the anatomical structure 401 the instrument 310A is expected to contact when the instrument 310A is extended from the endoscope 302.
  • the endoscope 302 is moved closer to the anatomical structure 401.
  • the computer system presents an indicator 424 on the display system 110.
  • the indicator 424 may be a circle.
  • the color of the indicator 424 may be yellow.
  • the color of the indicator 424 may indicate to the user that the endoscope 302 is still too far from the anatomical structure 401 to establish an accurate estimate of where on the anatomical structure 401 the instrument 310A is expected to contact when the instrument 310A is extended from the endoscope 302.
  • the yellow color of the indicator 424 may indicate that the user is closer to establishing an accurate estimate.
  • the endoscope 302 is moved even closer to the anatomical structure 401.
  • the computer system presents an indicator 432 on the display system 110.
  • the indicator 432 may be a circle.
  • the color of the indicator 432 may be green or blue.
  • the color of the indicator 432 may indicate to the user that the endoscope 302 is close enough to the anatomical structure 401 to establish an accurate estimate of where on the anatomical structure 401 the instrument 310A is expected to contact when the instrument 310A is extended from the endoscope 302.
  • the user understands that the instrument 310A is likely to contact the portion of the anatomical structure 401 indicated by the indicator 432 when the instrument 310A is extended from the endoscope 302.
  • the computer system generates an alert 720 when a distance between the endoscope 302 and the anatomical structure 401 allows the instrument 310A to contact the anatomical structure 401 when the instrument 310A is extended or deployed.
  • the alert 720 may be a visual alert or an auditory alert or a haptic alert.
  • the computer system may display the visual alert on the display system 110.
  • the auditory alert may use sound to alert the user to a notification.
  • the tone, volume, and duration of the sound may convey different levels of urgency.
  • the haptic alert may be haptic feedback, such as vibrations or tactile sensations that are exhibited by the endoscope 302.
  • the user need not deploy and retract the instrument from the endoscope a number of times or make several attempts to determine the portion of the anatomical structure that the instrument will contact.
  • the user will have confidence that the estimated target point is where the instrument will contact the surface of the anatomical structure. This will mitigate the number of deployment and retraction attempts made in determining where the instrument will land on the surface of the anatomical structure.
  • the instrument 310A is deployed from the endoscope 302 to engage the surface of the anatomical structure 401.
  • the engagement point 715 may be at an edge of the indicator 442. In other examples, the engagement point 715 may be at the center of the indicator 442.
  • the engagement point 715 may be a series of points within the indicator 442.
  • the instrument 310A extends from the distal end or distal tip of the endoscope 302 to reach the engagement point 715.
  • the shape of the indicator 442 may change (e.g., to a square). The square shape may indicate to the user that the instrument 310A is contacting the anatomical structure 401.
  • the computer system adjusts the indicator 442 to indicate a depth of bite 730 of the instrument 310A on the anatomical structure 401. For example, the computer system may determine how far the instrument 310A has extended from the endoscope 302 and the amount of pressure that the instrument 310A is exerting on the anatomical structure 401 . The computer system may then use this information to determine how deep of a bite the instrument 310A would make on the anatomical structure 401 if the instrument 310A were actuated. The computer system then adjusts the indicator 442 (e.g., the shape and/or color of the indicator 442) to indicate the depth the bite. For example, one color or shape may indicate a shallow bite, while another color or shape may indicate a deep bite.
  • the indicator 442 e.g., the shape and/or color of the indicator 442
  • FIG 8 illustrates an example operation 800 performed by the computer system.
  • the instrument 310A may be a tissue grabber (or another instrument) that is deployed from the endoscope 302 to grab or pull tissue.
  • the tissue is pulled a distance ds by the instrument 310A.
  • the tissue pull distance 810 may indicate the amount of tissue that is displaced or drawn away from its original position when it is manipulated by the instrument 310A.
  • the tissue is pulled to a new position 802.
  • the indicator 805 may be a star.
  • the star shape may indicate to the user that the instrument 310A is contacting the anatomical structure 401 .
  • the computer system may adjust the indicator 805 (e.g., the size, color, or shape of the indicator 805) to indicate the tissue pull distance 810.
  • the computer system may use a first color for the indicator 805 when the tissue has been pulled a short distance.
  • the computer may change the color of the indicator 805 when the tissue has been pulled a greater distance. In this manner, the computer system uses the indicator 805 to indicate how far the tissue has been pulled by the instrument 310A.
  • the computer system uses a triangulation technique to determine the pull distance. For example, the computer system may determine the position of the instrument 310A on the anatomical structure 401 . The instrument 310A is then actuated to grab tissue and the tissue is then pulled towards the endoscope 302 b pulling the instrument 310A back. As the tissue is pulled towards the endoscope 302, the computer system may update the position or location of the instrument 310A. The computer system may then use the change in position of the instrument 310A as the pull distance. In the example of Figure 8, the computer system the pull distance to be ds.
  • a machine learning model may be trained using annotated data to predict where the instrument (e.g., the tissue grabber) will engage the anatomical structure based on a variety of factors or variables.
  • the machine learning model may be built using data collection, data annotation, data preprocessing, feature extraction, model selection, and model training.
  • the data collection step may include gathering a dataset of endoscopic videos or images of previous successful digital overlay attempts.
  • the machine learning model may be trained using the videos or images to estimate the target point on the anatomical structure where the instrument (e.g., helix) will contact the anatomical structure.
  • the computer system may then apply the machine learning model to the video captured by the camera of the endoscope to predict where the instrument will contact the anatomical structure when the instrument is extended from the endoscope.
  • the computer system may use a neural network to determine where sutures should be applied on the anatomical structure to be consistent with existing best medical practices.
  • the computer system then includes in the overlay indicators that indicate the positioning of the sutures.
  • the computer system may then position the overlay (e.g. , on a display) over a video so the healthcare provider can see where the sutures should be applied on the display in a specific orientation, pattern, or even step-to-step layout.
  • the overlay may present visual indicators on the video to indicate where the sutures should be applied.
  • the computer system may also present audio and textual messages or indicators that inform the healthcare provider where to place the sutures.
  • the healthcare provider may then operate a tool to apply a suture at a position indicated in the overlay.
  • the computer system may use the neural network to analyze the video to determine a position and direction of the applied suture.
  • the computer system may then update the locations and directions of subsequent sutures in the overlay to account for the changes caused by the applied suture. In this manner, the computer system guides the suturing or stapling process.
  • the computer system collects data about the procedure.
  • the computer system may track the pull locations.
  • the computer system may collect images or pictures (e.g., of the anatomical structure) that show the effect of the procedure on the anatomical structure.
  • the computer system may collect followup data for the patient that shows the effectiveness of the procedure.
  • the computer system uses the collected data to train or update the artificial intelligence (e.g., the neural network) used by the computer system during the pre-operative and intra-operative stages. In this manner, the computer system uses the information from the procedure to inform subsequent procedures.
  • the artificial intelligence e.g., the neural network
  • the user may use a manual process for determining where the instrument will contact the anatomical structure when the instrument is extended from the endoscope.
  • the user may first position a light source (e.g., a laser) in a channel of the endoscope.
  • a light source e.g., a laser
  • the laser may project a beam onto the anatomical structure from the channel.
  • the user may view the image or video from the camera of the endoscope to determine where the beam is projected onto the anatomical structure.
  • the user may stop moving the endoscope and extract the laser from the channel.
  • the endoscope is held steady, the user may insert the instrument into the channel and then extend the instrument from the channel.
  • the present disclosure describes a medical system that guides a user operating or handling an instrument to accurately predict where the instrument will engage a surface of an anatomical structure.
  • the system displays an indicator (e.g., a reticle) that shows where the instrument is expected to contact an anatomical structure when the instrument is extended or deployed from an endoscope.
  • the system receives, from the endoscope, a video of the anatomical structure and determines pixels of the video that show a portion of the anatomical structure where the instrument will contact the anatomical structure when the instrument is extended or deployed.
  • the system then generates an overlay with the indicator and presents the overlay on the video such that the indicator is positioned on the pixels.
  • the system may be expanded to provide any number of indicators that show where any number of instruments will engage or contact the surface of the anatomical structure.
  • spatially relative terms such as “beneath”, “below”, “lower”, “above”, “upper”, “proximal”, “distal”, and the like may be used to describe one element’s or feature’s relationship to another element or feature as illustrated in the figures.
  • These spatially relative terms are intended to encompass different positions (i.e., locations) and orientations (i.e., rotational placements) of the elements or their operation in addition to the position and orientation shown in the figures. For example, if the content of one of the figures is turned over, elements described as “below” or “beneath” other elements or features would then be “above” or “over” the other elements or features.
  • the exemplary term “below” can encompass both positions and orientations of above and below.
  • a device may be otherwise oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
  • descriptions of movement along and around various axes include various special element positions and orientations.
  • the singular forms “a”, “an”, and “the” are intended to include the plural forms as well, unless the context indicates otherwise.
  • the terms “comprises”, “comprising”, “includes”, and the like specify the presence of stated features, steps, operations, elements, and/or components but do not preclude the presence or addition of one or more other features, steps, operations, elements, components, and/or groups.
  • Components described as coupled may be electrically or mechanically directly coupled, or they may be indirectly coupled via one or more intermediate components.
  • the term “position” refers to the location of an element or a portion of an element in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian x-, y-, and z-coordinates).
  • the term “orientation” refers to the rotational placement of an element or a portion of an element (three degrees of rotational freedom - e.g., roll, pitch, and yaw).
  • shape refers to a set positions or orientations measured along an element.
  • proximal refers to a direction toward the base of the computer-assisted device along its kinematic chain and distal refers to a direction away from the base along the kinematic chain.
  • aspects of this disclosure are described in reference to computer-assisted systems and devices, which may include systems and devices that are teleoperated, remote-controlled, autonomous, semiautonomous, robotic, and/or the like. Further, aspects of this disclosure are described in terms of an embodiment using a medical system, such as the DA VINCI SURGICAL SYSTEM or ION SYSTEM commercialized by Intuitive Surgical, Inc. of Sunnyvale, California. Knowledgeable persons will understand, however, that aspects disclosed herein may be embodied and implemented in various ways, including robotic and, if applicable, non-robotic embodiments. Techniques described with reference to surgical instruments and surgical methods may be used in other contexts.
  • the instruments, systems, and methods described herein may be used for humans, animals, portions of human or animal anatomy, industrial systems, general robotic, or teleoperational systems.
  • the instruments, systems, and methods described herein may be used for non-medical purposes including industrial uses, general robotic uses, sensing or manipulating non-tissue work pieces, cosmetic improvements, imaging of human or animal anatomy, gathering data from human or animal anatomy, setting up or taking down systems, training medical or non-medical personnel, and/or the like.
  • Additional example applications include use for procedures on tissue removed from human or animal anatomies (with or without return to a human or animal anatomy) and for procedures on human or animal cadavers. Further, these techniques can also be used for medical treatment or diagnosis procedures that include, or do not include, surgical aspects.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Radiology & Medical Imaging (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Robotics (AREA)
  • Signal Processing (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Gynecology & Obstetrics (AREA)
  • Endoscopes (AREA)

Abstract

La présente divulgation concerne un système et un procédé de guidage d'un instrument vers un point cible sur une surface d'une structure anatomique. Le système reçoit, en provenance d'un endoscope, une vidéo d'une structure anatomique, détermine un pixel de la vidéo qui montre une partie de la structure anatomique où un instrument positionné dans l'endoscope est en contact avec la structure anatomique lorsque l'instrument s'étend à partir de l'endoscope, génère une première superposition comprenant un premier indicateur, et présente la première superposition sur la vidéo de telle sorte que le premier indicateur est positionné sur le pixel.
PCT/US2025/026442 2024-04-29 2025-04-25 Estimation assistée par ordinateur de réseau de recouvrement cible Pending WO2025230837A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463639970P 2024-04-29 2024-04-29
US63/639,970 2024-04-29

Publications (1)

Publication Number Publication Date
WO2025230837A1 true WO2025230837A1 (fr) 2025-11-06

Family

ID=95784207

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/026442 Pending WO2025230837A1 (fr) 2024-04-29 2025-04-25 Estimation assistée par ordinateur de réseau de recouvrement cible

Country Status (1)

Country Link
WO (1) WO2025230837A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200054400A1 (en) * 2017-04-26 2020-02-20 Olympus Corporation Image processing apparatus, operating method of image processing apparatus, and computer-readable recording medium
JP2020516408A (ja) * 2017-04-13 2020-06-11 ブイ.ティー.エム.(バーチャル テープ メジャー)テクノロジーズ リミテッド 内視鏡測定の方法および器具
WO2023052938A1 (fr) * 2021-09-29 2023-04-06 Cilag Gmbh International Procédés et systèmes de commande d'instruments chirurgicaux coopératifs
KR20230113360A (ko) * 2020-11-30 2023-07-28 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 로봇 보조형 시스템을 위한 사용자 인터페이스에서합성 표시자들을 제공하는 시스템
WO2024013651A1 (fr) * 2022-07-13 2024-01-18 Auris Health, Inc. Entraînement de scope flexible dynamique et ses procédés d'utilisation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020516408A (ja) * 2017-04-13 2020-06-11 ブイ.ティー.エム.(バーチャル テープ メジャー)テクノロジーズ リミテッド 内視鏡測定の方法および器具
US20200054400A1 (en) * 2017-04-26 2020-02-20 Olympus Corporation Image processing apparatus, operating method of image processing apparatus, and computer-readable recording medium
KR20230113360A (ko) * 2020-11-30 2023-07-28 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 로봇 보조형 시스템을 위한 사용자 인터페이스에서합성 표시자들을 제공하는 시스템
WO2023052938A1 (fr) * 2021-09-29 2023-04-06 Cilag Gmbh International Procédés et systèmes de commande d'instruments chirurgicaux coopératifs
WO2024013651A1 (fr) * 2022-07-13 2024-01-18 Auris Health, Inc. Entraînement de scope flexible dynamique et ses procédés d'utilisation

Similar Documents

Publication Publication Date Title
US20240374334A1 (en) Systems and methods for adaptive input mapping
JP6793780B2 (ja) カテーテルの位置付け及び挿入のためのグラフィカル・ユーザインターフェイス
CN112004496B (zh) 与细长装置有关的系统和方法
CN109069217B (zh) 图像引导外科手术中的姿势估计以及透视成像系统的校准的系统和方法
JP7662221B2 (ja) 解剖学的境界を規定するためのグラフィカルユーザインターフェイス
KR102475654B1 (ko) 중재 시술 계획을 위한 시스템 및 방법
KR20190014112A (ko) 영상 안내식 시술 중에 복수의 모드에서 안내 정보를 디스플레이하기 위한 그래픽 사용자 인터페이스
KR20220065894A (ko) 수술중 세그먼트화를 위한 시스템 및 방법
US20240238049A1 (en) Medical instrument guidance systems and associated methods
US20250245826A1 (en) Systems and methods for connecting segmented structures
WO2019032450A1 (fr) Systèmes et procédés de rendu d'alertes sur un écran d'un système de téléopération
WO2025230837A1 (fr) Estimation assistée par ordinateur de réseau de recouvrement cible
CN120435263A (zh) 用于引导式工具改变弹性的系统和方法
US20240070875A1 (en) Systems and methods for tracking objects crossing body wallfor operations associated with a computer-assisted system
US11850004B2 (en) Systems and methods for determining an arrangement of explanted tissue and for displaying tissue information
US20250318816A1 (en) Systems and methods for arranging channels of an elongate flexible device
US20250235287A1 (en) Computer-assisted distance measurement in a surgical space
WO2025212319A1 (fr) Annotation assistée par ordinateur de structures de sous-surface sur une surface
US20240350121A1 (en) Systems and methods for three-dimensional imaging
WO2025101531A1 (fr) Pré-rendu graphique de contenu virtuel pour systèmes chirurgicaux
WO2025064440A1 (fr) Enregistrement et suivi assistés par ordinateur de modèles d'objets anatomiques
KR20230110583A (ko) 기구 롤에 대한 시각화 조절
CN119584912A (zh) 形状定位柔性器械
JPWO2022112969A5 (fr)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25727117

Country of ref document: EP

Kind code of ref document: A1