[go: up one dir, main page]

WO2025188850A1 - Dispositif de lithotripsie autonome et procédés d'affichage d'actions correctives associées - Google Patents

Dispositif de lithotripsie autonome et procédés d'affichage d'actions correctives associées

Info

Publication number
WO2025188850A1
WO2025188850A1 PCT/US2025/018498 US2025018498W WO2025188850A1 WO 2025188850 A1 WO2025188850 A1 WO 2025188850A1 US 2025018498 W US2025018498 W US 2025018498W WO 2025188850 A1 WO2025188850 A1 WO 2025188850A1
Authority
WO
WIPO (PCT)
Prior art keywords
catheter
image
rol
size
lithotripsy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/018498
Other languages
English (en)
Other versions
WO2025188850A8 (fr
Inventor
Lampros Athanasiou
Franklin King
Nobuhiko Hata
Satoshi Kobayashi
Fumitaro Masaki
Daniel Arthur WOLLIN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brigham and Womens Hospital Inc
Canon USA Inc
Original Assignee
Brigham and Womens Hospital Inc
Canon USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brigham and Womens Hospital Inc, Canon USA Inc filed Critical Brigham and Womens Hospital Inc
Publication of WO2025188850A1 publication Critical patent/WO2025188850A1/fr
Publication of WO2025188850A8 publication Critical patent/WO2025188850A8/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B18/18Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves
    • A61B18/20Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser
    • A61B18/22Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser the beam being directed along or through a flexible conduit, e.g. an optical fibre; Couplings or hand-pieces therefor
    • A61B18/26Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body by applying electromagnetic radiation, e.g. microwaves using laser the beam being directed along or through a flexible conduit, e.g. an optical fibre; Couplings or hand-pieces therefor for producing a shock wave, e.g. laser lithotripsy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/32Surgical robots operating autonomously
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/00234Surgical instruments, devices or methods for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • A61B2017/00305Constructional details of the flexible means
    • A61B2017/00314Separate linked members
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B17/00234Surgical instruments, devices or methods for minimally invasive surgery
    • A61B2017/00292Surgical instruments, devices or methods for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
    • A61B2017/003Steerable
    • A61B2017/00318Steering mechanisms
    • A61B2017/00323Cables or rods
    • A61B2017/00327Cables or rods with actuating members moving in opposite directions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00315Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for treatment of particular body parts
    • A61B2018/00505Urinary tract
    • A61B2018/00511Kidney
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00571Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body for achieving a particular surgical effect
    • A61B2018/00577Ablation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00636Sensing and controlling the application of energy
    • A61B2018/00904Automatic detection of target tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00982Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body combined with or comprising means for visual or photographic inspections inside the body, e.g. endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/309Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using white LEDs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/376Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
    • A61B2090/3762Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras

Definitions

  • Endoscopes are generally composed of a passive proximal section and an active distal section.
  • the proximal passive section can be rigid, semi-rigid, or flexible.
  • the active distal section includes a steerable tip that is manually controlled or remotely actuated by control wires connected to actuation wheels located on the handle of the device.
  • Endoscopes are typically equipped with an imaging device (a camera), a light source, irrigation and/or suction channels, and at least one instrument channel for passing interventional tools.
  • Typical endoscopic instruments have at least three degrees of freedom (3DOF) which allow for insertion, rotation and grasping operations.
  • a physician may insert a ureteroscope and guide the ureteroscope to a position identified as being located near a urinary stone.
  • US 2021/0298590 to Ayvali discusses an application that records urinary stone locations, identifies target papilla, and records position(s) of target papilla.
  • US 2004/0249267 to Gilboa discusses a system to facilitate navigation to a target within a branched structure, e.g., the bronchial tree, to guide a medical tool to the target.
  • Ureteroscopy for transurethral lithotripsy may be performed with a robotic ureteroscope or continuum robot that includes a bending section with flexible body.
  • a physician may control the robotic ureteroscope by control via a joystick of the bending section to navigate through the urinary system.
  • a type of continuum robot is discussed by US 11,685,046 to Takagi et al. Gauhar, V., et al., Robotic Retrograde Intrarenal Surgery: A Journey from “Back to the Future”, Journal of Clinical Medicine (Vol. 11, Issue 18) 2022, discusses robotic platforms available for flexible ureteroscopy, including that a robotic device with a manual ureteroscope for recording and replaying input.
  • Kidney stones are characterized as mineral deposits found in the renal pelvis and calyces. They can affect around 5% of women and 12% of men in America, can cause nausea, vomiting, pain, and hematuria, and the patient might end up with infection. Diagnosis of kidney stones is based on radiological imaging and treatment can vary from mild with the use of analgesics or antibiotics, in case infection occurs, to more intense like lithotripsy. Lithotripsy is the physical destruction of kidney stones and is a more permanent treatment. Depending on the stone size, different lithotripsy techniques have been developed.
  • Extracorporeal shockwave therapy is a non-invasive lithotripsy method in which shock waves generated outside the body are focused upon the stone at a rate of one or per second. The stone is transformed into fragments which are small enough to pass through the urethra.
  • ESWT lithotripsy can be applied to kidney stones with size less than 2 cm while for stones greater than 2 cm size laser lithotripsy was introduced.
  • Laser lithotripsy is a minimally invasive procedure typically performed by a urologist for treatment of a urinary tract stone by fragmentation. The ureteroscope is passed through the urethra and bladder all the way up to the ureter at the point where the stone is located.
  • Laser lithotripsy may use a ureteroscope, which includes a flexible laser fiber and a camera, which can live show images. The camera is used to visually target the stone and the laser is used to physically break the stones into fragments. The fragments can then be removed by using a basket-like instrument.
  • Laser lithotripsy has several advantages such as: reduced recovery time, effectiveness and versatility since it can be applied to multiple types of stones. On the other hand, disadvantages may include difficulty in retrieving fragments and the risk of complications during the procedure. Those risks involve the injury of the surrounding tissues, infections, or even damage to the ureter. [008] A factor that can affect the clinical complications and prolong operation time is the retro-pulsive movement of the stone during laser ablation.
  • a lithotripsy apparatus that includes a lithotripsy wave guide shaft configured to transmit energy to a urinary tract stone, in which a lithotripter collects signal data, provides feedback to a user, and determines if the lithotripsy wave guide shaft is in contact with a tissue; if the lithotripsy wave guide shaft is in contact with a stone; a type of stone; if a user is applying force in excess of a predetermined threshold; and physical characteristics of a stone.
  • the laser must touch the tissue. Also, Shelton does not recognize when the tissue is being traumatized.
  • a ureteroscope that includes an elongate flexible shaft; a camera at a distal end of the shaft; and an image processing module coupled with the ureteroscope that includes a console with a processor for receiving imaging data including a first image that includes a plurality of objects including a first subset of the plurality of objects that obstructs visibility of one or more objects of a second subset of the plurality of objects.
  • the processor generates a second image including the second subset of the plurality of objects visibly unobstructed, and renders of the first image or the second image on a display.
  • Gupta et al., Multi-class motion-based semantic segmentation for ureteroscopy and laser lithotripsy, discusses automated segmentation of kidney stones and the laser fiber for performing automated quantitative analysis, particularly stone-size estimation, that a surgeon can use to decide if a stone requires further fragmentation. Stone detection is a crucial step of this autonomous process and involves the automatic segmentation thereof.
  • Gupta discusses factors such as turbid fluid inside the cavity, specularities, motion blur due to kidney movements and camera motion, bleeding, and stone debris as impacting the quality of vision within the kidney, that may lead to extended operative times.
  • Gupta shows the stone in an ureteroscopic video in order to perform accurate measurements. However, Gupta does not segment an image during lithotripsy.
  • An aspect of the present disclosure provides a method for performing laser lithotripsy, with the method including inserting a catheter into a lumen; navigating the catheter through the lumen along an insertion trajectory; obtaining at least one image of one or more objects within the lumen; segmenting the at least one image; detecting at least one object of the one or more objects having a size exceeding a threshold as at least one target object; and defining a region of lasing (ROL) of the at least one target object, aligning a tip of the catheter with the ROL, and performing lithotripsy.
  • ROL region of lasing
  • a further aspect of the present disclosure provides a method for performing lithotripsy, the method including inserting a catheter into a lumen; navigating the catheter through the lumen along an insertion trajectory; obtaining at least one image of one or more objects within the lumen; segmenting the at least one image; identifying at least two targets among the segmented at least one image; identifying at least one parameter of the targets; prioritizing the targets based on at least one parameter of each target; defining an ROL for a highest priority target; aligning a tip of the catheter with the ROL; and performing lithotripsy.
  • Yet another aspect of the present disclosure provides an information processing apparatus to control a steerable catheter, the information processing apparatus including at least one memory configured to store instructions; and at least one processor configured to execute the stored instructions to cause the steerable catheter to: obtain at least one image of one or more objects within a lumen; segment the at least one image; detect at least one object of the one or mor objects having a size exceeding a threshold as at least one target object; and define an ROL of the at least one target object, align a catheter tip with the ROL, activate a laser on a distal end of the steerable catheter, and perform lithotripsy.
  • a further aspect of the present disclosure provides an autonomous navigation robot system that includes a steerable catheter; a camera at the distal end of the steerable catheter; one or more actuators configured to automatically move the steerable catheter; and a controller that is configured to: obtain, from the camera, at least one image of an object within a lumen; segment the at least one image; determine a size of the object; and in response to the size of the object exceeding or being equal to a predetermined size: detect an ROL of the object, align the catheter tip with the ROL, activate a laser on a distal end of the steerable catheter, and perform lithotripsy.
  • FIG. 1 illustrates a simplified representation of a medical environment, such as an operating room, where a robotic catheter system can be used.
  • FIG.2 illustrates a functional block diagram of the robotic catheter system.
  • FIG.3 represents the catheter and bending thereof.
  • FIG.4 is a block diagram illustrating components of the robotic catheter system.
  • FIG.5 is a block diagram illustrating components of the system controller and/or the display controller. [0025] FIG.
  • FIGS. 7A-7D illustrate kidney and stone phantoms for validating data for an autonomous stone ablation method according to an embodiment.
  • FIGS. 8A-8C are images derived from the recorded videos of extraction of a phantom stone according to an embodiment.
  • FIG. 9 is an image dataset used to train a U-net algorithm according to an embodiment.
  • FIG.10 illustrates a method of autonomous ablation that is performed upon reaching the stone according to an embodiment.
  • FIGS. 11A and 11B provide results of stone target evaluation according to an embodiment.
  • FIGS. 12A and 12B present application examples of the stone targeting procedure according to an embodiment.
  • FIG. 13 schematically presents autonomous movement of the catheter, according to an embodiment.
  • DETAILED DESCRIPTION [0033] Aspects of the present disclosure can be understood by reading the following detailed description in light of the accompanying figures. It is noted that, in accordance with standard practice, the various features of the drawings are not drawn to scale and do not represent actual components. Details such as dimensions of the various features may be arbitrarily increased or reduced for ease of illustration. In addition, reference numerals, labels and/or letters are repeated in the various examples to depict similar components and/or functionality. This repetition is for the purpose of simplicity and clarity and does not in itself limit the various embodiments and/or configurations the same components discussed.
  • a numeric value may have a value that is +/- 0.1% of the stated value (or range of values), +/-1% of the stated value (or range of values), +/-2% of the stated value (or range of values), +/-5% of the stated value (or range of values), +/-10% of the stated value (or range of values), etc. Any numerical range, if recited herein, is intended to be inclusive of end values and includes all sub-ranges subsumed therein, unless specifically stated otherwise.
  • the term “substantially” is meant to allow for deviations from the descriptor that do not negatively affect the intended purpose. For example, deviations that are from limitations in measurements, differences within manufacture tolerance, or variations of less than 5% can be considered within the scope of substantially the same.
  • the specified descriptor can be an absolute value (e.g. substantially spherical, substantially perpendicular, substantially concentric, etc.) or a relative term (e.g. substantially similar, substantially the same, etc.).
  • Real time refers to a level of computer responsiveness that a user senses as sufficiently immediate or that enables the computer to keep up with some external process.
  • real-time refers to the actual time during which something takes place and the computer may at least partly process the data in real time (as it comes in).
  • real-time processing relates to a system in which input data is processed within milliseconds so that it is available virtually immediately as feedback, e.g., in a missile guidance, an airline booking system, or the stock market real-time quotes (RTQs).
  • the present disclosure generally relates to medical devices, and it exemplifies embodiments of an endoscope or catheter, and more particular to a steerable catheter controlled by a medical continuum robot (MCR).
  • MCR medical continuum robot
  • the embodiments of the endoscope or catheter and portions thereof are described in terms of their state in a three-dimensional space.
  • the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian X, Y, Z coordinates);
  • the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom--e.g., roll, pitch, and yaw);
  • the term “posture” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of object in at least one degree of rotational freedom (up to six total degrees of freedom);
  • the term “shape” refers to a set of posture, positions, and/or orientations measured along the elongated body of the object.
  • proximal and distal are used with reference to the manipulation of an end of an instrument extending from the user to a surgical or diagnostic site.
  • proximal refers to the portion (e.g., a handle) of the instrument closer to the user
  • distal refers to the portion (tip) of the instrument further away from the user and closer to a surgical or diagnostic site.
  • spatial terms such as “vertical”, “horizontal”, “up”, and “down” may be used herein with respect to the drawings.
  • surgical instruments are used in many orientations and positions, and these terms are not intended to be limiting and/or absolute.
  • the term “catheter” generally refers to a flexible and thin tubular instrument made of medical grade material designed to be inserted through a narrow opening into an anatomical bodily lumen (e.g., an airway or a vessel) to perform a broad range of medical functions.
  • steererable catheter refers to a medical instrument comprising an elongated flexible shaft having at least one tool channel spanning through a plurality of bendable segments that are actuated by an actuator that applies an actuation force via drive wires arranged along a wall of the shaft.
  • endoscope refers to a rigid or flexible medical instrument which uses light guided by an optical probe to look inside a body cavity or organ. A medical procedure, in which an endoscope is inserted through a natural opening, is called an endoscopy.
  • Specialized endoscopes are generally named for how or where the endoscope is intended to be used, such as the bronchoscope (mouth), sigmoidoscope (rectum), cystoscope (bladder), nephroscope (kidney), bronchoscope (bronchi), laryngoscope (larynx), otoscope (ear), arthroscope (joint), laparoscope (abdomen), and gastrointestinal endoscopes.
  • the terms “optical fiber”, “fiber optic”, or simply “fiber” refers to an elongated, flexible, light conducting waveguide capable of conducting light from one end to another end due to the effect known as total internal reflection.
  • FIG. 1 illustrates a simplified representation of a medical environment, such as an operating room, where a robotic catheter system 100 can be used.
  • FIG. 2 illustrates a functional block diagram of the robotic catheter system 100.
  • FIG. 3 represents the catheter and bending thereof.
  • FIG. 4 illustrate a logical block diagram of the robotic catheter system 100.
  • the system 100 includes a system console 102 operatively connected to a steerable catheter / ureteroscope 104 via a robotic platform 106.
  • the robotic platform 106 includes one or more than one robotic arm 108 and a linear translation stage 110.
  • a user 112 e.g., a physician
  • the user interface may include at least one of a main display 118 (a first user interface unit), a secondary display 120 (a second user interface unit), and a handheld controller 124 (a third user interface unit).
  • the main display 118 may include, for example, a large display screen attached to the system console 102 or mounted on a wall of the operating room and may be, for example, designed as part of the robotic catheter system 100 or be part of the operating room equipment.
  • a secondary display 120 that is a compact (portable) display device configured to be removably attached to the robotic platform 106. Examples of the secondary display 120 include a portable tablet computer or a mobile communication device (a cellphone).
  • the steerable catheter 104 is actuated via an actuator unit 122.
  • the actuator unit 122 is removably attached to the linear translation stage 110 of the robotic platform 106.
  • the handheld controller 124 may include a gamepad-like controller with a joystick having shift levers and/or push buttons.
  • the actuator unit 122 may be a one-handed controller or a two-handed controller.
  • the actuator unit 122 is enclosed in a housing having a shape of a catheter handle.
  • One or more access ports 126 are provided in or around the catheter handle.
  • the access port 126 is used for inserting and/or withdrawing end effector tools and/or fluids when performing an interventional procedure of the patient 114.
  • the system console 102 includes a system controller 128, a display controller 130, and the main display 118.
  • the main display 118 may include a conventional display device such as a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, a quantum dot light-emitting diode (QLED) display or the like.
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • QLED quantum dot light-emitting diode
  • the main display 118 provides a graphic interface unit (GUI) configured to display one or more views. These views may include live view image 132, an intraoperative image 134, and a preoperative image 136, and other procedural information 138. Other views that may be displayed include a model view, a navigational information view, and/or a composite view.
  • the live image view 132 may be an image from a camera at the tip of the catheter. This view may also include, for example, information about the perception and navigation of the catheter 104.
  • the preoperative image 136 may include pre-acquired three dimensional (3D) or two dimensional (2D) medical images of the patient 114 acquired by conventional imaging modalities such as computer tomography (CT), magnetic resonance imaging (MRI), or ultrasound imaging.
  • the intraoperative image 134 may include images used for image guided procedure such images may be acquired by fluoroscopy or CT imaging modalities. Intraoperative image 134 may be augmented, combined, or correlated with information obtained from a sensor, camera image, or catheter data.
  • the sensor may be located at the distal end of the catheter.
  • the catheter tip tracking sensor 140 may be, for example, an electromagnetic (EM) sensor. If an EM sensor is used, a catheter tip position detector 142 may be included in the robotic catheter system 100; this catheter tip position detector 142 would include an EM field generator operatively connected to the system controller 128.
  • EM electromagnetic
  • FIG. 2 illustrates the robotic catheter system 100 includes the system controller 128 operatively connected to the display controller 130, which is connected to the display unit 118, and to the hand-held control 124.
  • the system controller 128 is also connected to the actuator unit 122 via the robotic platform 106, which includes the linear translation stage 110.
  • the actuator unit 122 includes a plurality of motors that control a plurality of drive wires 160. These drive wires travel through the steerable catheter 104.
  • One or more access ports 126 may be located on the catheter.
  • the catheter includes a proximal section 148 located between the actuator and the proximal bending section 152 where they actuate the proximal bending section.
  • Three of the six drive wires 160 continue through the distal bending section 156 where they actuate this section and allow for a range of movement.
  • at least two bendable sections (152 and 156) may be provided.
  • Other embodiments as described herein can have three bendable sections.
  • a single bending section may be provided, or alternatively, four or more bendable sections may be present in the catheter.
  • FIG. 3 shows an exemplary embodiment of a steerable catheter 104.
  • the steerable catheter 104 includes a non-steerable proximal section 148, a steerable distal section, and a catheter tip 158.
  • the proximal section 148 and distal bendable section (including 152, 154 and 156) are joined to each other by a plurality of drive wires 160 arranged along the wall of the catheter.
  • the proximal section 148 is configured with thru-holes or grooves or conduits to pass drive wires 160 from the distal section to the actuator unit 122.
  • the distal section is comprised of a plurality of bending segments including at least a distal segment 156, a middle segment 154, and a proximal segment 152 to form a multi-section catheter.
  • Each bending segment is bent by actuation of at least some of the plurality of drive wires 160 (driving members).
  • the posture of the catheter may be supported by non-illustrated supporting wires (support members) also arranged along the wall of the catheter (see US 2021/0308423).
  • the proximal ends of drive wires 160 are connected to individual actuators or motors of the actuator unit 122, while the distal ends of the drive wires 160 are selectively anchored to anchor members in the different bending segments of the distal bendable section.
  • Each bending segment is formed by a plurality of ring-shaped components (rings) with thru-holes, grooves, or conduits along the wall of the rings.
  • the ring-shaped components are defined as wire-guiding members 162 or anchor members 164 depending on their function within the catheter.
  • Anchor members 164 are ring-shaped components onto which the distal end of one or more drive wires 160 are attached.
  • Wire-guiding members 162 are ring-shaped components through which some drive wires 160 slide through (without being attached thereto).
  • Detail “A” in FIG. 3 illustrates an exemplary embodiment of a ring-shaped component (a wire-guiding member 162 or an anchor member 164).
  • Each ring-shaped component includes a central opening which forms the tool channel 168, and plural conduits 166 (grooves, sub-channels, or thru-holes) arranged lengthwise equidistant from the central opening along the annular wall of each ring-shaped component.
  • an inner cover such as is described in US 2021/0369085 and US 2022/0126060, may be included to provide a smooth inner channel and provide protection.
  • the non-steerable proximal section 148 is a flexible tubular shaft and can be made of extruded polymer material.
  • the tubular shaft of the proximal section 148 also has a central opening or tool channel 168 and plural conduits 166 along the wall of the shaft surrounding the tool channel 168.
  • the actuator unit 122 includes one or more servo motors or piezoelectric actuators. The actuator unit 122 bends one or more of the bending segments of the catheter by applying a pushing and/or pulling force to the drive wires 160. As shown in FIG. 3, each of the three bendable segments of the steerable catheter 104 has a plurality of drive wires 160.
  • the steerable catheter 104 may have nine driving wires arranged along the wall of the catheter. Each bendable segment of the catheter is bent by the actuator unit 122 by pushing or pulling at least one of these nine drive wires 160. Force is applied to each individual drive wire in order to manipulate/steer the catheter to a desired pose.
  • the actuator unit 122 assembled with steerable catheter 104 is mounted on the linear translation stage 110.
  • Linear translation stage 110 includes a slider and a linear motor. In other words, the linear translation stage 110 is motorized, and can be controlled by the system controller 128 to insert and remove the steerable catheter 104 to/from the patient’s bodily lumen.
  • the catheter as described herein is the steerable multi-section catheter described in one or more of U.S. Patents 11,007,641; 11,051,892; 11,096,552; 11,278,366; 11,559,190; 11,622,828; 11,730,551; and 11,786,106; U.S.
  • An imaging device 170 that can be inserted through the tool channel 168 includes an endoscope camera (videoscope) along with illumination optics (e.g., optical fibers or LEDs).
  • the illumination optics provides light to irradiate the lumen and/or a lesion target which is a region of interest within the patient 114.
  • End effector tools refer endoscopic surgical tools including clamps, graspers, scissors, staplers, ablation or biopsy needles, and other similar tools, which serve to manipulate body parts (organs or tumorous tissue) during examination or surgery.
  • the imaging device 170 may be what is commonly known as a chip-on-tip camera and may be color or black-and-white.
  • a laser may be inserted through the tool channel 168.
  • a tracking sensor 140 (e.g., an EM tracking sensor) is attached to the catheter tip 158.
  • steerable catheter 104 and the tracking sensor 140 can be tracked by the tip position detector.
  • the tip position detector detects a position of the tracking sensor 140, and outputs the detected positional information to the system controller 100.
  • the system controller 128, receives the positional information from the tip position detector, and continuously records and displays the position of the steerable catheter 104 with respect to the patient’s coordinate system.
  • FIG.4 is a block diagram illustrating components of the robotic catheter system 100.
  • the system controller 128 may execute software programs and controls the display controller 130 to display a navigation screen (e.g., a live view image 132) on the main display 118 and/or the secondary display 120.
  • the display controller 130 may include a graphics processing unit (GPU) or a video display controller (VDC).
  • system controller 128 and the display controller 130 may be configured separately. Alternatively, the system controller 128 and the display controller 130 can be configured as one device. In either case, the system controller 128 and the display controller 130 comprise substantially the same components.
  • the system controller 128 and display controller 130 may include a central processing unit (CPU) 182 comprised of one or more processors (microprocessors), a random access memory (RAM) 184 module, an input/output (I/O) 186 interface, a read only memory (ROM) 180, and data storage memory (e.g., a hard disk drive (HDD) 188 or solid state drive (SSD))).
  • CPU central processing unit
  • processors microprocessors
  • RAM random access memory
  • I/O input/output
  • ROM read only memory
  • data storage memory e.g., a hard disk drive (HDD) 188 or solid state drive (SSD)
  • the ROM 180 and/or HDD 188 store the operating system (OS) software, and software programs necessary for executing the functions of the robotic catheter system 100 as a whole.
  • the RAM 184 is used as a workspace memory.
  • the CPU 182 executes the software programs developed in the RAM 184.
  • the I/O 186 inputs, for example, positional information to the display controller 130, and outputs information for displaying the navigation screen to the one or more displays (main display 118 and/or secondary display 120).
  • the navigation screen may be a GUI generated by a software program but, it may also be generated by firmware, or a combination of software and firmware.
  • the system controller 128 may control the steerable catheter 104 based on any known kinematic algorithms applicable to continuum or snake-like catheter robots. For example, the system controller controls the steerable catheter 104 based on an algorithm known as follow the leader (FTL) algorithm. By applying the FTL algorithm, the most distal segment 156 of the steerable section is actively controlled with forward kinematic values, while the middle segment 154 and the proximal segment 152 (following sections) of the steerable catheter 104 move at a first position in the same way as the distal section moved at the first position or a second position near the first position.
  • the display controller 130 acquires position information of the steerable catheter 104 from system controller 102.
  • the display controller 130 may acquire the position information directly from the tip position detector.
  • the steerable catheter 104 may be a single-use or limited-use catheter device. In other words, the steerable catheter 104 can be attachable to, and detachable from, the actuator unit 122 to be disposable.
  • the display controller 130 may generate and outputs a live-view image or other view(s) or a navigation screen to the main display 118 and/or the secondary display 120. This view can optionally be registered with a 3D model of a patient’s anatomy (a branching structure) and the position information of at least a portion of the catheter (e.g., position of the catheter tip 158) by executing pre-programmed software routines.
  • one or more end effector tools can be inserted through the access port 126 at the proximal end of the catheter, and such tools can be guided through the tool channel 168 of the catheter body to perform an intraluminal procedure from the distal end of the catheter.
  • the tool may be a medical tool such as an endoscope camera, forceps, a needle or other biopsy or ablation tools.
  • the tool may be described as an operation tool or working tool. The working tool is inserted or removed through the working tool access port 126.
  • an embodiment of using a steerable catheter to guide a tool to a target is explained.
  • the tool may include an endoscope camera or an end effector tool, which can be guided through a steerable catheter under the same principles.
  • a procedure there is usually a planning procedure, a registration procedure, a targeting procedure, and an operation procedure.
  • Use of a laser as an operation tool involves risk of unintentional trauma. Such risk may be caused by the laser when the stones are out of target due to retro-pulsive movement of the stones.
  • an autonomous stone ablation method that is performed during lithotripsy is provided. As discussed herein, the method segments the stone during lithotripsy in order to autonomously guide the catheter / ureteroscope 104.
  • FIG. 6 illustrates a method of performing autonomous lithotripsy according to an embodiment.
  • phantom data is input for U-net segmentation in Step S603.
  • the phantom data is acquired by two different components a renal pelvis phantom illustrated in FIGS.7A and 7C and a stone phantom illustrated in FIGS.7B and 7D.
  • Step S605 of FIG. 6 data is received from 3D Slicer software and, in Step S607, data is received from the robotic system.
  • the 3D Slicer software is described by A. Fedorov, et al., 3D Slicer as an image computing platform for the Quantitative Imaging Network. Magnetic Resonance Imaging 30(9), 1323-1341 (112012).
  • Data received in Steps S605 and S607 are integrated in Step S609 and is output for performing autonomous lithotripsy in Step S611.
  • a segmentation algorithm is trained using generated phantom data to identify stones.
  • the segmentation algorithm is integrated with a training model with the 3D Slicer software and the above- described robotic system.
  • the autonomous stone targeting method was tested with live captured images.
  • FIGS. 7A-7D illustrate kidney and stone phantoms for validating data for an autonomous stone ablation method according to an embodiment.
  • FIGS. 7A and 7C illustrate a kidney phantom for data validation for the autonomous stone ablation method.
  • the phantom is composed by two different components: a renal pelvis phantom and a stone phantom (FIGS. 7B and 7D).
  • the pelvis phantom was created using a CT scan acquired under IRB approval protocol #2021P001848, in a Toshiba scanner at Brigham and Women’s Hospital based on a patient diagnosed with blood in the urine.
  • the patient’s renal pelvis were manually delineated from the CT scan through 3D Slicer software’s modules of “Thresholding” and “Grow from Seeds.” See, Pinter, A, et al. “Polymorph segmentation representation for medical image computing”, Computer Methods and Programs in Biomedicine, Volume 171, p19-26, 2019. [0080] The 3D volume of the renal pelvis was 3D printed using ABS plastic. The printed mold was then filled with a silicone rubber compound, and a patient-specific renal pelvis phantom was created. [0081] Similar to renal pelvis phantom, the stone phantoms illustrated in FIGS.
  • FIGS. 8A-8C are images derived from the recorded videos of extraction of the phantom stone according to an embodiment.
  • FIG. 9 is an image dataset used to train a U-net algorithm according to an embodiment. 9. Training of the stone segmentation algorithm may be performed before inserting the catheter tip into the bodily lumen in Step 1001 of FIG.10.
  • the U-net algorithm in FIG.9 is visualized using a Visualkeras Python package.
  • the U-net consists of a U-shape architecture which contracts and expands the convolutions.
  • the left side of FIG. 9 is a contracting path and the right side of FIG. 9 is an expansive path.
  • For the contracting path a block of 4 repetitions were used: 3x3 convolution followed by a dropout and a second 3x3 convolution followed by a 2x2 maxpooling. Then, a 3x3 convolution followed by a dropout and a second 3x3 convolution which completed the contracting path.
  • FIG.10 illustrates a method of autonomous ablation that is performed upon reaching the stone according to an embodiment.
  • autonomous lithotripsy may commence after insertion of a catheter tip into a bodily lumen and navigation of the catheter through the lumen along an insertion trajectory.
  • a laser that is configured to perform lithotripsy is turned off, i.e. is in a deactivated state.
  • live imaging may be performed, and autonomous lithotripsy may be performed in Step S1005.
  • Step S1007 at least one image may be obtained from the live imaging by the imaging device (camera) 170 within the bodily lumen.
  • Step S1009 segmentation may be performed of the at least one live image using a convolutional neural network (CNN).
  • the at least one image may be segmented by fully convolutional network (FCN) segmentation or U-Net segmentation.
  • FCN fully convolutional network
  • Step S1011 an object is detected in the at least one live image, a diameter or size of the object may be determined and a determination may be made of whether a maximum size of the object exceeds or is equal to a predetermined size, for example approximately 1.5 mm.
  • the diameter of the object may be determined based on the segmented at least one image and segmenting the at least one image may be automated.
  • the predetermined size may be set by the user prior to the lithotripsy or may be from a lookup table.
  • the predetermined size may be defined as the size of a kidney stone for which laser lithotripsy is indicated. In some embodiments, the predetermined size may take into account properties of the kidney stone such as stone density, stone shape, and Guy's stone score. [0091]
  • the size of the object may be determined based on at least one physical parameter of the object, with the physical parameter being one or more of a diameter, a radius, a circumference, or an area of the object. [0092] If in Step S1013 the maximum size of the object is less than the predetermined size, the method returns to Step S1001, with the laser in the deactivated state.
  • Step S1013 If in Step S1013 the maximum size of the object exceeds or is equal to the predetermined size, a region of lasing (ROL) of the object, i.e., the kidney stone, is detected.
  • the ROL may be a perimeter around a center of gravity of the segmented object and an overlapping area of previous detected stone objects.
  • Step S1015 a tip of the catheter may be aligned with the ROL.
  • a notification may be output to autonomously perform the lithotripsy, with the laser maintaining a minimum predefined distance from kidney tissue.
  • the catheter tip Upon detecting the ROL, the catheter tip automatically moves into alignment with the ROL.
  • Step S1017 the laser is activated and lithotripsy is performed on the object.
  • the laser may be deactivated.
  • Another image of the object may then be obtained, the another image may be segmented, and an updated size of the object may be determined.
  • another ROL of the object may be detected, the catheter tip may be aligned with the another ROL, the laser may be activated, and automated lithotripsy may be performed.
  • stone ablation is performed during lithotripsy by autonomously guiding the ureteroscope during lithotripsy.
  • the catheter 104 may be inserted to the kidney manually, robotically and/or autonomously. Navigation of the catheter 104 is performed to an identified stone. Upon reaching the stone, the imaging device (camera) 170 is used to physically view the stone and the user 112 may autonomously target the stone and enable the laser for ablation. After the autonomously targeting, the stone is then automatically segmented and the tip of the catheter 104 automatically targets the stone. [0098] The U-net network was trained using 884 (1024-120 for testing) frames coming from three recorded videos using an OVM6946 camera and phantom data for components of the renal pelvis phantom, as described in FIGS. 7A and 7C.
  • the network was trained on 500 epochs and dynamic augmentations were applied.
  • dynamic augmentations random transformations are applied (rotation, shifting, shearing, blurring, added noise etc.) to increase the training data size during the course of model training.
  • a new randomly augmented dataset is created. Therefore, the size of training data is larger when compared to static augmentation (each epoch pulls the data from the same augmented dataset).
  • the model has less risk to overfit. See, GM, H., Mori, K., Verma, S., Athanasiou, L.: The influence of image cropping sizes on mammographic breast cancer classification using CNN.
  • FIG. 11A and 11B provide results of stone target evaluation according to an embodiment.
  • FIG. 11A illustrates Euclidean distance difference between the COG of the detected and annotated stones (triangles), and the mean of the maximum distances (circles) of each perimeter point of the annotation to all the other perimeter points of the same annotation.
  • the out of target frames are highlighted with a perpendicular (vertical) line.
  • FIG. 11B illustrates regression analysis of detected versus predicted-annotated stone percentage areas.
  • FIGS. 12A and 12B present application examples of the stone targeting procedure according to an embodiment.
  • FIG. 12A illustrates application examples of the out of target stones in frames 24, 25 and 77.
  • FIG.12B illustrates application examples of three successfully targeted stones.
  • FIG. 13 schematically presents autonomous movement of the catheter, according to an embodiment.
  • the upper region of FIG. 13 provides an application example of an autonomous robotic integrated system, with the left image showing the live image captured from the catheter’s camera, the center image being the live image with the binary U-net result overlaid, and the right image being binary U-net result.
  • FIG. 13 illustrates a demonstration of the catheter autonomous movement, with the green arrow pointing to the stone phantom and the orange arrow pointing to the catheter tip, with the catheter autonomously following the stone.
  • the autonomous movement of the catheter is illustrated in FIG. 13 with four different images of a live demonstration being presented.
  • the phantom stone was placed in the entrance of the pelvis phantom since the catheter movement cannot be depicted inside the phantom since the material is not transparent.
  • One or more stone fragments are detected and one or more potential targets are identified, with small targets being ignored.
  • the target identification may define the targets by prioritization by size, distance, and/or combination thereof. Tracing is performed and at least one target is broken into first generation fragments. The first generation fragments may be reprioritized and broking into second generation fragments. The defining is repeated for creation of each fragment generation.
  • Disclosed is a method for autonomous targeting of the kidney stones based on integration of a kidney phantom and a robotic system.
  • the autonomous laser lithotripsy provides benefits that include reduced trauma that may occur during the procedure due to stone movement and inexperienced users.
  • the autonomous catheter operated to automatically followed the kidney stone during lithotripsy, based on accuracy of the U-net results.
  • some peripheral tissue parts may be detected as stones.
  • detection of some peripheral tissue parts as being stones occurred in only three out of 120 frames, which is considered a successful targeting result, with, in a clinical scenario, manually controlling the level of autonomy may prevent detection of some peripheral tissue parts as being stones, and may be further help the user in cases where more than one stone is in the camera’s field of view.
  • an information processing apparatus that controls a steerable catheter, with the information processing apparatus including at least one memory configured to store instructions and at least one processor configured to execute the stored instructions to cause the steerable catheter to obtain at least one image of an object within a lumen; segment the at least one image; determine a size of the object; and in response to the size of the object exceeding or being equal to a predetermined size: detect an ROL of the object, align the catheter tip with the ROL, activate a laser on a distal end of the steerable catheter, and perform lithotripsy.
  • an autonomous navigation robot system that includes a steerable catheter, a camera at the distal end of the steerable catheter, one or more actuators configured to automatically move the steerable catheter, and a controller, which is configured to obtain, from the camera, at least one image of an object within a lumen; segment the at least one image; determine a size of the object; and in response to the size of the object exceeding or being equal to a predetermined size detect an ROL of the object, align the catheter tip with the ROL, activate a laser on a distal end of the steerable catheter, and perform lithotripsy.
  • At least certain aspects of the exemplary embodiments described herein can be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs or executable code) recorded on a memory such as an SSD or a storage medium (which may also be referred to as a non- transitory computer-readable storage medium) to perform functions of one or more block diagrams, systems, or flowchart described above.
  • a computer executable instructions e.g., one or more programs or executable code
  • the detector interface also provides communication interfaces to input and output devices.
  • the detector may include, for example a photomultiplier tube (PMT), a photodiode, an avalanche photodiode detector (APD), a charge-coupled device (CCD), multi-pixel photon counters (MPPC), or other.
  • PMT photomultiplier tube
  • APD avalanche photodiode detector
  • CCD charge-coupled device
  • MPPC multi-pixel photon counters
  • the function of detector may be realized by computer executable instructions (e.g., one or more programs) recorded on a storage/RAM.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Optics & Photonics (AREA)
  • Molecular Biology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Electromagnetism (AREA)
  • Robotics (AREA)
  • Otolaryngology (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Laser Surgery Devices (AREA)
  • Endoscopes (AREA)

Abstract

L'invention concerne un dispositif et un procédé d'ablation de pierre autonome pendant la lithotripsie laser, le procédé consistant à : insérer un cathéter dans une lumière ; faire naviguer le cathéter à travers la lumière le long d'une trajectoire d'insertion ; obtenir au moins une image d'un objet à l'intérieur de la lumière ; segmenter l'au moins une image ; déterminer une taille de l'objet ; et en réponse à la taille dépassant ou étant égale à une taille prédéterminée : définir une région de laser de l'objet, aligner une pointe du cathéter avec la région de laser, et effectuer une lithotripsie.
PCT/US2025/018498 2024-03-06 2025-03-05 Dispositif de lithotripsie autonome et procédés d'affichage d'actions correctives associées Pending WO2025188850A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463561996P 2024-03-06 2024-03-06
US63/561,996 2024-03-06

Publications (2)

Publication Number Publication Date
WO2025188850A1 true WO2025188850A1 (fr) 2025-09-12
WO2025188850A8 WO2025188850A8 (fr) 2025-10-02

Family

ID=96991457

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/018498 Pending WO2025188850A1 (fr) 2024-03-06 2025-03-05 Dispositif de lithotripsie autonome et procédés d'affichage d'actions correctives associées

Country Status (1)

Country Link
WO (1) WO2025188850A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0329492A2 (fr) * 1988-02-18 1989-08-23 Bjorn A. J. Angelsen Cathéter équipé de laser
US20160051133A1 (en) * 2013-08-07 2016-02-25 Olympus Corporation Endoscope system and operation method for endoscope system
US20160135894A1 (en) * 2013-11-11 2016-05-19 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Aiming beam detection for safe laser lithotripsy
US20210196398A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Anatomical feature identification and targeting
CN113877007A (zh) * 2020-07-02 2022-01-04 唐科俊 辅助治疗式现场导尿系统及方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0329492A2 (fr) * 1988-02-18 1989-08-23 Bjorn A. J. Angelsen Cathéter équipé de laser
US20160051133A1 (en) * 2013-08-07 2016-02-25 Olympus Corporation Endoscope system and operation method for endoscope system
US20160135894A1 (en) * 2013-11-11 2016-05-19 Gyrus Acmi, Inc. (D.B.A. Olympus Surgical Technologies America) Aiming beam detection for safe laser lithotripsy
US20210196398A1 (en) * 2019-12-31 2021-07-01 Auris Health, Inc. Anatomical feature identification and targeting
CN113877007A (zh) * 2020-07-02 2022-01-04 唐科俊 辅助治疗式现场导尿系统及方法

Also Published As

Publication number Publication date
WO2025188850A8 (fr) 2025-10-02

Similar Documents

Publication Publication Date Title
US12465431B2 (en) Alignment techniques for percutaneous access
JP7536752B2 (ja) 内視鏡支援経皮的医療処置のためのシステム及び方法
JP2024153875A (ja) マッピング及びナビゲーションのための方法及びシステム
KR102676381B1 (ko) 경피적 수술을 위한 방법
JP2025000994A (ja) 付随する処置のためのシステム、方法、及びワークフロー
KR20210062043A (ko) 동시 의료 절차를 위한 시스템 및 방법
US12156704B2 (en) Intraluminal navigation using ghost instrument information
JP7566165B2 (ja) 仮想の衛星標的を用いた腔内ナビゲーション
US20250275824A1 (en) Apparatus and methods for targeted navigation
WO2025117336A1 (fr) Cathéters orientables et différences de force de fil
US20250143812A1 (en) Robotic catheter system and method of replaying targeting trajectory
US20240000530A1 (en) Robotic and manual aspiration catheters
WO2025188850A1 (fr) Dispositif de lithotripsie autonome et procédés d'affichage d'actions correctives associées
US20240127399A1 (en) Visualization adjustments for instrument roll
US20250170363A1 (en) Robotic catheter tip and methods and storage mediums for controlling and/or manufacturing a catheter having a tip
US20230381399A1 (en) Catheter tip
US20250107854A1 (en) Bronchoscope graphical user interface with improved navigation
US20250169684A1 (en) Distributed Bending and Mode Control for Bendable Medical Devices
WO2025059207A1 (fr) Appareil médical doté d'une structure de support et son procédé d'utilisation
WO2025117590A1 (fr) Appareil médical pliable comportant des actionneurs modulaires
WO2024081745A2 (fr) Localisation et ciblage de petites lésions pulmonaires
KR20250025379A (ko) 병변 내 도구 단층영상합성을 통합한 로봇 내시경을 위한 시스템 및 방법
JP2025037832A (ja) 湾曲可能医用機器の曲げの分散
JP2025526776A (ja) 医療処置において解剖学的チャネルをナビゲートするためのユーザインターフェース
JPWO2022112969A5 (fr)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 25768835

Country of ref document: EP

Kind code of ref document: A1