US20200268459A1 - Flexible instrument insertion using an adaptive insertion force threshold - Google Patents
Flexible instrument insertion using an adaptive insertion force threshold Download PDFInfo
- Publication number
- US20200268459A1 US20200268459A1 US16/773,740 US202016773740A US2020268459A1 US 20200268459 A1 US20200268459 A1 US 20200268459A1 US 202016773740 A US202016773740 A US 202016773740A US 2020268459 A1 US2020268459 A1 US 2020268459A1
- Authority
- US
- United States
- Prior art keywords
- sensor
- elongate body
- endoscope
- status
- expected
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00002—Operational features of endoscopes
- A61B1/00004—Operational features of endoscopes characterised by electronic signal processing
- A61B1/00006—Operational features of endoscopes characterised by electronic signal processing of control signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/005—Flexible endoscopes
- A61B1/0051—Flexible endoscopes with controlled bending of insertion part
- A61B1/0057—Constructional details of force transmission elements, e.g. control wires
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/267—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
- A61B1/2676—Bronchoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/10—Computer-aided planning, simulation or modelling of surgical operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00017—Electrical control of surgical instruments
- A61B2017/00115—Electrical control of surgical instruments with audible or visual output
- A61B2017/00128—Electrical control of surgical instruments with audible or visual output related to intensity or progress of surgical action
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/00234—Surgical instruments, devices or methods for minimally invasive surgery
- A61B2017/00292—Surgical instruments, devices or methods for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
- A61B2017/003—Steerable
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/00234—Surgical instruments, devices or methods for minimally invasive surgery
- A61B2017/00292—Surgical instruments, devices or methods for minimally invasive surgery mounted on or guided by flexible, e.g. catheter-like, means
- A61B2017/003—Steerable
- A61B2017/00318—Steering mechanisms
- A61B2017/00323—Cables or rods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00477—Coupling
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/06—Measuring instruments not otherwise provided for
- A61B2090/064—Measuring instruments not otherwise provided for for measuring force, pressure or mechanical tension
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
- A61B34/71—Manipulators operated by drive cable mechanisms
Definitions
- This description generally relates to surgical robotics, and particularly to controlling insertion of a surgical instrument into an anatomical lumen of a patient.
- Robotic technologies have a range of applications.
- robotic arms help complete tasks that a human would normally perform.
- factories use robotic arms to manufacture automobiles and consumer electronics products.
- scientific facilities use robotic arms to automate laboratory procedures such as transporting microplates.
- physicians and/or surgeons have started using robotic arms to help perform surgical procedures.
- physicians and/or surgeons use robotic arms to control surgical instruments such as endoscopes.
- An endoscope is able to perform surgical procedures in a minimally invasive manner.
- the endoscope can be directed to a target location of a patient, such as the lung or blood vessel.
- the robotic arms applies a force to insert the endo scope into an open access point of a patient, e.g., mouth, anus, urethra, to the target location within the patient lumen.
- the endoscope may brush, rub, and push against internal anatomy that may be fragile and subject to tearing if too much insertion force is applied.
- the endoscope typically may buckle in response to slack or insertion insistence in the endoscope and incidental force from coming in contact with patient anatomy.
- the present disclosure describes the determination of an insertion force threshold to regulate an insertion force of an instrument within a patient's lumen in order to prevent buckling of the instrument or possible injury to the patient.
- the insertion force threshold may be dynamically determined based on real time data captured from the instrument and data associated with the patient as the instrument moves to an operative site. Additionally or alternatively, the insertion force threshold may be at least partially pre-determined and tagged to different portions of a pre-operative model.
- FIG. 1A illustrates a surgical robotic system according to one embodiment.
- FIGS. 1B-1F show various perspective views of a robotic platform coupled to the surgical robotic system shown in FIG. 1A , according to one embodiment.
- FIG. 2 illustrates a command console for a surgical robotic system according to one embodiment.
- FIG. 3A illustrates multiple degrees of motion of an endoscope according to one embodiment.
- FIG. 3B is a top view of an endoscope according to one embodiment.
- FIG. 3C is a cross sectional isometric view of the leader of the endoscope according to one embodiment.
- FIG. 4A is an isometric view of an instrument device manipulator of a surgical robotic system according to one embodiment.
- FIG. 4B is an exploded isometric view of the instrument device manipulator shown in FIG. 4A according to one embodiment.
- FIG. 4C is an isometric view of an independent drive mechanism of the instrument device manipulator shown in FIG. 4A according to one embodiment.
- FIG. 4D illustrates a conceptual diagram that shows how forces may be measured by a strain gauge of the independent drive mechanism shown in FIG. 4C according to one embodiment.
- FIG. 5A is a flowchart of a process for determining movements of an endoscope from a sequence of recorded images according to one embodiment.
- FIG. 5B is a diagram of electromagnetic tracking system according to one embodiment.
- FIG. 6A illustrates the distal end of an endoscope within an anatomical lumen according to one embodiment.
- FIG. 6B illustrates the endoscope shown in FIG. 6A in use at an operative site according to one embodiment.
- FIG. 6C illustrates the endoscope shown in FIG. 6B with an aspiration needle according to one embodiment.
- FIGS. 7A, and 7B illustrate an example of endolumenal buckling occurred when an endoscope is inserted into a patient's lung to an operative site according to one embodiment.
- FIGS. 8A and 8B illustrate examples of sensor regions used to place sensors according to one embodiment.
- FIGS. 9A-9L illustrate examples of endolumenal buckling detection based on a comparison between measured status and expected status according to one embodiment.
- FIG. 10 is a flowchart of a process for detecting endolumenal buckling based on a comparison between measured status and expected status according to one embodiment.
- FIGS. 11A-11H illustrate examples of endolumenal buckling detection based on before and after (or during) a command, according to one embodiment.
- FIG. 12 is a flowchart of a process for detecting endolumenal buckling based on status changes indicated by sensor data according to one embodiment.
- FIGS. 13A-13F are examples of detecting buckling of an endoscope outside a patient according to one embodiment.
- FIG. 14 is a flowchart of a process for detecting buckling outside a patient based using transmitter-receiver pairs according to one embodiment.
- FIG. 15 illustrates another example of detecting buckling of an endoscope outside a patient according to one embodiment.
- FIGS. 16A-C illustrate examples of adaptive insertion force thresholds used at different locations of an endoscope with different patients according to an embodiment.
- FIG. 17 is a flowchart of a process for inserting an endoscope using an adaptive insertion force threshold according to one embodiment.
- FIG. 1A illustrates a surgical robotic system 100 according to one embodiment.
- the surgical robotic system 100 includes a base 101 coupled to one or more robotic arms, e.g., robotic arm 102 .
- the base 101 is communicatively coupled to a command console, which is further described with reference to FIG. 2 in Section I.B. Command Console.
- the base 101 can be positioned such that the robotic arm 102 has access to perform a surgical procedure on a patient, while a user such as a physician may control the surgical robotic system 100 from the comfort of the command console.
- the base 101 may be coupled to a surgical operating table or bed for supporting the patient. Though not shown in FIG.
- the base 101 may include subsystems such as control electronics, pneumatics, power sources, optical sources, and the like.
- the robotic arm 102 includes multiple arm segments 110 coupled at joints 111 , which provides the robotic arm 102 multiple degrees of freedom, e.g., seven degrees of freedom corresponding to seven arm segments.
- the base 101 may contain a source of power 112 , pneumatic pressure 113 , and control and sensor electronics 114 —including components such as a central processing unit, data bus, control circuitry, and memory—and related actuators such as motors to move the robotic arm 102 .
- the electronics 114 in the base 101 may also process and transmit control signals communicated from the command console.
- the base 101 includes wheels 115 to transport the surgical robotic system 100 .
- Mobility of the surgical robotic system 100 helps accommodate space constraints in a surgical operating room as well as facilitate appropriate positioning and movement of surgical equipment. Further, the mobility allows the robotic arms 102 to be configured such that the robotic arms 102 do not interfere with the patient, physician, anesthesiologist, or any other equipment. During procedures, a user may control the robotic arms 102 using control devices such as the command console.
- the robotic arm 102 includes set up joints that use a combination of brakes and counter-balances to maintain a position of the robotic arm 102 .
- the counter-balances may include gas springs or coil springs.
- the brakes e.g., fail safe brakes, may be include mechanical and/or electrical components.
- the robotic arms 102 may be gravity-assisted passive support type robotic arms.
- Each robotic arm 102 may be coupled to an instrument device manipulator (IDM) 117 using a mechanism changer interface (MCI) 116 .
- the IDM 117 can be removed and replaced with a different type of IDM, for example, a first type of IDM manipulates an endoscope, while a second type of IDM manipulates a laparoscope.
- the MCI 116 includes connectors to transfer pneumatic pressure, electrical power, electrical signals, and optical signals from the robotic arm 102 to the IDM 117 .
- the MCI 116 can be a set screw or base plate connector.
- the IDM 117 manipulates surgical instruments such as the endoscope 118 using techniques including direct drive, harmonic drive, geared drives, belts and pulleys, magnetic drives, and the like.
- the MCI 116 is interchangeable based on the type of IDM 117 and can be customized for a certain type of surgical procedure.
- the robotic arm 102 can include a joint level torque sensing and a wrist at a distal end, such as the KUKA AG® LBR5 robotic arm.
- the endoscope 118 is a tubular and flexible surgical instrument that is inserted into the anatomy of a patient to capture images of the anatomy (e.g., body tissue).
- the endoscope 118 includes one or more imaging devices (e.g., cameras or sensors) that capture the images.
- the imaging devices may include one or more optical components such as an optical fiber, fiber array, or lens.
- the optical components move along with the tip of the endoscope 118 such that movement of the tip of the endoscope 118 results in changes to the images captured by the imaging devices.
- the endoscope 118 is further described with reference to FIGS. 3A-3C in Section I.C. Endoscope.
- Robotic arms 102 of the surgical robotic system 100 manipulate the endoscope 118 using elongate movement members.
- the elongate movement members may include pull wires, also referred to as pull or push wires, cables, fibers, or flexible shafts.
- the robotic arms 102 actuate multiple pull wires coupled to the endoscope 118 to deflect the tip of the endoscope 118 .
- the pull wires may include both metallic and non-metallic materials such as stainless steel, Kevlar, tungsten, carbon fiber, and the like.
- the endoscope 118 may exhibit nonlinear behavior in response to forces applied by the elongate movement members. The nonlinear behavior may be based on stiffness and compressibility of the endoscope 118 , as well as variability in slack or stiffness between different elongate movement members.
- the surgical robotic system 100 includes a controller 120 , for example, a computer processor.
- the controller 120 includes image registration module 130 , and a store 135 .
- the surgical robotic system 100 uses the image registration module 130 for determining movement of the endoscope, which is further described in Section I.C.2. Optical Flow and I.C.3. EM Registration.
- some or all functionality of the controller 120 is performed outside the surgical robotic system 100 , for example, on another computer system or server communicatively coupled to the surgical robotic system 100 .
- FIGS. 1B-1F show various perspective views of the surgical robotic system 100 coupled to a robotic platform 150 (or surgical bed), according to various embodiments.
- FIG. 1B shows a side view of the surgical robotic system 100 with the robotic arms 102 manipulating the endoscopic 118 to insert the endoscopic inside a patient's body, and the patient is lying on the robotic platform 150 .
- FIG. 1C shows a top view of the surgical robotic system 100 and the robotic platform 150 , and the endoscopic 118 manipulated by the robotic arms is inserted inside the patient's body.
- FIG. 1D shows a perspective view of the surgical robotic system 100 and the robotic platform 150 , and the endoscopic 118 is controlled to be positioned horizontally parallel with the robotic platform.
- FIG. 1B shows a side view of the surgical robotic system 100 with the robotic arms 102 manipulating the endoscopic 118 to insert the endoscopic inside a patient's body, and the patient is lying on the robotic platform 150 .
- FIG. 1C shows a top view of the surgical robotic system 100 and the
- FIG. 1E shows another perspective view of the surgical robotic system 100 and the robotic platform 150 , and the endoscopic 118 is controlled to be positioned relatively perpendicular to the robotic platform.
- the angle between the horizontal surface of the robotic platform 150 and the endoscopic 118 is 75 degree.
- FIG. 1F shows the perspective view of the surgical robotic system 100 and the robotic platform 150 shown in FIG. 1E , and in more detail, the angle between the endoscopic 118 and the virtual line 160 connecting one end 180 of the endoscopic and the robotic arm 102 that is positioned relatively farther away from the robotic platform is 90 degree.
- FIG. 2 illustrates a command console 200 for a surgical robotic system 100 according to one embodiment.
- the command console 200 includes a console base 201 , display modules 202 , e.g., monitors, and control modules, e.g., a keyboard 203 and joystick 204 .
- one or more of the command module 200 functionality may be integrated into a base 101 of the surgical robotic system 100 or another system communicatively coupled to the surgical robotic system 100 .
- a user 205 e.g., a physician, remotely controls the surgical robotic system 100 from an ergonomic position using the command console 200 .
- the console base 201 may include a central processing unit, a memory unit, a data bus, and associated data communication ports that are responsible for interpreting and processing signals such as camera imagery and tracking sensor data, e.g., from the endoscope 118 shown in FIG. 1 . In some embodiments, both the console base 201 and the base 101 perform signal processing for load-balancing.
- the console base 201 may also process commands and instructions provided by the user 205 through the control modules 203 and 204 .
- the control modules may include other devices, for example, computer mice, trackpads, trackballs, control pads, video game controllers, and sensors (e.g., motion sensors or cameras) that capture hand gestures and finger gestures.
- the user 205 can control a surgical instrument such as the endoscope 118 using the command console 200 in a velocity mode or position control mode.
- velocity mode the user 205 directly controls pitch and yaw motion of a distal end of the endoscope 118 based on direct manual control using the control modules.
- movement on the joystick 204 may be mapped to yaw and pitch movement in the distal end of the endoscope 118 .
- the joystick 204 can provide haptic feedback to the user 205 .
- the joystick 204 vibrates to indicate that the endoscope 118 cannot further translate or rotate in a certain direction.
- the command console 200 can also provide visual feedback (e.g., pop-up messages) and/or audio feedback (e.g., beeping) to indicate that the endoscope 118 has reached maximum translation or rotation.
- the command console 200 uses a three-dimensional (3D) map of a patient and pre-determined computer models of the patient to control a surgical instrument, e.g., the endoscope 118 .
- the command console 200 provides control signals to robotic arms 102 of the surgical robotic system 100 to manipulate the endoscope 118 to a target location. Due to the reliance on the 3D map, position control mode requires accurate mapping of the anatomy of the patient.
- users 205 can manually manipulate robotic arms 102 of the surgical robotic system 100 without using the command console 200 .
- the users 205 may move the robotic arms 102 , endoscopes 118 , and other surgical equipment to access a patient.
- the surgical robotic system 100 may rely on force feedback and inertia control from the users 205 to determine appropriate configuration of the robotic arms 102 and equipment.
- the display modules 202 may include electronic monitors, virtual reality viewing devices, e.g., goggles or glasses, and/or other means of display devices.
- the display modules 202 are integrated with the control modules, for example, as a tablet device with a touchscreen.
- the user 205 can both view data and input commands to the surgical robotic system 100 using the integrated display modules 202 and control modules.
- the display modules 202 can display 3D images using a stereoscopic device, e.g., a visor or goggle.
- the 3D images provide an “endo view” (i.e., endoscopic view), which is a computer 3D model illustrating the anatomy of a patient.
- the “endo view” provides a virtual environment of the patient's interior and an expected location of an endoscope 118 inside the patient.
- a user 205 compares the “endo view” model to actual images captured by a camera to help mentally orient and confirm that the endoscope 118 is in the correct—or approximately correct—location within the patient.
- the “endo view” provides information about anatomical structures, e.g., the shape of an intestine or colon of the patient, around the distal end of the endoscope 118 .
- the display modules 202 can simultaneously display the 3D model and computerized tomography (CT) scans of the anatomy the around distal end of the endoscope 118 . Further, the display modules 202 may overlay pre-determined optimal navigation paths of the endoscope 118 on the 3D model and CT scans.
- CT computerized tomography
- a model of the endoscope 118 is displayed with the 3D models to help indicate a status of a surgical procedure.
- the CT scans identify a lesion in the anatomy where a biopsy may be necessary.
- the display modules 202 may show a reference image captured by the endoscope 118 corresponding to the current location of the endoscope 118 .
- the display modules 202 may automatically display different views of the model of the endoscope 118 depending on user settings and a particular surgical procedure. For example, the display modules 202 show an overhead fluoroscopic view of the endoscope 118 during a navigation step as the endoscope 118 approaches an operative region of a patient.
- FIG. 3A illustrates multiple degrees of motion of an endoscope 118 according to one embodiment.
- the endoscope 118 is an embodiment of the endoscope 118 shown in FIG. 1 .
- the tip 301 of the endoscope 118 is oriented with zero deflection relative to a longitudinal axis 306 (also referred to as a roll axis 306 ).
- a surgical robotic system 100 deflects the tip 301 on a positive yaw axis 302 , negative yaw axis 303 , positive pitch axis 304 , negative pitch axis 305 , or roll axis 306 .
- the tip 301 or body 310 of the endoscope 118 may be elongated or translated in the longitudinal axis 306 , x-axis 308 , or y-axis 309 .
- the endoscope 118 includes a reference structure 307 to calibrate the position of the endoscope 118 .
- the surgical robotic system 100 measures deflection of the endoscope 118 relative to the reference structure 307 .
- the reference structure 307 is located on a proximal end of the endoscope 118 and may include a key, slot, or flange.
- the reference structure 307 is coupled to a first drive mechanism for calculating movement and is coupled to a second drive mechanism, e.g., the IDM 117 , to perform a surgical procedure.
- FIG. 3B is a top view of an endoscope 118 according to one embodiment.
- the endoscope 118 includes a leader 315 tubular component nested or partially nested inside and longitudinally-aligned with a sheath 311 tubular component, such that the leader telescopes out of the sheath.
- the sheath 311 includes a proximal sheath section 312 and distal sheath section 313 .
- the leader 315 has a smaller outer diameter than the sheath 311 and includes a proximal leader section 316 and distal leader section 317 .
- the sheath base 314 and the leader base 318 actuate the distal sheath section 313 and the distal leader section 317 , respectively, for example, based on control signals from a user of a surgical robotic system 100 .
- the sheath base 314 and the leader base 318 are, e.g., part of the IDM 117 shown in FIG. 1 .
- Both the sheath base 314 and the leader base 318 include drive mechanisms (e.g., the independent drive mechanism further described with reference to FIG. 4A-D in Section II.C.4. Instrument Device Manipulator) to control pull wires coupled to the sheath 311 and leader 315 .
- the sheath base 314 generates tensile loads on pull wires coupled to the sheath 311 to deflect the distal sheath section 313 .
- the leader base 318 generates tensile loads on pull wires coupled to the leader 315 to deflect the distal leader section 317 .
- Both the sheath base 314 and leader base 318 may also include couplings for the routing of pneumatic pressure, electrical power, electrical signals, or optical signals from IDMs to the sheath 311 and leader 314 , respectively.
- a pull wire may include a steel coil pipe along the length of the pull wire within the sheath 311 or the leader 315 , which transfers axial compression back to the origin of the load, e.g., the sheath base 314 or the leader base 318 , respectively.
- the endoscope 118 can navigate the anatomy of a patient with ease due to the multiple degrees of freedom provided by pull wires coupled to the sheath 311 and the leader 315 .
- pull wires coupled to the sheath 311 and the leader 315 .
- four or more pull wires may be used in either the sheath 311 and/or the leader 315 , providing eight or more degrees of freedom.
- up to three pull wires may be used, providing up to six degrees of freedom.
- the sheath 311 and leader 315 may be rotated up to 360 degrees along a longitudinal axis 306 , providing more degrees of motion.
- the combination of rotational angles and multiple degrees of freedom provides a user of the surgical robotic system 100 with a user friendly and instinctive control of the endoscope 118 .
- FIG. 3C is a cross sectional isometric view of the leader 315 of the endoscope 118 according to one embodiment.
- the leader 315 includes an imaging device 349 (e.g., image sensor, still or video camera, 2D or 3D detector array, charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) camera, imaging fiber bundle, etc.), light sources 350 (e.g., white light source, laser diode, light-emitting diode (LED), optic fiber illuminator, etc.), and at least one working channel 343 for other components.
- an imaging device 349 e.g., image sensor, still or video camera, 2D or 3D detector array, charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) camera, imaging fiber bundle, etc.
- light sources 350 e.g., white light source, laser diode, light-emitting diode (LED), optic fiber illuminator, etc.
- the leader 315 includes a pocket hole to accommodate insertion of a component into a working channel 343 .
- FIG. 4A is an isometric view of an instrument device manipulator 117 of the surgical robotic system 100 according to one embodiment.
- the robotic arm 102 is coupled to the IDM 117 via an articulating interface 401 .
- the IDM 117 is coupled to the endoscope 118 .
- the articulating interface 401 may transfer pneumatic pressure, power signals, control signals, and feedback signals to and from the robotic arm 102 and the IDM 117 .
- the IDM 117 may include a gear head, motor, rotary encoder, power circuits, and control circuits.
- a tool base 403 for receiving control signals from the IDM 117 is coupled to the proximal end of the endoscope 118 . Based on the control signals, the IDM 117 manipulates the endoscope 118 by actuating output shafts, which are further described below with reference to FIG. 4B .
- FIG. 4B is an exploded isometric view of the instrument device manipulator shown in FIG. 4A according to one embodiment.
- the endoscopic 118 has been removed from the IDM 117 to reveal the output shafts 405 , 406 , 407 , and 408 .
- FIG. 4C is an isometric view of an independent drive mechanism of the instrument device manipulator 117 shown in FIG. 4A according to one embodiment.
- the independent drive mechanism can tighten or loosen the pull wires 421 , 422 , 423 , and 424 (e.g., independently from each other) of an endoscope by rotating the output shafts 405 , 406 , 407 , and 408 of the IDM 117 , respectively.
- the output shafts 405 , 406 , 407 , and 408 transfer force down pull wires 421 , 422 , 423 , and 424 , respectively, through angular motion
- the pull wires 421 , 422 , 423 , and 424 transfer force back to the output shafts.
- the IDM 117 and/or the surgical robotic system 100 can measure the transferred force using a sensor, e.g., a strain gauge further described below.
- FIG. 4D illustrates a conceptual diagram that shows how forces may be measured by a strain gauge 434 of the independent drive mechanism shown in FIG. 4C according to one embodiment.
- a force 431 may be directed away from the output shaft 405 coupled to the motor mount 433 of the motor 437 . Accordingly, the force 431 results in horizontal displacement of the motor mount 433 . Further, the strain gauge 434 horizontally coupled to the motor mount 433 experiences strain in the direction of the force 431 .
- the strain may be measured as a ratio of the horizontal displacement of the tip 435 of strain gauge 434 to the overall horizontal width 436 of the strain gauge 434 .
- the IDM 117 includes additional sensors, e.g., inclinometers or accelerometers, to determine an orientation of the IDM 117 .
- the surgical robotic system 100 can calibrate readings from the strain gauge 434 to account for gravitational load effects. For example, if the IDM 117 is oriented on a horizontal side of the IDM 117 , the weight of certain components of the IDM 117 may cause a strain on the motor mount 433 . Accordingly, without accounting for gravitational load effects, the strain gauge 434 may measure strain that did not result from strain on the output shafts.
- the movement is reflected in changes from one image to the next. These changes may be detected using optical flow techniques that register one image to another, from which a movement may be estimated.
- FIG. 5A is a flowchart of a process for determining movements of an endoscope from a sequence of recorded images according to one embodiment.
- the process 500 may include different or additional steps than those described in conjunction with FIG. 5A in some embodiments, or perform steps in different orders than the order described in conjunction with FIG. 5A .
- the image registration module 130 of the surgical robotic system 100 shown in FIG. 1 determines movement of an endoscope tip based on changes in properties of a sample of images (e.g., grayscale or color) captured by an image sensor coupled to the endoscope tip, e.g., the imaging device 349 of endoscope 118 shown in FIG. 3C . Because the image sensor is coupled to the endoscope 118 , the image registration module 130 assumes that changes between a pair of images of the sample are due to a shift in perspective of the image sensor corresponding to a movement of the endoscope tip, e.g., translation, rotation, and/or scaling in a pitch or yaw axis.
- a sample of images e.g., grayscale or color
- the image registration module 130 can filter the sample of images, for example, by removing every other image of the sample to help reduce the time required to process the sample.
- the image registration module 130 extracts the sample of images from a video captured by the image sensor. Image registration does not require the source and target images to be subsequent frames of the camera. However, the accuracy of the motion estimated by image registration tends to be greater as the time period between images decreases. Thus, the image registration module 130 generates more accurate motion estimates (e.g., nearly continuous measurement of parameters associated with movement of the endoscope) by registering many images in sequence.
- the image registration module 130 receives 510 a sample of images and analyzes pairs of images of the sample using an optical flow technique.
- the image that occurs first is referred to as the source image and the image that occurs second is referred to as the target image.
- the order of the first and second images is arbitrary.
- the direction of translation e.g., moving forward or backward in time
- each image is a two-dimensional pixel array of N pixel values corresponding to light intensities (e.g., for grayscale images), vectors representing intensities of different colors of light (e.g., for color images), etc.
- the image registration module 130 can transform the two-dimensional pixel array into a corresponding 1-dimensional array with N elements for processing.
- the image registration module 130 generates 520 a difference array D and generates 530 a gradient array G based on the pair of images.
- the image registration module 130 generates a difference array and gradient array for each pair of images of the sample.
- the difference array D is based on the difference between a pixel value of the target image and a corresponding pixel value of the source image.
- the gradient array G is based on a weighted average of the rate of change (e.g., derivative) of a pixel value of the target image and the rate of change of a corresponding pixel value of the source image.
- the rate of change of a pixel in the x-dimension G x is based on the difference between the pixel and each of two or more adjacent pixels in the x-direction.
- the rate of change of the pixel in the y-dimension G y is based on the difference between the pixel and each of two or more adjacent pixels in the y-direction.
- the gradient array may be a weighted average of the rates of change in the x and y dimensions, e.g., equally weighted.
- the image registration module 130 determines a motion of the endoscope base on the difference array D and the gradient array G.
- the motion can be represented by a vector p.
- the vector p often comprises a set of model parameters, and the identities of these parameters may be varied in order to detect different properties of motion.
- the solved p represents a motion (e.g., translation, rotation) of the endoscope.
- the image registration module 130 can repeat the steps 520 - 540 of the process 500 for multiple pairs of images of the sample. Thus, the image registration module 130 generates a set of motion vectors corresponding to each processed pair of images.
- FIG. 5B is a diagram of electromagnetic tracking system according to one embodiment.
- the spatial sensor 550 coupled to the tip of the endoscope 118 is an EM sensor 550 that detects an electromagnetic field (EMF) generated by one or more EMF generators 600 in proximity to the endoscope 118 .
- EMF electromagnetic field
- the strength of the detected EMF is a function of the position and/or orientation of the endoscope 118 .
- a number of EMF generators 600 are located externally to a patient.
- the EMF generators 600 emit EM fields that are picked up by the EM sensor 550 .
- the different EMF generators 600 may be modulated in a number of different ways so that when their emitted fields are captured by the EM sensor 550 and are processed by the controller 120 (or any computer system external to the surgical robotic system 100 ), their signals are separable. Further, the EMF generators 600 may be oriented relative to each other in Cartesian space at non-zero, non-orthogonal angles so that changes in orientation of the EM sensor 550 will result in the EM sensor 550 receiving at least some signal from at least one of the EMF generators 600 at any instant in time.
- the controller 120 registers EM data captured by the EM sensor 550 to an image of the patient captured with a different technique other than EM (or whatever mechanism is used to capture the alignment sensor's data), such as a computed tomography (CT) scan, to establish a reference frame for the EM data.
- a computed tomography (CT) scan to establish a reference frame for the EM data.
- CT computed tomography
- the distal end of the endoscope may be tracked by EM sensors located in the tip.
- the relative location within the patient may be determined by comparing a pre-operative model generated from CT data to the absolute location measured by the EM tracking system.
- data points derived from the EM data are initially located far from the position of the endoscope tip moving along a planned navigation path expected from the 3D model.
- This position difference between the EM data and the 3D model reflects the lack of registration between the EM coordinates and the 3D model coordinates.
- the controller 120 may determine and adjust the points on the 3D model based on correlation between the 3D model itself, image data received from the imaging device (e.g., cameras) on the tip and robot data from robot commands (e.g., provided to the robotic arms of the surgical robotic system 100 ).
- the controller 120 uses the 3D transformation between these points and collected EM data points to determine the initial registration of the EM coordinate system to the 3D model coordinate system. After registering EM data with the 3D model, the data points derived from EM data fall along the planned navigation path derived from the 3D model, and each data point among the data points reflects a measurement of the position of endoscope tip in the coordinate system of the 3D model.
- FIGS. 6A-C illustrate example surgical procedures using an endoscope, e.g., endoscope 118 shown in FIG. 3A .
- FIG. 6A illustrates the distal end of the endoscope 118 within an anatomical lumen 602 according to one embodiment.
- the endoscope 118 includes a sheath 311 and navigates through the anatomical lumen 602 inside a patient toward an operative site 603 for a surgical procedure.
- FIG. 6B illustrates the endoscope 118 shown in FIG. 6A in use at the operative site 603 according to one embodiment.
- the endoscope 118 After reaching the operative site 603 , the endoscope 118 extends a distal leader section 317 , longitudinally aligned with the sheath 311 , in the direction marked by arrow 605 .
- the endoscope can also articulate the distal leader section 317 to direct surgical tools toward the operative site 603 .
- FIG. 6C illustrates the endoscope 118 shown in FIG. 6B with an aspiration needle 1007 according to one embodiment.
- the distal leader section 317 articulates in the direction marked by arrow 606 to convey the aspiration needle 1007 to target the lesion.
- the distal leader section 317 is integrated with the sheath 311 (not shown in FIG. 6 ).
- the distal leader section 317 navigates with the sheath 311 through the anatomical lumen 602 inside a patient toward an operative site 603 for a surgical procedure. After reaching the operative site 603 , surgical tools can be directed to the operative site 603 via the distal leader section 317 .
- the distal leader section 317 can be deployed through a working channel that is off-axis (neutral axis) of the sheath 311 , which allows the distal leader section 317 to operate without obscuring an image sensor (not shown in FIG. 6 ) coupled to the end of the sheath 311 (or any other location of the endoscope 118 ).
- This arrangement allows the image sensor to capture images inside the anatomical lumen while the endoscope 118 articulates the distal leader section 317 and keeps the sheath 311 stationary.
- distal leader section 317 which may also be referred to as a flexure section, are disclosed in U.S. patent application Ser. No. 14/201,610, filed Mar. 7, 2014, and U.S. patent application Ser. No. 14/479,095, filed Sep. 5, 2014, the entire contents of which are incorporated by reference.
- endolumenal buckling is a phenomenon whereby a flexible instrument (e.g., endoscope) navigated within anatomical lumens towards an operative site or a surgical site prolapses in an undesired direction within the anatomical lumen in response to an insertion force.
- a flexible instrument e.g., endoscope
- FIGS. 7A and 7B illustrate an example of endolumenal buckling occurring when an endoscope is inserted into a patient's lung 700 to an operative site 710 .
- the endoscope 118 is inserted into a patient's mouth, down the patient's trachea, and into the patent's lung 700 .
- the endoscope bends normally towards the operative site 710 located in a left upper lobe of the lung 700 .
- the sheath 740 of the endoscope is navigated to the left main bronchus first, and then the leader 730 is navigating in tertiary bronchi towards the operative site 710 .
- FIG. 7A illustrates an example of endolumenal buckling occurring when an endoscope is inserted into a patient's lung 700 to an operative site 710 .
- the endoscope 118 is inserted into a patient's mouth, down the patient's trachea, and into the patent's lung 700 .
- Improper placement of the sheath 740 relative to the operative site 710 may also result in undesirable buckling of the endoscope.
- the leader 730 will not be supported when attempting to insert into the upper lobe of patient's lung 700 in order to reach the operative site 710 .
- the insertion force on the sheath 740 is directed “downward”, i.e., towards the lower lobes of the patient's lung 700 , in the opposite direction of the upper lobes, where the operative site 710 is located.
- the insertion force vector on the leader 730 is may be more aligned with the direction of the operative site 710 .
- greater insertion may be achieved with lower amounts of insertion force applied to the sheath 740 , in addition to a reduction in prolapsing or buckling by the leader 730 .
- Endolumenal buckling may occur in a variety of ways.
- the tip of the leader of the endoscope may become stuck or nearly stuck, and a portion of the leader or sheath may bends with a great amount of curvature as the endoscope is further inserted into the patient.
- the bucked portion stores potential energy and generates an opposing force that attempts to push the endoscope backward.
- a first region may cover the volume near the tip of the leader.
- a second region covers a portion of the leader in a range from an end of the sheath within the patient to the edge of the first region.
- a third region may cover the end of the sheath where the leader extends from as well as the portion of the sheath proximal to its end (also referred to as the distal sheath section).
- one or more sensors can be placed in any one of several locations.
- sensor locations include outer surface of the sheath or the leader, walls of the sheath or the leader, inner surface of sheath's lumen, inner surface of conduits of the leader or the sheath, one or more locations on pull wires of the leader or the sheath, another suitable location within the sensor region to place sensors, or some combination thereof.
- FIGS. 8A-B illustrate examples of sensor regions used to place sensors according to one embodiment.
- T 1 860 A and T 2 860 B are consecutive, or are separated with a time interval.
- a region of interest (ROI) 810 is selected and zoomed in.
- the ROI 810 includes the leader 730 and a portion of the sheath 740 .
- the zoomed-in ROIs without lung structures are shown at bottom of FIG. 8A and FIG. 8B , respectively.
- Sensor region A 820 includes the tip of the leader 730 and a small portion proximal to the tip.
- the sensor region B 830 covers a portion of the leader 730 in the range from the end of the sheath 740 within the patient to the tip of the leader 730 .
- the sensor region C 840 includes the end of the sheath and a small portion of the distal sheath section.
- One or more different types of sensors can be placed in each sensor region.
- one or more position sensors, one or more force sensors, one or more shape sensors or some combination thereof can be placed in each sensor region.
- types of sensors include a position sensor (e.g., EM sensor, optical sensor, accelerometer, gyroscope, magnetometer, another suitable type of sensor that detects motion, or some combination thereof), a force sensor (e.g., resistance sensor, pressure sensor, strain gauge, torque sensor, friction sensor, another suitable type of sensor that detects various types of forces, or some combination thereof), an image sensor (e.g., CCD, CMOS, NMOS, another suitable type of sensor that detects and conveys the information that constitutes an image, or some combination thereof), a shape sensor (e.g., optical fiber shape sensor, another suitable type of sensor that detects boundary, outline or surface of an object, or some combination thereof).
- a position sensor e.g., EM sensor, optical sensor, accelerometer, gyroscope, magnetometer, another
- Sensor data captured from one or more sensor regions can be compared with expected data (also referred to as historical data or reference data) to determine if buckling has occurred.
- the expected data describes data associated with various characteristics caused by a motion of the endoscope during a navigation. Examples of the expected data include data associated with various expected statuses caused by the motion of the endoscope, sensor data captured from one or more different sensor regions, different types of sensor data captured from the same sensor region, different types of sensor data captured from one or more different sensor regions, or some combination thereof. More specifically, expected data includes data associated with various possible states/statuses caused by the motion of the endoscope.
- expected statuses include expected position of the tip or distal end of the sheath, expected position of a portion of the leader or sheath, expected bending shape of the leader or sheath, expected force generated by the expected bending of the leader or sheath, expected force detected by the tip of the leader or sheath, or any other measurable or derivable quantity relating to the state of the endoscope which may include, but is not limited to, shape, distance, length, slope, gradient, curvature, angle, etc., or some combination thereof.
- the sensor data (also referred to measured data) collected from the sensors in the instrument during operation indicates a measured status based on an actual motion of the corresponding sensor regions where those sensors are placed.
- Examples of the measured statuses include a similar list of statuses as the list of expected statuses provided in the immediately previous paragraph.
- sensor data collected from an imaging device on the tip also referred to as optical flow data
- sensor data collected from an EM sensor located on the tip both can indicates a measured state (e.g., a position of the tip).
- the surgical robotic system 100 determines a measured status indicating a relative location of the tip within the patient. When the measured status indicated by the sensor data does not match or correlate to the expected status indicated by the expected data, the surgical robotics system 100 determines that endolumenal buckling has occurred. Examples are further described in Section II.A.1.
- Sensor data captured from one or more sensor regions can be compared with sensor data from the same and/or different sensor regions to determine if endolumenal buckling has occurred. For example, if sensor data captured from the one or more sensor regions indicates that the corresponding sensor regions of the endoscope have undergone a first status change (e.g., a status change indicating a force change in the first region), and sensor data from a different sensor region, or a different type of sensor data from the same sensor region indicates that the corresponding sensor region or sensor types has undergone a second status change (e.g., a status change indicating a force change in the third region, or a status change indicating that the tip has not moved in the first region), the surgical robotics system 100 determines that endolumenal buckling has occurred. Examples are further described in Section II.A.2.
- a status change indicates that some quantity measureable or derivable from the sensor data, which may include measured and expected sensor data, has changed one of more or less than a threshold, often measured over some period of time (e.g., T 1 and T 2 ).
- a threshold often measured over some period of time (e.g., T 1 and T 2 ).
- a first type of status change is a position change of some portion of the endoscope being less than a position threshold, representing a range of motion where the portion of the endoscope has not moved an appreciable distance, generally in response to an endoscope insertion command.
- a first example of the first type status change is where the tip of the leader or the end of the sheath within the patient has not moved or has moved less than a threshold amount in response to the command. For example, when an endoscope enters into an organ with a complex tubular network (e.g., a tubular network with variable bending, or with variable diameter), a certain insertion force is applied to the endoscope in order to move the endoscope to a target location.
- a complex tubular network e.g., a tubular network with variable bending, or with variable diameter
- the surgical robotics system 100 may determine that endolumenal buckling has occurred based or this status change alone, or in combination with other types of status change, as further described in Section II.A.2.
- a second example is where a portion of the leader or a portion of the sheath does not move to an expected position, in response to the command.
- a third example is where a portion of the sheath (e.g., the end of sheath, a distal sheath section) has been retracted in response to the command.
- a second type of status change is a force change above a threshold in response to a command that is detected at the tip of the leader, a portion of the distal leader section, the end of sheath, a portion of the distal sheath section.
- a third type of status change identifies an unwanted motion, generally bending, along the leader or the sheath, generally in response to an endoscope insertion command.
- One example of the third type status change include a bending change (e.g., a slope change, a gradient change, a curvature change, etc.) among two or more points along the leader or the sheath equals or exceeds a bending threshold, representing a situation where the leader or the sheath has appreciably bent in an unexpected manner in response to the command.
- Another example of the third type status change include a distance change between two points along the leader or the sheath less than a distance threshold, representing a situation where the distance between the two points has been shortened unexpectedly, in response to the command.
- Another example of the third type of status change occurs in instances such as when navigating the endoscope through a turn in the patient's endolumenal network is such that bending is expected but where that bending does not occur along the section of the endoscope where it is expected to occur.
- a lack of a bending change as measured by sensors along some points of the endoscope may suggest that bending has instead occurred elsewhere along the endoscope.
- FIGS. 9A-9L illustrate examples of endolumenal buckling detection based on a comparison between measured status and expected status according to one embodiment.
- a sensor A such as position or force sensor
- FIGS. 9A-9B show a measured position A 915 A and an expected position A 915 B indicated by the sensor A 910 .
- the endoscope is inserted to a measured position A 915 A.
- a measured force in FIG. 9A (e.g., a friction force generated between the tip and the lung structure) may be greater than the expected force in FIG. 9B based on the command input, thereby indicating that buckling has occurred.
- a sensor C and a sensor D are placed in the second sensor region (e.g., a portion of the leader).
- both sensors C and D are position sensors.
- the sensor C in response to a command to move the second region to an expected positions C and D, the sensor C detects a measured position C and the sensor D detects a measured position D.
- the measured position C and measured position D are compared with the expected position C and the expected position D. The comparison indicates whether the measured positions (based on the raw data or some derivation thereof such as the distance between them) deviate from the expected positions more than a threshold (not matching) or less than a threshold (matching).
- the surgical robotics system determines that buckling has not occurred, and that it has occurred if they do not.
- derived parameters used for detecting buckling include a slope, a distance, curvature, a gradient, another suitable parameter derived from the two positions, or some combination thereof.
- sensors C and D are force sensors.
- the sensor C detects a measured force A (e.g., a first torque) and the sensor D detects a measured force B (e.g., a first torque) in FIG. 9C .
- the measured force A and measured force B are compared with the expected force A and the expected force B. The comparison indicates whether the measured forces (based on the raw data or some derivation thereof) deviate from the expected forces more than a threshold (not matching) or less than a threshold (matching). If the measured and the expected match forces, the surgical robotic system 100 determines that buckling has not occurred, and that it has occurred if they do not.
- the sensor C and the sensor D have different sensor types.
- the sensor C is a position sensor and the sensor D is a force sensor.
- the sensor C detects a measured position C and the sensor D detects a measured force B.
- the measured position C is compared with the expected position C and the measured force B is compared with the expected force B.
- the comparisons indicate whether the measured position C deviates from expected position C more than a threshold (not matching) or less than a threshold (matching), and whether the measured force B deviates from the expected force B more than a threshold (not matching), or less than a threshold (matching). If the measured and the expected match, the surgical robotic system determines that buckling has not occurred, and that it has occurred if they do not
- a sensor B is placed in the third sensor region (e.g., a portion of the distal sheath section).
- the measured position E is compared with the expected position E shown in FIG. 9F .
- the measured position E shown in FIG. 9E has moved backward 960 indicating that the measured position E does not match the expected position E, the surgical robotic system determines buckling has occurred.
- the sensor B can also be a force sensor. For example, in response to a command to move the endoscope, the endoscope has an expected force C in the third region.
- the sensor B detects a measured force C (e.g., a friction between the third sensor region and the leader), and the measured force C is compared with the expected force C.
- the measured force is greater than the expected force C in FIG. 9F indicating that the measured force C does not match the expected C, the surgical robotic system determines that buckling has occurred.
- the example embodiments illustrated in this section may be variously combined with each other to provide other possible sensor setups for an endoscope, as well as buckling detection processes that use the detection of status changes in more than region at a time to identify or verify that buckling has occurred.
- expected vs. measured data from sensor A in the first sensor region A can be combined with expected vs. measured data from sensor B in the third sensor region as shown in FIGS. 9G-H .
- the sensor C and the sensor D can have the same or different sensor types.
- the shape of the leader can be detected using multiple position sensors as shown in FIGS. 9I-9J or by a shape sensing optical fiber as shown in FIGS. 9K-9L .
- a shape sensing optical fiber may include a segment of a fiber Bragg grating (FBG).
- the FBG reflects certain wavelengths of light, while transmitting other wavelengths.
- the surgical robotics system generates reflection spectrum data based on the wavelengths of light reflected by the FBG.
- the system can analyze the reflection spectrum data to generate position and orientation data of the endoscope in two or three dimensional space. In particular, as the endoscope bends, the shape sensing optical fiber embedded inside also bends.
- the specific wavelengths of light reflected by the FBG changes based on the shape of the shape sensing optical fiber (e.g., a “straight” endoscope is in a different shape than a “curved” endoscope).
- the system can determine, for example, how many degrees the endoscope has bent in one or more directions (e.g., in response to commands from the surgical robotic system) by identifying differences in the reflection spectrum data.
- Endolumenal bucking is detected based on a comparison between the measured shape and the expected shape as provided by the shape sensing optical sensor or the discrete sensors.
- a function can be used to estimate the shape of the leader (or sheath), e.g., linear (e.g., polynomial interpolation) or non-linear interpolations (e.g., spline interpolation), curve fitting based on one more fitting functions, linear or non-linear regression analysis, or some combination thereof.
- a shape sensing optical fiber 950 is placed along the leader (or sheath, not shown).
- the shape sensing sensor can be placed in conduits with the pull wires inside the length of walls of the leader (or the sheath).
- the shape sensing sensor can be placed in the outside of conduits but inside the length of walls of the leader (or the sheath).
- FIG. 10 is a flowchart of a general process 1000 for detecting endolumenal buckling based on a comparison between measured status and expected status according to one embodiment.
- a controller of a surgical robotics system for example, the controller 120 of the surgical robotics system 100 shown in FIG. 1 , uses the process 1000 to detect endolumenal buckling.
- the process 1000 may include different or additional steps than those described in conjunction with FIG. 10 in some embodiments, or perform steps in different orders than the order described in conjunction with FIG. 10 .
- the controller 120 receives 1010 sensor data generated from a first sensor placed in a portion of the endoscope located within a patient lumen, and the sensor data indicates a measured status based on an actual motion of the portion of the endoscope.
- the portion of the endoscope can be the three sensor regions mentioned above as shown in FIGS. 8A-8B . Examples are described in FIGS. 9A-9L .
- the controller 120 receives 1020 expected data describing data associated with an expected status caused by an expected motion of the endoscope.
- the expected data is robotic command data generated from an instrument device manipulator (IDM) physically coupled to the endo scope, where the robotic command data is configured to control the IDM to cause the portion of the endoscope to move within the patient towards an expected position.
- IDM instrument device manipulator
- the robotic command data indicates the expected status based on the expected motion.
- the controller 130 compares 1030 the measured status with the expected status. Responsive to the measured status deviating from the expected status more or less than an associated threshold, the controller 130 determines 1040 that the endoscope has buckled. In some embodiments, the threshold indicates a match between the measured status and the expected status.
- buckling was described as being detected based on a difference between expected vs. measured behavior. This section describes how buckling can be detected on a change in endoscope state between two points in time, generally during the carrying out of a motion command by the endoscope (e.g., insertion).
- FIGS. 11A-11H illustrate examples of endolumenal buckling detection based on before and after (or during) a command, according to one embodiment.
- Status change detection for each sensor region is similar to the examples described in FIGS. 9A-9H , with the exception that instead of using expected data and measured data to detect status change, measured data at two different points in time is used instead.
- a sensor A 1125 is placed in a sensor region A 1120 (e.g., tip of the endoscope).
- the sensor A 1125 detects a measured status A (e.g., a position A, or a force A depending on sensor type of sensor A).
- the sensor A 1125 detects a measured status B (e.g., a position B, or a force B). If the measured status at T 1 and T 2 triggers one of the thresholds of one of the status changes (e.g., increase in force, insufficient change of position) for sensor A located near the tip, the system determines that buckling has occurred.
- a status change can be sufficient to detect buckling, in some instances the identification of two or more status changes helps determine or verify that buckling has occurred.
- These detected status changes may originate from different sensors of the same or different type in the same or different regions. For example, if another sensor with different type (e.g., a force sensor) is placed in the sensor region A 1120 , if that other sensor also detects a corresponding status change, then it may be better determined or verified that buckling has occurred.
- another sensor with different type e.g., a force sensor
- FIGS. 11C-11H illustrate examples of two status changes being detected in two different regions. Examples include various combinations of sensors in region A, B, and C.
- FIGS. 11C and 11D illustrate detecting buckling based on status changes in regions A and B.
- FIGS. 11E and 11F illustrate detecting buckling based on status changes in regions A and C, and
- FIGS. 11G and 11H illustrate detecting buckling based on status changed in regions B and C.
- buckling may be detected based on status changes in all three regions.
- FIG. 12 is a flowchart of a process 1200 for detecting endolumenal buckling based on status changes indicated by sensor data according to one example embodiment.
- the process 1200 may include different or additional steps than those described in conjunction with FIG. 12 in some embodiments, or perform steps in different orders than the order described in conjunction with FIG. 12 .
- a controller 120 of a surgical robotics system receives 1210 first sensor data generated from a first sensor placed in a portion of the endoscope located within a patient lumen, the first sensor data indicating motion of the portion of the endoscope.
- the first sensor is located in one of the three sensor regions (e.g., sensor regions A-C).
- the first sensor is located in the sensor region C.
- the first sensor include a position sensor (e.g., EM sensor), an image sensor, a force sensor, or a resistance sensor.
- the controller 120 receives 1220 second sensor data generated from a second sensor located at a distal tip of the endoscope, the second sensor data indicating motion of the distal tip of the endoscope.
- the second sensor is an imaging device mounted on the distal tip (e.g., the imaging device 349 on the endoscope 118 in FIG. 3C ).
- the second sensor data (also referred to as optical flow data) is images captured the imaging device. As described in Section I.C.2., the second sensor data is used to estimate motion of the endoscope based on changes between a pair of images.
- the controller 120 evaluates 1230 the first sensor data to determine whether the portion of the endoscope has undergone a first status change (e.g., any type of status change mentioned above).
- the controller 120 evaluates 1240 the second sensor data to determine whether the distal tip of the endoscope has undergone a second status change (e.g., the tip does not move). Responsive to determining that the first sensor data indicates that the distal portion of the endoscope has had the first status change and that the second sensor data indicates that the distal tip of the endoscope has had the second status change, the controller 120 determines 1250 the endoscope has buckled.
- FIGS. 13A-13F are examples of detecting buckling of an endoscope outside a patient according to one embodiment.
- sensors 1340 are placed on both leader base 1310 and sheath base 1320 .
- Two sensors constitute a transmitter-receiver pair.
- the transmitter transmits a light beam 1345 of infrared light or visible light
- the receiver coaxial with the transmitter or adjacent to the transmitter detects the light beam 1345 .
- the transmitter 1340 is placed opposite to the receiver 1343 as shown in FIG. 13A , or vice versa.
- the transmitter 1340 is placed around an exit 1315 of the proximal leader section 1330 on the leader base 1310 at a distance 1350 between the transmitter and the exit.
- the corresponding receiver 1343 is placed around an entrance 1325 of the proximal leader section 1330 on the sheath base 1320 at the same distance between the receiver and the entrance 1325 .
- the distance 1350 is within a threshold, representing a suitable distance range for detecting buckling.
- the transmitter-receiver pair may be placed on the same side of the proximal leader section, as shown in FIG. 13C .
- the transmitter-receiver pair is placed around the exit 1315 and a reflector 1360 is placed around the entrance 1325 to reflect a light beam transmitted from the transmitter to the corresponding receiver.
- the transmitter 1340 is placed at a distance A 1350 and a receiver 1343 is placed at a distance B 1355 .
- the distances A 1350 and B 1355 are within the threshold for detecting buckling. When buckling occurs, a buckled portion of the proximal leader section fully or partially block the light beam, and no light signal is detected by the receiver, or the light signal detected by the receiver is reduced accordingly.
- More than one set of transmitter-receiver pairs may be used to detect buckling at different directions. For example, multiple transmitters are placed around the exit 1315 between each transmitter and the exit 1315 . The multiple transmitter-receiver pairs may be distributed to generate parallel light beams from each other, or they may be distributed to generate crossed light beams to better cover the cylindrical surface area around the endoscope. In some embodiments, the transmitted light beams are focused light, such as laser beams, how they may also be dispersed in nature and matched with receivers configured to receive the type of light emitted.
- FIG. 14 is a flowchart of a process 1400 for detecting buckling outside a patient based using transmitter-receiver pairs according to one embodiment.
- a controller of a surgical robotics system for example, the controller 120 of the surgical robotics system 100 shown in FIG. 1 , uses the process 1400 to detect buckling.
- the process 1400 may include different or additional steps than those described in conjunction with FIG. 14 in some embodiments, or perform steps in different orders than the order described in conjunction with FIG. 14 .
- the controller 120 provides 1410 one or more commands from the surgical robotic system 100 to one or more actuators, for example, the sheath base 1320 and leader base 1310 shown in FIGS. 13A-13F , to move the endoscope 118 for a surgical procedure.
- actuators for example, the sheath base 1320 and leader base 1310 shown in FIGS. 13A-13F .
- the controller 120 receives receiver data generated from at least one transmitter-receiver pair placed along a length of the endoscope outside the patient, the transmitter-receiver pair configured to transmit a light beam from a transmitter to a receiver, the receiver data indicating whether the receiver has had received light beam transmitted from the transmitter.
- the transmitter is placed on the sheath base and the receiver is placed on the leader base as shown in FIG. 13B and FIGS. 13D-13F .
- the controller 120 determines that the endoscope has buckled.
- FIG. 15 illustrates another example of detecting buckling of an endoscope outside a patient according to one embodiment.
- the sensor region 1540 located around the connection 1525 of the leader base 1520 is in contact with a proximal leader section 1530 .
- Sensors include strain gauges or load cells in rigid connection with the proximal leader section 1530 . Examples of strain configuration are described in U.S. application Ser. No.
- the controller 120 generates feedback for a user indicating that the endoscope has buckled and provides the feedback to users. For example, the controller 120 generates a message or a warning indicating that the endoscope has buckled. This message or warning may be provided for display on a graphical user interface (GUI), for example one or more monitors being used by the operator to control the operation.
- GUI graphical user interface
- the controller 120 can also generate a recommendation to users. To do this, the controller 120 determines one or more modifications to a command to move the endoscope. The modification is based on at least in part on the sensor data. For example, the controller 120 may adjust the command to smooth the buckled portion of the endoscope. Examples of command include moving the endoscope backward, adjusting movement of the tip, adjusting insertion force provided by the IDM, another suitable command that adjusts endoscope's movements, stopping movement of the endoscope, or some combination thereof.
- the first sensor region can be the tip of the endoscope or a small region around the end of the sheath.
- the second sensor region can be a portion of the sheath.
- the third sensor region may be omitted, or interpreted as another region along the sheath located further from the sheath tip than the second region.
- a surgical robotic system 100 uses one or more robotic arms 102 to control an endoscope 118 in a patient for surgical procedures.
- the robotic arms apply an insertion force to insert and advance the endoscope to an operative site.
- the force required to further advance the endoscope will change over time depending on a variety of factors including the location of the operative site, the path taken within the patient cavity to get there, the size of the endoscope, etc.
- the amount of force that may be safely applied without injuring the patient lumen will vary. For example, within a single lung network in a patient, a single force threshold limit that may be set to avoid injury is not applicable for all lobes.
- a dynamic force insertion threshold is needed to allow operations to be performed safely while still preventing the application of a level of force above that dynamic threshold.
- the surgical robotics system makes use of an adaptive insertion force threshold to regulate insertion force for different locations within a patient's lumen to avoid unsafe further insertion to the patient.
- the adaptive insertion force threshold is determined based on endoscopic data and patient data.
- the endoscopic data describes data associated with the endoscope during a navigation.
- Examples of the endoscopic data include a friction force between a sheath and a leader, a friction force between the sheath and internal anatomy, a friction force between the leader and the internal anatomy, a current location of the endoscope, a target location of the endoscope, insertion length of the sheath, insertion length of the leader, a distance between the sheath and the leader (e.g., a difference between the insertion length of the sheath and the insertion length of the leader, a distance between a distal end of the sheath and the tip of the endoscope), motion of the leader (e.g., translation, rotation, blending, etc.), motion of the sheath (e.g., translation, rotation, blending, etc.), motion of the tip (e.g., translation, rotation, deflection, etc.), a contact interaction between the tip and a portion of a tissue within a patient (e.g., contacting
- the endoscope data can be obtained from one or more sensors placed on the endoscope.
- a position sensor or an image sensor on the tip of the endoscope can obtain a current location of the endoscope, and motions of the tip.
- a force sensor on the tip can obtain a contacting force between the tip and a portion of a tissue within a patient, or other types of force between the tip and contacting tissue (e.g., friction, pressure, etc.).
- One or more sensors of different sensor types e.g., position sensor, force sensor, shape sensor, etc.
- Patient data describes associated with a patient inserted by the endoscope.
- patent data include medical data (e.g., medical diagnosis, medical treatment, disease, medical history, other suitable medical data affecting navigation, or some combination thereof), general information (e.g., gender, age, habit, etc.), or some combination thereof.
- the patient data may be stored in a database included in and accessible by the robotic surgical system.
- the adaptive insertion force threshold is determined by a function associated with the endoscopic data and patient data.
- the adaptive insertion force threshold is determined based on a nonlinear function associated with a relationship among an insertion force threshold, endoscopic data and patient data.
- the function By inputting the endoscopic data and patient data, the function generates an insertion force threshold.
- the adaptive insertion force threshold is determined based on optimizing a metric.
- the metric accounts for an effect of applying an insertion force within a safety range.
- the safety range describes a range that the insertion force doesn't damage contacting tissues or organs within the patient. For example, an optimization function is used to find a maximum insertion force within the safety range.
- the insertion force threshold is determined based on a machine learning algorithm. For example, by historical endoscope data and patient data regarding prior similar operations may be passed as a training data set into a machine learning model, and various parameters for determining the insertion force threshold is generated. The parameters may be the same parameters as there are types of patient and endoscopic data introduced above, however additional or different parameters may also be used. In some embodiments, patient data can be used as constraints to functions in above embodiments. For example, if a patient has an asthma disease, the walls of airways become inflamed and oversensitive. Consequently, the force insertion threshold may be set to a lower value than it would be for a patient without asthma.
- the insertion force threshold may also be determined based on a look-up table.
- the look-up table includes data describing a plurality of insertion force thresholds having various characteristics. For example, the look-up table describes a plurality of insertion force thresholds associated with different endoscope's locations of a patient or of a group of patients.
- the look-up table may be obtained by statistical analysis of various endoscope data and various patient data, machine learning applied to various endoscope data and various patient data, data mining of various endoscope data and various patient data, or by any other suitable method.
- Various types of look-up tables may be stored by the surgical robotics system in different embodiments.
- Example types of look-up tables stored by the controller include: a probability distribution of a likelihood of insertion force thresholds relative to different locations of the endoscope, clusters of insertion force thresholds having different characteristics, or other suitable information (e.g., numbers, density, classification).
- the look-up table is obtained from application of patients having different characteristics (e.g., gender, age) by one or more robotic surgical systems.
- the look-up table may identify characteristics of insertion force thresholds obtained from a patient or from a threshold number or percentage of patients.
- a look-up table is generated for each patient. Based on patient data and endoscopic data, an insertion force threshold can be determined.
- a look-up table is generated for different types of patients.
- FIGS. 16A-C illustrate examples of adaptive insertion force thresholds used at different locations of an endoscope with different patients according to an embodiment.
- FIG. 16A shows two examples of inserting an endoscope to an operative site. The first example shows the endoscope is inserted into an operative site A 1610 A located in the left upper lobe of lung 1600 . The second example shows the endoscope is inserted into an operative site B 1610 B located in the right lower lobe of the lung 1600 . As shown in FIG. 16A , the two examples have different endoscope data.
- the two examples have different locations of the endoscope, different insertion lengths of the sheath 1630 , different lengths of the leader 1620 , different distances between the sheath 1630 and the leader 1620 , different motions of the endoscope (e.g., the leader 1620 A bends more than the leader 1620 B), etc.
- Different endoscope data results in different insertion force thresholds.
- the first example needs more insertion force to overcome a force (e.g., torque, friction) generated due to bending.
- a force e.g., torque, friction
- different patients may have different insertion force thresholds at the same operative site.
- the insertion force threshold to allow insertion of the endoscope while preventing injury may not be a value that can be precisely determined based on available data. Consequently, the system may instead determine an insertion force threshold with size determined based on any of the techniques described previously.
- An insertion force threshold region indicates a probability distribution (e.g., a cluster or density) of a likelihood of insertion force threshold being safe (i.e., not harming the patient) relative to a location of the endoscope (e.g., a location proximal to the operative site), or statistical data of insertion force threshold relative to the location of the endoscope.
- the insertion force threshold region indicates a plurality of possible insertion force thresholds relative to a plurality of possible locations during a navigation to an operative site.
- FIGS. 16B-16C illustrate region 1645 A from a first patient 1640 and an insertion force threshold region 1655 A from a second patient 1650 , both associated with operative site A 1610 A, and similar insertion force threshold regions 1645 B and 1655 B for the first and second patients with respect to a second operative site 1610 B.
- FIGS. 16B-16C illustrate region 1645 A from a first patient 1640 and an insertion force threshold region 1655 A from a second patient 1650 , both associated with operative site A 1610 A, and similar insertion force threshold regions 1645 B and 1655 B for the first and second patients with respect to a second operative site 1610 B.
- the surgical robotic system actively determines the insertion force threshold during a navigation.
- the insertion force thresholds may be pre-determined and tagged to different portions of a pre-operative model as part of a robotic pre-operative planning stage.
- the surgical robotic system compares the insertion force with the determined insertion force threshold.
- the insertion force can be detected by one or more force sensors coupled to a robotic arm of the surgical robotic system.
- the surgical robotic system sends a visual and/or audio feedback to a user via the system GUI. For example, a warning indicating that the insertion force is very close to the insertion force threshold, or approaches the insertion force threshold.
- Different colors such as green, yellow, and red, may be used to indicate relative distance to the insertion force threshold.
- the surgical robotic system upon reaching the insertion force threshold, the surgical robotic system generates a recommendation to the user.
- the surgical robotic system determines one or more modifications to a command to insert the endoscope.
- the modification is based on at least in part on the endoscopic data and patient data.
- Examples of command includes ceasing one or more insertion forces from the surgical robotic system, reducing the insertion force, another suitable command that adjusts insertion force, or some combination thereof.
- FIG. 17 is a flowchart of a process 1700 for inserting an endoscope using an adaptive insertion force threshold according to one embodiment.
- a controller of a surgical robotics system for example, the controller 120 of the surgical robotics system 100 shown in FIG. 1 , uses the process 1700 to insert the endoscope using the adaptive insertion force threshold.
- the process 1700 may include different or additional steps than those described in conjunction with FIG. 17 in some embodiments, or perform steps in different orders than the order described in conjunction with FIG. 17 .
- the controller 120 receives 1710 endoscopic data from an endoscope of a robotic surgical system, the endoscope data based in part on a current location of the endoscope.
- the controller 120 can obtain sensor data as endoscopic data from one or more sensors placed on the endoscope (e.g., sheath, leader, or tip).
- the controller 120 accesses 1720 patient data associated with a patient, the patient data based in part on medical data associated with the patient.
- the controller 120 can access a patient data database stored in the robotic surgical system.
- the controller 120 can obtain the patient data by accessing one or more external databases via a network.
- the controller 120 determines 1730 an adaptive force insertion threshold based on the endoscopic data and the patient data. For example, the controller 120 determines the adaptive force insertion threshold based on one or more functions or models, a look-up table, or based on insertion force threshold region.
- the controller 120 receives 1740 an insertion force detected by one or more force sensors coupled to a robotic arm of the robotic surgical system, the insertion force applied by the arm to the endoscope.
- one or more force sensors can be placed on one or more arm segments of the robotic arm, one or more joints of the robotic arm, a connection between the robotic arm with an IMD, other suitable location affecting movement of the robotic arm, or some combination thereof.
- the controller 120 compares 1750 the insertion force with the adaptive insertion force threshold. Responsive to the insertion force exceeding the adaptive force threshold, the controller 120 sends 1760 an endoscope command recommendation to the robotic surgical system. For example, if the insertion force exceeds the adaptive force threshold, the controller 120 sends a message or a warning indicating that the insertion force exceeds the insertion force threshold. The controller 120 determines one or more modifications to a command to adjust the insertion force.
- any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment.
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Coupled and “connected” along with their derivatives.
- some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact.
- the term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other.
- the embodiments are not limited in this context unless otherwise explicitly stated.
- the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion.
- a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
- “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Pulmonology (AREA)
- Robotics (AREA)
- Otolaryngology (AREA)
- Physiology (AREA)
- Signal Processing (AREA)
- Endoscopes (AREA)
- Anesthesiology (AREA)
- Hematology (AREA)
Abstract
Description
- This application is a divisional of U.S. patent application Ser. No. 15/392,868, filed Dec. 28, 2016, which is related to U.S. patent application Ser. No. 15/392,917, filed on Dec. 28, 2016, each of which is incorporated herein by reference in its entirety for all purposes.
- This description generally relates to surgical robotics, and particularly to controlling insertion of a surgical instrument into an anatomical lumen of a patient.
- Robotic technologies have a range of applications. In particular, robotic arms help complete tasks that a human would normally perform. For example, factories use robotic arms to manufacture automobiles and consumer electronics products. Additionally, scientific facilities use robotic arms to automate laboratory procedures such as transporting microplates. Recently, physicians and/or surgeons have started using robotic arms to help perform surgical procedures. For instance, physicians and/or surgeons use robotic arms to control surgical instruments such as endoscopes.
- An endoscope is able to perform surgical procedures in a minimally invasive manner. The endoscope can be directed to a target location of a patient, such as the lung or blood vessel. The robotic arms applies a force to insert the endo scope into an open access point of a patient, e.g., mouth, anus, urethra, to the target location within the patient lumen. As the endoscope is inserted deeper into the patient anatomy, the endoscope may brush, rub, and push against internal anatomy that may be fragile and subject to tearing if too much insertion force is applied. Moreover, during the endoscope moves to the target location, the endoscope typically may buckle in response to slack or insertion insistence in the endoscope and incidental force from coming in contact with patient anatomy. When the endoscope buckles, the physicians and/or surgeons continue to push the scope, and increase insertion force beyond normal levels in order to advance the endoscope. This creates a danger of the buckled portion of the endoscope storing up undesirable potential energy, which may be potentially unwound in an uncontrollable way within the patient lumen/cavity or damage the endoscope.
- The present disclosure describes the determination of an insertion force threshold to regulate an insertion force of an instrument within a patient's lumen in order to prevent buckling of the instrument or possible injury to the patient. The insertion force threshold may be dynamically determined based on real time data captured from the instrument and data associated with the patient as the instrument moves to an operative site. Additionally or alternatively, the insertion force threshold may be at least partially pre-determined and tagged to different portions of a pre-operative model.
- Other aspects include methods, components, devices, systems, improvements, methods, processes, applications, computer readable mediums, and other technologies related to any of the above.
-
FIG. 1A illustrates a surgical robotic system according to one embodiment. -
FIGS. 1B-1F show various perspective views of a robotic platform coupled to the surgical robotic system shown inFIG. 1A , according to one embodiment. -
FIG. 2 illustrates a command console for a surgical robotic system according to one embodiment. -
FIG. 3A illustrates multiple degrees of motion of an endoscope according to one embodiment. -
FIG. 3B is a top view of an endoscope according to one embodiment. -
FIG. 3C is a cross sectional isometric view of the leader of the endoscope according to one embodiment. -
FIG. 4A is an isometric view of an instrument device manipulator of a surgical robotic system according to one embodiment. -
FIG. 4B is an exploded isometric view of the instrument device manipulator shown inFIG. 4A according to one embodiment. -
FIG. 4C is an isometric view of an independent drive mechanism of the instrument device manipulator shown inFIG. 4A according to one embodiment. -
FIG. 4D illustrates a conceptual diagram that shows how forces may be measured by a strain gauge of the independent drive mechanism shown inFIG. 4C according to one embodiment. -
FIG. 5A is a flowchart of a process for determining movements of an endoscope from a sequence of recorded images according to one embodiment. -
FIG. 5B is a diagram of electromagnetic tracking system according to one embodiment. -
FIG. 6A illustrates the distal end of an endoscope within an anatomical lumen according to one embodiment. -
FIG. 6B illustrates the endoscope shown inFIG. 6A in use at an operative site according to one embodiment. -
FIG. 6C illustrates the endoscope shown inFIG. 6B with an aspiration needle according to one embodiment. -
FIGS. 7A, and 7B illustrate an example of endolumenal buckling occurred when an endoscope is inserted into a patient's lung to an operative site according to one embodiment. -
FIGS. 8A and 8B illustrate examples of sensor regions used to place sensors according to one embodiment. -
FIGS. 9A-9L illustrate examples of endolumenal buckling detection based on a comparison between measured status and expected status according to one embodiment. -
FIG. 10 is a flowchart of a process for detecting endolumenal buckling based on a comparison between measured status and expected status according to one embodiment. -
FIGS. 11A-11H illustrate examples of endolumenal buckling detection based on before and after (or during) a command, according to one embodiment. -
FIG. 12 is a flowchart of a process for detecting endolumenal buckling based on status changes indicated by sensor data according to one embodiment. -
FIGS. 13A-13F are examples of detecting buckling of an endoscope outside a patient according to one embodiment. -
FIG. 14 is a flowchart of a process for detecting buckling outside a patient based using transmitter-receiver pairs according to one embodiment. -
FIG. 15 illustrates another example of detecting buckling of an endoscope outside a patient according to one embodiment. -
FIGS. 16A-C illustrate examples of adaptive insertion force thresholds used at different locations of an endoscope with different patients according to an embodiment. -
FIG. 17 is a flowchart of a process for inserting an endoscope using an adaptive insertion force threshold according to one embodiment. - The figures depict embodiments of the present invention for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
- The methods and apparatus disclosed herein are well suited for use with one or more endoscope components or steps as described in U.S. application Ser. No. 14/523,760, filed on Oct. 24, 2014, published as U.S. Pat. Pub. No. US 2015/0119637, entitled “SYSTEM FOR ROBOTIC-ASSISTED ENDOLUMENAL SURGERY AND RELATED METHODS,” the full disclosure of which is incorporated herein by reference. The aforementioned application describes system components, endolumenal systems, virtual rail configurations, mechanism changer interfaces, instrument device manipulators (IDMs), endoscope tool designs, control consoles, endoscopes, instrument device manipulators, endolumenal navigation, and endolumenal procedures suitable for combination in accordance with embodiments disclosed herein. The principles described in the above application are also applicable to catheter designs. Generally, although the following sections of this description describe endoscope embodiments, this is merely one example, and the description that follows can also be implemented and/or used in conjunction with catheters as well, or more generally any flexible instrument comprising an elongate body.
-
FIG. 1A illustrates a surgicalrobotic system 100 according to one embodiment. The surgicalrobotic system 100 includes a base 101 coupled to one or more robotic arms, e.g.,robotic arm 102. Thebase 101 is communicatively coupled to a command console, which is further described with reference toFIG. 2 in Section I.B. Command Console. The base 101 can be positioned such that therobotic arm 102 has access to perform a surgical procedure on a patient, while a user such as a physician may control the surgicalrobotic system 100 from the comfort of the command console. In some embodiments, thebase 101 may be coupled to a surgical operating table or bed for supporting the patient. Though not shown inFIG. 1 for purposes of clarity, thebase 101 may include subsystems such as control electronics, pneumatics, power sources, optical sources, and the like. Therobotic arm 102 includesmultiple arm segments 110 coupled atjoints 111, which provides therobotic arm 102 multiple degrees of freedom, e.g., seven degrees of freedom corresponding to seven arm segments. The base 101 may contain a source ofpower 112,pneumatic pressure 113, and control andsensor electronics 114—including components such as a central processing unit, data bus, control circuitry, and memory—and related actuators such as motors to move therobotic arm 102. Theelectronics 114 in thebase 101 may also process and transmit control signals communicated from the command console. - In some embodiments, the
base 101 includeswheels 115 to transport the surgicalrobotic system 100. Mobility of the surgicalrobotic system 100 helps accommodate space constraints in a surgical operating room as well as facilitate appropriate positioning and movement of surgical equipment. Further, the mobility allows therobotic arms 102 to be configured such that therobotic arms 102 do not interfere with the patient, physician, anesthesiologist, or any other equipment. During procedures, a user may control therobotic arms 102 using control devices such as the command console. - In some embodiments, the
robotic arm 102 includes set up joints that use a combination of brakes and counter-balances to maintain a position of therobotic arm 102. The counter-balances may include gas springs or coil springs. The brakes, e.g., fail safe brakes, may be include mechanical and/or electrical components. Further, therobotic arms 102 may be gravity-assisted passive support type robotic arms. - Each
robotic arm 102 may be coupled to an instrument device manipulator (IDM) 117 using a mechanism changer interface (MCI) 116. TheIDM 117 can be removed and replaced with a different type of IDM, for example, a first type of IDM manipulates an endoscope, while a second type of IDM manipulates a laparoscope. TheMCI 116 includes connectors to transfer pneumatic pressure, electrical power, electrical signals, and optical signals from therobotic arm 102 to theIDM 117. TheMCI 116 can be a set screw or base plate connector. TheIDM 117 manipulates surgical instruments such as theendoscope 118 using techniques including direct drive, harmonic drive, geared drives, belts and pulleys, magnetic drives, and the like. TheMCI 116 is interchangeable based on the type ofIDM 117 and can be customized for a certain type of surgical procedure. Therobotic arm 102 can include a joint level torque sensing and a wrist at a distal end, such as the KUKA AG® LBR5 robotic arm. - The
endoscope 118 is a tubular and flexible surgical instrument that is inserted into the anatomy of a patient to capture images of the anatomy (e.g., body tissue). In particular, theendoscope 118 includes one or more imaging devices (e.g., cameras or sensors) that capture the images. The imaging devices may include one or more optical components such as an optical fiber, fiber array, or lens. The optical components move along with the tip of theendoscope 118 such that movement of the tip of theendoscope 118 results in changes to the images captured by the imaging devices. Theendoscope 118 is further described with reference toFIGS. 3A-3C in Section I.C. Endoscope. -
Robotic arms 102 of the surgicalrobotic system 100 manipulate theendoscope 118 using elongate movement members. The elongate movement members may include pull wires, also referred to as pull or push wires, cables, fibers, or flexible shafts. For example, therobotic arms 102 actuate multiple pull wires coupled to theendoscope 118 to deflect the tip of theendoscope 118. The pull wires may include both metallic and non-metallic materials such as stainless steel, Kevlar, tungsten, carbon fiber, and the like. Theendoscope 118 may exhibit nonlinear behavior in response to forces applied by the elongate movement members. The nonlinear behavior may be based on stiffness and compressibility of theendoscope 118, as well as variability in slack or stiffness between different elongate movement members. - The surgical
robotic system 100 includes acontroller 120, for example, a computer processor. Thecontroller 120 includesimage registration module 130, and a store 135. The surgicalrobotic system 100 uses theimage registration module 130 for determining movement of the endoscope, which is further described in Section I.C.2. Optical Flow and I.C.3. EM Registration. In some embodiments, some or all functionality of thecontroller 120 is performed outside the surgicalrobotic system 100, for example, on another computer system or server communicatively coupled to the surgicalrobotic system 100. -
FIGS. 1B-1F show various perspective views of the surgicalrobotic system 100 coupled to a robotic platform 150 (or surgical bed), according to various embodiments. Specifically,FIG. 1B shows a side view of the surgicalrobotic system 100 with therobotic arms 102 manipulating the endoscopic 118 to insert the endoscopic inside a patient's body, and the patient is lying on therobotic platform 150.FIG. 1C shows a top view of the surgicalrobotic system 100 and therobotic platform 150, and the endoscopic 118 manipulated by the robotic arms is inserted inside the patient's body.FIG. 1D shows a perspective view of the surgicalrobotic system 100 and therobotic platform 150, and the endoscopic 118 is controlled to be positioned horizontally parallel with the robotic platform.FIG. 1E shows another perspective view of the surgicalrobotic system 100 and therobotic platform 150, and the endoscopic 118 is controlled to be positioned relatively perpendicular to the robotic platform. In more detail, inFIG. 1E , the angle between the horizontal surface of therobotic platform 150 and the endoscopic 118 is 75 degree.FIG. 1F shows the perspective view of the surgicalrobotic system 100 and therobotic platform 150 shown inFIG. 1E , and in more detail, the angle between the endoscopic 118 and thevirtual line 160 connecting oneend 180 of the endoscopic and therobotic arm 102 that is positioned relatively farther away from the robotic platform is 90 degree. -
FIG. 2 illustrates acommand console 200 for a surgicalrobotic system 100 according to one embodiment. Thecommand console 200 includes aconsole base 201,display modules 202, e.g., monitors, and control modules, e.g., akeyboard 203 and joystick 204. In some embodiments, one or more of thecommand module 200 functionality may be integrated into abase 101 of the surgicalrobotic system 100 or another system communicatively coupled to the surgicalrobotic system 100. Auser 205, e.g., a physician, remotely controls the surgicalrobotic system 100 from an ergonomic position using thecommand console 200. - The
console base 201 may include a central processing unit, a memory unit, a data bus, and associated data communication ports that are responsible for interpreting and processing signals such as camera imagery and tracking sensor data, e.g., from theendoscope 118 shown inFIG. 1 . In some embodiments, both theconsole base 201 and the base 101 perform signal processing for load-balancing. Theconsole base 201 may also process commands and instructions provided by theuser 205 through thecontrol modules 203 and 204. In addition to thekeyboard 203 and joystick 204 shown inFIG. 2 , the control modules may include other devices, for example, computer mice, trackpads, trackballs, control pads, video game controllers, and sensors (e.g., motion sensors or cameras) that capture hand gestures and finger gestures. - The
user 205 can control a surgical instrument such as theendoscope 118 using thecommand console 200 in a velocity mode or position control mode. In velocity mode, theuser 205 directly controls pitch and yaw motion of a distal end of theendoscope 118 based on direct manual control using the control modules. For example, movement on the joystick 204 may be mapped to yaw and pitch movement in the distal end of theendoscope 118. The joystick 204 can provide haptic feedback to theuser 205. For example, the joystick 204 vibrates to indicate that theendoscope 118 cannot further translate or rotate in a certain direction. Thecommand console 200 can also provide visual feedback (e.g., pop-up messages) and/or audio feedback (e.g., beeping) to indicate that theendoscope 118 has reached maximum translation or rotation. - In position control mode, the
command console 200 uses a three-dimensional (3D) map of a patient and pre-determined computer models of the patient to control a surgical instrument, e.g., theendoscope 118. Thecommand console 200 provides control signals torobotic arms 102 of the surgicalrobotic system 100 to manipulate theendoscope 118 to a target location. Due to the reliance on the 3D map, position control mode requires accurate mapping of the anatomy of the patient. - In some embodiments,
users 205 can manually manipulaterobotic arms 102 of the surgicalrobotic system 100 without using thecommand console 200. During setup in a surgical operating room, theusers 205 may move therobotic arms 102,endoscopes 118, and other surgical equipment to access a patient. The surgicalrobotic system 100 may rely on force feedback and inertia control from theusers 205 to determine appropriate configuration of therobotic arms 102 and equipment. - The
display modules 202 may include electronic monitors, virtual reality viewing devices, e.g., goggles or glasses, and/or other means of display devices. In some embodiments, thedisplay modules 202 are integrated with the control modules, for example, as a tablet device with a touchscreen. Further, theuser 205 can both view data and input commands to the surgicalrobotic system 100 using the integrateddisplay modules 202 and control modules. - The
display modules 202 can display 3D images using a stereoscopic device, e.g., a visor or goggle. The 3D images provide an “endo view” (i.e., endoscopic view), which is a computer 3D model illustrating the anatomy of a patient. The “endo view” provides a virtual environment of the patient's interior and an expected location of anendoscope 118 inside the patient. Auser 205 compares the “endo view” model to actual images captured by a camera to help mentally orient and confirm that theendoscope 118 is in the correct—or approximately correct—location within the patient. The “endo view” provides information about anatomical structures, e.g., the shape of an intestine or colon of the patient, around the distal end of theendoscope 118. Thedisplay modules 202 can simultaneously display the 3D model and computerized tomography (CT) scans of the anatomy the around distal end of theendoscope 118. Further, thedisplay modules 202 may overlay pre-determined optimal navigation paths of theendoscope 118 on the 3D model and CT scans. - In some embodiments, a model of the
endoscope 118 is displayed with the 3D models to help indicate a status of a surgical procedure. For example, the CT scans identify a lesion in the anatomy where a biopsy may be necessary. During operation, thedisplay modules 202 may show a reference image captured by theendoscope 118 corresponding to the current location of theendoscope 118. Thedisplay modules 202 may automatically display different views of the model of theendoscope 118 depending on user settings and a particular surgical procedure. For example, thedisplay modules 202 show an overhead fluoroscopic view of theendoscope 118 during a navigation step as theendoscope 118 approaches an operative region of a patient. -
FIG. 3A illustrates multiple degrees of motion of anendoscope 118 according to one embodiment. Theendoscope 118 is an embodiment of theendoscope 118 shown inFIG. 1 . As shown inFIG. 3A , thetip 301 of theendoscope 118 is oriented with zero deflection relative to a longitudinal axis 306 (also referred to as a roll axis 306). To capture images at different orientations of thetip 301, a surgicalrobotic system 100 deflects thetip 301 on apositive yaw axis 302,negative yaw axis 303,positive pitch axis 304,negative pitch axis 305, or rollaxis 306. Thetip 301 orbody 310 of theendoscope 118 may be elongated or translated in thelongitudinal axis 306,x-axis 308, or y-axis 309. - The
endoscope 118 includes areference structure 307 to calibrate the position of theendoscope 118. For example, the surgicalrobotic system 100 measures deflection of theendoscope 118 relative to thereference structure 307. Thereference structure 307 is located on a proximal end of theendoscope 118 and may include a key, slot, or flange. Thereference structure 307 is coupled to a first drive mechanism for calculating movement and is coupled to a second drive mechanism, e.g., theIDM 117, to perform a surgical procedure. -
FIG. 3B is a top view of anendoscope 118 according to one embodiment. Theendoscope 118 includes aleader 315 tubular component nested or partially nested inside and longitudinally-aligned with asheath 311 tubular component, such that the leader telescopes out of the sheath. Thesheath 311 includes aproximal sheath section 312 anddistal sheath section 313. Theleader 315 has a smaller outer diameter than thesheath 311 and includes aproximal leader section 316 anddistal leader section 317. Thesheath base 314 and theleader base 318 actuate thedistal sheath section 313 and thedistal leader section 317, respectively, for example, based on control signals from a user of a surgicalrobotic system 100. Thesheath base 314 and theleader base 318 are, e.g., part of theIDM 117 shown inFIG. 1 . - Both the
sheath base 314 and theleader base 318 include drive mechanisms (e.g., the independent drive mechanism further described with reference toFIG. 4A-D in Section II.C.4. Instrument Device Manipulator) to control pull wires coupled to thesheath 311 andleader 315. For example, thesheath base 314 generates tensile loads on pull wires coupled to thesheath 311 to deflect thedistal sheath section 313. Similarly, theleader base 318 generates tensile loads on pull wires coupled to theleader 315 to deflect thedistal leader section 317. Both thesheath base 314 andleader base 318 may also include couplings for the routing of pneumatic pressure, electrical power, electrical signals, or optical signals from IDMs to thesheath 311 andleader 314, respectively. A pull wire may include a steel coil pipe along the length of the pull wire within thesheath 311 or theleader 315, which transfers axial compression back to the origin of the load, e.g., thesheath base 314 or theleader base 318, respectively. - The
endoscope 118 can navigate the anatomy of a patient with ease due to the multiple degrees of freedom provided by pull wires coupled to thesheath 311 and theleader 315. For example, four or more pull wires may be used in either thesheath 311 and/or theleader 315, providing eight or more degrees of freedom. In other embodiments, up to three pull wires may be used, providing up to six degrees of freedom. Thesheath 311 andleader 315 may be rotated up to 360 degrees along alongitudinal axis 306, providing more degrees of motion. The combination of rotational angles and multiple degrees of freedom provides a user of the surgicalrobotic system 100 with a user friendly and instinctive control of theendoscope 118. -
FIG. 3C is a cross sectional isometric view of theleader 315 of theendoscope 118 according to one embodiment. Theleader 315 includes an imaging device 349 (e.g., image sensor, still or video camera, 2D or 3D detector array, charge-coupled device (CCD) or complementary metal-oxide semiconductor (CMOS) camera, imaging fiber bundle, etc.), light sources 350 (e.g., white light source, laser diode, light-emitting diode (LED), optic fiber illuminator, etc.), and at least one workingchannel 343 for other components. For example, other components include camera wires, an insufflation device, a suction device, electrical wires, fiber optics, an ultrasound transducer, position sensing components, electromagnetic (EM) sensing components, and optical coherence tomography (OCT) sensing components. In some embodiments, theleader 315 includes a pocket hole to accommodate insertion of a component into a workingchannel 343. -
FIG. 4A is an isometric view of aninstrument device manipulator 117 of the surgicalrobotic system 100 according to one embodiment. Therobotic arm 102 is coupled to theIDM 117 via an articulatinginterface 401. TheIDM 117 is coupled to theendoscope 118. The articulatinginterface 401 may transfer pneumatic pressure, power signals, control signals, and feedback signals to and from therobotic arm 102 and theIDM 117. TheIDM 117 may include a gear head, motor, rotary encoder, power circuits, and control circuits. Atool base 403 for receiving control signals from theIDM 117 is coupled to the proximal end of theendoscope 118. Based on the control signals, theIDM 117 manipulates theendoscope 118 by actuating output shafts, which are further described below with reference toFIG. 4B . -
FIG. 4B is an exploded isometric view of the instrument device manipulator shown inFIG. 4A according to one embodiment. InFIG. 4B , the endoscopic 118 has been removed from theIDM 117 to reveal the 405, 406, 407, and 408.output shafts -
FIG. 4C is an isometric view of an independent drive mechanism of theinstrument device manipulator 117 shown inFIG. 4A according to one embodiment. The independent drive mechanism can tighten or loosen the 421, 422, 423, and 424 (e.g., independently from each other) of an endoscope by rotating thepull wires 405, 406, 407, and 408 of theoutput shafts IDM 117, respectively. Just as the 405, 406, 407, and 408 transfer force down pulloutput shafts 421, 422, 423, and 424, respectively, through angular motion, thewires 421, 422, 423, and 424 transfer force back to the output shafts. Thepull wires IDM 117 and/or the surgicalrobotic system 100 can measure the transferred force using a sensor, e.g., a strain gauge further described below. -
FIG. 4D illustrates a conceptual diagram that shows how forces may be measured by astrain gauge 434 of the independent drive mechanism shown inFIG. 4C according to one embodiment. Aforce 431 may be directed away from theoutput shaft 405 coupled to themotor mount 433 of themotor 437. Accordingly, theforce 431 results in horizontal displacement of themotor mount 433. Further, thestrain gauge 434 horizontally coupled to themotor mount 433 experiences strain in the direction of theforce 431. The strain may be measured as a ratio of the horizontal displacement of thetip 435 ofstrain gauge 434 to the overallhorizontal width 436 of thestrain gauge 434. - In some embodiments, the
IDM 117 includes additional sensors, e.g., inclinometers or accelerometers, to determine an orientation of theIDM 117. Based on measurements from the additional sensors and/or thestrain gauge 434, the surgicalrobotic system 100 can calibrate readings from thestrain gauge 434 to account for gravitational load effects. For example, if theIDM 117 is oriented on a horizontal side of theIDM 117, the weight of certain components of theIDM 117 may cause a strain on themotor mount 433. Accordingly, without accounting for gravitational load effects, thestrain gauge 434 may measure strain that did not result from strain on the output shafts. - As the endoscope moves, the movement is reflected in changes from one image to the next. These changes may be detected using optical flow techniques that register one image to another, from which a movement may be estimated.
-
FIG. 5A is a flowchart of a process for determining movements of an endoscope from a sequence of recorded images according to one embodiment. Theprocess 500 may include different or additional steps than those described in conjunction withFIG. 5A in some embodiments, or perform steps in different orders than the order described in conjunction withFIG. 5A . - The
image registration module 130 of the surgicalrobotic system 100 shown inFIG. 1 determines movement of an endoscope tip based on changes in properties of a sample of images (e.g., grayscale or color) captured by an image sensor coupled to the endoscope tip, e.g., theimaging device 349 ofendoscope 118 shown inFIG. 3C . Because the image sensor is coupled to theendoscope 118, theimage registration module 130 assumes that changes between a pair of images of the sample are due to a shift in perspective of the image sensor corresponding to a movement of the endoscope tip, e.g., translation, rotation, and/or scaling in a pitch or yaw axis. - The
image registration module 130 can filter the sample of images, for example, by removing every other image of the sample to help reduce the time required to process the sample. In some embodiments, theimage registration module 130 extracts the sample of images from a video captured by the image sensor. Image registration does not require the source and target images to be subsequent frames of the camera. However, the accuracy of the motion estimated by image registration tends to be greater as the time period between images decreases. Thus, theimage registration module 130 generates more accurate motion estimates (e.g., nearly continuous measurement of parameters associated with movement of the endoscope) by registering many images in sequence. - To determine translation movement, the
image registration module 130 receives 510 a sample of images and analyzes pairs of images of the sample using an optical flow technique. In a pair of images, the image that occurs first is referred to as the source image and the image that occurs second is referred to as the target image. The order of the first and second images is arbitrary. Thus, the direction of translation (e.g., moving forward or backward in time) is determined based on which image is considered the source and which images is considered the target. In one embodiment, each image is a two-dimensional pixel array of N pixel values corresponding to light intensities (e.g., for grayscale images), vectors representing intensities of different colors of light (e.g., for color images), etc. Theimage registration module 130 can transform the two-dimensional pixel array into a corresponding 1-dimensional array with N elements for processing. - The
image registration module 130 generates 520 a difference array D and generates 530 a gradient array G based on the pair of images. In some embodiments, theimage registration module 130 generates a difference array and gradient array for each pair of images of the sample. The difference array D is based on the difference between a pixel value of the target image and a corresponding pixel value of the source image. The gradient array G is based on a weighted average of the rate of change (e.g., derivative) of a pixel value of the target image and the rate of change of a corresponding pixel value of the source image. In embodiments with a two-dimensional (e.g., x and y dimensions) pixel array, the rate of change of a pixel in the x-dimension Gx is based on the difference between the pixel and each of two or more adjacent pixels in the x-direction. Similarly, the rate of change of the pixel in the y-dimension Gy is based on the difference between the pixel and each of two or more adjacent pixels in the y-direction. The gradient array may be a weighted average of the rates of change in the x and y dimensions, e.g., equally weighted. Theimage registration module 130 can decompose the 2D gradient array into two sub-arrays, Gx and Gy, corresponding to partial derivatives in the x and y directions, respectively. Accordingly, theimage registration module 130 represents G as an N×2 matrix: G=(Gx Gy), where Gx and Gy each include N components. - The
image registration module 130 determines a motion of the endoscope base on the difference array D and the gradient array G. The motion can be represented by a vector p. The vector p often comprises a set of model parameters, and the identities of these parameters may be varied in order to detect different properties of motion. In general, p may be modeled as satisfying a linear equation of the form Ap=v, wherein A is a matrix determined by G and the form of p, and v is a vector corresponding to D. The value of p in the above equation may be solved by methods such as least-squares fitting, in which p may be estimated as p=(ATA)−1ATv, where AT represents the transpose of A and (ATA)−1 represents the inverse of the product of AT with A. The solved p represents a motion (e.g., translation, rotation) of the endoscope. Theimage registration module 130 can repeat the steps 520-540 of theprocess 500 for multiple pairs of images of the sample. Thus, theimage registration module 130 generates a set of motion vectors corresponding to each processed pair of images. -
FIG. 5B is a diagram of electromagnetic tracking system according to one embodiment. Thespatial sensor 550 coupled to the tip of theendoscope 118 is anEM sensor 550 that detects an electromagnetic field (EMF) generated by one ormore EMF generators 600 in proximity to theendoscope 118. The strength of the detected EMF is a function of the position and/or orientation of theendoscope 118. In one embodiment, a number ofEMF generators 600 are located externally to a patient. TheEMF generators 600 emit EM fields that are picked up by theEM sensor 550. Thedifferent EMF generators 600 may be modulated in a number of different ways so that when their emitted fields are captured by theEM sensor 550 and are processed by the controller 120 (or any computer system external to the surgical robotic system 100), their signals are separable. Further, theEMF generators 600 may be oriented relative to each other in Cartesian space at non-zero, non-orthogonal angles so that changes in orientation of theEM sensor 550 will result in theEM sensor 550 receiving at least some signal from at least one of theEMF generators 600 at any instant in time. - The
controller 120 registers EM data captured by theEM sensor 550 to an image of the patient captured with a different technique other than EM (or whatever mechanism is used to capture the alignment sensor's data), such as a computed tomography (CT) scan, to establish a reference frame for the EM data. In some embodiments, the distal end of the endoscope may be tracked by EM sensors located in the tip. The relative location within the patient may be determined by comparing a pre-operative model generated from CT data to the absolute location measured by the EM tracking system. - For example, before registering EM data with a 3D model generated from the CT data, data points derived from the EM data are initially located far from the position of the endoscope tip moving along a planned navigation path expected from the 3D model. This position difference between the EM data and the 3D model reflects the lack of registration between the EM coordinates and the 3D model coordinates. The
controller 120 may determine and adjust the points on the 3D model based on correlation between the 3D model itself, image data received from the imaging device (e.g., cameras) on the tip and robot data from robot commands (e.g., provided to the robotic arms of the surgical robotic system 100). Thecontroller 120 uses the 3D transformation between these points and collected EM data points to determine the initial registration of the EM coordinate system to the 3D model coordinate system. After registering EM data with the 3D model, the data points derived from EM data fall along the planned navigation path derived from the 3D model, and each data point among the data points reflects a measurement of the position of endoscope tip in the coordinate system of the 3D model. -
FIGS. 6A-C illustrate example surgical procedures using an endoscope, e.g.,endoscope 118 shown inFIG. 3A .FIG. 6A illustrates the distal end of theendoscope 118 within ananatomical lumen 602 according to one embodiment. Theendoscope 118 includes asheath 311 and navigates through theanatomical lumen 602 inside a patient toward anoperative site 603 for a surgical procedure. -
FIG. 6B illustrates theendoscope 118 shown inFIG. 6A in use at theoperative site 603 according to one embodiment. After reaching theoperative site 603, theendoscope 118 extends adistal leader section 317, longitudinally aligned with thesheath 311, in the direction marked byarrow 605. The endoscope can also articulate thedistal leader section 317 to direct surgical tools toward theoperative site 603. -
FIG. 6C illustrates theendoscope 118 shown inFIG. 6B with an aspiration needle 1007 according to one embodiment. In cases where theoperative site 603 includes a lesion for biopsy, thedistal leader section 317 articulates in the direction marked byarrow 606 to convey the aspiration needle 1007 to target the lesion. - In some embodiments, the
distal leader section 317 is integrated with the sheath 311 (not shown inFIG. 6 ). Thedistal leader section 317 navigates with thesheath 311 through theanatomical lumen 602 inside a patient toward anoperative site 603 for a surgical procedure. After reaching theoperative site 603, surgical tools can be directed to theoperative site 603 via thedistal leader section 317. - In some embodiments, the
distal leader section 317 can be deployed through a working channel that is off-axis (neutral axis) of thesheath 311, which allows thedistal leader section 317 to operate without obscuring an image sensor (not shown inFIG. 6 ) coupled to the end of the sheath 311 (or any other location of the endoscope 118). This arrangement allows the image sensor to capture images inside the anatomical lumen while theendoscope 118 articulates thedistal leader section 317 and keeps thesheath 311 stationary. - The construction, composition, capabilities, and use of
distal leader section 317, which may also be referred to as a flexure section, are disclosed in U.S. patent application Ser. No. 14/201,610, filed Mar. 7, 2014, and U.S. patent application Ser. No. 14/479,095, filed Sep. 5, 2014, the entire contents of which are incorporated by reference. - As introduced above, endolumenal buckling is a phenomenon whereby a flexible instrument (e.g., endoscope) navigated within anatomical lumens towards an operative site or a surgical site prolapses in an undesired direction within the anatomical lumen in response to an insertion force.
-
FIGS. 7A and 7B illustrate an example of endolumenal buckling occurring when an endoscope is inserted into a patient'slung 700 to anoperative site 710. Theendoscope 118 is inserted into a patient's mouth, down the patient's trachea, and into the patent'slung 700. As shown inFIG. 7A , the endoscope bends normally towards theoperative site 710 located in a left upper lobe of thelung 700. Thesheath 740 of the endoscope is navigated to the left main bronchus first, and then theleader 730 is navigating in tertiary bronchi towards theoperative site 710. As shown inFIG. 7B , as theleader 730 is navigating towards theoperative site 710, a distal leader section of theleader 730 gets stuck or blocked and therefore does not move forward. As more insertion force is applied, a portion of the endoscope buckles 720 rather than to forcing the leader further. - Improper placement of the
sheath 740 relative to theoperative site 710 may also result in undesirable buckling of the endoscope. For example, if thesheath 740 is inserted and advanced only to the trachea, theleader 730 will not be supported when attempting to insert into the upper lobe of patient'slung 700 in order to reach theoperative site 710. In this example, the insertion force on thesheath 740 is directed “downward”, i.e., towards the lower lobes of the patient'slung 700, in the opposite direction of the upper lobes, where theoperative site 710 is located. In contrast, when thesheath 740 is positioned deeper into the lung, i.e, closer to the operative site, so thesheath 740 is directed in a more “upward” position, or at least a more “neutral” position, the insertion force vector on theleader 730 is may be more aligned with the direction of theoperative site 710. In the latter example, greater insertion may be achieved with lower amounts of insertion force applied to thesheath 740, in addition to a reduction in prolapsing or buckling by theleader 730. - II.A. Detecting Endolumenal Buckling within a Patient Lumen
- Endolumenal buckling may occur in a variety of ways. For example, the tip of the leader of the endoscope may become stuck or nearly stuck, and a portion of the leader or sheath may bends with a great amount of curvature as the endoscope is further inserted into the patient. The bucked portion stores potential energy and generates an opposing force that attempts to push the endoscope backward.
- Accordingly, there are a number of regions of interest where it may be advantageous to place sensors to detect buckling. As an example, three main regions of arbitrary “size” can be defined. A first region may cover the volume near the tip of the leader. A second region covers a portion of the leader in a range from an end of the sheath within the patient to the edge of the first region. A third region may cover the end of the sheath where the leader extends from as well as the portion of the sheath proximal to its end (also referred to as the distal sheath section).
- For each sensor region, one or more sensors can be placed in any one of several locations. Examples of sensor locations include outer surface of the sheath or the leader, walls of the sheath or the leader, inner surface of sheath's lumen, inner surface of conduits of the leader or the sheath, one or more locations on pull wires of the leader or the sheath, another suitable location within the sensor region to place sensors, or some combination thereof.
-
FIGS. 8A-B illustrate examples of sensor regions used to place sensors according to one embodiment.FIG. 8A shows theleader 730 bends normally towards theoperative site 710 at time T=T 1 860A, andFIG. 8B shows theleader 730 buckles when theleader 730 is inserted more at time T=T 2 860B.T 1 860A andT 2 860B are consecutive, or are separated with a time interval. As shown inFIGS. 8A and 8B , a region of interest (ROI) 810 is selected and zoomed in. TheROI 810 includes theleader 730 and a portion of thesheath 740. The zoomed-in ROIs without lung structures are shown at bottom ofFIG. 8A andFIG. 8B , respectively.Sensor region A 820 includes the tip of theleader 730 and a small portion proximal to the tip. Thesensor region B 830 covers a portion of theleader 730 in the range from the end of thesheath 740 within the patient to the tip of theleader 730. Thesensor region C 840 includes the end of the sheath and a small portion of the distal sheath section. - One or more different types of sensors can be placed in each sensor region. For example, one or more position sensors, one or more force sensors, one or more shape sensors or some combination thereof can be placed in each sensor region. Examples of types of sensors include a position sensor (e.g., EM sensor, optical sensor, accelerometer, gyroscope, magnetometer, another suitable type of sensor that detects motion, or some combination thereof), a force sensor (e.g., resistance sensor, pressure sensor, strain gauge, torque sensor, friction sensor, another suitable type of sensor that detects various types of forces, or some combination thereof), an image sensor (e.g., CCD, CMOS, NMOS, another suitable type of sensor that detects and conveys the information that constitutes an image, or some combination thereof), a shape sensor (e.g., optical fiber shape sensor, another suitable type of sensor that detects boundary, outline or surface of an object, or some combination thereof).
- Sensor data captured from one or more sensor regions can be compared with expected data (also referred to as historical data or reference data) to determine if buckling has occurred. The expected data describes data associated with various characteristics caused by a motion of the endoscope during a navigation. Examples of the expected data include data associated with various expected statuses caused by the motion of the endoscope, sensor data captured from one or more different sensor regions, different types of sensor data captured from the same sensor region, different types of sensor data captured from one or more different sensor regions, or some combination thereof. More specifically, expected data includes data associated with various possible states/statuses caused by the motion of the endoscope. Examples of expected statuses include expected position of the tip or distal end of the sheath, expected position of a portion of the leader or sheath, expected bending shape of the leader or sheath, expected force generated by the expected bending of the leader or sheath, expected force detected by the tip of the leader or sheath, or any other measurable or derivable quantity relating to the state of the endoscope which may include, but is not limited to, shape, distance, length, slope, gradient, curvature, angle, etc., or some combination thereof.
- The sensor data (also referred to measured data) collected from the sensors in the instrument during operation indicates a measured status based on an actual motion of the corresponding sensor regions where those sensors are placed. Examples of the measured statuses include a similar list of statuses as the list of expected statuses provided in the immediately previous paragraph. For example, sensor data collected from an imaging device on the tip (also referred to as optical flow data), or sensor data collected from an EM sensor located on the tip both can indicates a measured state (e.g., a position of the tip). In some embodiments, by comparing “endo view” with the sensor data, the surgical
robotic system 100 determines a measured status indicating a relative location of the tip within the patient. When the measured status indicated by the sensor data does not match or correlate to the expected status indicated by the expected data, thesurgical robotics system 100 determines that endolumenal buckling has occurred. Examples are further described in Section II.A.1. - Sensor data captured from one or more sensor regions can be compared with sensor data from the same and/or different sensor regions to determine if endolumenal buckling has occurred. For example, if sensor data captured from the one or more sensor regions indicates that the corresponding sensor regions of the endoscope have undergone a first status change (e.g., a status change indicating a force change in the first region), and sensor data from a different sensor region, or a different type of sensor data from the same sensor region indicates that the corresponding sensor region or sensor types has undergone a second status change (e.g., a status change indicating a force change in the third region, or a status change indicating that the tip has not moved in the first region), the
surgical robotics system 100 determines that endolumenal buckling has occurred. Examples are further described in Section II.A.2. - Generally, a status change indicates that some quantity measureable or derivable from the sensor data, which may include measured and expected sensor data, has changed one of more or less than a threshold, often measured over some period of time (e.g., T1 and T2). There are a number of different types of status changes.
- A first type of status change is a position change of some portion of the endoscope being less than a position threshold, representing a range of motion where the portion of the endoscope has not moved an appreciable distance, generally in response to an endoscope insertion command. A first example of the first type status change is where the tip of the leader or the end of the sheath within the patient has not moved or has moved less than a threshold amount in response to the command. For example, when an endoscope enters into an organ with a complex tubular network (e.g., a tubular network with variable bending, or with variable diameter), a certain insertion force is applied to the endoscope in order to move the endoscope to a target location. If the status change indicates that the tip of the leader or the end of the sheath within the patient has moved less than a threshold amount in response to the command, the
surgical robotics system 100 may determine that endolumenal buckling has occurred based or this status change alone, or in combination with other types of status change, as further described in Section II.A.2. A second example is where a portion of the leader or a portion of the sheath does not move to an expected position, in response to the command. A third example is where a portion of the sheath (e.g., the end of sheath, a distal sheath section) has been retracted in response to the command. - A second type of status change is a force change above a threshold in response to a command that is detected at the tip of the leader, a portion of the distal leader section, the end of sheath, a portion of the distal sheath section.
- A third type of status change identifies an unwanted motion, generally bending, along the leader or the sheath, generally in response to an endoscope insertion command. One example of the third type status change include a bending change (e.g., a slope change, a gradient change, a curvature change, etc.) among two or more points along the leader or the sheath equals or exceeds a bending threshold, representing a situation where the leader or the sheath has appreciably bent in an unexpected manner in response to the command. Another example of the third type status change include a distance change between two points along the leader or the sheath less than a distance threshold, representing a situation where the distance between the two points has been shortened unexpectedly, in response to the command. Another example of the third type of status change occurs in instances such as when navigating the endoscope through a turn in the patient's endolumenal network is such that bending is expected but where that bending does not occur along the section of the endoscope where it is expected to occur. Thus, a lack of a bending change as measured by sensors along some points of the endoscope may suggest that bending has instead occurred elsewhere along the endoscope.
- Although the above description describes the sensors as being associated with regions, this region association does not need to be explicitly made use of in the data processing system that uses the sensor data to determine whether buckling has occurred. In such an implementation, assignment of sensors to regions merely serves as a convenient way for distinguishing different sensors placed within the instrument, and in practice other differentiable characteristics may be used such as position along the sheath or leader, etc.
-
FIGS. 9A-9L illustrate examples of endolumenal buckling detection based on a comparison between measured status and expected status according to one embodiment. As discussed above, one or more different types of sensors can be placed in the same sensor region to detect endolumenal buckling. As shown inFIGS. 9A-9B , a sensor A, such as position or force sensor, is placed in the first sensor region (e.g., tip of the endoscope).FIGS. 9A-9B show a measured position A 915A and an expected position A 915B indicated by thesensor A 910. For example, in response to an insertion command to move the endoscope to an expected position A 915B, the endoscope is inserted to a measured position A 915A. Compared with the expected position A shown inFIG. 9B , the measured position A shown inFIG. 9A is still or has moved only slightly, thereby indicating that buckling has occurred. Similarly, a measured force inFIG. 9A (e.g., a friction force generated between the tip and the lung structure) may be greater than the expected force inFIG. 9B based on the command input, thereby indicating that buckling has occurred. - As shown in
FIGS. 9C-9D , a sensor C and a sensor D are placed in the second sensor region (e.g., a portion of the leader). In a first embodiment, both sensors C and D are position sensors. InFIG. 9C , in response to a command to move the second region to an expected positions C and D, the sensor C detects a measured position C and the sensor D detects a measured position D. The measured position C and measured position D are compared with the expected position C and the expected position D. The comparison indicates whether the measured positions (based on the raw data or some derivation thereof such as the distance between them) deviate from the expected positions more than a threshold (not matching) or less than a threshold (matching). If measured and expected match, the surgical robotics system determines that buckling has not occurred, and that it has occurred if they do not. Examples of derived parameters used for detecting buckling include a slope, a distance, curvature, a gradient, another suitable parameter derived from the two positions, or some combination thereof. - In a second embodiment, sensors C and D are force sensors. In response to a command to insert the endoscope having an expected forces A and B in the second region, the sensor C detects a measured force A (e.g., a first torque) and the sensor D detects a measured force B (e.g., a first torque) in
FIG. 9C . The measured force A and measured force B are compared with the expected force A and the expected force B. The comparison indicates whether the measured forces (based on the raw data or some derivation thereof) deviate from the expected forces more than a threshold (not matching) or less than a threshold (matching). If the measured and the expected match forces, the surgicalrobotic system 100 determines that buckling has not occurred, and that it has occurred if they do not. - In a third embodiment, the sensor C and the sensor D have different sensor types. For example, the sensor C is a position sensor and the sensor D is a force sensor. In response to a command to insert the endoscope having an expected position C and an expected force B in the second region, the sensor C detects a measured position C and the sensor D detects a measured force B. The measured position C is compared with the expected position C and the measured force B is compared with the expected force B. The comparisons indicate whether the measured position C deviates from expected position C more than a threshold (not matching) or less than a threshold (matching), and whether the measured force B deviates from the expected force B more than a threshold (not matching), or less than a threshold (matching). If the measured and the expected match, the surgical robotic system determines that buckling has not occurred, and that it has occurred if they do not
- As shown in
FIGS. 9E-9F , a sensor B is placed in the third sensor region (e.g., a portion of the distal sheath section). In response to a command to move the endoscope to an expected position E in the third region, the measured position E is compared with the expected position E shown inFIG. 9F . The measured position E shown inFIG. 9E has moved backward 960 indicating that the measured position E does not match the expected position E, the surgical robotic system determines buckling has occurred. The sensor B can also be a force sensor. For example, in response to a command to move the endoscope, the endoscope has an expected force C in the third region. The sensor B detects a measured force C (e.g., a friction between the third sensor region and the leader), and the measured force C is compared with the expected force C. The measured force is greater than the expected force C inFIG. 9F indicating that the measured force C does not match the expected C, the surgical robotic system determines that buckling has occurred. - The example embodiments illustrated in this section may be variously combined with each other to provide other possible sensor setups for an endoscope, as well as buckling detection processes that use the detection of status changes in more than region at a time to identify or verify that buckling has occurred. For example, expected vs. measured data from sensor A in the first sensor region A can be combined with expected vs. measured data from sensor B in the third sensor region as shown in
FIGS. 9G-H . Similar toFIGS. 9C-9D , the sensor C and the sensor D can have the same or different sensor types. - The shape of the leader (or sheath) can be detected using multiple position sensors as shown in
FIGS. 9I-9J or by a shape sensing optical fiber as shown inFIGS. 9K-9L . A shape sensing optical fiber may include a segment of a fiber Bragg grating (FBG). The FBG reflects certain wavelengths of light, while transmitting other wavelengths. The surgical robotics system generates reflection spectrum data based on the wavelengths of light reflected by the FBG. The system can analyze the reflection spectrum data to generate position and orientation data of the endoscope in two or three dimensional space. In particular, as the endoscope bends, the shape sensing optical fiber embedded inside also bends. The specific wavelengths of light reflected by the FBG changes based on the shape of the shape sensing optical fiber (e.g., a “straight” endoscope is in a different shape than a “curved” endoscope). Thus, the system can determine, for example, how many degrees the endoscope has bent in one or more directions (e.g., in response to commands from the surgical robotic system) by identifying differences in the reflection spectrum data. - Endolumenal bucking is detected based on a comparison between the measured shape and the expected shape as provided by the shape sensing optical sensor or the discrete sensors. A function can be used to estimate the shape of the leader (or sheath), e.g., linear (e.g., polynomial interpolation) or non-linear interpolations (e.g., spline interpolation), curve fitting based on one more fitting functions, linear or non-linear regression analysis, or some combination thereof.
- As shown in
FIGS. 9K-9L , a shape sensingoptical fiber 950 is placed along the leader (or sheath, not shown). For example, the shape sensing sensor can be placed in conduits with the pull wires inside the length of walls of the leader (or the sheath). The shape sensing sensor can be placed in the outside of conduits but inside the length of walls of the leader (or the sheath). -
FIG. 10 is a flowchart of ageneral process 1000 for detecting endolumenal buckling based on a comparison between measured status and expected status according to one embodiment. A controller of a surgical robotics system, for example, thecontroller 120 of thesurgical robotics system 100 shown inFIG. 1 , uses theprocess 1000 to detect endolumenal buckling. Theprocess 1000 may include different or additional steps than those described in conjunction withFIG. 10 in some embodiments, or perform steps in different orders than the order described in conjunction withFIG. 10 . - The
controller 120 receives 1010 sensor data generated from a first sensor placed in a portion of the endoscope located within a patient lumen, and the sensor data indicates a measured status based on an actual motion of the portion of the endoscope. The portion of the endoscope can be the three sensor regions mentioned above as shown inFIGS. 8A-8B . Examples are described inFIGS. 9A-9L . Thecontroller 120 receives 1020 expected data describing data associated with an expected status caused by an expected motion of the endoscope. In some embodiments, the expected data is robotic command data generated from an instrument device manipulator (IDM) physically coupled to the endo scope, where the robotic command data is configured to control the IDM to cause the portion of the endoscope to move within the patient towards an expected position. The robotic command data indicates the expected status based on the expected motion. Thecontroller 130 compares 1030 the measured status with the expected status. Responsive to the measured status deviating from the expected status more or less than an associated threshold, thecontroller 130 determines 1040 that the endoscope has buckled. In some embodiments, the threshold indicates a match between the measured status and the expected status. - In the prior section, buckling was described as being detected based on a difference between expected vs. measured behavior. This section describes how buckling can be detected on a change in endoscope state between two points in time, generally during the carrying out of a motion command by the endoscope (e.g., insertion).
-
FIGS. 11A-11H illustrate examples of endolumenal buckling detection based on before and after (or during) a command, according to one embodiment. Status change detection for each sensor region is similar to the examples described inFIGS. 9A-9H , with the exception that instead of using expected data and measured data to detect status change, measured data at two different points in time is used instead. - As a first example, as shown in
FIGS. 11A-B , asensor A 1125 is placed in a sensor region A 1120 (e.g., tip of the endoscope). At T=T1, thesensor A 1125 detects a measured status A (e.g., a position A, or a force A depending on sensor type of sensor A). At T=T2, thesensor A 1125 detects a measured status B (e.g., a position B, or a force B). If the measured status at T1 and T2 triggers one of the thresholds of one of the status changes (e.g., increase in force, insufficient change of position) for sensor A located near the tip, the system determines that buckling has occurred. - Although a status change can be sufficient to detect buckling, in some instances the identification of two or more status changes helps determine or verify that buckling has occurred. These detected status changes may originate from different sensors of the same or different type in the same or different regions. For example, if another sensor with different type (e.g., a force sensor) is placed in the
sensor region A 1120, if that other sensor also detects a corresponding status change, then it may be better determined or verified that buckling has occurred. - Similarly, one or more sensors, of the same sensor type, or of different sensor types can be placed in more than one sensor region to evaluate if the endoscope has undergone corresponding status changes associated with respective sensor region. By combining at least two status changes detected from different regions based on measured data at two different points in time, the system will have a better ability to detect buckling as it occurs.
FIGS. 11C-11H illustrate examples of two status changes being detected in two different regions. Examples include various combinations of sensors in region A, B, and C.FIGS. 11C and 11D illustrate detecting buckling based on status changes in regions A and B.FIGS. 11E and 11F illustrate detecting buckling based on status changes in regions A and C, andFIGS. 11G and 11H illustrate detecting buckling based on status changed in regions B and C. Although not shown, buckling may be detected based on status changes in all three regions. -
FIG. 12 is a flowchart of aprocess 1200 for detecting endolumenal buckling based on status changes indicated by sensor data according to one example embodiment. Theprocess 1200 may include different or additional steps than those described in conjunction withFIG. 12 in some embodiments, or perform steps in different orders than the order described in conjunction withFIG. 12 . - A
controller 120 of a surgical robotics system receives 1210 first sensor data generated from a first sensor placed in a portion of the endoscope located within a patient lumen, the first sensor data indicating motion of the portion of the endoscope. In some embodiments, the first sensor is located in one of the three sensor regions (e.g., sensor regions A-C). For example, the first sensor is located in the sensor region C. Examples of the first sensor include a position sensor (e.g., EM sensor), an image sensor, a force sensor, or a resistance sensor. - The
controller 120 receives 1220 second sensor data generated from a second sensor located at a distal tip of the endoscope, the second sensor data indicating motion of the distal tip of the endoscope. In some embodiments, the second sensor is an imaging device mounted on the distal tip (e.g., theimaging device 349 on theendoscope 118 inFIG. 3C ). The second sensor data (also referred to as optical flow data) is images captured the imaging device. As described in Section I.C.2., the second sensor data is used to estimate motion of the endoscope based on changes between a pair of images. - The
controller 120 evaluates 1230 the first sensor data to determine whether the portion of the endoscope has undergone a first status change (e.g., any type of status change mentioned above). Thecontroller 120 evaluates 1240 the second sensor data to determine whether the distal tip of the endoscope has undergone a second status change (e.g., the tip does not move). Responsive to determining that the first sensor data indicates that the distal portion of the endoscope has had the first status change and that the second sensor data indicates that the distal tip of the endoscope has had the second status change, thecontroller 120 determines 1250 the endoscope has buckled. - Buckling of the endoscope may occur outside a patient. For example, a buckling may occur along a proximal leader section between the leader base and sheath base.
FIGS. 13A-13F are examples of detecting buckling of an endoscope outside a patient according to one embodiment. As shown inFIG. 13A ,sensors 1340 are placed on bothleader base 1310 andsheath base 1320. Two sensors constitute a transmitter-receiver pair. For example, the transmitter transmits a light beam 1345 of infrared light or visible light, and the receiver coaxial with the transmitter or adjacent to the transmitter detects the light beam 1345. Thetransmitter 1340 is placed opposite to thereceiver 1343 as shown inFIG. 13A , or vice versa. - The
transmitter 1340 is placed around anexit 1315 of the proximal leader section 1330 on theleader base 1310 at adistance 1350 between the transmitter and the exit. The correspondingreceiver 1343 is placed around anentrance 1325 of the proximal leader section 1330 on thesheath base 1320 at the same distance between the receiver and theentrance 1325. Thedistance 1350 is within a threshold, representing a suitable distance range for detecting buckling. When buckling occurs, as shown inFIGS. 13D-13F , a buckled portion of the proximal leader section fully or partially blocks the light beam, and no light signal is detected by the receiver, or the light signal detected by the receiver is reduced accordingly. - The transmitter-receiver pair may be placed on the same side of the proximal leader section, as shown in
FIG. 13C . For example, the transmitter-receiver pair is placed around theexit 1315 and areflector 1360 is placed around theentrance 1325 to reflect a light beam transmitted from the transmitter to the corresponding receiver. As shown inFIG. 13C , thetransmitter 1340 is placed at adistance A 1350 and areceiver 1343 is placed at adistance B 1355. The distances A 1350 andB 1355 are within the threshold for detecting buckling. When buckling occurs, a buckled portion of the proximal leader section fully or partially block the light beam, and no light signal is detected by the receiver, or the light signal detected by the receiver is reduced accordingly. - More than one set of transmitter-receiver pairs may be used to detect buckling at different directions. For example, multiple transmitters are placed around the
exit 1315 between each transmitter and theexit 1315. The multiple transmitter-receiver pairs may be distributed to generate parallel light beams from each other, or they may be distributed to generate crossed light beams to better cover the cylindrical surface area around the endoscope. In some embodiments, the transmitted light beams are focused light, such as laser beams, how they may also be dispersed in nature and matched with receivers configured to receive the type of light emitted. -
FIG. 14 is a flowchart of aprocess 1400 for detecting buckling outside a patient based using transmitter-receiver pairs according to one embodiment. A controller of a surgical robotics system, for example, thecontroller 120 of thesurgical robotics system 100 shown inFIG. 1 , uses theprocess 1400 to detect buckling. Theprocess 1400 may include different or additional steps than those described in conjunction withFIG. 14 in some embodiments, or perform steps in different orders than the order described in conjunction withFIG. 14 . - The
controller 120 provides 1410 one or more commands from the surgicalrobotic system 100 to one or more actuators, for example, thesheath base 1320 andleader base 1310 shown inFIGS. 13A-13F , to move theendoscope 118 for a surgical procedure. - The
controller 120 receives receiver data generated from at least one transmitter-receiver pair placed along a length of the endoscope outside the patient, the transmitter-receiver pair configured to transmit a light beam from a transmitter to a receiver, the receiver data indicating whether the receiver has had received light beam transmitted from the transmitter. For example, the transmitter is placed on the sheath base and the receiver is placed on the leader base as shown inFIG. 13B andFIGS. 13D-13F . - Responsive to the receiver data indicating that the light from the transmitter has been blocked, the
controller 120 determines that the endoscope has buckled. - Rather than using optical sensors, in an alternate implementation one or more force sensors can be placed in a sensor region around an entrance on a sheath base to detect buckling outside the patient.
FIG. 15 illustrates another example of detecting buckling of an endoscope outside a patient according to one embodiment. As shown inFIG. 15 , thesensor region 1540 located around theconnection 1525 of theleader base 1520 is in contact with aproximal leader section 1530. When a buckling along the proximal leader section occurs, force between the sensor and contacted portion of the proximal leader section is increased. Sensors, include strain gauges or load cells in rigid connection with theproximal leader section 1530. Examples of strain configuration are described in U.S. application Ser. No. 14/542,403, filed on Nov. 14, 2014, published as U.S. Pat. Pub. No. US 2015/0119638, entitled “INSTRUMENT DEVICE MANIPULATOR WITH TENSION SENSING APPARATUS,” the full disclosure of which is incorporated herein by reference. - The
controller 120 generates feedback for a user indicating that the endoscope has buckled and provides the feedback to users. For example, thecontroller 120 generates a message or a warning indicating that the endoscope has buckled. This message or warning may be provided for display on a graphical user interface (GUI), for example one or more monitors being used by the operator to control the operation. Thecontroller 120 can also generate a recommendation to users. To do this, thecontroller 120 determines one or more modifications to a command to move the endoscope. The modification is based on at least in part on the sensor data. For example, thecontroller 120 may adjust the command to smooth the buckled portion of the endoscope. Examples of command include moving the endoscope backward, adjusting movement of the tip, adjusting insertion force provided by the IDM, another suitable command that adjusts endoscope's movements, stopping movement of the endoscope, or some combination thereof. - Although the above description is generally described with respect to examples that focus on the leader, endolumenal buckling may also occur along the sheath. Similar methods to those described above for the leader can also be applied to the sheath. For example, the first sensor region can be the tip of the endoscope or a small region around the end of the sheath. The second sensor region can be a portion of the sheath. The third sensor region may be omitted, or interpreted as another region along the sheath located further from the sheath tip than the second region.
- As mentioned earlier, a surgical
robotic system 100 uses one or morerobotic arms 102 to control anendoscope 118 in a patient for surgical procedures. The robotic arms apply an insertion force to insert and advance the endoscope to an operative site. As the endoscope is advanced, the force required to further advance the endoscope will change over time depending on a variety of factors including the location of the operative site, the path taken within the patient cavity to get there, the size of the endoscope, etc. Correspondingly, depending at least on the path chosen, the amount of force that may be safely applied without injuring the patient lumen will vary. For example, within a single lung network in a patient, a single force threshold limit that may be set to avoid injury is not applicable for all lobes. Generally the upper lobes need more insertion force than the lower lobes due to bending in the endoscope to enter those areas. As such, a dynamic force insertion threshold is needed to allow operations to be performed safely while still preventing the application of a level of force above that dynamic threshold. - As described herein, the surgical robotics system makes use of an adaptive insertion force threshold to regulate insertion force for different locations within a patient's lumen to avoid unsafe further insertion to the patient. The adaptive insertion force threshold is determined based on endoscopic data and patient data.
- The endoscopic data describes data associated with the endoscope during a navigation. Examples of the endoscopic data include a friction force between a sheath and a leader, a friction force between the sheath and internal anatomy, a friction force between the leader and the internal anatomy, a current location of the endoscope, a target location of the endoscope, insertion length of the sheath, insertion length of the leader, a distance between the sheath and the leader (e.g., a difference between the insertion length of the sheath and the insertion length of the leader, a distance between a distal end of the sheath and the tip of the endoscope), motion of the leader (e.g., translation, rotation, blending, etc.), motion of the sheath (e.g., translation, rotation, blending, etc.), motion of the tip (e.g., translation, rotation, deflection, etc.), a contact interaction between the tip and a portion of a tissue within a patient (e.g., contacting force), force on the leader within the patient, force on the sheath within the patient, force on the tip, another suitable data affecting movements of the endoscope, or some combination thereof.
- The endoscope data can be obtained from one or more sensors placed on the endoscope. For example, a position sensor or an image sensor on the tip of the endoscope can obtain a current location of the endoscope, and motions of the tip. A force sensor on the tip can obtain a contacting force between the tip and a portion of a tissue within a patient, or other types of force between the tip and contacting tissue (e.g., friction, pressure, etc.). One or more sensors of different sensor types (e.g., position sensor, force sensor, shape sensor, etc.) can be placed on a portion of leader or sheath to detect length, motions, or different types of force associated with the leader or the sheath. Examples are described in Section II. above.
- Patient data describes associated with a patient inserted by the endoscope. Examples of patent data include medical data (e.g., medical diagnosis, medical treatment, disease, medical history, other suitable medical data affecting navigation, or some combination thereof), general information (e.g., gender, age, habit, etc.), or some combination thereof. The patient data may be stored in a database included in and accessible by the robotic surgical system.
- As introduced above, the adaptive insertion force threshold is determined by a function associated with the endoscopic data and patient data. In a first embodiment, the adaptive insertion force threshold is determined based on a nonlinear function associated with a relationship among an insertion force threshold, endoscopic data and patient data. By inputting the endoscopic data and patient data, the function generates an insertion force threshold. In a second embodiment, the adaptive insertion force threshold is determined based on optimizing a metric. The metric accounts for an effect of applying an insertion force within a safety range. The safety range describes a range that the insertion force doesn't damage contacting tissues or organs within the patient. For example, an optimization function is used to find a maximum insertion force within the safety range. In a third embodiment, the insertion force threshold is determined based on a machine learning algorithm. For example, by historical endoscope data and patient data regarding prior similar operations may be passed as a training data set into a machine learning model, and various parameters for determining the insertion force threshold is generated. The parameters may be the same parameters as there are types of patient and endoscopic data introduced above, however additional or different parameters may also be used. In some embodiments, patient data can be used as constraints to functions in above embodiments. For example, if a patient has an asthma disease, the walls of airways become inflamed and oversensitive. Consequently, the force insertion threshold may be set to a lower value than it would be for a patient without asthma.
- The insertion force threshold may also be determined based on a look-up table. The look-up table includes data describing a plurality of insertion force thresholds having various characteristics. For example, the look-up table describes a plurality of insertion force thresholds associated with different endoscope's locations of a patient or of a group of patients. The look-up table may be obtained by statistical analysis of various endoscope data and various patient data, machine learning applied to various endoscope data and various patient data, data mining of various endoscope data and various patient data, or by any other suitable method. Various types of look-up tables may be stored by the surgical robotics system in different embodiments. Example types of look-up tables stored by the controller include: a probability distribution of a likelihood of insertion force thresholds relative to different locations of the endoscope, clusters of insertion force thresholds having different characteristics, or other suitable information (e.g., numbers, density, classification). In one example, the look-up table is obtained from application of patients having different characteristics (e.g., gender, age) by one or more robotic surgical systems. The look-up table may identify characteristics of insertion force thresholds obtained from a patient or from a threshold number or percentage of patients. In some embodiments, a look-up table is generated for each patient. Based on patient data and endoscopic data, an insertion force threshold can be determined. In some embodiments, a look-up table is generated for different types of patients.
-
FIGS. 16A-C illustrate examples of adaptive insertion force thresholds used at different locations of an endoscope with different patients according to an embodiment.FIG. 16A shows two examples of inserting an endoscope to an operative site. The first example shows the endoscope is inserted into anoperative site A 1610A located in the left upper lobe oflung 1600. The second example shows the endoscope is inserted into anoperative site B 1610B located in the right lower lobe of thelung 1600. As shown inFIG. 16A , the two examples have different endoscope data. For example, the two examples have different locations of the endoscope, different insertion lengths of the sheath 1630, different lengths of the leader 1620, different distances between the sheath 1630 and the leader 1620, different motions of the endoscope (e.g., theleader 1620A bends more than theleader 1620B), etc. Different endoscope data results in different insertion force thresholds. For example, the first example needs more insertion force to overcome a force (e.g., torque, friction) generated due to bending. Moreover, different patients may have different insertion force thresholds at the same operative site. - As shown in
FIGS. 16B-16C , the insertion force threshold to allow insertion of the endoscope while preventing injury may not be a value that can be precisely determined based on available data. Consequently, the system may instead determine an insertion force threshold with size determined based on any of the techniques described previously. An insertion force threshold region indicates a probability distribution (e.g., a cluster or density) of a likelihood of insertion force threshold being safe (i.e., not harming the patient) relative to a location of the endoscope (e.g., a location proximal to the operative site), or statistical data of insertion force threshold relative to the location of the endoscope. In some embodiments, the insertion force threshold region indicates a plurality of possible insertion force thresholds relative to a plurality of possible locations during a navigation to an operative site. -
FIGS. 16B-16C illustrateregion 1645A from afirst patient 1640 and an insertionforce threshold region 1655A from a second patient 1650, both associated withoperative site A 1610A, and similar insertion 1645B and 1655B for the first and second patients with respect to aforce threshold regions second operative site 1610B. These figures illustrate the possible differences between threshold regions between patients for similar operative sites and procedures, and also the variance between operative sites for similar procedures. - In some embodiments, the surgical robotic system actively determines the insertion force threshold during a navigation. In some embodiments, the insertion force thresholds may be pre-determined and tagged to different portions of a pre-operative model as part of a robotic pre-operative planning stage.
- The surgical robotic system compares the insertion force with the determined insertion force threshold. The insertion force can be detected by one or more force sensors coupled to a robotic arm of the surgical robotic system. When the insertion force is approaching the insertion force threshold within a predefined range or approaches the insertion force threshold, the surgical robotic system sends a visual and/or audio feedback to a user via the system GUI. For example, a warning indicating that the insertion force is very close to the insertion force threshold, or approaches the insertion force threshold. Different colors, such as green, yellow, and red, may be used to indicate relative distance to the insertion force threshold. In other embodiments, upon reaching the insertion force threshold, the surgical robotic system generates a recommendation to the user. To do this, the surgical robotic system determines one or more modifications to a command to insert the endoscope. The modification is based on at least in part on the endoscopic data and patient data. Examples of command includes ceasing one or more insertion forces from the surgical robotic system, reducing the insertion force, another suitable command that adjusts insertion force, or some combination thereof.
-
FIG. 17 is a flowchart of aprocess 1700 for inserting an endoscope using an adaptive insertion force threshold according to one embodiment. A controller of a surgical robotics system, for example, thecontroller 120 of thesurgical robotics system 100 shown inFIG. 1 , uses theprocess 1700 to insert the endoscope using the adaptive insertion force threshold. Theprocess 1700 may include different or additional steps than those described in conjunction withFIG. 17 in some embodiments, or perform steps in different orders than the order described in conjunction withFIG. 17 . - The
controller 120 receives 1710 endoscopic data from an endoscope of a robotic surgical system, the endoscope data based in part on a current location of the endoscope. For example, thecontroller 120 can obtain sensor data as endoscopic data from one or more sensors placed on the endoscope (e.g., sheath, leader, or tip). - The
controller 120accesses 1720 patient data associated with a patient, the patient data based in part on medical data associated with the patient. For example, thecontroller 120 can access a patient data database stored in the robotic surgical system. Thecontroller 120 can obtain the patient data by accessing one or more external databases via a network. - The
controller 120 determines 1730 an adaptive force insertion threshold based on the endoscopic data and the patient data. For example, thecontroller 120 determines the adaptive force insertion threshold based on one or more functions or models, a look-up table, or based on insertion force threshold region. - The
controller 120 receives 1740 an insertion force detected by one or more force sensors coupled to a robotic arm of the robotic surgical system, the insertion force applied by the arm to the endoscope. For example, one or more force sensors can be placed on one or more arm segments of the robotic arm, one or more joints of the robotic arm, a connection between the robotic arm with an IMD, other suitable location affecting movement of the robotic arm, or some combination thereof. - The
controller 120 compares 1750 the insertion force with the adaptive insertion force threshold. Responsive to the insertion force exceeding the adaptive force threshold, thecontroller 120 sends 1760 an endoscope command recommendation to the robotic surgical system. For example, if the insertion force exceeds the adaptive force threshold, thecontroller 120 sends a message or a warning indicating that the insertion force exceeds the insertion force threshold. Thecontroller 120 determines one or more modifications to a command to adjust the insertion force. - Upon reading this disclosure, those of skill in the art will appreciate still additional alternative structural and functional designs through the disclosed principles herein. Thus, while particular embodiments and applications have been illustrated and described, it is to be understood that the disclosed embodiments are not limited to the precise construction and components disclosed herein. Various modifications, changes and variations, which will be apparent to those skilled in the art, may be made in the arrangement, operation and details of the method and apparatus disclosed herein without departing from the spirit and scope defined in the appended claims.
- As used herein any reference to “one embodiment” or “an embodiment” means that a particular element, feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment. The appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment.
- Some embodiments may be described using the expression “coupled” and “connected” along with their derivatives. For example, some embodiments may be described using the term “coupled” to indicate that two or more elements are in direct physical or electrical contact. The term “coupled,” however, may also mean that two or more elements are not in direct contact with each other, but yet still co-operate or interact with each other. The embodiments are not limited in this context unless otherwise explicitly stated.
- As used herein, the terms “comprises,” “comprising,” “includes,” “including,” “has,” “having” or any other variation thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, article, or apparatus that comprises a list of elements is not necessarily limited to only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Further, unless expressly stated to the contrary, “or” refers to an inclusive or and not to an exclusive or. For example, a condition A or B is satisfied by any one of the following: A is true (or present) and B is false (or not present), A is false (or not present) and B is true (or present), and both A and B are true (or present).
- In addition, use of the “a” or “an” are employed to describe elements and components of the embodiments herein. This is done merely for convenience and to give a general sense of the invention. This description should be read to include one or at least one and the singular also includes the plural unless it is obvious that it is meant otherwise.
Claims (21)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US16/773,740 US20200268459A1 (en) | 2016-12-28 | 2020-01-27 | Flexible instrument insertion using an adaptive insertion force threshold |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US15/392,868 US10543048B2 (en) | 2016-12-28 | 2016-12-28 | Flexible instrument insertion using an adaptive insertion force threshold |
| US16/773,740 US20200268459A1 (en) | 2016-12-28 | 2020-01-27 | Flexible instrument insertion using an adaptive insertion force threshold |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/392,868 Continuation US10543048B2 (en) | 2016-12-28 | 2016-12-28 | Flexible instrument insertion using an adaptive insertion force threshold |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20200268459A1 true US20200268459A1 (en) | 2020-08-27 |
Family
ID=62625238
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/392,868 Active US10543048B2 (en) | 2016-12-28 | 2016-12-28 | Flexible instrument insertion using an adaptive insertion force threshold |
| US16/773,740 Abandoned US20200268459A1 (en) | 2016-12-28 | 2020-01-27 | Flexible instrument insertion using an adaptive insertion force threshold |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US15/392,868 Active US10543048B2 (en) | 2016-12-28 | 2016-12-28 | Flexible instrument insertion using an adaptive insertion force threshold |
Country Status (1)
| Country | Link |
|---|---|
| US (2) | US10543048B2 (en) |
Cited By (121)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10898277B2 (en) | 2018-03-28 | 2021-01-26 | Auris Health, Inc. | Systems and methods for registration of location sensors |
| US10898275B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Image-based airway analysis and mapping |
| US10898286B2 (en) * | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Path-based navigation of tubular networks |
| US10903725B2 (en) | 2016-04-29 | 2021-01-26 | Auris Health, Inc. | Compact height torque sensing articulation axis assembly |
| US10932691B2 (en) | 2016-01-26 | 2021-03-02 | Auris Health, Inc. | Surgical tools having electromagnetic tracking components |
| US10932861B2 (en) | 2016-01-14 | 2021-03-02 | Auris Health, Inc. | Electromagnetic tracking surgical system and method of controlling the same |
| US10959792B1 (en) | 2019-09-26 | 2021-03-30 | Auris Health, Inc. | Systems and methods for collision detection and avoidance |
| US11007021B2 (en) | 2013-03-15 | 2021-05-18 | Auris Health, Inc. | User interface for active drive apparatus with finite range of motion |
| US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
| US11026758B2 (en) | 2017-06-28 | 2021-06-08 | Auris Health, Inc. | Medical robotics systems implementing axis constraints during actuation of one or more motorized joints |
| US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
| US11051681B2 (en) | 2010-06-24 | 2021-07-06 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
| USD924410S1 (en) | 2018-01-17 | 2021-07-06 | Auris Health, Inc. | Instrument tower |
| US11109920B2 (en) | 2018-03-28 | 2021-09-07 | Auris Health, Inc. | Medical instruments with variable bending stiffness profiles |
| US11109928B2 (en) | 2019-06-28 | 2021-09-07 | Auris Health, Inc. | Medical instruments including wrists with hybrid redirect surfaces |
| US11129602B2 (en) | 2013-03-15 | 2021-09-28 | Auris Health, Inc. | Systems and methods for tracking robotically controlled medical instruments |
| USD932628S1 (en) | 2018-01-17 | 2021-10-05 | Auris Health, Inc. | Instrument cart |
| US11141048B2 (en) | 2015-06-26 | 2021-10-12 | Auris Health, Inc. | Automated endoscope calibration |
| US11147633B2 (en) | 2019-08-30 | 2021-10-19 | Auris Health, Inc. | Instrument image reliability systems and methods |
| US11160615B2 (en) | 2017-12-18 | 2021-11-02 | Auris Health, Inc. | Methods and systems for instrument tracking and navigation within luminal networks |
| US11179213B2 (en) | 2018-05-18 | 2021-11-23 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
| US11179212B2 (en) | 2018-09-26 | 2021-11-23 | Auris Health, Inc. | Articulating medical instruments |
| US11197728B2 (en) | 2018-09-17 | 2021-12-14 | Auris Health, Inc. | Systems and methods for concomitant medical procedures |
| US11202683B2 (en) | 2019-02-22 | 2021-12-21 | Auris Health, Inc. | Surgical platform with motorized arms for adjustable arm supports |
| US11207141B2 (en) | 2019-08-30 | 2021-12-28 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
| US11213363B2 (en) | 2013-03-14 | 2022-01-04 | Auris Health, Inc. | Catheter tension sensing |
| US11213356B2 (en) | 2010-09-17 | 2022-01-04 | Auris Health, Inc. | Systems and methods for positioning an elongate member inside a body |
| US11234780B2 (en) | 2019-09-10 | 2022-02-01 | Auris Health, Inc. | Systems and methods for kinematic optimization with shared robotic degrees-of-freedom |
| US11241203B2 (en) | 2013-03-13 | 2022-02-08 | Auris Health, Inc. | Reducing measurement sensor error |
| US11254009B2 (en) | 2018-12-20 | 2022-02-22 | Auris Health, Inc. | Systems and methods for robotic arm alignment and docking |
| US11278703B2 (en) | 2014-04-21 | 2022-03-22 | Auris Health, Inc. | Devices, systems, and methods for controlling active drive systems |
| US11280690B2 (en) | 2017-10-10 | 2022-03-22 | Auris Health, Inc. | Detection of undesirable forces on a robotic manipulator |
| US11282251B2 (en) | 2018-05-02 | 2022-03-22 | Covidien Lp | System and method for constructing virtual radial ultrasound images from CT data and performing a surgical navigation procedure using virtual ultrasound images |
| US11278357B2 (en) | 2017-06-23 | 2022-03-22 | Auris Health, Inc. | Robotic systems for determining an angular degree of freedom of a medical device in luminal networks |
| US11298195B2 (en) | 2019-12-31 | 2022-04-12 | Auris Health, Inc. | Anatomical feature identification and targeting |
| US11324554B2 (en) | 2016-04-08 | 2022-05-10 | Auris Health, Inc. | Floating electromagnetic field generator system and method of controlling the same |
| US11337602B2 (en) | 2016-12-28 | 2022-05-24 | Auris Health, Inc. | Endolumenal object sizing |
| US11344377B2 (en) | 2014-10-09 | 2022-05-31 | Auris Health, Inc. | Systems and methods for aligning an elongate member with an access site |
| US11350998B2 (en) | 2014-07-01 | 2022-06-07 | Auris Health, Inc. | Medical instrument having translatable spool |
| US11357586B2 (en) | 2020-06-30 | 2022-06-14 | Auris Health, Inc. | Systems and methods for saturated robotic movement |
| US11369386B2 (en) | 2019-06-27 | 2022-06-28 | Auris Health, Inc. | Systems and methods for a medical clip applier |
| US11376085B2 (en) | 2013-03-15 | 2022-07-05 | Auris Health, Inc. | Remote catheter manipulator |
| US11382650B2 (en) | 2015-10-30 | 2022-07-12 | Auris Health, Inc. | Object capture with a basket |
| US11403759B2 (en) | 2015-09-18 | 2022-08-02 | Auris Health, Inc. | Navigation of tubular networks |
| US11399905B2 (en) | 2018-06-28 | 2022-08-02 | Auris Health, Inc. | Medical systems incorporating pulley sharing |
| US11413428B2 (en) | 2013-03-15 | 2022-08-16 | Auris Health, Inc. | Catheter insertion system and method of fabrication |
| US11426095B2 (en) | 2013-03-15 | 2022-08-30 | Auris Health, Inc. | Flexible instrument localization from both remote and elongation sensors |
| US11432892B1 (en) * | 2021-03-02 | 2022-09-06 | Mazor Robotics Ltd. | Systems and methods for cutting an anatomical element |
| US11439419B2 (en) | 2019-12-31 | 2022-09-13 | Auris Health, Inc. | Advanced basket drive mode |
| US11452844B2 (en) | 2013-03-14 | 2022-09-27 | Auris Health, Inc. | Torque-based catheter articulation |
| US11464586B2 (en) | 2009-04-29 | 2022-10-11 | Auris Health, Inc. | Flexible and steerable elongate instruments with shape control and support elements |
| US11464591B2 (en) | 2015-11-30 | 2022-10-11 | Auris Health, Inc. | Robot-assisted driving systems and methods |
| US11472030B2 (en) | 2017-10-05 | 2022-10-18 | Auris Health, Inc. | Robotic system with indication of boundary for robotic arm |
| US11490782B2 (en) | 2017-03-31 | 2022-11-08 | Auris Health, Inc. | Robotic systems for navigation of luminal networks that compensate for physiological noise |
| US11497568B2 (en) | 2018-09-28 | 2022-11-15 | Auris Health, Inc. | Systems and methods for docking medical instruments |
| US11504195B2 (en) | 2013-03-15 | 2022-11-22 | Auris Health, Inc. | Active drive mechanism for simultaneous rotation and translation |
| US11503986B2 (en) | 2018-05-31 | 2022-11-22 | Auris Health, Inc. | Robotic systems and methods for navigation of luminal network that detect physiological noise |
| US11511079B2 (en) | 2014-07-01 | 2022-11-29 | Auris Health, Inc. | Apparatuses and methods for monitoring tendons of steerable catheters |
| US11510736B2 (en) | 2017-12-14 | 2022-11-29 | Auris Health, Inc. | System and method for estimating instrument location |
| US11517717B2 (en) | 2013-03-14 | 2022-12-06 | Auris Health, Inc. | Active drives for robotic catheter manipulators |
| US11529129B2 (en) | 2017-05-12 | 2022-12-20 | Auris Health, Inc. | Biopsy apparatus and system |
| US11534249B2 (en) | 2015-10-30 | 2022-12-27 | Auris Health, Inc. | Process for percutaneous operations |
| US11564759B2 (en) | 2016-08-31 | 2023-01-31 | Auris Health, Inc. | Length conservative surgical instrument |
| US11571229B2 (en) | 2015-10-30 | 2023-02-07 | Auris Health, Inc. | Basket apparatus |
| US11576738B2 (en) | 2018-10-08 | 2023-02-14 | Auris Health, Inc. | Systems and instruments for tissue sealing |
| USD978941S1 (en) | 2018-01-17 | 2023-02-21 | Auris Health, Inc. | Robotic arm |
| US11602372B2 (en) | 2019-12-31 | 2023-03-14 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
| US11617627B2 (en) | 2019-03-29 | 2023-04-04 | Auris Health, Inc. | Systems and methods for optical strain sensing in medical instruments |
| US11642242B2 (en) | 2013-08-13 | 2023-05-09 | Auris Health, Inc. | Method and apparatus for light energy assisted surgery |
| US11660147B2 (en) | 2019-12-31 | 2023-05-30 | Auris Health, Inc. | Alignment techniques for percutaneous access |
| US11660153B2 (en) | 2013-03-15 | 2023-05-30 | Auris Health, Inc. | Active drive mechanism with finite range of motion |
| US11666393B2 (en) | 2017-06-30 | 2023-06-06 | Auris Health, Inc. | Systems and methods for medical instrument compression compensation |
| US11690977B2 (en) | 2014-05-15 | 2023-07-04 | Auris Health, Inc. | Anti-buckling mechanisms for catheters |
| US11701192B2 (en) | 2016-08-26 | 2023-07-18 | Auris Health, Inc. | Steerable catheter with shaft load distributions |
| US11701492B2 (en) | 2020-06-04 | 2023-07-18 | Covidien Lp | Active distal tip drive |
| US11701783B2 (en) | 2017-10-10 | 2023-07-18 | Auris Health, Inc. | Surgical robotic arm admittance control |
| US11712154B2 (en) | 2016-09-30 | 2023-08-01 | Auris Health, Inc. | Automated calibration of surgical instruments with pull wires |
| US11712173B2 (en) | 2018-03-28 | 2023-08-01 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
| US11717147B2 (en) | 2019-08-15 | 2023-08-08 | Auris Health, Inc. | Medical device having multiple bending sections |
| US11723730B2 (en) | 2015-04-01 | 2023-08-15 | Auris Health, Inc. | Microsurgical tool for robotic applications |
| US11723636B2 (en) | 2013-03-08 | 2023-08-15 | Auris Health, Inc. | Method, apparatus, and system for facilitating bending of an instrument in a surgical or medical robotic environment |
| US11737835B2 (en) | 2019-10-29 | 2023-08-29 | Auris Health, Inc. | Braid-reinforced insulation sheath |
| US11737845B2 (en) | 2019-09-30 | 2023-08-29 | Auris Inc. | Medical instrument with a capstan |
| US11744670B2 (en) | 2018-01-17 | 2023-09-05 | Auris Health, Inc. | Surgical platform with adjustable arm supports |
| US11759605B2 (en) | 2014-07-01 | 2023-09-19 | Auris Health, Inc. | Tool and method for using surgical endoscope with spiral lumens |
| US11771521B2 (en) | 2015-09-09 | 2023-10-03 | Auris Health, Inc. | Instrument device manipulator with roll mechanism |
| US11771309B2 (en) | 2016-12-28 | 2023-10-03 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
| US11779400B2 (en) | 2018-08-07 | 2023-10-10 | Auris Health, Inc. | Combining strain-based shape sensing with catheter control |
| USD1004782S1 (en) | 2018-01-17 | 2023-11-14 | Auris Health, Inc. | Instrument handle |
| US11819636B2 (en) | 2015-03-30 | 2023-11-21 | Auris Health, Inc. | Endoscope pull wire electrical circuit |
| US11826117B2 (en) | 2018-06-07 | 2023-11-28 | Auris Health, Inc. | Robotic medical systems with high force instruments |
| US11839969B2 (en) | 2020-06-29 | 2023-12-12 | Auris Health, Inc. | Systems and methods for detecting contact between a link and an external object |
| US11850008B2 (en) | 2017-10-13 | 2023-12-26 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
| US11857277B2 (en) | 2019-02-08 | 2024-01-02 | Auris Health, Inc. | Robotically controlled clot manipulation and removal |
| US11864849B2 (en) | 2018-09-26 | 2024-01-09 | Auris Health, Inc. | Systems and instruments for suction and irrigation |
| US11864842B2 (en) | 2018-09-28 | 2024-01-09 | Auris Health, Inc. | Devices, systems, and methods for manually and robotically driving medical instruments |
| US11872007B2 (en) | 2019-06-28 | 2024-01-16 | Auris Health, Inc. | Console overlay and methods of using same |
| US11883121B2 (en) | 2004-03-05 | 2024-01-30 | Auris Health, Inc. | Robotic catheter system |
| US11896335B2 (en) | 2018-08-15 | 2024-02-13 | Auris Health, Inc. | Medical instruments for tissue cauterization |
| US11896330B2 (en) | 2019-08-15 | 2024-02-13 | Auris Health, Inc. | Robotic medical system having multiple medical instruments |
| US11925332B2 (en) | 2018-12-28 | 2024-03-12 | Auris Health, Inc. | Percutaneous sheath for robotic medical systems and methods |
| US11931901B2 (en) | 2020-06-30 | 2024-03-19 | Auris Health, Inc. | Robotic medical system with collision proximity indicators |
| USD1021103S1 (en) | 2018-01-17 | 2024-04-02 | Auris Health, Inc. | Controller |
| US11950872B2 (en) | 2019-12-31 | 2024-04-09 | Auris Health, Inc. | Dynamic pulley system |
| US11957446B2 (en) | 2017-12-08 | 2024-04-16 | Auris Health, Inc. | System and method for medical instrument navigation and targeting |
| US11974948B2 (en) | 2013-06-11 | 2024-05-07 | Auris Health, Inc. | Method, apparatus, and a system for robotic assisted surgery |
| US11986257B2 (en) | 2018-12-28 | 2024-05-21 | Auris Health, Inc. | Medical instrument with articulable segment |
| US12076100B2 (en) | 2018-09-28 | 2024-09-03 | Auris Health, Inc. | Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures |
| US12089912B2 (en) | 2013-03-15 | 2024-09-17 | Auris Health, Inc. | User input devices for controlling manipulation of guidewires and catheters |
| US12114838B2 (en) | 2018-08-24 | 2024-10-15 | Auris Health, Inc. | Manually and robotically controllable medical instruments |
| US12138003B2 (en) | 2019-06-25 | 2024-11-12 | Auris Health, Inc. | Medical instruments including wrists with hybrid redirect surfaces |
| US12251176B2 (en) | 2005-07-01 | 2025-03-18 | Auris Health, Inc. | Robotic catheter system and methods |
| US12256923B2 (en) | 2020-08-13 | 2025-03-25 | Covidien Lp | Endoluminal robotic systems and methods for suturing |
| US12303220B2 (en) | 2022-01-26 | 2025-05-20 | Covidien Lp | Autonomous endobronchial access with an EM guided catheter |
| US12324645B2 (en) | 2019-09-26 | 2025-06-10 | Auris Health, Inc. | Systems and methods for collision avoidance using object models |
| US12357409B2 (en) | 2019-11-21 | 2025-07-15 | Auris Health, Inc. | Systems and methods for draping a surgical system |
| US12364557B2 (en) | 2018-06-27 | 2025-07-22 | Auris Health, Inc. | Alignment and attachment systems for medical instruments |
| US12370002B2 (en) | 2020-03-30 | 2025-07-29 | Auris Health, Inc. | Workspace optimization for robotic surgery |
| US12383352B2 (en) | 2020-08-13 | 2025-08-12 | Covidien Lp | Endoluminal robotic (ELR) systems and methods |
| US12414686B2 (en) | 2020-03-30 | 2025-09-16 | Auris Health, Inc. | Endoscopic anatomical feature tracking |
| US12478444B2 (en) | 2019-03-21 | 2025-11-25 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for localization based on machine learning |
Families Citing this family (109)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8414505B1 (en) | 2001-02-15 | 2013-04-09 | Hansen Medical, Inc. | Catheter driver system |
| US12290277B2 (en) | 2007-01-02 | 2025-05-06 | Aquabeam, Llc | Tissue resection with pressure sensing |
| US9232959B2 (en) | 2007-01-02 | 2016-01-12 | Aquabeam, Llc | Multi fluid tissue resection methods and devices |
| EP3622910B1 (en) | 2008-03-06 | 2024-07-10 | AquaBeam LLC | Tissue ablation and cautery with optical energy carried in fluid stream |
| US20120191079A1 (en) | 2011-01-20 | 2012-07-26 | Hansen Medical, Inc. | System and method for endoluminal and translumenal therapy |
| US20130030363A1 (en) | 2011-07-29 | 2013-01-31 | Hansen Medical, Inc. | Systems and methods utilizing shape sensing fibers |
| US9387048B2 (en) | 2011-10-14 | 2016-07-12 | Intuitive Surgical Operations, Inc. | Catheter sensor systems |
| US9452276B2 (en) | 2011-10-14 | 2016-09-27 | Intuitive Surgical Operations, Inc. | Catheter with removable vision probe |
| US20130303944A1 (en) | 2012-05-14 | 2013-11-14 | Intuitive Surgical Operations, Inc. | Off-axis electromagnetic sensor |
| US10238837B2 (en) | 2011-10-14 | 2019-03-26 | Intuitive Surgical Operations, Inc. | Catheters with control modes for interchangeable probes |
| EP3351196A1 (en) | 2012-02-29 | 2018-07-25 | Procept Biorobotics Corporation | Automated image-guided tissue resection and treatment |
| US10383765B2 (en) | 2012-04-24 | 2019-08-20 | Auris Health, Inc. | Apparatus and method for a global coordinate system for use in robotic surgery |
| US20130317519A1 (en) | 2012-05-25 | 2013-11-28 | Hansen Medical, Inc. | Low friction instrument driver interface for robotic systems |
| US20140148673A1 (en) | 2012-11-28 | 2014-05-29 | Hansen Medical, Inc. | Method of anchoring pullwire directly articulatable region in catheter |
| US10231867B2 (en) | 2013-01-18 | 2019-03-19 | Auris Health, Inc. | Method, apparatus and system for a water jet |
| US9668814B2 (en) | 2013-03-07 | 2017-06-06 | Hansen Medical, Inc. | Infinitely rotatable tool with finite rotating drive shafts |
| US9566414B2 (en) | 2013-03-13 | 2017-02-14 | Hansen Medical, Inc. | Integrated catheter and guide wire controller |
| US9326822B2 (en) | 2013-03-14 | 2016-05-03 | Hansen Medical, Inc. | Active drives for robotic catheter manipulators |
| US9498601B2 (en) | 2013-03-14 | 2016-11-22 | Hansen Medical, Inc. | Catheter tension sensing |
| US9629595B2 (en) | 2013-03-15 | 2017-04-25 | Hansen Medical, Inc. | Systems and methods for localizing, tracking and/or controlling medical instruments |
| US9452018B2 (en) | 2013-03-15 | 2016-09-27 | Hansen Medical, Inc. | Rotational support for an elongate member |
| US9763741B2 (en) | 2013-10-24 | 2017-09-19 | Auris Surgical Robotics, Inc. | System for robotic-assisted endolumenal surgery and related methods |
| EP2923669B1 (en) | 2014-03-24 | 2017-06-28 | Hansen Medical, Inc. | Systems and devices for catheter driving instinctiveness |
| JP6689832B2 (en) | 2014-09-30 | 2020-04-28 | オーリス ヘルス インコーポレイテッド | Configurable robotic surgery system with provisional trajectory and flexible endoscope |
| WO2016164824A1 (en) | 2015-04-09 | 2016-10-13 | Auris Surgical Robotics, Inc. | Surgical system with configurable rail-mounted mechanical arms |
| US9622827B2 (en) | 2015-05-15 | 2017-04-18 | Auris Surgical Robotics, Inc. | Surgical robotics system |
| EP3337419B1 (en) * | 2015-08-19 | 2020-08-12 | Brainlab AG | Reference array holder |
| US11241559B2 (en) | 2016-08-29 | 2022-02-08 | Auris Health, Inc. | Active drive for guidewire manipulation |
| US10543048B2 (en) | 2016-12-28 | 2020-01-28 | Auris Health, Inc. | Flexible instrument insertion using an adaptive insertion force threshold |
| WO2018131134A1 (en) * | 2017-01-13 | 2018-07-19 | オリンパス株式会社 | Flexible tube insertion device |
| US10792466B2 (en) | 2017-03-28 | 2020-10-06 | Auris Health, Inc. | Shaft actuating handle |
| US10285574B2 (en) | 2017-04-07 | 2019-05-14 | Auris Health, Inc. | Superelastic medical instrument |
| JP7314052B2 (en) | 2017-04-07 | 2023-07-25 | オーリス ヘルス インコーポレイテッド | Patient introducer alignment |
| EP3624668A4 (en) | 2017-05-17 | 2021-05-26 | Auris Health, Inc. | EXCHANGEABLE WORK CHANNEL |
| WO2018220797A1 (en) * | 2017-06-01 | 2018-12-06 | オリンパス株式会社 | Flexible tube insertion assistance device and flexible tube insertion device |
| EP3644885B1 (en) | 2017-06-28 | 2023-10-11 | Auris Health, Inc. | Electromagnetic field generator alignment |
| EP4437999A3 (en) | 2017-06-28 | 2024-12-04 | Auris Health, Inc. | Instrument insertion compensation |
| CN118121324A (en) | 2017-06-28 | 2024-06-04 | 奥瑞斯健康公司 | System for detecting electromagnetic distortion |
| US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
| US11344372B2 (en) * | 2017-10-24 | 2022-05-31 | SpineGuard Vincennes | Robotic surgical system |
| FR3072559B1 (en) | 2017-10-24 | 2023-03-24 | Spineguard | MEDICAL SYSTEM COMPRISING A ROBOTIZED ARM AND A MEDICAL DEVICE INTENDED TO PENETRATE INTO AN ANATOMICAL STRUCTURE |
| JP7362610B2 (en) | 2017-12-06 | 2023-10-17 | オーリス ヘルス インコーポレイテッド | System and method for correcting uncommanded instrument rotation |
| WO2019113389A1 (en) | 2017-12-08 | 2019-06-13 | Auris Health, Inc. | Directed fluidics |
| US10470830B2 (en) | 2017-12-11 | 2019-11-12 | Auris Health, Inc. | Systems and methods for instrument based insertion architectures |
| WO2019143458A1 (en) | 2018-01-17 | 2019-07-25 | Auris Health, Inc. | Surgical robotics systems with improved robotic arms |
| JP7301884B2 (en) | 2018-02-13 | 2023-07-03 | オーリス ヘルス インコーポレイテッド | Systems and methods for driving medical instruments |
| CN108420538B (en) * | 2018-04-27 | 2020-08-25 | 微创(上海)医疗机器人有限公司 | Surgical Robot System |
| US10905499B2 (en) | 2018-05-30 | 2021-02-02 | Auris Health, Inc. | Systems and methods for location sensor-based branch prediction |
| US10667875B2 (en) | 2018-06-27 | 2020-06-02 | Auris Health, Inc. | Systems and techniques for providing multiple perspectives during medical procedures |
| CN110755152A (en) * | 2018-07-26 | 2020-02-07 | 赛诺微医疗科技(浙江)有限公司 | Microwave ablation catheter, manipulator for controlling microwave ablation catheter and manipulator control system |
| US10639114B2 (en) | 2018-08-17 | 2020-05-05 | Auris Health, Inc. | Bipolar medical instrument |
| JP7128963B2 (en) * | 2018-10-25 | 2022-08-31 | キヤノン ユーエスエイ,インコーポレイテッド | Medical device with reflow trap anchor and method of use thereof |
| WO2020131529A1 (en) | 2018-12-20 | 2020-06-25 | Auris Health, Inc. | Shielding for wristed instruments |
| US11161244B2 (en) * | 2019-01-22 | 2021-11-02 | Mitsubishi Electric Research Laboratories, Inc. | System and method for automatic error recovery in robotic assembly |
| CN113347938A (en) | 2019-01-25 | 2021-09-03 | 奥瑞斯健康公司 | Vascular sealer with heating and cooling capabilities |
| US10945904B2 (en) | 2019-03-08 | 2021-03-16 | Auris Health, Inc. | Tilt mechanisms for medical systems and applications |
| WO2020197671A1 (en) | 2019-03-22 | 2020-10-01 | Auris Health, Inc. | Systems and methods for aligning inputs on medical instruments |
| JP7116925B2 (en) * | 2019-03-22 | 2022-08-12 | 株式会社エビデント | Observation device operating method, observation device, and program |
| WO2020197625A1 (en) | 2019-03-25 | 2020-10-01 | Auris Health, Inc. | Systems and methods for medical stapling |
| KR20210149805A (en) * | 2019-04-08 | 2021-12-09 | 아우리스 헬스, 인코포레이티드 | Systems, Methods, and Workflows for Concurrent Procedures |
| WO2020263520A1 (en) | 2019-06-26 | 2020-12-30 | Auris Health, Inc. | Systems and methods for robotic arm alignment and docking |
| CN120678525A (en) * | 2019-07-15 | 2025-09-23 | 西门子医疗血管介入机器人公司 | Data capture and adaptive guidance for robotic surgery using elongated medical devices |
| USD975275S1 (en) | 2019-08-15 | 2023-01-10 | Auris Health, Inc. | Handle for a medical instrument |
| USD978348S1 (en) | 2019-08-15 | 2023-02-14 | Auris Health, Inc. | Drive device for a medical instrument |
| IL290896B2 (en) | 2019-08-30 | 2025-07-01 | Brainlab Ag | Image based motion control correction |
| JP7494290B2 (en) | 2019-09-03 | 2024-06-03 | オーリス ヘルス インコーポレイテッド | Electromagnetic Distortion Detection and Compensation |
| WO2021119207A1 (en) * | 2019-12-12 | 2021-06-17 | Intuitive Surgical Operations, Inc. | Control and feedback based on insertion force |
| US20240225542A1 (en) * | 2020-10-06 | 2024-07-11 | Helo Corp. | Personal Healthcare Device |
| CN113116522A (en) * | 2021-03-31 | 2021-07-16 | 常州朗合医疗器械有限公司 | Magnetic navigation trachea positioning robot |
| US20220361956A1 (en) * | 2021-05-03 | 2022-11-17 | Westface Medical Inc. | Guiding Medical Instruments During Medical Procedures |
| CN118251181A (en) | 2021-09-28 | 2024-06-25 | 脊柱防护公司 | Universal adapter for handheld surgical systems |
| EP4221613A1 (en) | 2021-09-29 | 2023-08-09 | Cilag GmbH International | Surgical system for altering the body's sensing of food |
| EP4408330A1 (en) | 2021-09-29 | 2024-08-07 | Cilag GmbH International | Coordinated instrument control systems |
| JP2024536178A (en) | 2021-09-29 | 2024-10-04 | シラグ・ゲーエムベーハー・インターナショナル | Surgical devices, systems and methods for controlling one visualization with another visualization - Patents.com |
| WO2023052938A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Methods and systems for controlling cooperative surgical instruments |
| EP4221630A1 (en) | 2021-09-29 | 2023-08-09 | Cilag GmbH International | Surgical devices, systems, and methods using multi-source imaging |
| US20230107857A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical sealing devices for a natural body orifice |
| JP2024536155A (en) | 2021-09-29 | 2024-10-04 | シラグ・ゲーエムベーハー・インターナショナル | Surgical system for independently ventilating two separate anatomical spaces - Patents.com |
| JP2024536176A (en) | 2021-09-29 | 2024-10-04 | シラグ・ゲーエムベーハー・インターナショナル | Surgical devices, systems and methods using multi-source imaging - Patents.com |
| US20230102358A1 (en) | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Surgical devices, systems, and methods using fiducial identification and tracking |
| WO2023052949A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods using fiducial identification and tracking |
| EP4384097A1 (en) | 2021-09-29 | 2024-06-19 | Cilag GmbH International | Surgical sealing devices for a natural body orifice |
| WO2023052930A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical systems with devices for both intraluminal and extraluminal access |
| WO2023052960A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods using fiducial identification and tracking |
| US12364545B2 (en) | 2021-09-29 | 2025-07-22 | Cilag Gmbh International | Surgical devices, systems, and methods using fiducial identification and tracking |
| EP4267015B1 (en) | 2021-09-29 | 2025-01-29 | Cilag GmbH International | Surgical anchoring systems for endoluminal access |
| US12137986B2 (en) | 2021-09-29 | 2024-11-12 | Cilag Gmbh International | Methods for controlling cooperative surgical instruments |
| WO2023052962A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Methods and systems for controlling cooperative surgical instruments |
| WO2023052951A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical systems with intraluminal and extraluminal cooperative instruments |
| EP4216846B1 (en) | 2021-09-29 | 2025-07-09 | Cilag GmbH International | Surgical systems with port devices for instrument control |
| WO2023052931A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical sealing systems for instrument stabilization |
| WO2023052936A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Systems for controlling cooperative surgical instruments with variable surgical site access trajectories |
| US20230098670A1 (en) | 2021-09-29 | 2023-03-30 | Cilag Gmbh International | Surgical devices, systems, and methods using multi-source imaging |
| WO2023052943A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods for control of one visualization with another |
| WO2023052941A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical system for delivering energy to tissue in an anatomic space and monitoring a tissue parameter in a different anatomic space |
| US12295667B2 (en) | 2021-09-29 | 2025-05-13 | Cilag Gmbh International | Surgical devices, systems, and methods using multi-source imaging |
| WO2023052953A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical systems and methods for selectively pressurizing a natural body lumen |
| US12290319B2 (en) | 2021-09-29 | 2025-05-06 | Cilag Gmbh International | Methods for controlling cooperative surgical instruments |
| WO2023052935A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods for control of one visualization with another |
| WO2023052955A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Instrument control surgical imaging systems |
| US12376910B2 (en) | 2021-09-29 | 2025-08-05 | Cilag Gmbh International | Methods for controlling cooperative surgical instruments |
| WO2023052934A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Methods and systems for controlling cooperative surgical instruments |
| US20230110791A1 (en) | 2021-09-29 | 2023-04-13 | Cilag Gmbh International | Instrument Control Imaging Systems for Visualization of Upcoming Surgical Procedure Steps |
| WO2023052947A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods for control of one visualization with another |
| WO2023052929A1 (en) | 2021-09-29 | 2023-04-06 | Cilag Gmbh International | Surgical devices, systems, and methods using multi-source imaging |
| US20230181275A1 (en) | 2021-12-15 | 2023-06-15 | Cilag Gmbh International | Robotic surgical instruments having onboard generators |
| CN115837114A (en) * | 2022-12-05 | 2023-03-24 | 深圳市爱博医疗机器人有限公司 | Intelligent force feedback guide wire control system and corresponding control method |
| DE102023129189A1 (en) * | 2023-10-24 | 2025-04-24 | Karl Storz Se & Co. Kg | Medical imaging method and medical imaging system |
| DE102023129187A1 (en) * | 2023-10-24 | 2025-04-24 | Karl Storz Se & Co. Kg | Medical imaging method and medical imaging system |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090137952A1 (en) * | 2007-08-14 | 2009-05-28 | Ramamurthy Bhaskar S | Robotic instrument systems and methods utilizing optical fiber sensor |
| US20110208000A1 (en) * | 2009-06-23 | 2011-08-25 | Olympus Medical Systems Corp. | Medical system |
| US20110319815A1 (en) * | 2010-06-24 | 2011-12-29 | Hansen Medical, Inc. | Fiber optic instrument sensing system |
| US20150297864A1 (en) * | 2014-04-21 | 2015-10-22 | Hansen Medical, Inc. | Devices, systems, and methods for controlling active drive systems |
| US20170281049A1 (en) * | 2014-12-19 | 2017-10-05 | Olympus Corporation | Insertion/removal supporting apparatus and insertion/removal supporting method |
Family Cites Families (270)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US2556601A (en) | 1947-02-10 | 1951-06-12 | Niles Bement Pond Co | Multiple tapping head |
| US2566183A (en) | 1947-05-29 | 1951-08-28 | Skilsaw Inc | Portable power-driven tool |
| US2730699A (en) | 1952-02-01 | 1956-01-10 | Gen Dynamics Corp | Telemetering system |
| US2884808A (en) | 1957-10-23 | 1959-05-05 | Mueller Co | Drive for drilling machine |
| US3294183A (en) | 1964-09-30 | 1966-12-27 | Black & Decker Mfg Co | Power driven tools |
| US3472083A (en) | 1967-10-25 | 1969-10-14 | Lawrence S Schnepel | Torque wrench |
| US3513724A (en) | 1968-07-17 | 1970-05-26 | Monogram Ind Inc | Speed reduction mechanism |
| US3595074A (en) | 1968-10-30 | 1971-07-27 | Clarence Johnson | Torque transducer |
| JPS5025234B1 (en) | 1970-02-20 | 1975-08-21 | ||
| JPS4921672Y1 (en) | 1970-08-21 | 1974-06-10 | ||
| US3734207A (en) | 1971-12-27 | 1973-05-22 | M Fishbein | Battery powered orthopedic cutting tool |
| US3921536A (en) | 1975-01-30 | 1975-11-25 | Hall Ski Lift Company Inc | Cable grip tester |
| DE2524605A1 (en) | 1975-06-03 | 1976-12-23 | Heinz Peter Dipl Brandstetter | DEVICE FOR MEASURING MECHANICAL WORK AND POWER |
| SE414272B (en) | 1978-10-17 | 1980-07-21 | Viggo Ab | CANNEL OR CATETER DEVICE |
| US4241884A (en) | 1979-03-20 | 1980-12-30 | George Lynch | Powered device for controlling the rotation of a reel |
| AT365363B (en) | 1979-09-20 | 1982-01-11 | Philips Nv | RECORDING AND / OR PLAYING DEVICE |
| CH643092A5 (en) | 1980-02-18 | 1984-05-15 | Gruenbaum Heinrich Leuzinger | DEVICE FOR MEASURING TORQUE EXTENDED BY AN ELECTRIC MOTOR. |
| US4357843A (en) | 1980-10-31 | 1982-11-09 | Peck-O-Matic, Inc. | Tong apparatus for threadedly connecting and disconnecting elongated members |
| JPS57144633A (en) | 1981-03-05 | 1982-09-07 | Inoue Japax Res Inc | Wire electrode feeder |
| US4507026A (en) | 1982-09-29 | 1985-03-26 | Boeing Aerospace Company | Depth control assembly |
| US4555960A (en) | 1983-03-23 | 1985-12-03 | Cae Electronics, Ltd. | Six degree of freedom hand controller |
| US4688555A (en) | 1986-04-25 | 1987-08-25 | Circon Corporation | Endoscope with cable compensating mechanism |
| US4784150A (en) | 1986-11-04 | 1988-11-15 | Research Corporation | Surgical retractor and blood flow monitor |
| US4745908A (en) | 1987-05-08 | 1988-05-24 | Circon Corporation | Inspection instrument fexible shaft having deflection compensation means |
| US4907168A (en) | 1988-01-11 | 1990-03-06 | Adolph Coors Company | Torque monitoring apparatus |
| US4857058A (en) | 1988-07-11 | 1989-08-15 | Payton Hugh W | Support patch for intravenous catheter |
| US4945790A (en) | 1989-08-07 | 1990-08-07 | Arthur Golden | Multi-purpose hand tool |
| US5350101A (en) | 1990-11-20 | 1994-09-27 | Interventional Technologies Inc. | Device for advancing a rotatable tube |
| US5234428A (en) | 1991-06-11 | 1993-08-10 | Kaufman David I | Disposable electrocautery/cutting instrument with integral continuous smoke evacuation |
| JPH05146975A (en) | 1991-11-26 | 1993-06-15 | Bridgestone Corp | Multi-shaft automatic nut runner |
| US5256150A (en) | 1991-12-13 | 1993-10-26 | Endovascular Technologies, Inc. | Large-diameter expandable sheath and method |
| US5207128A (en) | 1992-03-23 | 1993-05-04 | Weatherford-Petco, Inc. | Tong with floating jaws |
| US5709661A (en) | 1992-04-14 | 1998-01-20 | Endo Sonics Europe B.V. | Electronic catheter displacement sensor |
| GB2280343A (en) | 1993-07-08 | 1995-01-25 | Innovative Care Ltd | A laser targeting device for use with image intensifiers |
| US5524180A (en) | 1992-08-10 | 1996-06-04 | Computer Motion, Inc. | Automated endoscope system for optimal positioning |
| US5368564A (en) | 1992-12-23 | 1994-11-29 | Angeion Corporation | Steerable catheter |
| US5779623A (en) | 1993-10-08 | 1998-07-14 | Leonard Medical, Inc. | Positioner for medical instruments |
| US6154000A (en) | 1994-09-07 | 2000-11-28 | Omnitek Research & Development, Inc. | Apparatus for providing a controlled deflection and/or actuator apparatus |
| US5559294A (en) | 1994-09-15 | 1996-09-24 | Condux International, Inc. | Torque measuring device |
| DE19625850B4 (en) | 1995-06-27 | 2008-01-31 | Matsushita Electric Works, Ltd., Kadoma | planetary gear |
| US5855583A (en) | 1996-02-20 | 1999-01-05 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive cardiac procedures |
| US6436107B1 (en) | 1996-02-20 | 2002-08-20 | Computer Motion, Inc. | Method and apparatus for performing minimally invasive surgical procedures |
| US5792135A (en) | 1996-05-20 | 1998-08-11 | Intuitive Surgical, Inc. | Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity |
| US5767840A (en) | 1996-06-28 | 1998-06-16 | International Business Machines Corporation | Six-degrees-of-freedom movement sensor having strain gauge mechanical supports |
| DE19649082C1 (en) | 1996-11-27 | 1998-01-08 | Fraunhofer Ges Forschung | Remote control unit for implement with holder and two hexapods |
| US6331181B1 (en) | 1998-12-08 | 2001-12-18 | Intuitive Surgical, Inc. | Surgical robotic tools, data architecture, and use |
| US6272371B1 (en) | 1997-01-03 | 2001-08-07 | Biosense Inc. | Bend-responsive catheter |
| TW403051U (en) | 1997-05-29 | 2000-08-21 | Seiko Epson Corp | Recording medium of control program for printing device and recorded printing device |
| US6231565B1 (en) | 1997-06-18 | 2001-05-15 | United States Surgical Corporation | Robotic arm DLUs for performing surgical tasks |
| EP1015944B1 (en) | 1997-09-19 | 2013-02-27 | Massachusetts Institute Of Technology | Surgical robotic apparatus |
| US5921968A (en) | 1997-11-25 | 1999-07-13 | Merit Medical Systems, Inc. | Valve apparatus with adjustable quick-release mechanism |
| US20080177285A1 (en) | 1998-02-24 | 2008-07-24 | Hansen Medical, Inc. | Surgical instrument |
| IL123646A (en) | 1998-03-11 | 2010-05-31 | Refael Beyar | Remote control catheterization |
| US6171234B1 (en) | 1998-09-25 | 2001-01-09 | Scimed Life Systems, Inc. | Imaging gore loading tool |
| US6620173B2 (en) | 1998-12-08 | 2003-09-16 | Intuitive Surgical, Inc. | Method for introducing an end effector to a surgical site in minimally invasive surgery |
| US6394998B1 (en) | 1999-01-22 | 2002-05-28 | Intuitive Surgical, Inc. | Surgical tools for use in minimally invasive telesurgical applications |
| US6084371A (en) | 1999-02-19 | 2000-07-04 | Lockheed Martin Energy Research Corporation | Apparatus and methods for a human de-amplifier system |
| WO2000053077A2 (en) | 1999-03-07 | 2000-09-14 | Discure Ltd. | Method and apparatus for computerized surgery |
| US6289579B1 (en) | 1999-03-23 | 2001-09-18 | Motorola, Inc. | Component alignment and transfer apparatus |
| EP1206299A1 (en) | 1999-08-27 | 2002-05-22 | Wollschläger, Helmut | Device for handling a catheter |
| US8004229B2 (en) | 2005-05-19 | 2011-08-23 | Intuitive Surgical Operations, Inc. | Software center and highly configurable robotic systems for surgery and other uses |
| US6427783B2 (en) | 2000-01-12 | 2002-08-06 | Baker Hughes Incorporated | Steerable modular drilling assembly |
| WO2001051993A1 (en) | 2000-01-14 | 2001-07-19 | Advanced Micro Devices, Inc. | System, method and photomask for compensating aberrations in a photolithography patterning system |
| US6858005B2 (en) | 2000-04-03 | 2005-02-22 | Neo Guide Systems, Inc. | Tendon-driven endoscope and methods of insertion |
| DE10025285A1 (en) * | 2000-05-22 | 2001-12-06 | Siemens Ag | Fully automatic, robot-assisted camera guidance using position sensors for laparoscopic interventions |
| US20020100254A1 (en) | 2000-10-12 | 2002-08-01 | Dsd Communications, Inc. | System and method for targeted advertising and marketing |
| EP1199622B1 (en) | 2000-10-20 | 2007-12-12 | Deere & Company | Operating element |
| US6676557B2 (en) | 2001-01-23 | 2004-01-13 | Black & Decker Inc. | First stage clutch |
| US6487940B2 (en) | 2001-01-23 | 2002-12-03 | Associated Toolmakers Incorporated | Nut driver |
| US8414505B1 (en) | 2001-02-15 | 2013-04-09 | Hansen Medical, Inc. | Catheter driver system |
| AU2002244016A1 (en) | 2001-02-15 | 2002-10-03 | Cunningham, Robert | Flexible surgical instrument |
| US7766894B2 (en) | 2001-02-15 | 2010-08-03 | Hansen Medical, Inc. | Coaxial catheter system |
| US6612143B1 (en) | 2001-04-13 | 2003-09-02 | Orametrix, Inc. | Robot and method for bending orthodontic archwires and other medical devices |
| US6640412B2 (en) | 2001-04-26 | 2003-11-04 | Endovascular Technologies, Inc. | Method for loading a stent using a collapsing machine |
| US7635342B2 (en) | 2001-05-06 | 2009-12-22 | Stereotaxis, Inc. | System and methods for medical device advancement and rotation |
| ES2314062T3 (en) | 2001-05-06 | 2009-03-16 | Stereotaxis, Inc. | SYSTEM TO ADVANCE A CATETER. |
| US7766856B2 (en) | 2001-05-06 | 2010-08-03 | Stereotaxis, Inc. | System and methods for advancing a catheter |
| US20060199999A1 (en) | 2001-06-29 | 2006-09-07 | Intuitive Surgical Inc. | Cardiac tissue ablation instrument with flexible wrist |
| CA2351993C (en) | 2001-06-29 | 2003-02-18 | New World Technologie Inc. | Torque tool |
| US20040243147A1 (en) | 2001-07-03 | 2004-12-02 | Lipow Kenneth I. | Surgical robot and robotic controller |
| US7044936B2 (en) | 2002-08-21 | 2006-05-16 | Arrow International Inc. | Catheter connector with pivot lever spring latch |
| US7660623B2 (en) | 2003-01-30 | 2010-02-09 | Medtronic Navigation, Inc. | Six degree of freedom alignment display for medical procedures |
| EP1442720A1 (en) | 2003-01-31 | 2004-08-04 | Tre Esse Progettazione Biomedica S.r.l | Apparatus for the maneuvering of flexible catheters in the human cardiovascular system |
| US7246273B2 (en) | 2003-02-28 | 2007-07-17 | Sony Corporation | Method of, apparatus and graphical user interface for automatic diagnostics |
| US20050004579A1 (en) | 2003-06-27 | 2005-01-06 | Schneider M. Bret | Computer-assisted manipulation of catheters and guide wires |
| US9002518B2 (en) | 2003-06-30 | 2015-04-07 | Intuitive Surgical Operations, Inc. | Maximum torque driving of robotic surgical tools in robotic surgical systems |
| AU2004299000B8 (en) | 2003-12-11 | 2010-07-22 | Cook Medical Technologies Llc | Hemostatic valve assembly |
| US8287584B2 (en) | 2005-11-14 | 2012-10-16 | Sadra Medical, Inc. | Medical implant deployment tool |
| US7204168B2 (en) | 2004-02-25 | 2007-04-17 | The University Of Manitoba | Hand controller and wrist device |
| US8052636B2 (en) | 2004-03-05 | 2011-11-08 | Hansen Medical, Inc. | Robotic catheter system and methods |
| WO2005087128A1 (en) | 2004-03-05 | 2005-09-22 | Hansen Medical, Inc. | Robotic catheter system |
| DE102004020465B3 (en) | 2004-04-26 | 2005-09-01 | Aumann Gmbh | Wire tension regulator for winding machine has braking wheel which may be driven by electric motor and braked by disk brake applied by moving coil actuator |
| US7974674B2 (en) | 2004-05-28 | 2011-07-05 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic surgical system and method for surface modeling |
| IL162318A (en) | 2004-06-03 | 2011-07-31 | Tal Wenderow | Transmission for a remote catheterization system |
| US8005537B2 (en) | 2004-07-19 | 2011-08-23 | Hansen Medical, Inc. | Robotically controlled intravascular tissue injection system |
| US7314097B2 (en) | 2005-02-24 | 2008-01-01 | Black & Decker Inc. | Hammer drill with a mode changeover mechanism |
| US20060237205A1 (en) | 2005-04-21 | 2006-10-26 | Eastway Fair Company Limited | Mode selector mechanism for an impact driver |
| US7789874B2 (en) | 2005-05-03 | 2010-09-07 | Hansen Medical, Inc. | Support assembly for robotic catheter system |
| US8104479B2 (en) | 2005-06-23 | 2012-01-31 | Volcano Corporation | Pleated bag for interventional pullback systems |
| US20070005002A1 (en) | 2005-06-30 | 2007-01-04 | Intuitive Surgical Inc. | Robotic surgical instruments for irrigation, aspiration, and blowing |
| JP2009500086A (en) | 2005-07-01 | 2009-01-08 | ハンセン メディカル,インク. | Robotic guide catheter system |
| JP4763420B2 (en) | 2005-10-27 | 2011-08-31 | オリンパスメディカルシステムズ株式会社 | Endoscope operation assistance device |
| US20070149946A1 (en) | 2005-12-07 | 2007-06-28 | Viswanathan Raju R | Advancer system for coaxial medical devices |
| EP1962711B1 (en) | 2005-12-20 | 2012-02-29 | Intuitive Surgical Operations, Inc. | Instrument interface of a robotic surgical system |
| US9266239B2 (en) | 2005-12-27 | 2016-02-23 | Intuitive Surgical Operations, Inc. | Constraint based control in a minimally invasive surgical apparatus |
| JP4789000B2 (en) | 2006-02-16 | 2011-10-05 | Smc株式会社 | Automatic reduction ratio switching device |
| US9675375B2 (en) | 2006-03-29 | 2017-06-13 | Ethicon Llc | Ultrasonic surgical system and method |
| EP2177174B1 (en) | 2006-05-17 | 2013-07-24 | Hansen Medical, Inc. | Robotic instrument system |
| KR101477738B1 (en) | 2006-06-13 | 2014-12-31 | 인튜어티브 서지컬 인코포레이티드 | Minimally invasive surgical system |
| KR20090051029A (en) | 2006-06-14 | 2009-05-20 | 맥도널드 디트윌러 앤드 어소시에이츠 인코포레이티드 | Surgical manipulator with right-angle pulley drive mechanisms |
| US8303449B2 (en) | 2006-08-01 | 2012-11-06 | Techtronic Power Tools Technology Limited | Automatic transmission for a power tool |
| JP4755047B2 (en) | 2006-08-08 | 2011-08-24 | テルモ株式会社 | Working mechanism and manipulator |
| US7699809B2 (en) | 2006-12-14 | 2010-04-20 | Urmey William F | Catheter positioning system |
| US20080262480A1 (en) | 2007-02-15 | 2008-10-23 | Stahler Gregory J | Instrument assembly for robotic instrument system |
| US20080214925A1 (en) | 2007-03-01 | 2008-09-04 | Civco Medical Instruments Co., Inc. | Device for precision positioning of instruments at a mri scanner |
| US7695154B2 (en) | 2007-04-05 | 2010-04-13 | Dpm Associates, Llc | Illuminating footwear accessory |
| US20080262301A1 (en) | 2007-04-20 | 2008-10-23 | Wilson-Cook Medical Inc. | Steerable overtube |
| US8364312B2 (en) | 2007-06-06 | 2013-01-29 | Cycogs, Llc | Modular rotary multi-sensor sensor ring |
| US9301807B2 (en) | 2007-06-13 | 2016-04-05 | Intuitive Surgical Operations, Inc. | Surgical system counterbalance |
| US20090082722A1 (en) | 2007-08-21 | 2009-03-26 | Munger Gareth T | Remote navigation advancer devices and methods of use |
| US7998020B2 (en) | 2007-08-21 | 2011-08-16 | Stereotaxis, Inc. | Apparatus for selectively rotating and/or advancing an elongate device |
| AU2008291475B2 (en) | 2007-08-28 | 2014-02-06 | Marel A/S | Gripping device, for example for a robot |
| JP2009139187A (en) | 2007-12-05 | 2009-06-25 | Sumitomo Heavy Ind Ltd | Torque measuring device |
| US8473031B2 (en) | 2007-12-26 | 2013-06-25 | Intuitive Surgical Operations, Inc. | Medical robotic system with functionality to determine and display a distance indicated by movement of a tool robotically manipulated by an operator |
| US8708952B2 (en) | 2008-01-16 | 2014-04-29 | Catheter Robotics, Inc. | Remotely controlled catheter insertion system |
| US9179912B2 (en) | 2008-02-14 | 2015-11-10 | Ethicon Endo-Surgery, Inc. | Robotically-controlled motorized surgical cutting and fastening instrument |
| US8317745B2 (en) | 2008-03-27 | 2012-11-27 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic catheter rotatable device cartridge |
| US7938809B2 (en) | 2008-04-14 | 2011-05-10 | Merit Medical Systems, Inc. | Quick release hemostasis valve |
| EP3406291B8 (en) | 2008-05-06 | 2020-01-15 | Corindus, Inc. | Catheter system |
| US8006590B2 (en) | 2008-05-12 | 2011-08-30 | Longyear Tm, Inc. | Open-faced rod spinner |
| JP2010035768A (en) | 2008-08-04 | 2010-02-18 | Olympus Medical Systems Corp | Active drive type medical apparatus |
| JP2010046384A (en) | 2008-08-25 | 2010-03-04 | Terumo Corp | Medical manipulator and experimental device |
| US8390438B2 (en) | 2008-09-24 | 2013-03-05 | St. Jude Medical, Atrial Fibrillation Division, Inc. | Robotic catheter system including haptic feedback |
| US8720448B2 (en) | 2008-11-07 | 2014-05-13 | Hansen Medical, Inc. | Sterile interface apparatus |
| US8095223B2 (en) | 2008-11-26 | 2012-01-10 | B. Braun Medical, Inc. | Apparatus and method for inserting a catheter |
| US8602031B2 (en) | 2009-01-12 | 2013-12-10 | Hansen Medical, Inc. | Modular interfaces and drive actuation through barrier |
| ITBO20090004U1 (en) | 2009-02-11 | 2010-08-12 | Tre Esse Progettazione Biomedica S R L | ROBOTIC MANIPULATOR FOR DISTANCE MANEUVERING OF STEERABLE CATHETERS IN THE HUMAN CARDIOVASCULAR SYSTEM. |
| US8694129B2 (en) | 2009-02-13 | 2014-04-08 | Cardiac Pacemakers, Inc. | Deployable sensor platform on the lead system of an implantable device |
| CN102405022B (en) | 2009-03-14 | 2015-02-04 | 瓦索斯蒂奇股份有限公司 | Vessel access and closure device |
| EP2233103B1 (en) | 2009-03-26 | 2017-11-15 | W & H Dentalwerk Bürmoos GmbH | Medical, in particular dental handpiece |
| US10537713B2 (en) | 2009-05-25 | 2020-01-21 | Stereotaxis, Inc. | Remote manipulator device |
| WO2011005335A1 (en) | 2009-07-10 | 2011-01-13 | Tyco Healthcare Group Lp | Shaft constructions for medical devices with an articulating tip |
| US20110015484A1 (en) | 2009-07-16 | 2011-01-20 | Alvarez Jeffrey B | Endoscopic robotic catheter system |
| US20110015648A1 (en) | 2009-07-16 | 2011-01-20 | Hansen Medical, Inc. | Endoscopic robotic catheter system |
| US8277417B2 (en) | 2009-09-23 | 2012-10-02 | James J. Fedinec | Central venous catheter kit with line gripping and needle localizing devices |
| CN102612350B (en) | 2009-10-01 | 2015-11-25 | 马科外科公司 | Surgical systems used to place prosthetic components and/or limit movement of surgical tools |
| JP5770200B2 (en) | 2009-11-12 | 2015-08-26 | コーニンクレッカ フィリップス エヌ ヴェ | Steering system and catheter system |
| JP5750116B2 (en) | 2009-11-16 | 2015-07-15 | コーニンクレッカ フィリップス エヌ ヴェ | Human-Robot Shared Control for Endoscope-Assisted Robot |
| US8932211B2 (en) | 2012-06-22 | 2015-01-13 | Macroplata, Inc. | Floating, multi-lumen-catheter retractor system for a minimally-invasive, operative gastrointestinal treatment |
| DE102010031274B4 (en) | 2009-12-18 | 2023-06-22 | Robert Bosch Gmbh | Hand tool with gear cooling |
| US20110152880A1 (en) | 2009-12-23 | 2011-06-23 | Hansen Medical, Inc. | Flexible and steerable elongate instruments with torsion control |
| US8220688B2 (en) | 2009-12-24 | 2012-07-17 | Ethicon Endo-Surgery, Inc. | Motor-driven surgical cutting instrument with electric actuator directional control assembly |
| EP3659661B8 (en) | 2010-03-02 | 2025-02-12 | Siemens Healthineers Endovascular Robotics, Inc. | Robotic catheter system with variable drive mechanism |
| US9610133B2 (en) | 2010-03-16 | 2017-04-04 | Covidien Lp | Wireless laparoscopic camera |
| US9950139B2 (en) | 2010-05-14 | 2018-04-24 | C. R. Bard, Inc. | Catheter placement device including guidewire and catheter control elements |
| US8672837B2 (en) | 2010-06-24 | 2014-03-18 | Hansen Medical, Inc. | Methods and devices for controlling a shapeable medical device |
| US8961533B2 (en) | 2010-09-17 | 2015-02-24 | Hansen Medical, Inc. | Anti-buckling mechanisms and methods |
| EP2627278B1 (en) | 2010-10-11 | 2015-03-25 | Ecole Polytechnique Fédérale de Lausanne (EPFL) | Mechanical manipulator for surgical instruments |
| KR101894093B1 (en) | 2010-11-15 | 2018-08-31 | 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 | Decoupling instrument shaft roll and end effector actuation in a surgical instrument |
| DE102011003118A1 (en) | 2011-01-25 | 2012-07-26 | Krones Aktiengesellschaft | closing |
| DE102011011497A1 (en) | 2011-02-17 | 2012-08-23 | Kuka Roboter Gmbh | Surgical instrument |
| CA2833387A1 (en) | 2011-05-03 | 2012-11-08 | Shifamed Holdings, Llc | Steerable delivery sheaths |
| WO2013009252A2 (en) | 2011-07-11 | 2013-01-17 | Medical Vision Research & Development Ab | Status control for electrically powered surgical tool systems |
| JP5931497B2 (en) | 2011-08-04 | 2016-06-08 | オリンパス株式会社 | Surgery support apparatus and assembly method thereof |
| FR2979532B1 (en) | 2011-09-07 | 2015-02-20 | Robocath | MODULE AND METHOD FOR DRIVING LONG SOFT MEDICAL ORGANS AND ASSOCIATED ROBOTIC SYSTEM |
| WO2013040498A1 (en) | 2011-09-16 | 2013-03-21 | Translucent Medical, Inc. | System and method for virtually tracking a surgical tool on a movable display |
| WO2013043804A1 (en) | 2011-09-20 | 2013-03-28 | Corindus, Inc. | Catheter force measurement apparatus and method |
| US9504604B2 (en) | 2011-12-16 | 2016-11-29 | Auris Surgical Robotics, Inc. | Lithotripsy eye treatment |
| US20140142591A1 (en) | 2012-04-24 | 2014-05-22 | Auris Surgical Robotics, Inc. | Method, apparatus and a system for robotic assisted surgery |
| US10383765B2 (en) | 2012-04-24 | 2019-08-20 | Auris Health, Inc. | Apparatus and method for a global coordinate system for use in robotic surgery |
| DE102012207060A1 (en) | 2012-04-27 | 2013-10-31 | Deutsches Zentrum für Luft- und Raumfahrt e.V. | Robot assembly for use in medical fields |
| US20130317519A1 (en) | 2012-05-25 | 2013-11-28 | Hansen Medical, Inc. | Low friction instrument driver interface for robotic systems |
| JP2014004310A (en) | 2012-05-31 | 2014-01-16 | Canon Inc | Medical instrument |
| US9072536B2 (en) | 2012-06-28 | 2015-07-07 | Ethicon Endo-Surgery, Inc. | Differential locking arrangements for rotary powered surgical instruments |
| US8671817B1 (en) | 2012-11-28 | 2014-03-18 | Hansen Medical, Inc. | Braiding device for catheter having acuately varying pullwires |
| JP2014134530A (en) * | 2012-12-14 | 2014-07-24 | Panasonic Corp | Force measurement device, force measurement method, force measurement program, force measurement integrated electronic circuit and master-slave device |
| US10231867B2 (en) | 2013-01-18 | 2019-03-19 | Auris Health, Inc. | Method, apparatus and system for a water jet |
| DE102013002813B4 (en) | 2013-02-19 | 2017-11-09 | Rg Mechatronics Gmbh | Holding device with at least one jaw for a robotic surgical system |
| DE102013002818A1 (en) | 2013-02-19 | 2014-08-21 | Rg Mechatronics Gmbh | Holding device for a surgical instrument and a lock and method for operating a robot with such a holding device |
| US9668814B2 (en) | 2013-03-07 | 2017-06-06 | Hansen Medical, Inc. | Infinitely rotatable tool with finite rotating drive shafts |
| US10080576B2 (en) | 2013-03-08 | 2018-09-25 | Auris Health, Inc. | Method, apparatus, and a system for facilitating bending of an instrument in a surgical or medical robotic environment |
| US9867635B2 (en) | 2013-03-08 | 2018-01-16 | Auris Surgical Robotics, Inc. | Method, apparatus and system for a water jet |
| US10149720B2 (en) | 2013-03-08 | 2018-12-11 | Auris Health, Inc. | Method, apparatus, and a system for facilitating bending of an instrument in a surgical or medical robotic environment |
| US20140276389A1 (en) | 2013-03-13 | 2014-09-18 | Sean Walker | Selective grip device for drive mechanism |
| US20140277334A1 (en) | 2013-03-14 | 2014-09-18 | Hansen Medical, Inc. | Active drives for robotic catheter manipulators |
| US11213363B2 (en) | 2013-03-14 | 2022-01-04 | Auris Health, Inc. | Catheter tension sensing |
| US9173713B2 (en) | 2013-03-14 | 2015-11-03 | Hansen Medical, Inc. | Torque-based catheter articulation |
| US9326822B2 (en) | 2013-03-14 | 2016-05-03 | Hansen Medical, Inc. | Active drives for robotic catheter manipulators |
| US9498601B2 (en) | 2013-03-14 | 2016-11-22 | Hansen Medical, Inc. | Catheter tension sensing |
| US20140276936A1 (en) | 2013-03-15 | 2014-09-18 | Hansen Medical, Inc. | Active drive mechanism for simultaneous rotation and translation |
| US9452018B2 (en) | 2013-03-15 | 2016-09-27 | Hansen Medical, Inc. | Rotational support for an elongate member |
| US20140276394A1 (en) | 2013-03-15 | 2014-09-18 | Hansen Medical, Inc. | Input device for controlling a catheter |
| US20140276647A1 (en) | 2013-03-15 | 2014-09-18 | Hansen Medical, Inc. | Vascular remote catheter manipulator |
| US9408669B2 (en) | 2013-03-15 | 2016-08-09 | Hansen Medical, Inc. | Active drive mechanism with finite range of motion |
| US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
| US10744035B2 (en) | 2013-06-11 | 2020-08-18 | Auris Health, Inc. | Methods for robotic assisted cataract surgery |
| JP6037964B2 (en) | 2013-07-26 | 2016-12-07 | オリンパス株式会社 | Manipulator system |
| US10426661B2 (en) | 2013-08-13 | 2019-10-01 | Auris Health, Inc. | Method and apparatus for laser assisted cataract surgery |
| US9713509B2 (en) | 2013-10-24 | 2017-07-25 | Auris Surgical Robotics, Inc. | Instrument device manipulator with back-mounted tool attachment mechanism |
| US9763741B2 (en) | 2013-10-24 | 2017-09-19 | Auris Surgical Robotics, Inc. | System for robotic-assisted endolumenal surgery and related methods |
| CN103735313B (en) | 2013-12-11 | 2016-08-17 | 中国科学院深圳先进技术研究院 | A kind of operating robot and state monitoring method thereof |
| US9539020B2 (en) | 2013-12-27 | 2017-01-10 | Ethicon Endo-Surgery, Llc | Coupling features for ultrasonic surgical instrument |
| US9987094B2 (en) | 2014-02-07 | 2018-06-05 | Covidien Lp | Input device assemblies for robotic surgical systems |
| JP6664331B2 (en) | 2014-02-21 | 2020-03-13 | インテュイティブ サージカル オペレーションズ, インコーポレイテッド | Mechanical joints and related systems and methods |
| US10569052B2 (en) | 2014-05-15 | 2020-02-25 | Auris Health, Inc. | Anti-buckling mechanisms for catheters |
| US9561083B2 (en) | 2014-07-01 | 2017-02-07 | Auris Surgical Robotics, Inc. | Articulating flexible endoscopic tool with roll capabilities |
| US20170007337A1 (en) | 2014-07-01 | 2017-01-12 | Auris Surgical Robotics, Inc. | Driver-mounted torque sensing mechanism |
| US20160270865A1 (en) | 2014-07-01 | 2016-09-22 | Auris Surgical Robotics, Inc. | Reusable catheter with disposable balloon attachment and tapered tip |
| US9788910B2 (en) | 2014-07-01 | 2017-10-17 | Auris Surgical Robotics, Inc. | Instrument-mounted tension sensing mechanism for robotically-driven medical instruments |
| US10792464B2 (en) | 2014-07-01 | 2020-10-06 | Auris Health, Inc. | Tool and method for using surgical endoscope with spiral lumens |
| US10159533B2 (en) | 2014-07-01 | 2018-12-25 | Auris Health, Inc. | Surgical system with configurable rail-mounted mechanical arms |
| US9744335B2 (en) | 2014-07-01 | 2017-08-29 | Auris Surgical Robotics, Inc. | Apparatuses and methods for monitoring tendons of steerable catheters |
| JP6689832B2 (en) | 2014-09-30 | 2020-04-28 | オーリス ヘルス インコーポレイテッド | Configurable robotic surgery system with provisional trajectory and flexible endoscope |
| US10314463B2 (en) | 2014-10-24 | 2019-06-11 | Auris Health, Inc. | Automated endoscope calibration |
| DE102014222293A1 (en) | 2014-10-31 | 2016-05-19 | Siemens Aktiengesellschaft | Method for automatically monitoring the penetration behavior of a trocar held by a robot arm and monitoring system |
| US9949719B2 (en) * | 2014-12-16 | 2018-04-24 | General Electric Company | Breast imaging method and system |
| JP6342794B2 (en) | 2014-12-25 | 2018-06-13 | 新光電気工業株式会社 | Wiring board and method of manufacturing wiring board |
| JP6733660B2 (en) * | 2015-03-25 | 2020-08-05 | ソニー株式会社 | Medical support arm device |
| US20160287279A1 (en) | 2015-04-01 | 2016-10-06 | Auris Surgical Robotics, Inc. | Microsurgical tool for robotic applications |
| WO2016164824A1 (en) | 2015-04-09 | 2016-10-13 | Auris Surgical Robotics, Inc. | Surgical system with configurable rail-mounted mechanical arms |
| US9622827B2 (en) | 2015-05-15 | 2017-04-18 | Auris Surgical Robotics, Inc. | Surgical robotics system |
| JP6157792B2 (en) | 2015-06-01 | 2017-07-05 | オリンパス株式会社 | Medical manipulator |
| KR102569960B1 (en) | 2015-09-09 | 2023-08-24 | 아우리스 헬스, 인크. | Instrument device manipulator for a surgical robotics system |
| KR102661990B1 (en) | 2015-09-18 | 2024-05-02 | 아우리스 헬스, 인크. | Exploration of tubular networks |
| US10441371B2 (en) | 2015-10-02 | 2019-10-15 | Vanderbilt University | Concentric tube robot |
| US10639108B2 (en) | 2015-10-30 | 2020-05-05 | Auris Health, Inc. | Process for percutaneous operations |
| US9955986B2 (en) | 2015-10-30 | 2018-05-01 | Auris Surgical Robotics, Inc. | Basket apparatus |
| US9949749B2 (en) | 2015-10-30 | 2018-04-24 | Auris Surgical Robotics, Inc. | Object capture with a basket |
| CN113303915B (en) * | 2015-11-12 | 2024-04-12 | 柯惠Lp公司 | Robotic surgical system and method of monitoring applied force |
| CN105559850B (en) | 2015-12-17 | 2017-08-25 | 天津工业大学 | It is a kind of to be used for the surgical drill apparatus that robot assisted surgery has power sensing function |
| US10932861B2 (en) | 2016-01-14 | 2021-03-02 | Auris Health, Inc. | Electromagnetic tracking surgical system and method of controlling the same |
| US10932691B2 (en) | 2016-01-26 | 2021-03-02 | Auris Health, Inc. | Surgical tools having electromagnetic tracking components |
| US20200281665A1 (en) | 2016-03-04 | 2020-09-10 | Covidien Lp | Electromechanical surgical systems and robotic surgical instruments thereof |
| US11324554B2 (en) | 2016-04-08 | 2022-05-10 | Auris Health, Inc. | Floating electromagnetic field generator system and method of controlling the same |
| US10454347B2 (en) | 2016-04-29 | 2019-10-22 | Auris Health, Inc. | Compact height torque sensing articulation axis assembly |
| US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
| US10398517B2 (en) * | 2016-08-16 | 2019-09-03 | Ethicon Llc | Surgical tool positioning based on sensed parameters |
| US11241559B2 (en) | 2016-08-29 | 2022-02-08 | Auris Health, Inc. | Active drive for guidewire manipulation |
| EP3506836B1 (en) | 2016-08-31 | 2024-10-02 | Auris Health, Inc. | Length conservative surgical instrument |
| US9931025B1 (en) | 2016-09-30 | 2018-04-03 | Auris Surgical Robotics, Inc. | Automated calibration of endoscopes with pull wires |
| US10286556B2 (en) | 2016-10-16 | 2019-05-14 | The Boeing Company | Method and apparatus for compliant robotic end-effector |
| US10244926B2 (en) | 2016-12-28 | 2019-04-02 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
| US10136959B2 (en) | 2016-12-28 | 2018-11-27 | Auris Health, Inc. | Endolumenal object sizing |
| US10543048B2 (en) | 2016-12-28 | 2020-01-28 | Auris Health, Inc. | Flexible instrument insertion using an adaptive insertion force threshold |
| US10792466B2 (en) | 2017-03-28 | 2020-10-06 | Auris Health, Inc. | Shaft actuating handle |
| JP7282685B2 (en) | 2017-03-31 | 2023-05-29 | オーリス ヘルス インコーポレイテッド | A robotic system for navigation of luminal networks with compensation for physiological noise |
| JP7314052B2 (en) | 2017-04-07 | 2023-07-25 | オーリス ヘルス インコーポレイテッド | Patient introducer alignment |
| US10285574B2 (en) | 2017-04-07 | 2019-05-14 | Auris Health, Inc. | Superelastic medical instrument |
| EP3621520B1 (en) | 2017-05-12 | 2025-09-24 | Auris Health, Inc. | Biopsy apparatus and system |
| EP3624668A4 (en) | 2017-05-17 | 2021-05-26 | Auris Health, Inc. | EXCHANGEABLE WORK CHANNEL |
| US10022192B1 (en) | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
| EP3644885B1 (en) | 2017-06-28 | 2023-10-11 | Auris Health, Inc. | Electromagnetic field generator alignment |
| EP4437999A3 (en) | 2017-06-28 | 2024-12-04 | Auris Health, Inc. | Instrument insertion compensation |
| US11026758B2 (en) | 2017-06-28 | 2021-06-08 | Auris Health, Inc. | Medical robotics systems implementing axis constraints during actuation of one or more motorized joints |
| CN118121324A (en) | 2017-06-28 | 2024-06-04 | 奥瑞斯健康公司 | System for detecting electromagnetic distortion |
| US10426559B2 (en) | 2017-06-30 | 2019-10-01 | Auris Health, Inc. | Systems and methods for medical instrument compression compensation |
| US10464209B2 (en) | 2017-10-05 | 2019-11-05 | Auris Health, Inc. | Robotic system with indication of boundary for robotic arm |
| US10145747B1 (en) | 2017-10-10 | 2018-12-04 | Auris Health, Inc. | Detection of undesirable forces on a surgical robotic arm |
| US10016900B1 (en) | 2017-10-10 | 2018-07-10 | Auris Health, Inc. | Surgical robotic arm admittance control |
| US11058493B2 (en) | 2017-10-13 | 2021-07-13 | Auris Health, Inc. | Robotic system configured for navigation path tracing |
| US10555778B2 (en) | 2017-10-13 | 2020-02-11 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
| JP7362610B2 (en) | 2017-12-06 | 2023-10-17 | オーリス ヘルス インコーポレイテッド | System and method for correcting uncommanded instrument rotation |
| EP3684281B1 (en) | 2017-12-08 | 2025-03-12 | Auris Health, Inc. | System for medical instrument navigation and targeting |
| WO2019113389A1 (en) | 2017-12-08 | 2019-06-13 | Auris Health, Inc. | Directed fluidics |
| US10470830B2 (en) | 2017-12-11 | 2019-11-12 | Auris Health, Inc. | Systems and methods for instrument based insertion architectures |
| CN110869173B (en) | 2017-12-14 | 2023-11-17 | 奥瑞斯健康公司 | System and method for estimating instrument positioning |
| EP3684283A4 (en) | 2017-12-18 | 2021-07-14 | Auris Health, Inc. | METHODS AND SYSTEMS FOR MONITORING AND NAVIGATION OF INSTRUMENTS IN LUMINAL NETWORKS |
| WO2019143458A1 (en) | 2018-01-17 | 2019-07-25 | Auris Health, Inc. | Surgical robotics systems with improved robotic arms |
| KR102264368B1 (en) | 2018-01-17 | 2021-06-17 | 아우리스 헬스, 인코포레이티드 | Surgical platform with adjustable arm support |
| JP7301884B2 (en) | 2018-02-13 | 2023-07-03 | オーリス ヘルス インコーポレイテッド | Systems and methods for driving medical instruments |
-
2016
- 2016-12-28 US US15/392,868 patent/US10543048B2/en active Active
-
2020
- 2020-01-27 US US16/773,740 patent/US20200268459A1/en not_active Abandoned
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20090137952A1 (en) * | 2007-08-14 | 2009-05-28 | Ramamurthy Bhaskar S | Robotic instrument systems and methods utilizing optical fiber sensor |
| US20110208000A1 (en) * | 2009-06-23 | 2011-08-25 | Olympus Medical Systems Corp. | Medical system |
| US20110319815A1 (en) * | 2010-06-24 | 2011-12-29 | Hansen Medical, Inc. | Fiber optic instrument sensing system |
| US20150297864A1 (en) * | 2014-04-21 | 2015-10-22 | Hansen Medical, Inc. | Devices, systems, and methods for controlling active drive systems |
| US20170281049A1 (en) * | 2014-12-19 | 2017-10-05 | Olympus Corporation | Insertion/removal supporting apparatus and insertion/removal supporting method |
Cited By (190)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US11883121B2 (en) | 2004-03-05 | 2024-01-30 | Auris Health, Inc. | Robotic catheter system |
| US12251176B2 (en) | 2005-07-01 | 2025-03-18 | Auris Health, Inc. | Robotic catheter system and methods |
| US11464586B2 (en) | 2009-04-29 | 2022-10-11 | Auris Health, Inc. | Flexible and steerable elongate instruments with shape control and support elements |
| US11051681B2 (en) | 2010-06-24 | 2021-07-06 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
| US11857156B2 (en) | 2010-06-24 | 2024-01-02 | Auris Health, Inc. | Methods and devices for controlling a shapeable medical device |
| US12310669B2 (en) | 2010-09-17 | 2025-05-27 | Auris Health, Inc. | Systems and methods for positioning an elongate member inside a body |
| US11213356B2 (en) | 2010-09-17 | 2022-01-04 | Auris Health, Inc. | Systems and methods for positioning an elongate member inside a body |
| US11723636B2 (en) | 2013-03-08 | 2023-08-15 | Auris Health, Inc. | Method, apparatus, and system for facilitating bending of an instrument in a surgical or medical robotic environment |
| US11241203B2 (en) | 2013-03-13 | 2022-02-08 | Auris Health, Inc. | Reducing measurement sensor error |
| US12156755B2 (en) | 2013-03-13 | 2024-12-03 | Auris Health, Inc. | Reducing measurement sensor error |
| US12420063B2 (en) | 2013-03-14 | 2025-09-23 | Auris Health, Inc. | Torque-based catheter articulation |
| US11213363B2 (en) | 2013-03-14 | 2022-01-04 | Auris Health, Inc. | Catheter tension sensing |
| US11517717B2 (en) | 2013-03-14 | 2022-12-06 | Auris Health, Inc. | Active drives for robotic catheter manipulators |
| US11452844B2 (en) | 2013-03-14 | 2022-09-27 | Auris Health, Inc. | Torque-based catheter articulation |
| US11376085B2 (en) | 2013-03-15 | 2022-07-05 | Auris Health, Inc. | Remote catheter manipulator |
| US11007021B2 (en) | 2013-03-15 | 2021-05-18 | Auris Health, Inc. | User interface for active drive apparatus with finite range of motion |
| US11969157B2 (en) | 2013-03-15 | 2024-04-30 | Auris Health, Inc. | Systems and methods for tracking robotically controlled medical instruments |
| US12232711B2 (en) | 2013-03-15 | 2025-02-25 | Auris Health, Inc. | Systems and methods for tracking robotically controlled medical instruments |
| US11426095B2 (en) | 2013-03-15 | 2022-08-30 | Auris Health, Inc. | Flexible instrument localization from both remote and elongation sensors |
| US11413428B2 (en) | 2013-03-15 | 2022-08-16 | Auris Health, Inc. | Catheter insertion system and method of fabrication |
| US11504195B2 (en) | 2013-03-15 | 2022-11-22 | Auris Health, Inc. | Active drive mechanism for simultaneous rotation and translation |
| US11660153B2 (en) | 2013-03-15 | 2023-05-30 | Auris Health, Inc. | Active drive mechanism with finite range of motion |
| US12114943B2 (en) | 2013-03-15 | 2024-10-15 | Auris Health, Inc. | Remote catheter manipulator |
| US12089912B2 (en) | 2013-03-15 | 2024-09-17 | Auris Health, Inc. | User input devices for controlling manipulation of guidewires and catheters |
| US11129602B2 (en) | 2013-03-15 | 2021-09-28 | Auris Health, Inc. | Systems and methods for tracking robotically controlled medical instruments |
| US11020016B2 (en) | 2013-05-30 | 2021-06-01 | Auris Health, Inc. | System and method for displaying anatomy and devices on a movable display |
| US11974948B2 (en) | 2013-06-11 | 2024-05-07 | Auris Health, Inc. | Method, apparatus, and a system for robotic assisted surgery |
| US11642242B2 (en) | 2013-08-13 | 2023-05-09 | Auris Health, Inc. | Method and apparatus for light energy assisted surgery |
| US11278703B2 (en) | 2014-04-21 | 2022-03-22 | Auris Health, Inc. | Devices, systems, and methods for controlling active drive systems |
| US11690977B2 (en) | 2014-05-15 | 2023-07-04 | Auris Health, Inc. | Anti-buckling mechanisms for catheters |
| US12343483B2 (en) | 2014-05-15 | 2025-07-01 | Auris Health, Inc. | Anti-buckling mechanisms for catheters |
| US11350998B2 (en) | 2014-07-01 | 2022-06-07 | Auris Health, Inc. | Medical instrument having translatable spool |
| US11511079B2 (en) | 2014-07-01 | 2022-11-29 | Auris Health, Inc. | Apparatuses and methods for monitoring tendons of steerable catheters |
| US11759605B2 (en) | 2014-07-01 | 2023-09-19 | Auris Health, Inc. | Tool and method for using surgical endoscope with spiral lumens |
| US12447308B2 (en) | 2014-07-01 | 2025-10-21 | Auris Health, Inc. | Multiple-pull-wire robotic instrument articulation |
| US12220189B2 (en) | 2014-10-09 | 2025-02-11 | Auris Health, Inc. | Systems and methods for aligning an elongate member with an access site |
| US11344377B2 (en) | 2014-10-09 | 2022-05-31 | Auris Health, Inc. | Systems and methods for aligning an elongate member with an access site |
| US11819636B2 (en) | 2015-03-30 | 2023-11-21 | Auris Health, Inc. | Endoscope pull wire electrical circuit |
| US11723730B2 (en) | 2015-04-01 | 2023-08-15 | Auris Health, Inc. | Microsurgical tool for robotic applications |
| US11141048B2 (en) | 2015-06-26 | 2021-10-12 | Auris Health, Inc. | Automated endoscope calibration |
| US12075974B2 (en) | 2015-06-26 | 2024-09-03 | Auris Health, Inc. | Instrument calibration |
| US11771521B2 (en) | 2015-09-09 | 2023-10-03 | Auris Health, Inc. | Instrument device manipulator with roll mechanism |
| US11403759B2 (en) | 2015-09-18 | 2022-08-02 | Auris Health, Inc. | Navigation of tubular networks |
| US12089804B2 (en) | 2015-09-18 | 2024-09-17 | Auris Health, Inc. | Navigation of tubular networks |
| US11382650B2 (en) | 2015-10-30 | 2022-07-12 | Auris Health, Inc. | Object capture with a basket |
| US11559360B2 (en) | 2015-10-30 | 2023-01-24 | Auris Health, Inc. | Object removal through a percutaneous suction tube |
| US12433696B2 (en) | 2015-10-30 | 2025-10-07 | Auris Health, Inc. | Tool positioning for medical instruments with working channels |
| US11571229B2 (en) | 2015-10-30 | 2023-02-07 | Auris Health, Inc. | Basket apparatus |
| US11534249B2 (en) | 2015-10-30 | 2022-12-27 | Auris Health, Inc. | Process for percutaneous operations |
| US11464591B2 (en) | 2015-11-30 | 2022-10-11 | Auris Health, Inc. | Robot-assisted driving systems and methods |
| US10932861B2 (en) | 2016-01-14 | 2021-03-02 | Auris Health, Inc. | Electromagnetic tracking surgical system and method of controlling the same |
| US11911113B2 (en) | 2016-01-14 | 2024-02-27 | Auris Health, Inc. | Electromagnetic tracking surgical system and method of controlling the same |
| US12064229B2 (en) | 2016-01-26 | 2024-08-20 | Auris Health, Inc. | Surgical tools having electromagnetic tracking components |
| US10932691B2 (en) | 2016-01-26 | 2021-03-02 | Auris Health, Inc. | Surgical tools having electromagnetic tracking components |
| US11324554B2 (en) | 2016-04-08 | 2022-05-10 | Auris Health, Inc. | Floating electromagnetic field generator system and method of controlling the same |
| US12310673B2 (en) | 2016-04-08 | 2025-05-27 | Auris Health, Inc. | Floating electromagnetic field generator system and method of controlling the same |
| US10903725B2 (en) | 2016-04-29 | 2021-01-26 | Auris Health, Inc. | Compact height torque sensing articulation axis assembly |
| US11037464B2 (en) | 2016-07-21 | 2021-06-15 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
| US11676511B2 (en) | 2016-07-21 | 2023-06-13 | Auris Health, Inc. | System with emulator movement tracking for controlling medical devices |
| US12295692B2 (en) | 2016-08-26 | 2025-05-13 | Auris Health, Inc. | Steerable catheter with shaft load distributions |
| US11701192B2 (en) | 2016-08-26 | 2023-07-18 | Auris Health, Inc. | Steerable catheter with shaft load distributions |
| US11564759B2 (en) | 2016-08-31 | 2023-01-31 | Auris Health, Inc. | Length conservative surgical instrument |
| US11712154B2 (en) | 2016-09-30 | 2023-08-01 | Auris Health, Inc. | Automated calibration of surgical instruments with pull wires |
| US12290239B2 (en) | 2016-09-30 | 2025-05-06 | Auris Health, Inc. | Automated calibration of surgical instruments with pull wires |
| US11911011B2 (en) | 2016-12-28 | 2024-02-27 | Auris Health, Inc. | Endolumenal object sizing |
| US11337602B2 (en) | 2016-12-28 | 2022-05-24 | Auris Health, Inc. | Endolumenal object sizing |
| US11771309B2 (en) | 2016-12-28 | 2023-10-03 | Auris Health, Inc. | Detecting endolumenal buckling of flexible instruments |
| US11490782B2 (en) | 2017-03-31 | 2022-11-08 | Auris Health, Inc. | Robotic systems for navigation of luminal networks that compensate for physiological noise |
| US12053144B2 (en) | 2017-03-31 | 2024-08-06 | Auris Health, Inc. | Robotic systems for navigation of luminal networks that compensate for physiological noise |
| US11529129B2 (en) | 2017-05-12 | 2022-12-20 | Auris Health, Inc. | Biopsy apparatus and system |
| US11759266B2 (en) | 2017-06-23 | 2023-09-19 | Auris Health, Inc. | Robotic systems for determining a roll of a medical device in luminal networks |
| US12295672B2 (en) | 2017-06-23 | 2025-05-13 | Auris Health, Inc. | Robotic systems for determining a roll of a medical device in luminal networks |
| US11278357B2 (en) | 2017-06-23 | 2022-03-22 | Auris Health, Inc. | Robotic systems for determining an angular degree of freedom of a medical device in luminal networks |
| US11832907B2 (en) | 2017-06-28 | 2023-12-05 | Auris Health, Inc. | Medical robotics systems implementing axis constraints during actuation of one or more motorized joints |
| US11026758B2 (en) | 2017-06-28 | 2021-06-08 | Auris Health, Inc. | Medical robotics systems implementing axis constraints during actuation of one or more motorized joints |
| US12076098B2 (en) | 2017-06-30 | 2024-09-03 | Auris Health, Inc. | Systems and methods for medical instrument compression compensation |
| US11666393B2 (en) | 2017-06-30 | 2023-06-06 | Auris Health, Inc. | Systems and methods for medical instrument compression compensation |
| US11472030B2 (en) | 2017-10-05 | 2022-10-18 | Auris Health, Inc. | Robotic system with indication of boundary for robotic arm |
| US12145278B2 (en) | 2017-10-05 | 2024-11-19 | Auris Health, Inc. | Robotic system with indication of boundary for robotic arm |
| US11796410B2 (en) | 2017-10-10 | 2023-10-24 | Auris Health, Inc. | Robotic manipulator force determination |
| US11701783B2 (en) | 2017-10-10 | 2023-07-18 | Auris Health, Inc. | Surgical robotic arm admittance control |
| US11280690B2 (en) | 2017-10-10 | 2022-03-22 | Auris Health, Inc. | Detection of undesirable forces on a robotic manipulator |
| US11850008B2 (en) | 2017-10-13 | 2023-12-26 | Auris Health, Inc. | Image-based branch detection and mapping for navigation |
| US11957446B2 (en) | 2017-12-08 | 2024-04-16 | Auris Health, Inc. | System and method for medical instrument navigation and targeting |
| US11510736B2 (en) | 2017-12-14 | 2022-11-29 | Auris Health, Inc. | System and method for estimating instrument location |
| US11160615B2 (en) | 2017-12-18 | 2021-11-02 | Auris Health, Inc. | Methods and systems for instrument tracking and navigation within luminal networks |
| USD978941S1 (en) | 2018-01-17 | 2023-02-21 | Auris Health, Inc. | Robotic arm |
| USD1095845S1 (en) | 2018-01-17 | 2025-09-30 | Auris Health, Inc. | Instrument handle |
| USD932628S1 (en) | 2018-01-17 | 2021-10-05 | Auris Health, Inc. | Instrument cart |
| USD1021103S1 (en) | 2018-01-17 | 2024-04-02 | Auris Health, Inc. | Controller |
| US11744670B2 (en) | 2018-01-17 | 2023-09-05 | Auris Health, Inc. | Surgical platform with adjustable arm supports |
| US12310804B2 (en) | 2018-01-17 | 2025-05-27 | Auris Health Inc. | Surgical platform with adjustable arm supports |
| USD1004782S1 (en) | 2018-01-17 | 2023-11-14 | Auris Health, Inc. | Instrument handle |
| USD924410S1 (en) | 2018-01-17 | 2021-07-06 | Auris Health, Inc. | Instrument tower |
| USD994890S1 (en) | 2018-01-17 | 2023-08-08 | Auris Health, Inc. | Instrument tower |
| USD1069125S1 (en) | 2018-01-17 | 2025-04-01 | Auris Health, Inc. | Instrument cart |
| USD1015541S1 (en) | 2018-01-17 | 2024-02-20 | Auris Health, Inc. | Instrument handle |
| USD1094724S1 (en) | 2018-01-17 | 2025-09-23 | Auris Health, Inc. | Set of instrument cart arms |
| USD1069126S1 (en) | 2018-01-17 | 2025-04-01 | Auris Health, Inc. | Instrument tower |
| USD991459S1 (en) | 2018-01-17 | 2023-07-04 | Auris Health, Inc. | Instrument cart element |
| US11712173B2 (en) | 2018-03-28 | 2023-08-01 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
| US11950898B2 (en) | 2018-03-28 | 2024-04-09 | Auris Health, Inc. | Systems and methods for displaying estimated location of instrument |
| US12396808B2 (en) | 2018-03-28 | 2025-08-26 | Auris Health, Inc. | Medical instruments with variable bending stiffness profiles |
| US11576730B2 (en) | 2018-03-28 | 2023-02-14 | Auris Health, Inc. | Systems and methods for registration of location sensors |
| US10898277B2 (en) | 2018-03-28 | 2021-01-26 | Auris Health, Inc. | Systems and methods for registration of location sensors |
| US12226168B2 (en) | 2018-03-28 | 2025-02-18 | Auris Health, Inc. | Systems and methods for registration of location sensors |
| US11109920B2 (en) | 2018-03-28 | 2021-09-07 | Auris Health, Inc. | Medical instruments with variable bending stiffness profiles |
| US11282251B2 (en) | 2018-05-02 | 2022-03-22 | Covidien Lp | System and method for constructing virtual radial ultrasound images from CT data and performing a surgical navigation procedure using virtual ultrasound images |
| US11179213B2 (en) | 2018-05-18 | 2021-11-23 | Auris Health, Inc. | Controllers for robotically-enabled teleoperated systems |
| US11918316B2 (en) | 2018-05-18 | 2024-03-05 | Auris Health, Inc. | Controllers for robotically enabled teleoperated systems |
| US12453612B2 (en) | 2018-05-18 | 2025-10-28 | Auris Health, Inc. | Controllers for robotically enabled teleoperated systems |
| US12364552B2 (en) | 2018-05-31 | 2025-07-22 | Auris Health, Inc. | Path-based navigation of tubular networks |
| US11503986B2 (en) | 2018-05-31 | 2022-11-22 | Auris Health, Inc. | Robotic systems and methods for navigation of luminal network that detect physiological noise |
| US11864850B2 (en) | 2018-05-31 | 2024-01-09 | Auris Health, Inc. | Path-based navigation of tubular networks |
| US10898286B2 (en) * | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Path-based navigation of tubular networks |
| US11759090B2 (en) | 2018-05-31 | 2023-09-19 | Auris Health, Inc. | Image-based airway analysis and mapping |
| US10898275B2 (en) | 2018-05-31 | 2021-01-26 | Auris Health, Inc. | Image-based airway analysis and mapping |
| US11826117B2 (en) | 2018-06-07 | 2023-11-28 | Auris Health, Inc. | Robotic medical systems with high force instruments |
| US12364557B2 (en) | 2018-06-27 | 2025-07-22 | Auris Health, Inc. | Alignment and attachment systems for medical instruments |
| US11399905B2 (en) | 2018-06-28 | 2022-08-02 | Auris Health, Inc. | Medical systems incorporating pulley sharing |
| US12285229B2 (en) | 2018-06-28 | 2025-04-29 | Auris Health, Inc. | Medical systems incorporating pulley sharing |
| US12390286B2 (en) | 2018-08-07 | 2025-08-19 | Auris Health, Inc. | Instrument shape determination |
| US11779400B2 (en) | 2018-08-07 | 2023-10-10 | Auris Health, Inc. | Combining strain-based shape sensing with catheter control |
| US11896335B2 (en) | 2018-08-15 | 2024-02-13 | Auris Health, Inc. | Medical instruments for tissue cauterization |
| US12114838B2 (en) | 2018-08-24 | 2024-10-15 | Auris Health, Inc. | Manually and robotically controllable medical instruments |
| US11197728B2 (en) | 2018-09-17 | 2021-12-14 | Auris Health, Inc. | Systems and methods for concomitant medical procedures |
| US11779421B2 (en) | 2018-09-26 | 2023-10-10 | Auris Health, Inc. | Articulating medical instruments |
| US11864849B2 (en) | 2018-09-26 | 2024-01-09 | Auris Health, Inc. | Systems and instruments for suction and irrigation |
| US11179212B2 (en) | 2018-09-26 | 2021-11-23 | Auris Health, Inc. | Articulating medical instruments |
| US12076100B2 (en) | 2018-09-28 | 2024-09-03 | Auris Health, Inc. | Robotic systems and methods for concomitant endoscopic and percutaneous medical procedures |
| US11497568B2 (en) | 2018-09-28 | 2022-11-15 | Auris Health, Inc. | Systems and methods for docking medical instruments |
| US12226175B2 (en) | 2018-09-28 | 2025-02-18 | Auris Health, Inc. | Systems and methods for docking medical instruments |
| US11864842B2 (en) | 2018-09-28 | 2024-01-09 | Auris Health, Inc. | Devices, systems, and methods for manually and robotically driving medical instruments |
| US12376926B2 (en) | 2018-10-08 | 2025-08-05 | Cilag Gmbh International | Systems and instruments for tissue sealing |
| US11576738B2 (en) | 2018-10-08 | 2023-02-14 | Auris Health, Inc. | Systems and instruments for tissue sealing |
| US11801605B2 (en) | 2018-12-20 | 2023-10-31 | Auris Health, Inc. | Systems and methods for robotic arm alignment and docking |
| US11254009B2 (en) | 2018-12-20 | 2022-02-22 | Auris Health, Inc. | Systems and methods for robotic arm alignment and docking |
| US12157238B2 (en) | 2018-12-20 | 2024-12-03 | Auris Health, Inc. | Systems and methods for robotic arm alignment and docking |
| US11925332B2 (en) | 2018-12-28 | 2024-03-12 | Auris Health, Inc. | Percutaneous sheath for robotic medical systems and methods |
| US11986257B2 (en) | 2018-12-28 | 2024-05-21 | Auris Health, Inc. | Medical instrument with articulable segment |
| US11857277B2 (en) | 2019-02-08 | 2024-01-02 | Auris Health, Inc. | Robotically controlled clot manipulation and removal |
| US12472020B2 (en) | 2019-02-08 | 2025-11-18 | Auris Health, Inc. | Robotically controlled clot manipulation and removal |
| US11202683B2 (en) | 2019-02-22 | 2021-12-21 | Auris Health, Inc. | Surgical platform with motorized arms for adjustable arm supports |
| US12251178B2 (en) | 2019-02-22 | 2025-03-18 | Auris Health, Inc. | Surgical platform with motorized arms for adjustable arm supports |
| US12478444B2 (en) | 2019-03-21 | 2025-11-25 | The Board Of Trustees Of The Leland Stanford Junior University | Systems and methods for localization based on machine learning |
| US11617627B2 (en) | 2019-03-29 | 2023-04-04 | Auris Health, Inc. | Systems and methods for optical strain sensing in medical instruments |
| US12138003B2 (en) | 2019-06-25 | 2024-11-12 | Auris Health, Inc. | Medical instruments including wrists with hybrid redirect surfaces |
| US11369386B2 (en) | 2019-06-27 | 2022-06-28 | Auris Health, Inc. | Systems and methods for a medical clip applier |
| US11877754B2 (en) | 2019-06-27 | 2024-01-23 | Auris Health, Inc. | Systems and methods for a medical clip applier |
| US11957428B2 (en) | 2019-06-28 | 2024-04-16 | Auris Health, Inc. | Medical instruments including wrists with hybrid redirect surfaces |
| US11872007B2 (en) | 2019-06-28 | 2024-01-16 | Auris Health, Inc. | Console overlay and methods of using same |
| US12329485B2 (en) | 2019-06-28 | 2025-06-17 | Auris Health, Inc. | Console overlay and methods of using same |
| US11109928B2 (en) | 2019-06-28 | 2021-09-07 | Auris Health, Inc. | Medical instruments including wrists with hybrid redirect surfaces |
| US11896330B2 (en) | 2019-08-15 | 2024-02-13 | Auris Health, Inc. | Robotic medical system having multiple medical instruments |
| US11717147B2 (en) | 2019-08-15 | 2023-08-08 | Auris Health, Inc. | Medical device having multiple bending sections |
| US11944422B2 (en) | 2019-08-30 | 2024-04-02 | Auris Health, Inc. | Image reliability determination for instrument localization |
| US11207141B2 (en) | 2019-08-30 | 2021-12-28 | Auris Health, Inc. | Systems and methods for weight-based registration of location sensors |
| US11147633B2 (en) | 2019-08-30 | 2021-10-19 | Auris Health, Inc. | Instrument image reliability systems and methods |
| US11771510B2 (en) | 2019-09-10 | 2023-10-03 | Auris Health, Inc. | Systems and methods for kinematic optimization with shared robotic degrees-of-freedom |
| US11234780B2 (en) | 2019-09-10 | 2022-02-01 | Auris Health, Inc. | Systems and methods for kinematic optimization with shared robotic degrees-of-freedom |
| US12357405B2 (en) | 2019-09-10 | 2025-07-15 | Auris Health, Inc. | Systems and methods for kinematic optimization with shared robotic degrees-of-freedom |
| US12324645B2 (en) | 2019-09-26 | 2025-06-10 | Auris Health, Inc. | Systems and methods for collision avoidance using object models |
| US10959792B1 (en) | 2019-09-26 | 2021-03-30 | Auris Health, Inc. | Systems and methods for collision detection and avoidance |
| US11701187B2 (en) | 2019-09-26 | 2023-07-18 | Auris Health, Inc. | Systems and methods for collision detection and avoidance |
| US11737845B2 (en) | 2019-09-30 | 2023-08-29 | Auris Inc. | Medical instrument with a capstan |
| US11737835B2 (en) | 2019-10-29 | 2023-08-29 | Auris Health, Inc. | Braid-reinforced insulation sheath |
| US12357409B2 (en) | 2019-11-21 | 2025-07-15 | Auris Health, Inc. | Systems and methods for draping a surgical system |
| US11602372B2 (en) | 2019-12-31 | 2023-03-14 | Auris Health, Inc. | Alignment interfaces for percutaneous access |
| US12414823B2 (en) | 2019-12-31 | 2025-09-16 | Auris Health, Inc. | Anatomical feature tracking |
| US12465431B2 (en) | 2019-12-31 | 2025-11-11 | Auris Health, Inc. | Alignment techniques for percutaneous access |
| US12220150B2 (en) | 2019-12-31 | 2025-02-11 | Auris Health, Inc. | Aligning medical instruments to access anatomy |
| US11298195B2 (en) | 2019-12-31 | 2022-04-12 | Auris Health, Inc. | Anatomical feature identification and targeting |
| US12318102B2 (en) | 2019-12-31 | 2025-06-03 | Auris Health, Inc. | Advanced basket drive mode |
| US11660147B2 (en) | 2019-12-31 | 2023-05-30 | Auris Health, Inc. | Alignment techniques for percutaneous access |
| US11439419B2 (en) | 2019-12-31 | 2022-09-13 | Auris Health, Inc. | Advanced basket drive mode |
| US11950872B2 (en) | 2019-12-31 | 2024-04-09 | Auris Health, Inc. | Dynamic pulley system |
| US12370002B2 (en) | 2020-03-30 | 2025-07-29 | Auris Health, Inc. | Workspace optimization for robotic surgery |
| US12414686B2 (en) | 2020-03-30 | 2025-09-16 | Auris Health, Inc. | Endoscopic anatomical feature tracking |
| US12208220B2 (en) | 2020-06-04 | 2025-01-28 | Covidien Lp | Active distal tip drive |
| US11701492B2 (en) | 2020-06-04 | 2023-07-18 | Covidien Lp | Active distal tip drive |
| US12311530B2 (en) | 2020-06-29 | 2025-05-27 | Auris Health, Inc. | Systems and methods for detecting contact between a link and an external object |
| US11839969B2 (en) | 2020-06-29 | 2023-12-12 | Auris Health, Inc. | Systems and methods for detecting contact between a link and an external object |
| US12268460B2 (en) | 2020-06-30 | 2025-04-08 | Auris Health, Inc. | Systems and methods for saturated robotic movement |
| US11357586B2 (en) | 2020-06-30 | 2022-06-14 | Auris Health, Inc. | Systems and methods for saturated robotic movement |
| US11931901B2 (en) | 2020-06-30 | 2024-03-19 | Auris Health, Inc. | Robotic medical system with collision proximity indicators |
| US12383352B2 (en) | 2020-08-13 | 2025-08-12 | Covidien Lp | Endoluminal robotic (ELR) systems and methods |
| US12256923B2 (en) | 2020-08-13 | 2025-03-25 | Covidien Lp | Endoluminal robotic systems and methods for suturing |
| US20220280168A1 (en) * | 2021-03-02 | 2022-09-08 | Mazor Robotics Ltd. | Systems and methods for cutting an anatomical element |
| US11432892B1 (en) * | 2021-03-02 | 2022-09-06 | Mazor Robotics Ltd. | Systems and methods for cutting an anatomical element |
| US12303220B2 (en) | 2022-01-26 | 2025-05-20 | Covidien Lp | Autonomous endobronchial access with an EM guided catheter |
Also Published As
| Publication number | Publication date |
|---|---|
| US20180177556A1 (en) | 2018-06-28 |
| US10543048B2 (en) | 2020-01-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11771309B2 (en) | Detecting endolumenal buckling of flexible instruments | |
| US20200268459A1 (en) | Flexible instrument insertion using an adaptive insertion force threshold | |
| AU2017388217B2 (en) | Apparatus for flexible instrument insertion | |
| US11759090B2 (en) | Image-based airway analysis and mapping | |
| US20240065780A1 (en) | Vector-based luminal network branch mapping | |
| US12075974B2 (en) | Instrument calibration | |
| US20240164634A1 (en) | Endolumenal object sizing | |
| US10813539B2 (en) | Automated calibration of surgical instruments with pull wires | |
| US11944422B2 (en) | Image reliability determination for instrument localization | |
| EP4312856A1 (en) | Vision-based 6dof camera pose estimation in bronchoscopy |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| AS | Assignment |
Owner name: AURIS HEALTH, INC., CALIFORNIA Free format text: CHANGE OF NAME;ASSIGNOR:AURIS SURGICAL ROBOTICS, INC;REEL/FRAME:064502/0533 Effective date: 20180316 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |