[go: up one dir, main page]

WO2025074225A1 - Estimation de profondeur d'outil de caméra d'endoscope stéréoscopique et génération de nuage de points pour un enregistrement de position d'anatomie de patient pendant une navigation pulmonaire - Google Patents

Estimation de profondeur d'outil de caméra d'endoscope stéréoscopique et génération de nuage de points pour un enregistrement de position d'anatomie de patient pendant une navigation pulmonaire Download PDF

Info

Publication number
WO2025074225A1
WO2025074225A1 PCT/IB2024/059544 IB2024059544W WO2025074225A1 WO 2025074225 A1 WO2025074225 A1 WO 2025074225A1 IB 2024059544 W IB2024059544 W IB 2024059544W WO 2025074225 A1 WO2025074225 A1 WO 2025074225A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
sensor
images
tool
body lumen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2024/059544
Other languages
English (en)
Inventor
Vaclav Grym
Guy Alexandroni
Jonathan Scott THOMSON
Daniel OVADIA
Shaked PESSAH
Nathan J. Knutson
Darrick CHEKAS
Badr Elmaanaoui
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Covidien LP
Original Assignee
Covidien LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/896,859 external-priority patent/US20250107701A1/en
Application filed by Covidien LP filed Critical Covidien LP
Publication of WO2025074225A1 publication Critical patent/WO2025074225A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00064Constructional details of the endoscope body
    • A61B1/00071Insertion part of the endoscope body
    • A61B1/0008Insertion part of the endoscope body characterised by distal tip features
    • A61B1/00097Sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00194Optical arrangements adapted for three-dimensional imaging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/267Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the respiratory tract, e.g. laryngoscopes, bronchoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image

Definitions

  • patient data including X-ray data, computed tomography (CT) scan data, magnetic resonance imaging (MRI) data, or other imaging data that allows the clinician to view the internal anatomy of a patient.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • the imaging data is also utilized to identify targets of interest and to develop strategies for accessing the targets of interest for surgical treatment. Further, the imaging data has been used to create a three-dimensional (3D) model of the patient’s body to guide navigation of the medical device to a target of interest within a patient’s body.
  • 3D three-dimensional
  • the techniques of this disclosure generally relate to stereoscopic endoscope camera and sensor tools and methods of using the camera and sensor tools to estimate image depth and generate a point cloud for registering patient anatomical positional information to images from a different imaging modality during lung navigation.
  • the present disclosure provides a camera and sensor tool.
  • the camera and sensor tool includes a sensor pack at a distal end portion of the camera and sensor tool.
  • the sensor pack includes a structural member, a cable assembly, one or more cameras, an electromagnetic (EM) sensor assembly, an inertial measurement unit (IMU), an illumination source, and one or more lenses.
  • EM electromagnetic
  • IMU inertial measurement unit
  • the one or more cameras are coupled to the structural member and electrically coupled to the cable assembly.
  • the electromagnetic (EM) sensor assembly is coupled to the structural member and electrically coupled to the cable assembly.
  • the inertial measurement unit (IMU) is coupled to the structural member and electrically coupled to the cable assembly.
  • the one or more lenses are optically coupled to apertures of the one or more cameras.
  • Implementations of the camera and sensor tool may include one or more of the following features.
  • the structural member may be a length of flat wire or rigid wire.
  • the sensor pack and the cable assembly may be encased within a sheath.
  • the present disclosure provides a method of registering stereo images of at least one body lumen to a three-dimensional (3D) model.
  • the method includes illuminating, by a camera and sensor tool disposed within an endoscopic catheter, a feature of the at least one body lumen; capturing, by one or more cameras of the camera and sensor tool, stereoscopic images of the feature of the at least one body lumen; and matching points between the stereoscopic images, yielding matched points.
  • the method also includes estimating depth information based on the matched points, converting the depth information to a point cloud volume based on intrinsic parameters of the one or more cameras, and registering the point cloud volume to the 3D model of the at least one body lumen.
  • the method may include generating the 3D model based on preoperative radiographic images of the at least one body lumen.
  • the at least one body lumen forms at least a portion of a bronchial tree.
  • the method may include rectifying the stereoscopic images before matching points between the stereoscopic images.
  • Rectifying the stereoscopic images may include applying an image rectification algorithm to the stereoscopic images.
  • the image rectification algorithm may be at least one of an epipolar rectification algorithm, Hartley’s rectification algorithm, a polar rectification algorithm, a recursive rectification algorithm, a 3D rotation rectification algorithm, a non-parametric rectification algorithm, a shearbased rectification algorithm, or an automatic rectification algorithm.
  • Implementations of the system may include one or more of the following features.
  • the structural member may be a length of flat wire or rigid wire.
  • the sensor pack and the cable assembly may be encased within a sheath.
  • the sensor pack may include an electromagnetic (EM) sensor assembly coupled to the structural member and electrically coupled to the cable assembly.
  • the sensor pack may include an inertial measurement unit (IMU) coupled to the structural member and electrically coupled to the cable assembly.
  • EM electromagnetic
  • IMU inertial measurement unit
  • FIG. 1 is a block diagram that illustrates a system for acquiring and processing 3D CBCT scans of a patient and a location board;
  • FIG. 2 is a circuit block diagram that illustrates the workstation of the system of FIG. 1;
  • FIG. 3 is a block diagram that illustrates a camera and sensor tool for use in a catheter or working channel
  • FIG. 4 is a side view of a camera and sensor tool
  • FIG. 5 is a side view of another example of a camera and sensor tool
  • FIG. 6 is a perspective view of a distal end portion of the other example of the camera and sensor tool of FIG. 5;
  • FIG. 7 is a top view of the distal end portion and length portions of the other example of the camera and sensor tool of FIG. 5;
  • FIGS. 8 A and 8B are block diagrams that illustrate examples of dual camera configurations
  • FIG. 9 is a block diagram that illustrates an example of a camera module including staggered cameras and stereoscopic lenses;
  • FIG. 10 shows block diagrams that illustrate examples of camera modules including a single camera and stereoscopic lenses;
  • FIG. 11 shows block diagrams that illustrate an example of a camera module including a single camera and an adaptive lens
  • FIG. 12 is a block diagram that illustrates articulation of a camera tool to increase the field of view of the camera.
  • stereoscopic images may be obtained with either two cameras offset a known distance and positioned to capture an overlapping field, one camera with a specially designed lens setup, or one camera with the ability to mechanically change the view position.
  • the following camera module 321 configurations may be used to obtain stereoscopic views:
  • the system 100 may be further configured to construct radiographic-based volumetric data of a target area from intraprocedural 2D radiographic images, e.g., intraprocedural fluoroscopic and/or CBCT images, to confirm navigation of a navigation catheter 102, e.g., an extended working channel (EWC) or a smart extended working channel (sEWC), to a desired location near the target area, where a tool, e.g., a camera and sensor tool lOle, may be placed through and extending out of the navigation catheter 102.
  • the imaging system 124 of the system 100 may include one or more of a C-arm fluoroscope, a 3D cone-beam computed tomography (CBCT) imaging system, and a 3D fluoroscopic imaging system.
  • CBCT 3D cone-beam computed tomography
  • the bronchoscope adapter 109 is configured either to allow motion of the navigation catheter 102 through the working channel of the bronchoscope 108 (which may be referred to as an unlocked state of the bronchoscope adapter 109) or prevent motion of the navigation catheter 102 through the working channel of the bronchoscope (which may be referred to as an unlocked state of the bronchoscope adapter 109).
  • the camera and sensor tool lOle may be inserted into the working channel of the bronchoscope 108 in place of the navigation catheter 102.
  • the procedures may involve one or more of a locatable guide 101a, a microwave ablation tool 101b, a biopsy needle 101c, a forceps 10 Id, or a camera and sensor tool lOle.
  • the locatable guide (LG) 101a which may be a catheter, and which may include a sensor 104a similar to the sensor 104b, is inserted into the navigation catheter 102 and locked into position such that the sensor 104a extends a predetermined distance beyond the distal end portion of the navigation catheter 102.
  • the camera and sensor tool 1 Ole of the disclosure includes a sensor pack 113, which may incorporate miniaturized LEDs, and a catheter assembly 111, which may include one or more camera wires 311, one or more EM sensor wires 312, one or more IMU sensor wires 314, one or more pull wires, and/or one or more illumination optical fibers 316 (in aspects that do not incorporate miniaturized LEDs).
  • the tools lOla-lOle may include a fixing member 103a-e such that when the fixing member 103a-103e of the tools lOla-lOle engages, e.g., snaps in, with the proximal end portion of the handle 106 of the catheter guide assembly 110, the tools lOla-lOle extend a predetermined distance 107 beyond a distal tip or end portion of the navigation catheter 102.
  • the predetermined distance 107 may be based on the length of the navigation catheter 102 and a length between the end portion of the handle 105a-e or the fixing member 103a-103e and the distal end portion of the LG 101a or the other medical tools 10 lb— 10 le.
  • the handles 105a-105e may include control objects, e.g., a button or a lever, for controlling operation of the medical tools lOla-lOle.
  • the position of the fixing member 105a-105e along the length of the medical tools lOla-lOle may be adjustable so that the user can adjust the distance by which the distal end portion of the LG 101a or the medical tools 10 lb— 10 le extend beyond the distal end portion of the navigation catheter 102.
  • the position and orientation of the LG sensor 104a or the sensor pack 113 of the camera and sensor tool lOle relative to a reference coordinate system within an electromagnetic field can be derived using an application executed by the computer system 122.
  • the navigation catheter 102 may function as the LG 101a, in which case the LG 101a may not be used. In other aspects, the navigation catheter 102 and the LG 101a may be used together.
  • Catheter guide assemblies 110 are currently marketed and sold by Medtronic PLC under the brand names SUPERDIMENSION® Procedure Kits, or EDGETM Procedure Kits, and are contemplated as useable with the disclosure.
  • the location board 120 may also include fiducials, which may be embedded or otherwise incorporated into the location board 120, and which are designed and/or arranged to appear in radiographic images for the purpose of creating a 3D reconstruction from the radiographic images. Since the fiducials may be radiographically dense, the fiducials create artifacts on the radiographic images, e.g., intraoperative CBCT images.
  • the system 100 further includes a computer system 122 on which software and/or hardware are used to facilitate identification of a target, planning a pathway to the target, navigating a medical tool to the target, and/or confirmation and/or determination of placement of the navigation catheter 102, or a suitable tool therethrough, relative to the target.
  • the location board 120 is positioned beneath patient P.
  • the location board 120 generates an electromagnetic field around at least a portion of the patient P within which the position of the LG sensor 104a, the navigation catheter sensor 104b, the PST 118, and the distal portion of the camera and sensor tool lOle can be determined through use of a tracking module 116.
  • An additional electromagnetic sensor 126 may also be incorporated into the end of the navigation catheter 102.
  • the additional electromagnetic sensor 126 may be a five degree-of-freedom sensor or a six degree-of-freedom sensor.
  • One or more of the reference sensors of the PST 118 are attached to the chest of the patient P.
  • the instant disclosure is not so limited and may be used in conjunction with flexible sensor, shape sensors such as Fiber-Bragg gratings, ultrasonic sensors, or any other suitable sensor that does not emit harmful radiation. Additionally, the methods described herein may be used in conjunction with robotic systems such that robotic actuators drive the navigation catheter 102 or bronchoscope 108 proximate the target.
  • tools such as a locatable guide 101a, a therapy tool (e.g., a microwave ablation tool 101b or a forceps 10 Id), a biopsy tool (e.g., a biopsy needle 101c), may be inserted into and fixed in place relative to the navigation catheter 102 to place one of the tools lOlb-lOle proximate the target or a desired location using position information from the navigation catheter 102.
  • the position information from the sensors 104b and/or 126 of the navigation catheter 102 may be used to calculate the position of the distal tip or distal end portion of any of the tools 101b— lOld.
  • the tools lOla-lOle may be designed to extend a predetermined distance from the distal end of the navigation catheter 102 and at least the distal portions of the tools lOla-lOle that extend from the navigation catheter 102 are designed to be rigid or substantially rigid.
  • the predetermined distance may be different depending on one or more of the design of the tools lOla-lOle, the stiffnesses of the tools lOla-lOle, or how each of the tools lOla-lOle interact with different types of tissue.
  • the tools lOla-lOle may be designed or characterized to set the predetermined distance to ensure deflection is managed (e.g., minimized) so that the virtual tools and environment displayed to a clinician are an accurate representation of the actual clinical tools and environment.
  • Calculating the position of the distal end portion of any of the tools lOla-lOle may include distally projecting the position information from the sensors 104b and/or 126 according to tool information.
  • the tool information may include one or more of the shape of the tool, the type of tool, the stiffness of the tool, the type or characteristics of the tissue to be treated by the tool, or the dimensions of the tool.
  • the computer system 122 utilizes previously acquired CT image data for generating and viewing a 3D model or rendering of patient P’s airways, enables the identification of a target (automatically, semi-automatically, or manually), and allows for determining a pathway through patient P’s airways to tissue located at and around the target. More specifically, CT images acquired from CT scans are processed and assembled into a 3D CT volume, which is then utilized to generate a 3D model of patient P’s airways. The 3D model may be displayed on a display associated with the computer system 122, or in any other suitable fashion.
  • the computer system 122 uses the computer system 122 to various views of the 3D model or enhanced two-dimensional images generated from the 3D model.
  • the enhanced two- dimensional images may possess some 3D capabilities because they are generated from 3D data.
  • the 3D model may be manipulated to facilitate identification of target on the 3D model or two-dimensional images, and selection of a suitable pathway through patient P’s airways to access tissue located at the target can be made. Once selected, the pathway plan, the 3D model, and the images derived therefrom, can be saved, and exported to a navigation system for use during the navigation phase(s).
  • the ILLUMISITE software suite currently sold by Medtronic PLC includes one such planning software.
  • FIG. 2 is a schematic diagram of the computer system 122 of FIG. 1 configured for implementing the methods of the disclosure including the methods of FIG. 2.
  • the computer system 122 may include a workstation. In some aspects, the computer system 122 may be coupled with the imaging system, directly or indirectly, e.g., by wireless communication.
  • the computer system 122 may include a memory 202, a processor 204, a display 206 and an input device 210.
  • the processor 204 may include one or more hardware processors.
  • the computer system 122 may optionally include an output module 212 and a network interface 208.
  • the memory 202 may store an application 218 and sensor pack data 214 including data from the one or more EM sensors 104, 126 disposed at the distal portion of the navigation catheter 102, the EM sensors of the P ST 118, and data from the sensor pack 113 of the camera and sensor tool lOle.
  • the application 218 may include instructions executable by the processor 204 for executing the methods of the disclosure including the method of FIG. 13. [0063]
  • the application 218 may further include a user interface 216.
  • the image data may include preoperative CT image data, intraoperative 3D fluoroscopic image data, preoperative or intraoperative CBCT image data, and/or 3D reconstruction data.
  • the processor 204 may be coupled with the memory 202, the display 206, the input device 210, the output module 212, the network interface 208, and the imaging system.
  • the computer system 122 may be a stationary computer system, such as a personal computer, or a portable computer system such as a tablet computer.
  • the computer system 122 may embed multiple computers.
  • the memory 202 may include any non-transitory computer-readable storage media for storing data and/or software including instructions that are executable by the processor 204 and which control the operation of the computer system 122, process data from one or more EM sensors 104, 126 disposed in or on the navigation catheter 102, e.g., at a distal end portion of the navigation catheter 102, to track the position of the navigation catheter 102 and calculate or project the position of a distal end portion of a medical tool at a fixed position within the navigation catheter 102, and, in some aspects, may also control the operation of the imaging system.
  • the imaging system may be used to capture a series of preoperative CT images of a portion of a patient’s body, e.g., the lungs, as the portion of the patient’s body moves, e.g., as the lungs move during a respiratory cycle.
  • the imaging system may include a CBCT imaging system or a 3D fluoroscopic imaging system that captures a series of images based on which a 3D reconstruction is generated and/or to capture a live 2D view to confirm placement of the navigation catheter 102 and/orthe medical tool, e.g., the camera and sensor tool lOle.
  • the memory 202 may include one or more storage devices such as solid-state storage devices, e.g., flash memory chips.
  • the memory 202 may include one or more mass storage devices connected to the processor 204 through a mass storage controller (not shown) and a communications bus (not shown).
  • computer-readable media can be any available media that can be accessed by the processor 204. That is, computer readable storage media may include non-transitory, volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
  • computer-readable storage media may include RAM, ROM, EPROM, EEPROM, flash memory or other solid-state memory technology, CD-ROM, DVD, Blu-Ray or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which may be used to store the desired information, and which may be accessed by computer system 122.
  • the application 218 may, when executed by the processor 204, cause the display 206 to present the user interface 216.
  • the user interface 216 may be configured to present to the user a single screen including a three-dimensional (3D) view of a 3D model of a target from the perspective of a tip of a medical tool, a live two-dimensional (2D) view showing the medical tool, and a target mark, which corresponds to the 3D model of the target, overlaid on the live 2D view.
  • the user interface 216 may be further configured to display the target mark in different colors depending on whether the medical tool tip is aligned with the target in three dimensions.
  • the network interface 208 may be configured to connect to a network such as a local area network (LAN) consisting of a wired network and/or a wireless network, a wide area network (WAN), a wireless mobile network, a Bluetooth network, and/or the Internet.
  • the network interface 208 may be used to connect between the computer system 122 and the imaging system 515.
  • the network interface 208 may be also used to receive the sensor pack data 214.
  • the input device 210 may be any device by which a user may interact with the computer system 122, such as, for example, a mouse, keyboard, foot pedal, touch screen, and/or voice interface.
  • the output module 212 may include any connectivity port or bus, such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • connectivity port or bus such as, for example, parallel ports, serial ports, universal serial busses (USB), or any other similar connectivity port known to those skilled in the art.
  • the distal sensor pack 113 may include a camera module 321 (which may include, for example, a miniature camera with a CMOS imaging sensor) and an electromagnetic (EM) sensor 323.
  • the EM sensor 323 may include multiple sensor coils, e.g., 3 to 6 sensor coils.
  • the camera module 321 and the EM sensor 323 may be coupled to a structural wire 324 or other structure and/or material suitable for providing mechanical structure and/or stiffness.
  • the structural wire 324 may be a rigid or semi-rigid wire, such as a metal flat wire.
  • the camera module 321 and the EM sensor 323 may be coupled to a structural wire 324 via an adhesive 322 or any other suitable method for attaching the camera module 321 and the EM sensor 323 to the structural wire 324.
  • the distal sensor pack 113 may also include a flexible printed circuit board (PCB) 327, which may be mechanically coupled to the structural wire, and which may be electrically coupled to the camera module 321, which may include one or more miniature cameras, and the EM sensor 323.
  • An inertial measurement unit (IMU) sensor 325 e.g., an MC3672 accelerometer may be electrically coupled to the flexible PCB 327.
  • FIG. 4 is a side view of a camera and sensor tool lOle.
  • the camera and sensor tool lOle includes a sensor pack 113 at a distal end portion of the camera and sensor tool lOle.
  • the sensor pack 113 includes, in order proximally from a distal end portion, one or more cameras, an electromagnetic (EM) sensor assembly, and an inertial measurement unit (IMU) sensor coupled to a length of a structural wire 324.
  • the structural wire 324 provides structure to the sensor pack 113 and may be bent at an angle or into a shape suitable for navigating a body lumen.
  • the camera module 321 including one or more cameras, the EM sensor 323, and the IMU sensor 325 are electrically coupled to a cable assembly 329, which extends to the proximal end portion of the camera and sensor tool lOle.
  • the sensor pack 113 and the cable assembly 329 may both be encased within a single sheath or separate sheaths 310a, 310b.
  • the sensor pack 113 may be encased within a sheath 310a made of a different material from the sheath 310b encasing the cable assembly 329.
  • the camera wire electrically coupling the camera module 321 to the cable assembly 329 may be disposed on or adjacent to the EM sensor 323 and the IMU sensor 325.
  • FIG. 5 is a side view of another example of a camera and sensor tool 10 le.
  • the other example of the camera and sensor tool lOle includes a sensor pack 113 at a distal end portion of the camera and sensor tool lOle.
  • the sensor pack 113 includes, in order proximally from a distal end portion, a camera module 321, which may include one or more cameras, an electromagnetic (EM) sensor 323, and an inertial measurement unit (IMU) sensor 325 coupled to a length of structural wire 324.
  • the structural wire 324 provides structure to the sensor pack 113 and may be bent at an angle or into a shape suitable for navigating a body lumen.
  • the camera module 321, the EM sensor 323, and the IMU sensor 325 are electrically coupled to a cable assembly 329, e.g., a multi-conductor cable, which extends to the proximal end portion of the camera and sensor tool I Ole.
  • the sensor pack 113 and the cable assembly 329 may both be encased within a single sheath 310a, 310b.
  • the sensor pack 113 may be encased within a sheath 310a made of a different material from the sheath 310b encasing the cable assembly 329.
  • the camera module wires electrically coupling the camera module 321 to the cable assembly 329 may be disposed on or adjacent to the EM sensor 323 and the IMU sensor 325.
  • the camera wire 311 may be disposed opposite from or nearly opposite from the structural wire 324, e.g., a rigid or semi-rigid flat wire.
  • the other example of the camera and sensor tool I Ole also includes one or more optical fibers 316, which may be optically coupled to illumination sources of a console to which the camera and sensor tool 10 le connects.
  • FIG. 7 is a top view of the distal end portion and length portions of the other example of the camera and sensor tool lOle of FIG. 5.
  • the sheath 310a encasing the sensor pack 113 and the sheath 310b encasing the length of the cable assembly 329 may be or may approximately be the same diameter.
  • the cable assembly 329 is more flexible than the sensor pack 113.
  • the portion of the sensor pack 113 including the camera module 321, which may include multiple cameras, may be bent at an angle with respect to the remaining portion of the sensor pack 113.
  • FIG. 9 is a block diagram that illustrates an example of the camera module 321 including staggered cameras 821, 822 and stereoscopic lenses 901-903.
  • a pair of lenses 901, 902 are arranged at a distal end portion of the sensor pack 113, an aperture of a first camera 821 is arranged adjacent to a first lens 901 of the pair of lenses, and a second camera is arranged near a proximal end portion of the first camera 821 and axially offset from the first camera 821.
  • the first camera 821 and the second camera 822 are arranged in a staggered configuration.
  • a third lens 903 may be coupled to the aperture of the second camera 822.
  • a second lens 902 of the pair of lenses focuses light passing through the second lens 902 onto the third lens 903, which transmits light to the aperture of the second camera 822.
  • an optical fiber may be coupled between the second lens 902 and the third lens 903, and may be configured to transmit light (e.g., light reflected from tissue) between the second lens 902 and the third lens 903.
  • Hartley s rectification algorithm
  • Hartley rectification algorithm involves resampling the two stereoscopic images using a pair of 3x3 rectifying transformations to make the epipolar lines aligned and horizontal.
  • Hartley’s rectification algorithm ensures that rectified images are oriented such that they share the same row coordinates for matching points.
  • a stereoscopic image rectification algorithm is the polar rectification algorithm. Instead of making the epipolar lines horizontal, the polar rectification algorithm transforms the stereoscopic images into polar coordinates.
  • the polar rectification algorithm may be advantageous in cases where the camera’s epipoles are inside the image area.
  • a stereoscopic image rectification algorithm is a recursive rectification algorithm.
  • the recursive rectification algorithm recursively applies rectification, thereby refining the results iteratively.
  • the recursive rectification algorithm may provide more accurate rectification.
  • Another example of a stereoscopic image rectification algorithm is a 3D rotation rectification algorithm, which is based on 3D rotation.
  • the 3D rotation rectification algorithm involves rotating two stereoscopic image planes to bring them into a common plane, thereby simplifying the stereo correspondence problem.
  • the 3D rotation rectification algorithm may be used when accurate external calibration information (e.g., information regarding rotation and translation between cameras) is available.
  • a stereoscopic image rectification algorithm is a non-parametric rectification algorithm. Instead of relying on camera calibration and parametric models, the non-parametric rectification algorithm uses image features and content directly to compute rectification transformations. The non-parametric rectification algorithm may be more flexible and adaptive.
  • Another example of a stereoscopic image rectification algorithm is a shear-based rectification algorithm.
  • the shear-based rectification algorithm may apply shearing transformations to the stereoscopic images to achieve rectification.
  • the shear-based rectification algorithm may be viewed as a simplified version of the standard epipolar rectification algorithm.
  • the shear-based rectification may be computationally efficient and may be used in real-time.
  • FIG. 13 is a flowchart of an example of a method 1300 of generating a point cloud volume from stereoscopic images of at least one body lumen acquired by a camera and sensor tool lOle and registering the point cloud volume to a three-dimensional (3D) model.
  • the method 1300 may include applying a machine learning -based stereoscopic image rectification algorithm to the stereoscopic images acquired by the camera and sensor tool lOle of the disclosure.
  • the method 1300 includes illuminating, by a camera and sensor tool lOle disposed within an endoscopic catheter, a feature of the at least one body lumen at block 1302.
  • the camera and sensor tool lOle may be disposed within an extended working channel of the endoscopic catheter.
  • the method 1300 also includes capturing, by one or more cameras of the camera and sensor tool lOle, stereoscopic images of the feature of the at least one body lumen at block 1304.
  • the camera and sensor tool lOle may capture the stereoscopic images in any of a number of ways.
  • the camera may capture images while being moved, e.g., rotated, two cameras may be positioned in a compact way allowing them to fit into a very thin endoluminal medical device, and/or integrating the one or more cameras into an endoluminal medical device in a way suitable for an endoluminal procedure.
  • the method 1300 includes matching points between the stereoscopic images.
  • Neural networks may be used to estimate depth from the stereoscopic images using stereo disparity estimation.
  • stereo vision two images of the same scene are captured, e.g., by two cameras or by one camera and two lenses, from slightly different viewpoints.
  • the two captured images which may be referred to as left and right images, are used to compute a disparity map.
  • the disparity map encodes the pixel-wise differences or disparities between corresponding points in the left and right images.
  • the disparities are inversely proportional to depth: smaller disparities correspond to objects closer to a camera, while larger disparities correspond to objects farther away from the camera.
  • the described techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit.
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Endoscopes (AREA)

Abstract

Des systèmes et des procédés endoscopiques font appel à un outil de caméra multivue à l'intérieur du corps, notamment des voies respiratoires d'un poumon, pour capturer des images, et utilisent une combinaison d'informations de position provenant de capteurs EM et/ou IMU de l'outil de caméra multivue pour estimer la profondeur d'image et générer un volume de nuage de points tridimensionnel (3D) de l'anatomie du patient. Le volume de nuage de points 3D est généré à partir d'un point d'observation connu à l'aide d'images stéréoscopiques capturées par l'outil de caméra. Cette génération de structure 3D peut utiliser un algorithme de rectification d'image stéréo. Un algorithme d'apprentissage automatique peut également être appliqué à la place d'un algorithme de rectification d'image, ou en combinaison avec celui-ci, pour améliorer l'efficacité de calcul.
PCT/IB2024/059544 2023-10-01 2024-09-30 Estimation de profondeur d'outil de caméra d'endoscope stéréoscopique et génération de nuage de points pour un enregistrement de position d'anatomie de patient pendant une navigation pulmonaire Pending WO2025074225A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363541898P 2023-10-01 2023-10-01
US63/541,898 2023-10-01
US18/896,859 2024-09-25
US18/896,859 US20250107701A1 (en) 2023-10-01 2024-09-25 Stereoscopic endoscope camera tool depth estimation and point cloud generation for patient anatomy positional registration during lung navigation

Publications (1)

Publication Number Publication Date
WO2025074225A1 true WO2025074225A1 (fr) 2025-04-10

Family

ID=93150371

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/059544 Pending WO2025074225A1 (fr) 2023-10-01 2024-09-30 Estimation de profondeur d'outil de caméra d'endoscope stéréoscopique et génération de nuage de points pour un enregistrement de position d'anatomie de patient pendant une navigation pulmonaire

Country Status (1)

Country Link
WO (1) WO2025074225A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180368920A1 (en) * 2017-06-23 2018-12-27 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
WO2022098665A1 (fr) * 2020-11-05 2022-05-12 Covidien Lp Position synthétique dans l'espace d'un instrument endoluminal
US20230075251A1 (en) * 2020-06-02 2023-03-09 Noah Medical Corporation Systems and methods for a triple imaging hybrid probe
WO2023037367A1 (fr) * 2021-09-09 2023-03-16 Magnisity Ltd. Dispositif endoluminal autodirecteur utilisant une carte luminale déformable dynamique

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180368920A1 (en) * 2017-06-23 2018-12-27 Auris Health, Inc. Robotic systems for determining a roll of a medical device in luminal networks
US20230075251A1 (en) * 2020-06-02 2023-03-09 Noah Medical Corporation Systems and methods for a triple imaging hybrid probe
WO2022098665A1 (fr) * 2020-11-05 2022-05-12 Covidien Lp Position synthétique dans l'espace d'un instrument endoluminal
WO2023037367A1 (fr) * 2021-09-09 2023-03-16 Magnisity Ltd. Dispositif endoluminal autodirecteur utilisant une carte luminale déformable dynamique

Similar Documents

Publication Publication Date Title
US11798178B2 (en) Fluoroscopic pose estimation
US20220346886A1 (en) Systems and methods of pose estimation and calibration of perspective imaging system in image guided surgery
CN103209656B (zh) 配准过的表面下解剖部的可视化
JP7433932B2 (ja) 定量的3次元イメージング及び手術用インプラントのプリント
JP6976266B2 (ja) 多視点ポーズ推定を使用するための方法およびシステム
US20250177056A1 (en) Three-dimensional reconstruction of an instrument and procedure site
KR101572487B1 (ko) 환자와 3차원 의료영상의 비침습 정합 시스템 및 방법
US11896441B2 (en) Systems and methods for measuring a distance using a stereoscopic endoscope
CN113164149A (zh) 使用数字计算机断层扫描的多视图姿态估计的方法和系统
CA3088277A1 (fr) Systeme et procede d'estimation de pose d'un dispositif d'imagerie et de determination de l'emplacement d'un dispositif medical par rapport a une cible
US20230363830A1 (en) Auto-navigating digital surgical microscope
US20230030343A1 (en) Methods and systems for using multi view pose estimation
KR20160057024A (ko) 마커리스 3차원 객체추적 장치 및 그 방법
US20250107701A1 (en) Stereoscopic endoscope camera tool depth estimation and point cloud generation for patient anatomy positional registration during lung navigation
WO2025074225A1 (fr) Estimation de profondeur d'outil de caméra d'endoscope stéréoscopique et génération de nuage de points pour un enregistrement de position d'anatomie de patient pendant une navigation pulmonaire
Lin et al. Dense surface reconstruction with shadows in mis
Wang et al. Towards video guidance for ultrasound, using a prior high-resolution 3D surface map of the external anatomy
US20250040995A1 (en) Updating enb to ct registration using intra-op camera
WO2025032436A1 (fr) Mise à jour d'une bronchoscopie de navigation électromagnétique à un enregistrement de tomodensitométrie à l'aide d'une caméra intra-opératoire
Wang Navigating in Patient Space Using Camera Pose Estimation Relative to the External Anatomy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24791094

Country of ref document: EP

Kind code of ref document: A1