WO2025171214A2 - Filtre directionnel et/ou contraintes de translation pour enregistrement de coordonnées - Google Patents
Filtre directionnel et/ou contraintes de translation pour enregistrement de coordonnéesInfo
- Publication number
- WO2025171214A2 WO2025171214A2 PCT/US2025/014941 US2025014941W WO2025171214A2 WO 2025171214 A2 WO2025171214 A2 WO 2025171214A2 US 2025014941 W US2025014941 W US 2025014941W WO 2025171214 A2 WO2025171214 A2 WO 2025171214A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- coordinate system
- curve segment
- candidate
- computing
- imaging
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00694—Aspects not otherwise provided for with means correcting for movement of or for synchronisation with the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00681—Aspects not otherwise provided for
- A61B2017/00725—Calibration or performance testing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B2017/00743—Type of operation; Specification of treatment sites
- A61B2017/00809—Lung operations
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2061—Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30021—Catheter; Guide wire
Definitions
- a combination of one or more localization sensors disposed at a flexible elongate device and intraoperative imaging can greatly aid in planning and navigating a minimally invasive procedure.
- combining sensor data with intraoperative images can enable accurate determination of a position, orientation, and/or pose of the flexible elongate device within the patent anatomy.
- accurately registering sensor data with imaging data in space and time remains a challenge using current techniques.
- the instructions further cause the one or more processors to obtain a position of the tip in the imaging coordinate system and compute a translational transformation between the imaging coordinate system and the sensing coordinate system based on the position of the tip in the sensing coordinate system and the position of the tip in the imaging coordinate system.
- the instructions further cause the one or more processors to compute, based at least in part on the obtained three-dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system and compute a rotational transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- the instructions cause the one or more processors to register the imaging coordinate system with the sensing coordinate system based on the translational transformation and the rotational transformations.
- a system comprises one or more processors and a tangible, non-transitory, computer readable medium storing instructions that, when executed by the one or more processors, cause the one or more processors to obtain, based on image data received from an imaging system, a three-dimensional representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, the portion including a tip, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- the instructions further cause the one or more processors to receive, from a sensing unit, shape data indicative of a shape of the flexible elongate device including a position of the tip, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- the instructions further cause the one or more processors to obtain a position of the tip in the imaging coordinate system and compute a translational transformation between the imaging coordinate system and the sensing coordinate system based on the position of the tip in the sensing coordinate system and the position of the tip in the imaging coordinate system.
- the instructions further cause the one or more processors to compute, based at least in part on the obtained three- dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system and compute a rotational transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- the instructions cause the one or more processors register the imaging coordinate system with the sensing coordinate system based on the translational transformation and the rotational transformations.
- a method comprises obtaining, by one or more processors and based on image data received from an imaging system, a three-dimensional representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, the portion including a tip, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- the method further comprises receiving, from a sensing unit, shape data indicative of a shape of the flexible elongate device including a position of the tip, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- the method further comprises obtaining a position of the tip in the imaging coordinate system and computing a translational transformation between the imaging coordinate system and the sensing coordinate system based on the position of the tip in the sensing coordinate system and the position of the tip in the imaging coordinate system.
- the method further comprises computing, based at least in part on the obtained three-dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system and computing a rotational transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- the method comprises registering, by the one or more processors, the imaging coordinate system with the sensing coordinate system based on the translational transformation and the rotational transformations.
- a tangible, non-transitory, computer readable medium stores instructions that, when executed by one or more processors, cause the one or more processors to receive, from a sensing unit, shape data indicative of a shape of the flexible elongate device, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- the instructions further cause the one or more processors to obtain, based on image data received from an imaging system, a three-dimensional representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- the instructions further cause the one or more processors to compute, based at least in part on the obtained three-dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system and compute a transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- Computing the one or more curve segments includes: computing a first curve segment and a candidate second curve segment, computing a first direction vector representative of direction of the first curve segment at a terminal point of the first curve segment, and computing a filter region based at least in part on a maximum angular deviation from the first direction vector.
- Computing the one or more curve segments further includes: identifying a proximal point of the candidate second curve segment, determining that the proximal point of the candidate second curve segment is within the filter region, and designating the candidate second curve segment as an accepted candidate second curve segment based at least in part on determining that the proximal point of the candidate second curve segment lies within the filter region. Still further, the instructions cause the one or more processors to register the imaging coordinate system with the sensing coordinate system based on the transformation.
- a system comprises one or more processors and a tangible, non-transitory, computer readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to receive, from a sensing unit, shape data indicative of a shape of the flexible elongate device, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- the instructions further cause the one or more processors to obtain, based on image data received from an imaging system, a three-dimensional representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- the instructions further cause the one or more processors to compute, based at least in part on the received three-dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system and compute a transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- Computing the one or more curve segments includes: computing a first curve segment and a candidate second curve segment, computing a first direction vector representative of direction of the first curve segment at a terminal point of the first curve segment, and computing a filter region based at least in part on a maximum angular deviation from the first direction vector.
- Computing the one or more curve segments further includes: identifying a proximal point of the candidate second curve segment, determining that the proximal point of the candidate second curve segment is within the filter region, and designating the candidate second curve segment as an accepted candidate second curve segment based at least in part on determining that the proximal point of the candidate second curve segment lies within the filter region. Still further, the instructions cause the one or more processors to register the imaging coordinate system with the sensing coordinate system based on the transformation.
- a method comprises receiving, by one or more processors and from a sensing unit, shape data indicative of a shape of the flexible elongate device, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- the method further comprises obtaining, by the one or more processors and based on image data received from an imaging system, a three-dimensional representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- the method further comprises computing, by the one or more processors and based at least in part on the received three-dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system, and computing a transformation between the imaging coordinate system and the sensing coordinate system based at least in part on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- Computing the one or more curve segments includes: computing a first curve segment and a candidate second curve segment, computing a first direction vector representative of direction of the first curve segment at a terminal point of the first curve segment, and computing a filter region based at least in part on a maximum angular deviation from the first direction vector.
- Computing the one or more curve segments further includes: identifying a proximal point of the candidate second curve segment, determining that the proximal point of the candidate second curve segment is within the filter region, and designating the candidate second curve segment as an accepted candidate second curve segment based at least in part on determining that the proximal point of the candidate second curve segment lies within the filter region. Still further, the method comprises registering, by the one or more processors, the imaging coordinate system with the sensing coordinate system based on the transformation.
- FIG. 1 A depicts an example system for navigating during a medical procedure within an operating environment.
- FIG. 1 B is a simplified diagram of a flexible elongate device disposed within an anatomical structure.
- FIG. 2 is a schematic representation of an example operating environment including a robotic-assisted system for controlling a flexible elongate and an intraoperative imaging system.
- FIG. 3 is a schematic representation of an example timeline for obtaining intraoperative images and/or shape sensor data.
- FIG. 4A depicts an example flexible elongate device representation extracted from imaging data in a joint coordinate system with shape sensor data prior to registration.
- FIG. 4B depicts a rigid body in two example coordinate systems.
- FIG. 4C schematically illustrates an example coordinate transformation process.
- FIGS. 5A, B schematically illustrate example sources of error in coordinate registration.
- FIG. 6B schematically illustrates example candidate segments of a flexible elongate device extracted from the image in FIG. 6A.
- FIG. 6D schematically illustrates example techniques for selecting between alternative candidate centerline curve segments based on candidate segments of a flexible elongate device of FIG. 6B.
- FIG. 8A is a simplified diagram of a medical instrument system according to some examples.
- the distal portion or distal end of an instrument is closer to a procedural site than a proximal portion or proximal end of the instrument when the instrument is being used as designed to perform a procedure.
- the systems and methods discussed herein reduce the sliding error by separating computation of the 6-DoF transformation into two parts.
- One part includes computing a translational transformation between tip position in the imaging data and tip position in shape data.
- the tip position in the imaging data is obtained from an operator via a graphical user interface.
- Another part includes computing a rotational transformation between the shape data and the centerline curves after registering the tip position between the two coordinate systems.
- the rotational transformation may be computed using optimization over the three-dimensional rotational angle space to minimize a cost function indicative of quality of overlap (and, thus, registration quality).
- the cost function may place higher weight to aligning points closer to the tip than to aligning points farther from the tip.
- an additional 6-DoF transformation may be computed to fine-tune registration using constrained optimization.
- FIG. 4A An example coordinate registration problem is discussed below with reference to FIG. 4A.
- a general discussion of coordinate transformations is below with reference to FIGS. 4b and C.
- the sliding error and an example technique to reduce it by separating translational and rotational transformations are further discussed below, e.g., with reference to FIG. 5A-D.
- a graphical user interface on a display device may render the flexible elongate device from the shape data transformed onto the imaging coordinate system such that the transformed shape data is overlaid over the image of the flexible elongate device.
- An operator may then analyze the image to provide an indication of registration quality. If the registration quality is poor, the graphical user interface may prompt the operator to correct the indication of tip position imaging data (such as by providing a new indication of the position of the tip in the imaging data).
- the systems and methods discussed herein include synchronizing shape data and imaging data by selecting shape data (e.g., from a time sequence of shape data) corresponding to an imaging time at which the imaging data was captured.
- shape data corresponding to a time sequence may be stored in a shape data buffer.
- At least a subset of the shape data buffer may be selected for processing based on an indication of imaging time.
- the indication of imaging time may be obtained from the imaging system or from an operator.
- Shape data may be selected from the shape data buffer based at least in part on finding the time when the shape data remains substantially static, which may be indicative of a subject holding their breath during the imaging time. Synchronization of shape data and imaging data is discussed in more detail below, e.g., with reference to FIGS. 2 and 3.
- the present disclosure outlines a method for selecting a set of centerline curve segments from a plurality of candidate centerline curve segments.
- Using a plurality of centerline curve segments may result in a higher quality registration than using only the centerline curve segment including the tip.
- including a candidate centerline curve segment that does not belong to the flexible elongate device may damage the quality of registration.
- Candidate centerline curve segments to add to a selected centerline curve segment may be rejected when they do not fall within a filter region determined by the geometry of the shape of the selected centerline segment. More specifically, a reference direction vector tangent to the selected centerline curve segment at the segment’s terminal point may define a filter region.
- the filter region may encompass a region defined by a set of directions within a range of the reference direction vector.
- Proximal points e.g., nearest to the terminal point of the selected centerline curve segment
- respective candidate centerline curve segments may determine whether a given candidate centerline curve segment is accepted.
- Aggregation of candidate centerline curve segments into the one or more centerline curve segments that represent the flexible elongate device in imaging coordinates is discussed in more detail with below, e.g., with reference to FIGS. 6A-D.
- the system 100 includes a processing unit 125 and a display unit 130 in communicative connection with each other.
- the imaging unit 110 and the sensing unit 120 are depicted as being distinct from the system 100, in other examples, the system 100 may include the imaging unit 110 and/or the sensing unit 120.
- one or more processors of the processing unit 125 of the system 100 may be configured to receive images and/or processed image information from the imaging unit 110 and to receive data from the one or more sensors by way of the sensing unit 120.
- the instructions may cause the processing unit 125 to perform image processing operations on the images received from the imaging unit 110 and/or to perform computations (e.g., for coordinate registration) based on the data received by way of the sensing unit 120.
- the instructions may cause the processing unit 125 to cause the display unit 130 to display, via a GUI, information based on the processing of images received from the imaging unit 110 and the processing of data received by way of the sensing unit 120.
- the processing unit 125 may send the information, or send data representing the entire GUI including the information, to the display unit 130.
- An operator e.g., a physician, another medical practitioner, or a fully-automated robotic surgery system
- a medical procedure e.g., endoscopy, biopsy, pharmacological treatment, and/or treatment, such as ablation
- the operator may control a flexible elongate device 140 inserted through an orifice O (or through a suitable surgical incision) into an anatomical structure A of a patient P disposed at a table T.
- the medical procedure may include navigating the flexible elongate device 140 (indicated with solid lines outside and dashed lines inside the patient P) toward a region of interest, or ROI, R within the anatomical structure A with the aid of information displayed at the display unit 130.
- the ROI R may be a designated procedure site for visual examination, biopsy, treatment, or any other medical procedure.
- the ROI R may be referred to as a region or a target.
- One or more fiducial may be disposed at (e.g., physically contacting, integrated within, fixedly attached to, or removably attached to in a manner that, during operation/use, forms a rigid relationship with) the flexible elongate device 140.
- the fiducials are configured to be visible in images obtained by the imaging unit 110 to thereby enhance visibility of the flexible elongate device 140 and/or to aid in identifying certain points along the device 140.
- the fiducials may include elements of a variety of materials and/or structures such as metals, plastics, etched glass, dyes, radioactive or fluorescent markings, confined fluids (e.g., bubbles), etc.
- the fiducial 142 may be integrated (e.g., etched, deposited, painted, or otherwise fixedly attached) onto the flexible elongate device 140. Additionally or alternatively, the fiducials may include elements removably disposed at the flexible elongate device 140. For example, the fiducials may be integrated onto a removable structure, such as a sleeve or a stylet, which in turn may be removably attached to the flexible elongate device 140. [0046] One or more sensors may also be disposed at the flexible elongate device. The sensors may be mechanical sensors, optical sensors, electromagnetic (EM) sensors, or any other suitable sensors. The sensors may be integrated into the flexible elongate device 140, or removably attached to the flexible elongate device 140. The sensors may be configured to communicate with the sensing unit 120.
- EM electromagnetic
- the sensors may be fiberoptic sensors disposed along the length of the flexible elongate device.
- the fiberoptic sensors may include Bragg gratings and/or materials to enhance non-linear scattering.
- the fiberoptic sensors may be configured to change spectral reflectivity based on material strain. Such sensors may scatter light emitted, for example, by the sensing unit 120 in a manner that indicates locations and degrees of bends in the flexible elongate device 140.
- IMUs and/or IMMUs may generate signals indicative of motion of the flexible elongate device (e.g., caused by motion of an anatomical structure due to breathing and/or other factors, and/or independent motion of the flexible elongate device within the anatomical structure).
- the sensing unit 120 may combine indications of orientation (e.g., up to three degrees of freedom) from IMlls with indications of position from other (e.g., EM) sensors to generate more complete data indicative of pose of the flexible elongate device.
- the sensing unit 120 may combine data from sensors in multiple sensor coordinate systems.
- the sensing unit 120 and/or the processing unit 125 may register multiple sensor coordinate systems with each other.
- the system 100 may use any suitable combinations of the sensors described above to obtain shape data indicative of shape of the flexible elongate device 140 (e.g., pose, including location and orientation, at different points along the length of the flexible elongate device 140).
- shape data indicative of shape of the flexible elongate device 140 e.g., pose, including location and orientation, at different points along the length of the flexible elongate device 140.
- FIG. 1 B is a simplified diagram of the flexible elongate device 140 disposed within the anatomical structure A.
- FIG. 1 B is included to give an expanded and more detailed view of a portion of the operating environment 101 disposed within the field of view F.
- the anatomical structure A may be a lung of the patient P.
- the flexible elongate device 140 may be inserted into and navigated by the operator toward the region R (e.g., target of the medical procedure), for example, for the purpose of investigating or treating a pathology in the region R.
- the techniques described in the present disclosure can facilitate the navigation process by generating and displaying timely and accurate imaging and sensing of the flexible elongate device 140 and imaging of the anatomical structure A.
- These techniques incorporate location, pose, and/or shape data in one coordinate system obtained using the sensors and location, pose, and/or shape data obtained using imaging in another coordinate system.
- the combined data may improve speed, accuracy, reliability and/or safety of a medical procedure.
- the processing unit 125 may combine the data from the imaging unit 110 and/or the sensing unit 120 to determine the position, orientation, and/or pose of at least a portion of the flexible elongate device 140 within reference to the anatomical structure A of the patient P and, particularly, the ROI or target R.
- the techniques of this disclosure can improve speed and/or accuracy of the determination.
- the processing unit 125 may generate a graphical user interface (GUI) or update GUI data for display on the display device 130 to aid an operator with the medical procedure.
- GUI graphical user interface
- the processing unit 125 may generate data and/or control signals for a control unit of a robotic system configured to manipulate and/or navigate the flexible elongate device 140.
- the processing unit 125 may be configured to generate, based on the combined imaging and sensing data, one or more alerts.
- the alerts may include, for example, an alert indicating proximity to the region R, an alert indicating a potential navigation error, and alert indicating that confidence in location of the tip of the flexible elongate device 140 fell below a threshold level, etc.
- combining intra-operative data from the imaging unit 110 and the sensing unit 120 entails registering two coordinate systems with respect to each other, e.g., as described with reference to FIGS. 4A-C and 5A-D below.
- a suitably configured imaging unit may generate the intra-operative imaging data.
- the intra-operative imaging data may include, for example, computed tomography (CT), particularly cone-bean computed tomography (CBCT) data.
- CT computed tomography
- CBCT cone-bean computed tomography
- the imaging unit may include a C-arm CBCT imaging system, and/or magnetic- resonance imaging (MRI) data.
- MRI magnetic- resonance imaging
- the intraoperative imaging data may be obtained using thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging or any other suitable imaging technique.
- Intra-operative imaging data may include fluoroscopy X-ray data (e.g., generated by a C-arm X-ray device), and, particularly, tomosynthesis data (reconstructed from 2D X-ray images, such as fluoroscopy images, into 3D volume).
- fluoroscopy X-ray data e.g., generated by a C-arm X-ray device
- tomosynthesis data reconstructed from 2D X-ray images, such as fluoroscopy images, into 3D volume
- FIGS. 2 is a schematic representation of an example operating environment 201 (which may be operating environment 101 ) including a robotic system 205 (which may be system 100) for performing minimally invasive operating procedures.
- the operating environment 201 includes an X-ray imaging system 210 (which may be the imaging unit 110).
- the X-ray imaging system 210 can be referred to as the imaging system 210.
- the imaging system 210 includes a motion actuation unit 211 in mechanical connection with a C- arm 212.
- the C-arm 212 is in mechanical connection with an X-ray source 213 and an X- ray imaging detector 214.
- the imaging system 210 includes an operator interface unit 215, a communication interface 216, and a processing unit 218.
- the processing unit 218 is in communicative connection with the motion actuation unit 211 , the X-ray source 213, the X-ray imaging detector 214, the operator interface unit 215, and the communication interface 216.
- a sensing unit included in the robotic system 205 may be detached from the manipulator assembly 234. Still in other examples, a sensing unit (e.g., sensing unit 120) may be separate from the robotic system 205 but in communicative connection with the processing unit 238.
- the sensing unit e.g., the sensing unit 235 disposed at the manipulator assembly 234
- the sensing unit is configured to sense the shape of the flexible elongate device at different points in time (e.g., periodically and/or on demand) and send shape data to the processing unit 238.
- the processing unit 238 may include buffer memory or, simply, a buffer to store shape data over a limited time interval.
- the operating environment 201 includes, disposed on top of the table T, the patient P within whose anatomical structure A a portion of the flexible elongate device 240 is disposed during the minimally invasive procedure.
- the operating environment 201 further includes a medical professional M who may operate the robotic system 205 and/or the imaging system 210 to conduct a minimally invasive medical procedure.
- the medical professional M may be referred to as the operator M of the robotic system 205 and/or imaging system 210.
- the motion actuation unit 211 rotates the C-arm 212 while the X-ray source emits X-rays that pass through the anatomical region A and are detected by the X-ray imaging detector 214, generating projection data at a set of angles.
- the processing unit 218 reconstructs projection data to generate image data.
- the image data includes a 3D representation of the anatomical structure A and a portion of the flexible elongate device 240 disposed in the anatomical structure A.
- the processing unit 238 of the robotic system 205 may receive the imaging data from the imaging system 210 by way of a communication link 250 connecting the communication interfaces 216 and 236.
- the communication link may be wired (e.g., electrical or fiberoptic) or wireless (e.g., radio) and use any suitable standard protocol (e.g., ethernet, WiFi, Bluetooth, etc.) or a custom protocol.
- FIG. 3 is a schematic representation of an example timeline 300 for obtaining intraoperative images.
- the example timeline 300 may represent the imaging cycle and the transfer of image data in the example operating environment 201 above.
- the timeline 300 includes a time axis 310 depicting the passage of time, a shape sensor recording process 320 (e.g., performed by the robotic system 205) and steps 331 -335 of obtaining intraoperative images by an imaging system (e.g., the imaging system 210).
- the shape sensor recording process 320 is ongoing, continually generating shape data with a suitable sampling interval of 0.2, 0.5, 1 , 2, 5, 10, 20 sec or any other suitable periodicity.
- a processing unit e.g., processing unit 238) of a robotic system (e.g., the robotic system 205) may store at least some of the shape data (e.g., received from the sensing unit 235) in a buffer of a memory device within or communicatively connected to the processing unit. From the shape data stored in the buffer, the processing unit may select a portion that corresponds to imaging data (e.g., time when projection data is acquired during a spin of the C-arm 212).
- the imaging system may start a spin step 333 at time ti and end the spin step at time t2.
- the time interval between the time ti and the time t2 may be referred to as an imaging time window.
- the imaging system may continue with a reconstruction step 334 to generate image data from projection data and a transfer step 335 to send the image data to the robotic system.
- the length of each of the steps 331-335 may be variable and depend on settings of the imaging system.
- Time ts marks the end of the shape data in the buffer designated for processing to select the portion corresponding to the imaging data.
- the time ts may be based on an ending trigger generated by the operator at the robotic system or generated and sent to the robotic system by the imaging system.
- the ending trigger is generated after a reconstruction step 334 during a transfer step 335.
- FIG. 4A depicts, a joint coordinate system 401 , an example flexible elongate device representation 402 extracted from imaging data and shape data 403 prior to registration.
- the joint coordinate system 410 may use imaging system coordinates for the representation 402 and sensing coordinates for the shape data 403. Under the condition that a transformation mapping the imaging coordinate system to the sensor coordinate system is isometric, there is an isometric mapping that can overlay the representation 402 onto the shape data 403 and enable coordinate registration between the imaging coordinate system and the sensing coordinate system.
- the shape data 403 may take the form of an ordered set of poses corresponding to points along the flexible elongate device, an ordered set of points in 3D, or another suitable form.
- the shape data 403 may be represented by a set of splines that allow generating an ordered set of points at any spacing.
- the representation 402 extracted from the imaging data may be a point cloud a subset of points in a 3D grid, with optional weights at the points, or another suitable representation.
- the representation 402 may also be a set of parametric polynomial curves and/or splines. The parametric curves or splines may, for example, enable an ordered point form of at least portions of the representation 402.
- the representation 402 may be disjoined or segmented. Regions 404a and b may exist in an image where identifying and segmenting points corresponding to the flexible elongate device may be difficult. For example, computed probabilities of belonging to the representation 402 for points in the regions 404a and b may be below a threshold probability. Ordering segments of the representation 402 is discussed in more detail with reference to FIGS. 6B-D.
- the representation 402 may be a set of parametric splines tracing a centerline curve in 3D. That is, an ordered set of points (x[n],y[n],z[n]) form of the representation 402 may be generated by sampling spline curves x(s), y(s), and z(s). The domain of the parameter s may be restricted to intervals that define the segments of the representation 402.
- the shape data 403 may be represented by parametric spline curves x'(s), y'(s), and z'(s).
- a mapping by a rigid transformation may exist that maps any point (x(s),y(s),z(s)) onto a corresponding point (x'(s),y'(s),z'(s)) for any s within the domain of s.
- Such mapping wich enables coordinate registration, is discussed in more detail with reference to FIGS. 4B and C.
- FIG. 4B depicts a rigid body 412 in two example coordinate systems to illustrate the concept of registration between two coordinate systems.
- a first coordinate system 414 may correspond to coordinates of shape data obtained or generated, for example, by a sensing unit (e.g., the sensing unit 120 or the sensing unit 235).
- the sensing unit may be a part of a manipulator assembly (e.g., manipulator assembly 234) configured to actuate and/or manipulate a flexible elongate device (e.g., the flexible elongate device 140 or 240).
- the first coordinate system 414 may therefore represent the coordinate system of the sensing unit.
- a second coordinate system 416 may correspond to coordinates of images obtained or generated by an imaging system (e.g., the imaging unit 110 or the imaging system 210).
- the first coordinate system 414 may be referred to as the sensing coordinate system.
- the second coordinate system 416 may be referred to as the imaging coordinate system.
- the rigid body 412 may represent a portion (e.g., a segment of length) of the flexible elongate device (e.g., flexible elongate device 140 or 240). Although the flexible elongate device is flexible, a short portion (e.g., an infinitesimal segment) may be approximated as a rigid body. Furthermore, for the purpose of registration between the sensing coordinate system 414 and the imaging coordinate system 416, the flexible elongate device may be considered a rigid body if the coordinate systems 414 and 416 are isometric with each other.
- a state (e.g., position and orientation) of the rigid body 412 in 3D space can be described using the first coordinate system 414 and/or the second coordinate system 416.
- a rigid body without symmetries e.g., body 412
- the rigid body 412 may have coordinates (x, y, z, 0, ⁇
- the position coordinates may be designated for any point within the rigid body 412, or, in fact, any point in a rigidly defined geometric relationship to the body 412.
- ) and a can describe orientation of the body 412 with the aid of an orientation vector 419, which, in the example of FIG. 4A, originates in the center 418 and goes through the middle of one of the facets of the rigid body 412.
- 0 may be an elevation angle with respect to z-axis
- ) may be an azimuthal angle parallel to the xy-plane
- a may be the angle of rotation of the rigid body 412 around the orientation vector 419.
- the three orientation coordinates may be roll, pitch and yaw of the rigid body 412 with respect to any suitable reference direction.
- ) of first coordinate system 414 coordinates (x’, y’, z’, 0’, ⁇ ()’, o’) of the second coordinate system 416 describe position (x’, y’, z’) with respect to origin O’ and orientation (0’, ⁇
- Registering the first coordinate system 414 with the second coordinate system 416, at least in the vicinity of the rigid body 412 includes finding a mapping (e.g., a transformation, a mathematical relationship, etc.) at least between the coordinates (x, y, z, 0, ⁇ t>, a) and the coordinates (x’, y’, z’, 0’, ⁇
- a mapping e.g., a transformation, a mathematical relationship, etc.
- the mapping defines a corresponding point (u, v, w) in the vicinity of position (x, y, z) for any point (u’, v’, w’) in the vicinity of position (x’, y’, z’).
- the mapping from the first coordinate system 414 to the second coordinate system 416 may include one or more scaling factors for the axes.
- the mapping may include three translation variables, three rotation variables, and/or three scaling variables.
- each of the variables may depend on position and the mapping may include deformations.
- each of the coordinate systems 414 and 416 are independently calibrated to have accurate and consistent scaling within a shared operating volume. Coordinate registration process may then be defined in terms of three translation constants and three rotation constants for the shared operating volume. In other examples, gradual variations in scaling within at least one of the coordinate systems 414 and/or 416 may necessitate use of up to three translation variables and up to three rotation variables, each a function of location within the shared operating volume.
- FIG. 4C schematically illustrates an example coordinate transformation process.
- a processing unit e.g., the processing units 125, 238, may obtain or determine a position and an orientation of a rigid body (e.g., rigid body 412) within the first coordinate system 414 as (x, y, z, 0, ⁇
- the processing unit may then generate a mapping, M, between the two coordinate systems 414 and 416.
- the processing unit may be configured to map a new position (u’, v’, w’) within the second coordinate system 416 on to a corresponding position (u, v, w) within the first coordinate system 414. Additionally or alternatively, the processing unit may be configured to map coordinates from the first coordinate system 414 to the second coordinate system 416. The mapping may be only valid in a region around (x, y, z). By collecting rigid body coordinates in the two coordinate systems 414 and 416, the processing unit may extend the validity of mapping over any portion of a shared operating MH C CO ' volume of the two coordinate systems 414 and 416.
- the processing unit mJ ti3 •ay implement the coordinate registration as a linear mapping: 0 sin/?' 1 0 O ' u'' 1 0 0 cos a — sin a v' + 0 cos/3. .0 sin a cos a . .w'.
- a, p and y are rotation parameters (e.g., roll, pitch and yaw)
- Sn, S22 and S33 are scaling parameters (which may be unity, as discussed above)
- di, d2 and ds are displacement factors.
- the linear mapping may be a function of the input position coordinates (u’, v’, w’).
- the processing unit may store and/or access a lookup table find an entry for mapping parameters corresponding to the input position coordinates. Because the lookup table can only have a limited number of recorded input coordinates (herein, recorded coordinates), the system may use the entry corresponding to the recorded coordinates nearest to the input coordinates. Alternatively, the system may interpolate mapping parameters corresponding to a set of recorded coordinates near the input coordinates. In other examples, the processing unit may store and/or access one or more polynomial, spline, or another suitable fit function relating input coordinates to the mapping parameters.
- the mapping may relate the shape data 403 for the flexible elongate device obtained from a sensing system and the flexible elongate device representation 402 in the imaging coordinate system as discussed with reference to FIG. 4A. Furthermore, the system may use the mapping to update a target (e.g., target R) position within the instrument/robotic coordinate system, based on an identified target location within the imaging coordinate system.
- a target e.g., target R
- FIGS. 5A, B schematically illustrate example sources of error in coordinate registration.
- a coordinate system 501 e.g., the coordinate system 401
- a portion of imaging data corresponding to a flexible elongate device is represented by centerline curve segments 502a-c.
- the coordinates of the centerline curve segments 502a-c in the coordinate system 501 may be imaging coordinates or coordinates obtained by any well- defined rigid transformation.
- Shape data 503, on the other hand, may be in sensing system coordinates or coordinates obtained by any well-defined rigid transformation.
- a well-defined rigid transformation simply means that the transformation is rigid, know and, therefore reversible. Finding a rigid transformation that overlays the centerline curve segments 502a-c with the shape data 503 can therefore be used to register the imaging coordinate system with the sensing (e.g., robotic) coordinate system.
- aggregated centerline curve segments 502a-c and the shape data 503 are each represented by a respective collection of points (e.g., point clouds, ordered lists of points, etc.). Additionally, it can be assumed that for each point of the aggregated centerline curve segments 502a-c there is a corresponding point in the shape data 503.
- the centerline curve segments 502a-c need not have unique corresponding points in the shape data 503 or vice-versa.
- the system can compute the point-to-point correspondence using any suitable version of ICP or modified ICP algorithm, curve fitting of sets of points and subsequent sampling, and/or interpolation between points in point sets.
- the point correspondence may shift or slip along the nearly constant curvature, leading to a registration error.
- the registration error may result in an incorrect estimate of location of a tip (e.g., distal end, working end) of the flexible elongate device in the imaging coordinate system and, potentially, an error in estimating the relative position between the tip and a target (e.g., ROI R).
- a rigid transformation generating an overlap between the centerline curve segment 502c and a corresponding portion of the shape data 503 may be subject to another error.
- the straight overlap region may result in an optimization being insensitive to a linear shift 534. Therefore, in the presence of noise or errors in data, the optimization may settle on an incorrect position along the shift 534.
- using multiple centerline curve segments e.g., at least two of the centerline curve segments 502a-c
- the transformation between imaging and sensing coordinate system is slightly non-rigid (e.g., includes scaling or spatial nonuniformity) using multiple centerline curve segments may not improve registration. For example, an interplay between the shifts 532 and 534 may result in a registration error.
- the system may obtain a position of a tip 505 of the centerline curve segment 502a, as illustrated in FIG. 5C.
- Position of the tip 505 in imaging coordinates may be based on reconstructed imaging data, for example, by way of an operator (e.g., operator M in FIG. 2) selecting the position in imaging data displayed in a GUI on a display unit (e.g., display unit 130 or 230).
- Position of a tip 506 in shape data illustrated in FIG. 5C may be read directly from a sensing unit (e.g., sensing unit 120 or 235) in sensing coordinates.
- Subtracting the position of the tip 505 from the centerline curve segments 502a-c and subtracting the position of the tip 506 from the shape data 503, translates both, the aggregate centerline curve segments 502a-c and the shape data 503, to the origin of the coordinate system 501 , as illustrated in FIG. 5D.
- the translation transformations between FIG. 5C and D generate an accurate correspondence between points of the aggregate centerline curve segments 502a- c and the shape data 503.
- a rotational transformation 550 may then map of the aggregate centerline curve segments 502a-c to the shape data 503.
- the translational and rotational transformations in FIGS. 5C and D can be combined to create a mapping between the imaging and the sensing coordinate system, as discussed, for example, with reference to FIGS. 4B and C.
- the system may represent the image data corresponding to the flexible elongate device in other ways.
- the system may represent the flexible elongate device as segments of point sets segmented from image data.
- the points for example, may be associated with edges of the flexible elongate device, as discussed, for example with reference to FIGS. 6B and C.
- FIG. 6B schematically illustrates example candidate segments 650a-g of the flexible elongate device 640 segmented from the image 600 or the 3D image data associated with the image 600.
- the candidate segments 650a-g may be referred to, simply, as segments 650a-g.
- segments 650a-g are distinct from the centerline curve segments discussed with reference to FIG. 6D in that the segments 650a-g are volumetric features (e.g., point clouds representing volumetric features).
- the system may be configured to identify the candidate segments 650a-c that belong to the flexible elongate device 640 and reject the spurious candidate segments 650d-g.
- the system may represent each of the candidate segments 650a-c with a respective candidate centerline curve segment, as discussed with reference to FIGS. 6C, D.
- the system may be configured to identify the first segment 650a and to reject at least some of the other candidate segments 650b-g without computing at least some of the respective centerline curve segments.
- the system may identify the first segment 650a based on receiving an indication of tip position in imaging coordinates.
- the cloud point associated with the segment 650a may include the tip position or have points nearest to the tip position.
- the system may compute a filter region F1 to implement what can be referred to as a direction filter.
- the system may compute a direction vector at an end of the first segment 650a opposite from the tip 645.
- the direction vector 655a may be computed based on fitting the segment 650a with a centerline curve (e.g., centerline curve segment 660a in FIG. 6C) and taking the orientation in 3D of the centerline curve at the suitable terminal point.
- the filter region F1 may then be limited by an angular extent from the direction vector 655a.
- the filter region F1 may have a conical shell, isotropic around the direction vector 655a.
- the region F1 may be anisotropic, with the shell surface of an elliptical cone.
- the centerline of the segment 650a may lie predominantly in a plane.
- the system may be configured to give the filter region F1 a larger in-plane angle than an orthogonal out-of-plane angle.
- the filter region may be bounded by a maximum distance from the terminal point of the segment 650a.
- the system may reject segments 650c, e-g as continuations of the first segment 650a because they lie entirely outside of the filter region F1 .
- the system may designate segments 650b and d as suitable candidate second segments.
- the system may proceed to select candidate segment 650b as the second segment using suitable criteria (e.g., as discussed with reference to FIG. 6D) and proceed to find additional segments that represent the flexible elongate device 640.
- the system may repeat the filtering and selection process by treating segment 650b, in a sense, as the new “first segment,” computing the direction vector 655b, computing filter region F2, and identifying candidate segments 650c, f for expanding the segment-by-segment representation of the flexible elongate device 640. Selecting among candidate segments and identifying one or more centerline curve segments representing a flexible elongate device in image data, is discussed in more detail with reference to FIG. 6D.
- FIG. 6D schematically illustrates example techniques for selecting between alternative candidate centerline curve segments based on candidate segments of a flexible elongate device of FIG. 6B.
- a system e.g., system 100
- the centerline curve segment 660a may be referred to as the first centerline curve segment 660a with reference to FIG. 6D.
- centerline curve segments may be referred to, more simply, as curve segments. Consequently, candidate centerline curve segments may be referred to as candidate curve segments, and, similarly, for other descriptors.
- Terminal points 670a and 670b of the first curve segment 660a are, respectively, the tip or the tip- proximal terminal point 670a and the tip-distal terminal point 670b.
- the system may identify the tip-proximal terminal point 670a based on receiving an indication of tip position from imaging data.
- Other terminal points 670c-g play various roles in assembling a set of curve segments to represent the flexible elongate device 640.
- the system may compute a first direction vector 680a and a respective filter region F3.
- the first direction vector 680a may be the same as the direction vector 655, while the filter region F3 may be the filter region F1 .
- the system may compute or identify candidate second curve segments 660b and d by considering proximal points 670c, d, f and g of the remaining candidate curve segments 660d, b, c and e, respectively.
- the system may designate the remaining candidate curve segments 660b-e as candidate second curve segments.
- the proximal points 670c, d, f and g are closest, in their respective candidate curve segments 660d, b, c and e to the distal terminal point 670b of the first curve segment 660a. Only two proximal points - 670c and d - are within the directional filter region F1 .
- proximal points (e.g., to the terminal point 670b) need not be segment end points.
- the system may identify a proximal point to a reference point as a point at least as close to the reference point as all other points of the respective candidate curve segment.
- the system may identify a proximal point to a reference point as a terminal point at least as close to the reference point as an alternative terminal point of the respective candidate curve segment.
- the proximal points 670c, d, f and g with reference to the reference terminal point 670b fit both of the above descriptions of terminal points.
- the system may choose no more than one as the second curve segment. To that end, the system may compute direction vectors 685a and c at the respective proximal points 670c and d.
- the curve segment 660d as the accepted candidate second curve segment 660d and the curve segment 660b as an alternative accepted second curve segment 660b.
- the direction vector 685c as a second direction vector and the direction vector 685a as an alternative second direction vector.
- the direction vectors 685a and c are vectors tangent to the respective curve segments 660b and d at the respective proximal points 670d and c.
- the direction vectors 685a and c of the respective curve segments 660b and d may be vectors pointing to the reference terminal point 670b from the respective proximal points 670d and c.
- the system may compute collinearity factors between the first direction vector 680a and the second direction vector 685c and the second direction vector between the first direction vector 680a and the alternative second direction vector 685a.
- the collinearity factor between the first direction vector 680a and the alternative second direction vector 685a may be referred to as an alternative collinearity factor.
- a collinearity factor depends on the magnitude of an acute angle formed between two vectors - the smaller the angle, the higher the collinearity factor, regardless of the chosen formula. Because the angle between the first direction vector 680a and the second direction vector 685c is larger than the angle between the first direction vector 680a and the alternative second direction vector 685a, the system can reject the candidate second curve segment 660d. In cases where the collinearity threshold is lower than the collinearity factor between the first direction vector 680a and the alternative second direction vector 685a, the system can designate the alternative candidate second curve segment 660b as the second curve segment 660b.
- the system may compute direction vectors and respective collinearity vectors for selecting candidate second curve segments. That is, the system may outright reject a vector (avoid accepting as a candidate second vector) in the case that the respective collinearity factor exceeds a threshold. In a sense, adding a threshold to a collinearity factor may be thought of as an alternative or additional filter criterion (along with directions and/or distances to proximal points).
- the system may select among alternative candidate second curve segments (e.g., curve segments 660b and d) based on the positions of respective proximal points (e.g., proximal points 670d and c of curve segments 660b and d, respectively) within a filter region (e.g., region F3).
- a candidate second curve segment with a proximal point closer to the directional center of the filter region (than a proximal point of another candidate second curve segment) may be accepted as the second curve segment.
- the system may reject the curve segment 660d (e.g., not accept the curve segment 660d as a candidate second curve segment) based on a low collinearity factor between the first direction vector 680a and the second direction vector 685b.
- the collinearity factor may be one of filter criteria.
- the system may repurpose the procedure described above to select a third curve segment from the candidate curve segments 660c and e. To that end, the system may compare the collinearity factor between direction vectors 680b and 685b to the collinearity factor between direction vectors 680b and 685d. The system can designate the candidate curve segment 660c, corresponding to the larger of the two collinearity factors, as the third curve segment, provided that the collinearity factor is above the threshold value.
- FIGS. 7-9B depict diagrams of a medical system that may be used for manipulating a medical instrument that includes a flexible elongate device according to any of the methods and systems described above, in some examples.
- each reference above to the “system” may refer to a system (e.g., system 700) discussed below, or to a subsystem thereof.
- FIG. 7 is a simplified diagram of a medical system 700 according to some examples.
- the medical system 700 may include at least portions of the system 100 described with reference to FIG. 1 .
- the medical system 700 may be suitable for use in, for example, surgical, diagnostic (e.g., biopsy), or therapeutic (e.g., ablation, electroporation, etc.) procedures. While some examples are provided herein with respect to such procedures, any reference to medical or surgical instruments and medical or surgical methods is non-limiting.
- the systems, instruments, and methods described herein may be used for animals, human cadavers, animal cadavers, portions of human or animal anatomy, non-surgical diagnosis, as well as for industrial systems, general or special purpose robotic systems, general or special purpose teleoperational systems, or robotic medical systems.
- medical system 700 may include a manipulator assembly 702 that controls the operation of a medical instrument 704 in performing various procedures on a patient (e.g., patient P on table T, as in FIG. 1 ).
- the medical instrument 704 may include the flexible elongated device 140 of FIG. 1 .
- Medical instrument 704 may extend into an internal site within the body of patient P via an opening in the body of patient P.
- the manipulator assembly 702 may be teleoperated, non-teleoperated, or a hybrid teleoperated and non-teleoperated assembly with one or more degrees of freedom of motion that may be motorized and/or one or more degrees of freedom of motion that may be non-motorized (e.g., manually operated).
- the manipulator assembly 702 may be mounted to and/or positioned near patient table T.
- a master assembly 706 allows an operator O (e.g., a surgeon, a clinician, a physician, or other user, as described above) to control the manipulator assembly 702.
- the master assembly 706 allows the operator O to view the procedural site or other graphical or informational displays.
- the manipulator assembly 702 may be excluded from the medical system 700 and the instrument 704 may be controlled directly by the operator O.
- the manipulator assembly 702 may be manually controlled by the operator O. Direct operator control may include various handles and operator interfaces for hand-held operation of the instrument 704.
- the master assembly 706 may be located at a surgeon’s console which is in proximity to (e.g., in the same room as) the patient table T on which patient P is located, such as at the side of the patient table T. In some examples, the master assembly 706 is remote from the patient table T, such as in in a different room or a different building from the patient table T.
- the master assembly 706 may include one or more control devices for controlling the manipulator assembly 702.
- the control devices may include any number of a variety of input devices, such as joysticks, trackballs, scroll wheels, directional pads, buttons, data gloves, trigger-guns, hand-operated controllers, voice recognition devices, motion or presence sensors, and/or the like.
- the manipulator assembly 702 supports the medical instrument 704 and may include a kinematic structure of links that provide a set-up structure.
- the links may include one or more non-servo controlled links (e.g., one or more links that may be manually positioned and locked in place) and/or one or more servo controlled links (e.g., one or more links that may be controlled in response to commands, such as from a control system 712).
- the manipulator assembly 702 may include a plurality of actuators (e.g., motors) that drive inputs on the medical instrument 704 in response to commands, such as from the control system 712.
- the actuators may include drive systems that move the medical instrument 704 in various ways when coupled to the medical instrument 704.
- one or more actuators may advance medical instrument 704 into a naturally or surgically created anatomical orifice.
- Actuators may control articulation of the medical instrument 704, such as by moving the distal end (or any other portion) of medical instrument 704 in multiple degrees of freedom.
- degrees of freedom may include three degrees of linear motion (e.g., linear motion along the X, Y, Z Cartesian axes) and in three degrees of rotational motion (e.g., rotation about the X, Y, Z Cartesian axes).
- One or more actuators may control rotation of the medical instrument about a longitudinal axis.
- Actuators can also be used to move an articulable end effector of medical instrument 704, such as for grasping tissue in the jaws of a biopsy device and/or the like, or may be used to move or otherwise control tools (e.g., imaging tools, ablation tools, biopsy tools, electroporation tools, etc.) that are inserted within the medical instrument 704.
- medical instrument 704 such as for grasping tissue in the jaws of a biopsy device and/or the like
- move or otherwise control tools e.g., imaging tools, ablation tools, biopsy tools, electroporation tools, etc.
- the control system 712 may include at least portions of the processing unit 125. Additionally or alternatively, the control system 712 may be in communicative connection with the processing unit 125. In some examples, the output of the processing unit 125 according to the techniques described above may cause the control system 712 to autonomously (without input from the operator O) control certain movements of the flexible elongate device 140.
- the medical system 700 may include a sensor system 708 (which may include at least a portion of the sensing unit 120) with one or more sub-systems for receiving information about the manipulator assembly 702 and/or the medical instrument 704.
- Such sub-systems may include a position sensor system (e.g., that uses electromagnetic (EM) sensors or other types of sensors that detect position or location); a shape sensor system for determining the position, orientation, speed, velocity, pose, and/or shape of a distal end and/or of one or more segments along a flexible body of the medical instrument 704; a visualization system for capturing images, such as from the distal end of medical instrument 704 or from some other location; and/or actuator position sensors such as resolvers, encoders, potentiometers, and the like that describe the rotation and/or orientation of the actuators controlling the medical instrument 704.
- EM electromagnetic
- the subsystems may include an imaging sub-system (e.g., using a color imaging device, an infrared imaging device, an ultrasound imaging device, an x-ray imaging device, a fluoroscopic imaging device, a computed tomography (CT) imaging device, a magnetic resonance imaging (MRI) imaging device, or some other type of imaging device), such as the imaging unit 110.
- an imaging sub-system e.g., using a color imaging device, an infrared imaging device, an ultrasound imaging device, an x-ray imaging device, a fluoroscopic imaging device, a computed tomography (CT) imaging device, a magnetic resonance imaging (MRI) imaging device, or some other type of imaging device
- CT computed tomography
- MRI magnetic resonance imaging
- the positions and orientations of sensors in the sensor system 708 may be determined in the sensor coordinate system.
- the sensor coordinate system is integrated with or identical to the coordinate system of the manipulator assembly 702.
- the medical system 700 may include a display system 710 (e.g., display device 130) for displaying an image or representation of the procedural site and the medical instrument 704.
- Display system 710 and master assembly 706 may be oriented so physician O can control medical instrument 704 and master assembly 706 with the perception of telepresence.
- the display system 710 may include at least portions of the display unit 130.
- the medical instrument 704 may include a visualization system, which may include an image capture assembly that records a concurrent or real-time image of a procedural site and provides the image to the operator O through one or more displays of display system 710.
- the image capture assembly may include various types of imaging devices.
- the concurrent image may be, for example, a two-dimensional image or a 3D image captured by an endoscope positioned within the anatomical procedural site.
- the visualization system may obtain intra-operative images in image system coordinates, distinct from the sensor system coordinates.
- the visualization system may include endoscopic components that may be integrally or removably coupled to medical instrument 704.
- a separate endoscope attached to a separate manipulator assembly, may be used with medical instrument 704 to image the procedural site.
- the visualization system may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, such as of the control system 712.
- Display system 710 may also display an image of the procedural site and medical instruments, which may be captured by the visualization system.
- the medical system 700 provides a perception of telepresence to the operator O.
- images captured by an imaging device at a distal portion of the medical instrument 704 may be presented by the display system 710 to provide the perception of being at the distal portion of the medical instrument 704 to the operator O.
- the input to the master assembly 706 provided by the operator O may move the distal portion of the medical instrument 704 in a manner that corresponds with the nature of the input (e.g., distal tip turns right when a trackball is rolled to the right) and results in corresponding change to the perspective of the images captured by the imaging device at the distal portion of the medical instrument 704.
- the perception of telepresence for the operator O is maintained as the medical instrument 704 is moved using the master assembly 706.
- the operator O can manipulate the medical instrument 704 and hand controls of the master assembly 706 as if viewing the workspace in substantially true presence, simulating the experience of an operator that is physically manipulating the medical instrument 704 from within the patient anatomy.
- the display system 710 may present virtual images of a procedural site that are created using image data recorded pre-operatively (e.g., prior to the procedure performed by the medical instrument system 200) or intra-operatively (e.g., concurrent with the procedure performed by the medical instrument system 200), such as image data created using computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedance imaging, laser imaging, nanotube X-ray imaging, and/or the like.
- CT computed tomography
- MRI magnetic resonance imaging
- PET positron emission tomography
- fluoroscopy thermography
- ultrasound ultrasound
- OCT optical coherence tomography
- thermal imaging impedance imaging
- laser imaging nanotube X-ray imaging
- nanotube X-ray imaging and/or the like.
- the virtual images may include two-dimensional, 3D, or higher-dimensional (e.g., including, for example
- display system 710 may display a virtual image that is generated based on tracking the location of medical instrument 704.
- the tracked location of the medical instrument 704 may be registered (e.g., dynamically referenced) with the model generated using the preoperative or intra-operative images, with different portions of the model correspond with different locations of the patient anatomy.
- the registration is used to determine portions of the model corresponding with the location and/or perspective of the medical instrument 704 and virtual images are generated using the determined portions of the model. This may be done to present the operator O with virtual images of the internal procedural site from viewpoints of medical instrument 704 that correspond with the tracked locations of the medical instrument 704.
- the display system 710 may include the display unit 130 and may display images including the position, orientation, and/or pose of the medical instrument 704 in a joint coordinate system based on registering the sensor coordinate system with the imaging coordinate system according to the techniques described above with reference to FIGS. 2- 6D.
- the medical system 700 may also include the control system 712, which may include processing circuitry (e.g., the processing unit 125) that implements the some or all of the methods or functionality discussed herein.
- the control system 712 may include at least one memory and at least one processor for controlling the operations of the manipulator assembly 702, the medical instrument 704, the master assembly 706, the sensor system 708, and/or the display system 710.
- Control system 712 may include instructions (e.g., a non-transitory machine-readable medium storing the instructions) that when executed by the at least one processor, configures the one or more processors to implement some or all of the methods or functionality discussed herein. While the control system 712 is shown as a single block in FIG.
- control system 712 may include two or more separate data processing circuits with one portion of the processing being performed at the manipulator assembly 702, another portion of the processing being performed at the master assembly 706, and/or the like.
- control system 712 may include other types of processing circuitry, such as application-specific integrated circuits (ASICs) and/or field- programmable gate array (FPGAs).
- ASICs application-specific integrated circuits
- FPGAs field- programmable gate array
- the control system 712 may be implemented using hardware, firmware, software, or a combination thereof.
- control system 712 may receive feedback from the medical instrument 704, such as force and/or torque feedback. Responsive to the feedback, the control system 712 may transmit signals to the master assembly 706. In some examples, the control system 712 may transmit signals instructing one or more actuators of the manipulator assembly 702 to move the medical instrument 704. In some examples, the control system 712 may transmit informational displays regarding the feedback to the display system 710 for presentation or perform other types of actions based on the feedback. [0108] The control system 712 may include a virtual visualization system to provide navigation assistance to operator O when controlling the medical instrument 704 during an image-guided medical procedure.
- Virtual navigation using the virtual visualization system may be based upon an acquired pre-operative or intra-operative dataset of anatomical passageways of the patient P.
- the control system 712 or a separate computing device may convert the recorded images, using programmed instructions alone or in combination with operator inputs, into a model of the patient anatomy.
- the model may include a segmented two-dimensional or 3D composite representation of a partial or an entire anatomical organ or anatomical region.
- An image data set may be associated with the composite representation.
- the virtual visualization system may obtain sensor data from the sensor system 708 that is used to compute an (e.g., approximate) location of the medical instrument 704 with respect to the anatomy of patient P.
- the sensor system 708 may be used to register and display the medical instrument 704 together with the pre-operatively or intra-operatively recorded images.
- PCT Publication WO 2016/191298 published December 1 , 2016 and titled “Systems and Methods of Registration for Image Guided Surgery”
- the registration may be based on the techniques discussed above with reference to FIGS. 2-6D.
- the sensor system 708 may be used to compute the (e.g., approximate) location of the medical instrument 704 with respect to the anatomy of patient P.
- the location can be used to produce both macro-level (e.g., external) tracking images of the anatomy of patient P and virtual internal images of the anatomy of patient P.
- the system may include one or more electromagnetic (EM) sensors, fiber optic sensors, and/or other sensors to register and display a medical instrument together with pre- operatively recorded medical images.
- EM electromagnetic
- Medical system 700 may further include operations and support systems (not shown) such as illumination systems, steering control systems, irrigation systems, and/or suction systems.
- the medical system 700 may include more than one manipulator assembly and/or more than one master assembly. The exact number of manipulator assemblies may depend on the medical procedure and space constraints within the procedural room, among other factors. Multiple master assemblies may be co-located or they may be positioned in separate locations. Multiple master assemblies may allow more than one operator to control one or more manipulator assemblies in various combinations.
- FIG. 8A is a simplified diagram of a medical instrument system °0 according to some examples.
- the medical instrument system 800 includes a flexible elongate device 802 (e.g., device 140), also referred to as elongate device 802, a drive unit 804, and a medical tool 826 that collectively is an example of a medical instrument 704 of a medical system 700.
- the medical system 700 may be a teleoperated system, a non-teleoperated system, or a hybrid teleoperated and non-teleoperated system, as explained with reference to FIG. 7.
- a visualization system 831 , tracking system 830, and navigation system 832 are also shown in FIG. 8A and are example components of the control system 712 of the medical system 700.
- the medical instrument system 800 may be used for non-teleoperational exploratory procedures or in procedures involving traditional manually operated medical instruments, such as endoscopy.
- the medical instrument system 800 may be used to gather (e.g., measure) a set of data points corresponding to locations within anatomical passageways of a patient, such as patient P.
- the elongate device 802 is coupled to the drive unit 804.
- the elongate device 802 includes a channel 821 through which the medical tool 826 may be inserted.
- the elongate device 802 navigates within patient anatomy to deliver the medical tool 826 to a procedural site.
- the elongate device 802 includes a flexible body 816 having a proximal end 817 and a distal end 818.
- the flexible body 816 may have an approximately 3 mm outer diameter. Other flexible body outer diameters may be larger or smaller.
- Medical instrument system 800 may include the tracking system 830 for determining the position, orientation, speed, velocity, pose, and/or shape of the flexible body 816 at the distal end 818 and/or of one or more segments 824 along flexible body 816, as will be described in further detail below.
- the tracking system 830 may include one or more sensors and/or imaging devices.
- the flexible body 816 such as the length between the distal end 818 and the proximal end 817, may include multiple segments 824.
- the tracking system 830 may be implemented using hardware, firmware, software, or a combination thereof. In some examples, the tracking system 830 is part of control system 712 shown in FIG. 7.
- the tracking system 830 may implement at least some of the techniques described with reference to FIGS. 1 A-6, and, to that end, may include at least portions of or be in communicative connection with the processing unit 125 of FIG. 1A.
- T racking system 830 may track the distal end 818 and/or one or more of the segments 824 of the flexible body 816 using a shape sensor 822.
- the shape sensor 822 may be omitted.
- the shape sensor 822 may include an optical fiber aligned with the flexible body 816 (e.g., provided within an interior channel of the flexibly body 816 or mounted externally along the flexible body 816).
- the optical fiber may have a diameter of approximately 800 pm. In other examples, the diameter may be larger or smaller.
- the optical fiber of the shape sensor 822 may form a fiber optic bend sensor for determining the shape of flexible body 816.
- Optical fibers including Fiber Bragg Gratings may be used to provide strain measurements in structures in one or more dimensions.
- FBGs Fiber Bragg Gratings
- Various systems and methods for monitoring the shape and relative position of an optical fiber in three dimensions are described in U.S. Patent Application Publication No. 2006/0013523 (filed July 13, 2005 and titled “Fiber optic position and shape sensing device and method relating thereto”); U.S. Patent No. 7,772,541 (filed on March 12, 2008 and titled “Fiber Optic Position and/or Shape Sensing Based on Rayleigh Scatter”); and U.S. Patent No. 8,773,650 (filed on Sept.
- Sensors in some examples may employ other suitable strain sensing techniques, such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering.
- the shape of the flexible body 816 may be determined using other techniques. For example, a history of the position and/or pose of the distal end 818 of the flexible body 816 can be used to reconstruct the shape of flexible body 816 over an interval of time (e.g., as the flexible body 816 is advanced or retracted within a patient anatomy).
- the tracking system 830 may alternatively and/or additionally track the distal end 818 of the flexible body 816 using a position sensor system 820.
- Position sensor system 820 may be a component of an EM sensor system with the position sensor system 820 including one or more position sensors. Although the position sensor system 820 is shown as being near the distal end 818 of the flexible body 816 to track the distal end 818, the number and location of the position sensors of the position sensor system 820 may vary to track different regions along the flexible body 816.
- the position sensors include conductive coils that may be subjected to an externally generated electromagnetic field. Each coil of position sensor system 820 may produce an induced electrical signal having characteristics that depend on the position and orientation of the coil relative to the externally generated electromagnetic field.
- the position sensor system 820 may measure one or more position coordinates and/or one or more orientation angles associated with one or more portions of flexible body 816.
- the position sensor system 820 may be configured and positioned to measure six degrees of freedom, e.g., three position coordinates X, Y, Z and three orientation angles indicating pitch, yaw, and roll of a base point. In some examples, the position sensor system 820 may be configured and positioned to measure five degrees of freedom, e.g., three position coordinates X, Y, Z and two orientation angles indicating pitch and yaw of a base point. Further description of a position sensor system, which may be applicable in some examples, is provided in U.S. Patent No. 6,380,732 (filed August 11 , 1999 and titled “Six-Degree of Freedom Tracking System Having a Passive Transponder on the Object Being Tracked”), which is incorporated by reference herein in its entirety.
- a processing unit may enhance the accuracy of positions obtained by the sensor system 820 by combing data obtained by the sensor system 820 with data obtained by an external imaging system (e.g., by way of the imaging unit 110) according to the techniques of this disclosure described above with reference to FIGS. 2-6D.
- the tracking system 830 may alternately and/or additionally rely on a collection of pose, position, and/or orientation data stored for a point of an elongate device 802 and/or medical tool 826 captured during one or more cycles of alternating motion, such as breathing. This stored data may be used to develop shape information about the flexible body 816.
- a series of position sensors (not shown), such as EM sensors like the sensors in position sensor system 820 or some other type of position sensors may be positioned along the flexible body 816 and used for shape sensing.
- a history of data from one or more of these position sensors taken during a procedure may be used to represent the shape of elongate device 802, particularly if an anatomical passageway is generally static.
- Medical tool 826 may be, for example, an image capture probe, a biopsy tool (e.g., a needle, grasper, brush, etc.), an ablation tool (e.g., a laser ablation tool, radio frequency (RF) ablation tool, cryoablation tool, thermal ablation tool, heated liquid ablation tool, etc.), an electroporation tool, and/or another surgical, diagnostic, or therapeutic tool.
- the medical tool 826 may include an end effector having a single working member such as a scalpel, a blunt blade, an optical fiber, an electrode, and/or the like.
- Other end types of end effectors may include, for example, forceps, graspers, scissors, staplers, clip appliers, and/or the like.
- the medical tool 826 may be a biopsy tool used to remove sample tissue or a sampling of cells from a target anatomical location.
- the biopsy tool is a flexible needle.
- the biopsy tool may further include a sheath that can surround the flexible needle to protect the needle and interior surface of the channel 821 when the biopsy tool is within the channel 821 .
- the medical tool 826 may be an image capture probe that includes a distal portion with a stereoscopic or monoscopic camera that may be placed at or near the distal end 818 of flexible body 816 for capturing images (e.g., still or video images).
- the captured images may be processed by the visualization system 831 for display and/or provided to the tracking system 830 to support tracking of the distal end 818 of the flexible body 816 and/or one or more of the segments 824 of the flexible body 816.
- the image capture probe may include a cable for transmitting the captured image data that is coupled to an imaging device at the distal portion of the image capture probe.
- the image capture probe may include a fiber-optic bundle, such as a fiberscope, that couples to a more proximal imaging device of the visualization system 831 .
- the image capture probe may be single-spectral or multi-spectral, for example, capturing image data in one or more of the visible, near-infrared, infrared, and/or ultraviolet spectrums.
- the image capture probe may also include one or more light emitters that provide illumination to facilitate image capture.
- the image capture probe may use ultrasound, x-ray, fluoroscopy, CT, MRI, or other types of imaging technology.
- the image capture probe is inserted within the flexible body 816 of the elongate device 802 to facilitate visual navigation of the elongate device 802 to a procedural site and then is replaced within the flexible body 816 with another type of medical tool 826 that performs the procedure.
- the image capture probe may be within the flexible body 816 of the elongate device 802 along with another type of medical tool 826 to facilitate simultaneous image capture and tissue intervention, such as within the same channel or in separate channels.
- a medical tool 826 may be advanced from the opening of the channel 821 to perform the procedure (or some other functionality) and then retracted back into the channel 821 when the procedure is complete.
- the medical tool 826 may be removed from the proximal end 817 of the flexible body 816 or from another optional instrument port (not shown) along flexible body 816.
- the elongate device 802 may include integrated imaging capability rather than utilize a removable image capture probe.
- the imaging device (or fiber-optic bundle) and the light emitters may be located at the distal end 818 of the elongate device 802.
- the flexible body 815 may include one or more dedicated channels that carry the cable(s) and/or optical fiber(s) between the distal end 818 and the visualization system 831 .
- the medical instrument system 800 can perform simultaneous imaging and tool operations.
- the medical tool 826 is capable of controllable articulation.
- the medical tool 826 may house cables (which may also be referred to as pull wires), linkages, or other actuation controls (not shown) that extend between its proximal and distal ends to controllably bend the distal end of medical tool 826, such as discussed herein for the flexible elongate device 802.
- the medical tool 826 may be coupled to a drive unit 804 and the manipulator assembly 702.
- the elongate device 802 may be excluded from the medical instrument system 800 or may be a flexible device that does not have controllable articulation. Steerable instruments or tools, applicable in some examples, are further described in detail in U.S. Patent No.
- the flexible body 816 of the elongate device 802 may also or alternatively house cables, linkages, or other steering controls (not shown) that extend between the drive unit 804 and the distal end 818 to controllably bend the distal end 818 as shown, for example, by broken dashed line depictions 819 of the distal end 818 in FIG. 2A.
- at least four cables are used to provide independent up-down steering to control a pitch of the distal end 818 and left-right steering to control a yaw of the distal end 881 .
- the flexible elongate device 802 may be a steerable catheter.
- the drive unit 804 may include drive inputs that removably couple to and receive power from drive elements, such as actuators, of the teleoperational assembly.
- the elongate device 802 and/or medical tool 826 may include gripping features, manual actuators, or other components for manually controlling the motion of the elongate device 802 and/or medical tool 826.
- the elongate device 802 may be steerable or, alternatively, the elongate device 802 may be non-steerable with no integrated mechanism for operator control of the bending of distal end 818.
- one or more channels 821 (which may also be referred to as lumens), through which medical tools 826 can be deployed and used at a target anatomical location, may be defined by the interior walls of the flexible body 816 of the elongate device 802.
- the medical instrument system 800 may include a flexible bronchial instrument, such as a bronchoscope or bronchial catheter, e.g., for use in visual examination and diagnosis, biopsy, and/or treatment of a lung.
- a flexible bronchial instrument such as a bronchoscope or bronchial catheter, e.g., for use in visual examination and diagnosis, biopsy, and/or treatment of a lung.
- the medical instrument system 800 may also be suited for navigation and treatment of other tissues, via natural or surgically created connected passageways, in any of a variety of anatomical systems, including the colon, the intestines, the kidneys and kidney calices, the brain, the heart, the circulatory system including vasculature, and/or the like.
- the information from the tracking system 830 may be sent to the navigation system 832, where the information may be combined with information from the visualization system 831 and/or pre-operatively obtained models to provide the physician, clinician, surgeon, or other operator with real-time position information.
- the tracking system 830, the navigation system 832, and the visualization system 831 may cooperatively implement, at least partially, the functionality of the system 100 in implementing the techniques described with reference to FIGS. 1 A-6.
- the real-time position information may be displayed on the display system 710 for use in the control of the medical instrument system 800.
- the navigation system 832 may utilize the position information as feedback for positioning medical instrument system 800.
- FIGS. 9A and 9B are simplified diagrams of side views of a patient coordinate space including a medical instrument mounted on an insertion assembly according to some examples.
- a surgical environment 900 may include the patient P positioned on the patient table T.
- Patient P may be stationary within the surgical environment 900 in the sense that gross patient movement is limited by sedation, restraint, and/or other means. Cyclic anatomical motion, including respiration and cardiac motion, of patient P may continue.
- a medical instrument 904 is used to perform a medical procedure which may include, for example, surgery, biopsy, ablation, illumination, irrigation, suction, or electroporation.
- the medical instrument 904 may also be used to perform other types of procedures, such as a registration procedure to associate the position, orientation, and/or pose data captured by the sensor system 708 to a desired (e.g., anatomical or system) reference frame.
- the medical instrument 904 may be, for example, the medical instrument 704.
- the medical instrument 904 may include an elongate device 910 (e.g., a catheter) coupled to an instrument body 912.
- Elongate device 910 may be the elongate device 140 of FIG. 1.
- Elongate device 910 includes one or more channels sized and shaped to receive a medical tool. [0128]
- Elongate device 910 may also include one or more sensors (e.g., components of the sensor system 708).
- a shape sensor 914 may be fixed at a proximal point 916 on the instrument body 912.
- the proximal point 916 of the shape sensor 914 may be movable with the instrument body 912, and the location of the proximal point 916 with respect to a desired reference frame may be known (e.g., via a tracking sensor or other tracking device).
- the shape sensor 914 may measure a shape from the proximal point 916 to another point, such as a distal end 918 of the elongate device 910.
- the shape sensor 914 may be aligned with the elongate device 910 (e.g., provided within an interior channel or mounted externally).
- the shape sensor 914 may optical fibers used to generate shape information for the elongate device 910.
- position sensors e.g., EM sensors
- a series of position sensors may be positioned along the flexible elongate device 910 and used for shape sensing.
- Position sensors may be used alternatively to the shape sensor 914 or with the shape sensor 914, such as to improve the accuracy of shape sensing or to verify shape information.
- Elongate device 910 may house cables, linkages, or other steering controls that extend between the instrument body 912 and the distal end 918 to controllably bend the distal end 918.
- at least four cables are used to provide independent up- down steering to control a pitch of distal end 918 and left-right steering to control a yaw of distal end 918.
- the instrument body 912 may include drive inputs that removably couple to and receive power from drive elements, such as actuators, of a manipulator assembly.
- the instrument body 912 may be coupled to an instrument carriage 906.
- the instrument carriage 906 may be mounted to an insertion stage 908 that is fixed within the surgical environment 900.
- the insertion stage 908 may be movable but have a known location (e.g., via a tracking sensor or other tracking device) within surgical environment 900.
- Instrument carriage 906 may be a component of a manipulator assembly (e.g., manipulator assembly 702) that couples to the medical instrument 904 to control insertion motion (e.g., motion along an insertion axis A) and/or motion of the distal end 918 of the elongate device 910 in multiple directions, such as yaw, pitch, and/or roll.
- the instrument carriage 906 or insertion stage 908 may include actuators, such as servomotors, that control motion of instrument carriage 906 along the insertion stage 908.
- a sensor device 920 which may be a component of the sensor system 708, may provide information about the position of the instrument body 912 as it moves relative to the insertion stage 908 along the insertion axis A.
- the sensor device 920 may include one or more resolvers, encoders, potentiometers, and/or other sensors that measure the rotation and/or orientation of the actuators controlling the motion of the instrument carriage 906, thus indicating the motion of the instrument body 912.
- the insertion stage 908 has a linear track as shown in FIGS. 9A and 9B.
- the insertion stage 908 may have curved track or have a combination of curved and linear track sections.
- FIG. 9A shows the instrument body 912 and the instrument carriage 906 in a retracted position along the insertion stage 908.
- the proximal point 916 is at a position L0 on the insertion axis A.
- the location of the proximal point 916 may be set to a zero value and/or other reference value to provide a base reference (e.g., corresponding to the origin of a desired reference frame) to describe the position of the instrument carriage 906 along the insertion stage 908.
- the distal end 918 of the elongate device 910 may be positioned just inside an entry orifice of patient P.
- the instrument body 912 and the instrument carriage 906 have advanced along the linear track of insertion stage 908, and the distal end 918 of the elongate device 910 has advanced into patient P.
- the proximal point 916 is at a position L1 on the insertion axis A.
- the rotation and/or orientation of the actuators measured by the sensor device 920 indicating movement of the instrument carriage 906 along the insertion stage 908 and/or one or more position sensors associated with instrument carriage 906 and/or the insertion stage 908 may be used to determine the position L1 of the proximal point 916 relative to the position L0.
- the position L1 may further be used as an indicator of the distance or insertion depth to which the distal end 918 of the elongate device 910 is inserted into the passageway(s) of the anatomy of patient P.
- FIG. 10 depicts a flow chart for an example method 1000 for visualizing patient anatomy during a medical procedure.
- the method 1000 may be implemented by a system (e.g., system 100) which may include one or more processors (e.g., disposed at the processing unit 125). Furthermore, instructions for executing the method 1000 on the one or more processors may be stored on a tangible, non-transitory, computer readable medium.
- the method 1000 includes obtaining, based on image data received from an imaging system (e.g., imaging unit 110, imaging system 210, etc.), a three- dimensional (3D) representation (e.g., image 600) of an anatomical structure (e.g., anatomical structure A) and a portion of a flexible elongate device (e.g., flexible elongate device 640) disposed in the anatomical structure, the portion including a tip (e.g., tip 505 or 645).
- obtaining the 3D representation may include receiving the 3D representation from the imaging system.
- obtaining the 3D representation may include reconstructing, using one or more processors, the 3D representation based on image data (e.g., projection data) received from the imaging system.
- the 3D representation is in an imaging coordinate system associated with the imaging system.
- the system may receive the 3D representation from an imaging system (e.g., imaging unit 110, imaging system 210) via any suitable communication link (e.g., link 250).
- the system receives the 3D representation by way of digital storage that is outside of the imaging system.
- the 3D representation may be a 3D image or 2D slices that are reconstructed from projections (e.g., by the imaging system).
- the system may receive projection data directly and/or perform at least some of the reconstruction.
- the method 1000 includes receiving, from a sensing unit (e.g., sensing unit 120 or 235), shape data (e.g., shape data 403 or 503) indicative of a shape of the flexible elongate device including a position of the tip (e.g., tip 506).
- shape data is in a sensing coordinate system associated with the sensing unit.
- the method 1000 may include determining, by the one or more processors, that the shape data is received in an imaging time window as discussed, for example, with reference to FIG. 3.
- the low-motion period may be a period during which a patient holds their breath, per operator instructions, and the imaging data is being acquired.
- the method 1000 includes obtaining a position of the tip in the imaging coordinate system.
- Obtaining the position of the tip may be based on receiving the position based on a selection (e.g., by an operator) at a GUI of a display device. Additionally or alternatively, suitable image processing techniques may aid in the selection of the tip position.
- the method 1000 includes computing a translational transformation between the imaging coordinate system and the sensing coordinate system based on the position of the tip in the sensing coordinate system and the position of the tip in the imaging coordinate system (e.g., as discussed with reference to FIGS. 5C, D).
- the method 1000 includes computing (e.g., as discussed with reference to FIGS. 6C, D), based on the received 3D representation, one or more curve segments (e.g., segments 502a-c, 660a-c, etc.) indicative of the shape of the flexible elongate device in the imaging coordinate system.
- computing the one or more curve segments may include a set of operations.
- the system may compute a first curve segment and one or more candidate second curve segments.
- the system may compute a first direction vector representative of direction of the first curve segment (e.g., pointing tangentially to the first curve segment) at a terminal point of the first curve segment.
- the system may computing a filter region based at least in part on selecting a threshold angular deviation from the first direction vector.
- the filter region may also be bound by a maximum distance from the terminal point of the first curve segment.
- the system may use the filter region to accept and/or reject candidate second segments. To do so, the system may identify, for each candidate second curve segment, a proximal point (of the candidate second curve segment with respect to the first curve segment).
- the proximal point may be a point that is closest (or at least as close as all other points of the candidate second curve) to a terminal point of the first curve segment.
- the proximal point may be a terminal point of the candidate second curve segment closest (or at least as close as the alternative terminal point) to the terminal point of the first curve segment.
- the system may accept candidate second curve segments (designate them as accepted candidate second curve segments) when the system determines that the proximal point of the candidate second curve is within the filter region.
- candidate second curve segments may be accepted based on the filter region and the accepted second curve segment is chosen from the accepted candidate second curve segments based on additional selection criteria.
- no accepted second curve segment is designated. That is, the system may use only the first curve segment including the tip and the shape data to compute the rotational coordinate transformation.
- Selecting between one accepted candidate second curve segment and another accepted alternative candidate second curve segment can include several computations. For example, the system may compute, for each of the candidate second curve segments a respective direction vector. In one example, the computed direction vectors are vectors tangent to the candidate second curve vectors at the proximal (to the first curve segment, as discussed above) points. The system may then compute collinearity factors (as discussed with reference to FIG. 6D) between each of the direction vectors at the candidate second curve segments and the first direction vector at the first curve segment. The system may select the candidate second curve segment that has a direction vector most collinear with the first direction vector.
- the system may select among three, four, or any other suitable number of candidate second curve segments using the presently discussed techniques.
- the system may reject the selected candidate second curve segment.
- the system may designate the selected candidate second curve segment as the accepted second curve segment.
- the system may repeat the process discussed above to find a third curve segment. That is, candidate third curve segments may be filtered with a direction filter at a distal terminal point of the accepted second curve segment and an accepted third curve segment may be selected from the filtered option as discussed above. In a sense, the second curve segment becomes, for the selection of the third segment, what the first curve segment was for the selection of the second curve segment. The process may continue to string together any suitable number of curve segments.
- the method 1000 includes computing a rotational transformation (e.g., transformation 550) between the imaging coordinate system and the sensing coordinate system based on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- Computing the rotational transformation may be based at least in part on minimizing a cost function indicative of overlap of the shape data and the one or more curve segments in a shared coordinate system (e.g., as discussed with reference to FIG. 5D).
- the cost function may be based at least in part on a weighted L2 norm of distances between a set of points in the shape data and a respective set of points in the one or more curve segments in the shared coordinate system.
- the weights of the L2 norm may decrease away from the tip.
- performing translation to align tips prior to performing the rotational transformation can be thought of as assigning the discrepancy at the tip a weight that approaches infinity.
- the system may compute the rotational transformation based at least in part on an iterative closest point algorithm or a modified iterative closest point algorithm. Additionally or alternatively, the system may use a coherent point drift algorithm to compute the rotational transformation.
- segments added to the one or more curve segments may have deleterious effects of computing the transformation.
- a curve segment may include an imaging artifact that results in a lower quality transformation than could be obtained without the segment.
- the system may discard one or more curve segment to obtain a better transformation quality.
- the method 1000 includes registering the imaging coordinate system with the sensing coordinate system based on the translational transformation and the rotational transformations. Registering the imaging coordinate system with the sensing coordinate system may include assigning to at least some of the shape data coordinates in the imaging system.
- registering the imaging coordinate system with the sensing coordinate system may include assigning to at least some of the points in the imaging coordinate system respective coordinates in the sensing coordinate system.
- the system may segment a target (e.g., target R), find one or more positions associated with the target, and assign sensing (and, thereby, robotic) coordinates to the positions associated with the target.
- the method 1000 may include identifying a position of a target in the imaging coordinate system and computing the position of the target in the sensing coordinate system based at least in part on the computed translational and rotational transformations.
- registration may generate coordinates in a registration coordinate system that is distinct from both, the imaging and the sensing coordinates.
- the system may perform registration by assigning registration coordinates to one set of points from the sensing coordinates and another set of points from the imaging coordinates so as to reflect the relative positions among the two sets.
- the registration coordinate system may be chosen to have a coordinate origin located at the tip of the flexible elongate device and a z-axis along the pointing direction of the flexible elongate device at the tip.
- the origin of the registration coordinate system may be placed at the target.
- the method 1000 further includes displaying on a display device (e.g., of display units 130 or 230) overlayed images of the shape data and the one or more curve segments.
- the overlayed images may aid in operating a robotic system (e.g., robotic system 205) to control the flexible elongate device, evaluating the quality of registration, and/or making other medical procedure decisions.
- the method 1000 further includes obtaining an indication of low registration quality.
- the system may compare the value of the minimized cost function to a threshold and find that the optimization failed.
- the system may receive the indication of low registration quality (e.g., from an operator) by way of a GUI.
- the system may generate (e.g., on a GUI) a prompt to select a new position of the tip.
- the system may generate a suggestion (e.g., on a GUI) to acquire new imaging data using the imaging device.
- the method 1000 may include recomputing the translational and rotational transformations as described above (e.g., computing new translational and/or rotational transformations), resulting in a new registration.
- the new registration may be based, for example, on new position of the tip and/or new imaging data.
- the registration may be based at least in part on averaging (e.g., with weights selected by the system and/or the user) multiple registration.
- FIGS. 11 A and B depicts a flow chart for an example method 1100 for visualizing patient anatomy during a medical procedure.
- the method 1100 may be implemented by a system (e.g., system 100) which may include one or more processors (e.g., disposed at the processing unit 125). Furthermore, instructions for executing the method 1100 on the one or more processors may be stored on a tangible, non-transitory, computer readable medium.
- FIG. 11 A includes blocks 1110-1160 associated with higher level elements of the method 1100 insofar as the block 1130 includes additional elements discussed in FIG. 11 B with reference to a sub-method 1131 comprising blocks 1132-1137.
- the method 1100 includes receiving, by one or more processors and from a sensing unit, shape data indicative of a shape of the flexible elongate, wherein the shape data is in a sensing coordinate system associated with the sensing unit.
- Block 1110 may include the same options as block 1010 in FIG. 10.
- the method 1100 includes obtaining, by the one or more processors and based on image data received from an imaging system, a three-dimensional (3D) representation of an anatomical structure and a portion of a flexible elongate device disposed in the anatomical structure, wherein the three-dimensional representation is in an imaging coordinate system associated with the imaging system.
- obtaining the 3D representation may include receiving the 3D representation from the imaging system.
- obtaining the 3D representation may include reconstructing, using one or more processors, the 3D representation based on image data (e.g., projection data) received from the imaging system.
- Block 1120 may include the same options as block 1020 in FIG. 10.
- the method 1100 includes computing based at least in part on the received three- dimensional representation, one or more curve segments indicative of the shape of the flexible elongate device.
- the method 1100 includes an example sub-method 1131 , which is illustrated in FIG. 11 B.
- the sub-method 1131 includes the blocks 1132-1137.
- the sub-method 1131 includes computing a first curve segment and a candidate second curve segment.
- the sub-method 1131 includes computing a first direction vector representative of direction of the first curve segment (e.g., pointing tangentially to the first curve segment) at a terminal point of the first curve segment.
- the sub-method 1131 includes computing a filter region based on the first curve segment, wherein computing the filter region is based at least in part on selecting a maximum angular deviation from the first direction vector.
- the sub-method 1131 includes identifying a proximal point of the candidate second curve segment.
- the proximal point can be identified as either i) at least as close to a terminal point of the first curve segment as all other points of the candidate second curve segment; or ii) is a terminal point of the candidate second curve segment, where the terminal point of the candidate second curve is at least as close to the terminal point of the first curve segment as an alternative terminal point of the candidate second curve segment.
- the sub-method 1131 includes determining that the proximal point of the candidate second curve is within the filter region.
- the sub-method 1131 includes designating the candidate second curve segment as an accepted candidate second curve segment based at least in part on determining that the proximal point of the candidate second curve segment lies within the filter region. It should be noted that the blocks of the sub-method 1131 may include the elements discussed above with reference to block 1050 in FIG. 10.
- the method 1100 may use data indicative of position of a tip of the flexible elongate device in the shape data and/or in the 3D representation based on image data. For example, the position of the tip in the 3D representation may aid in ordering points of the first segment.
- the method 1100 may include identifying one or more points as proximal to the tip to aid, for example, in orienting the first curve segment with respect to the tip of the elongate device. The orientation of the first curve segment may aid in identifying the terminal point of the first curve segment. Additionally or alternatively, the method 1100 may identify the terminal point of the first segment based on the relative position of points of other curve segments. Still in some example, the method 1100 may use input from an operator (e.g., via a user interface) to identify the first curve segment and/or the orientation of the first curve segment with respect to the tip of the flexible elongate device.
- an operator e.g., via a user interface
- the method 1100 includes computing a transformation between the imaging coordinate system and the sensing coordinate system.
- Computing the transformation is based on the shape data and the one or more curve segments indicative of the shape of the flexible elongate device in the imaging coordinate system.
- Computing the transformation may be based at least in part on minimizing a cost function indicative of overlap of the shape data and the one or more curve segments in a shared coordinate system (e.g., as discussed with reference to FIG. 5D).
- the cost function may be based at least in part on a weighted L2 norm of distances between a set of points in the shape data and a respective set of points in the one or more curve segments in the shared coordinate system.
- the weights of the L2 norm may decrease away from the tip (whether or not the position of the tip is known).
- the system may compute the transformation based at least in part on an iterative closest point algorithm or a modified iterative closest point algorithm. Additionally or alternatively, the system may use a coherent point drift algorithm to compute the transformation.
- segments added to the one or more curve segments, selected as discussed above may have deleterious effects of computing the transformation. For example, a curve segment may include an imagine artifact that results in a lower quality transformation than could be obtained without the segment. In some examples, the system may discard one or more curve segment to obtain a better transformation quality.
- the method 1100 includes registering the imaging coordinate system with the sensing coordinate system based on the transformation.
- Block 1130 may include the same options as block 1070 in FIG. 10.
- the method 1100 includes similar techniques as method 1000 with the difference that computing the transformation need not be separated into translational and rotational transformations. That is, the optimization of the cost function may include translation.
- the separation of the tip obtained from the imaging data and the tip obtained from the shape data may be given a suitably high weight, which may result in the registration result of method 1100 approaching the registration of method 1000.
- the potentially weaker constraint on the tip of the method 1000 may yield a better registration when the estimate of the tip in the imaging data has an uncertainty above a certain value.
- control system 712 may be implemented in software for execution on one or more processors of a computer system.
- the software may include code that when executed by the one or more processors, configures the one or more processors to perform various functionalities as discussed herein.
- the code may be stored in a non-transitory computer readable storage medium (e.g., a memory, magnetic storage, optical storage, solid-state storage, etc.).
- the computer readable storage medium may be part of a computer readable storage device, such as an electronic circuit, a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device.
- the code may be downloaded via computer networks such as the Internet, Intranet, etc. for storage on the computer readable storage medium.
- the code may be executed by any of a wide variety of centralized or distributed data processing architectures.
- the programmed instructions of the code may be implemented as a number of separate programs or subroutines, or they may be integrated into a number of other aspects of the systems described herein.
- wireless connections may use wireless communication protocols such as Bluetooth, near-field communication (NFC), Infrared Data Association (IrDA), home radio frequency (HomeRF), IEEE 802.11 , Digital Enhanced Cordless Telecommunications (DECT), and wireless medical telemetry service (WMTS).
- wireless communication protocols such as Bluetooth, near-field communication (NFC), Infrared Data Association (IrDA), home radio frequency (HomeRF), IEEE 802.11 , Digital Enhanced Cordless Telecommunications (DECT), and wireless medical telemetry service (WMTS).
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Pathology (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Gynecology & Obstetrics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radiology & Medical Imaging (AREA)
- Theoretical Computer Science (AREA)
- Robotics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Processing (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Un système reçoit une représentation tridimensionnelle, dans des coordonnées d'imagerie, d'une structure anatomique et d'une portion (comprenant éventuellement une pointe) d'un dispositif allongé flexible disposé dans la structure anatomique. Le système reçoit également, en provenance d'une unité de détection, des données de forme, dans des coordonnées de détection, indiquant une forme du dispositif allongé flexible, les données de forme comprenant éventuellement une position de la pointe. En outre, le système calcule, sur la base, au moins en partie, de la représentation tridimensionnelle reçue, un ou plusieurs segments de courbe indiquant la forme du dispositif allongé flexible dans des coordonnées d'imagerie. Pour enregistrer les systèmes de coordonnées d'imagerie et de détection, le système calcule une transformation entre le système de coordonnées d'imagerie et le système de coordonnées de détection sur la base, au moins en partie, des données de forme et des segments de courbe.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463552083P | 2024-02-09 | 2024-02-09 | |
| US63/552,083 | 2024-02-09 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2025171214A2 true WO2025171214A2 (fr) | 2025-08-14 |
| WO2025171214A3 WO2025171214A3 (fr) | 2025-09-18 |
Family
ID=94820882
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/014941 Pending WO2025171214A2 (fr) | 2024-02-09 | 2025-02-07 | Filtre directionnel et/ou contraintes de translation pour enregistrement de coordonnées |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025171214A2 (fr) |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6380732B1 (en) | 1997-02-13 | 2002-04-30 | Super Dimension Ltd. | Six-degree of freedom tracking system having a passive transponder on the object being tracked |
| US20060013523A1 (en) | 2004-07-16 | 2006-01-19 | Luna Innovations Incorporated | Fiber optic position and shape sensing device and method relating thereto |
| US7316681B2 (en) | 1996-05-20 | 2008-01-08 | Intuitive Surgical, Inc | Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity |
| US7772541B2 (en) | 2004-07-16 | 2010-08-10 | Luna Innnovations Incorporated | Fiber optic position and/or shape sensing based on rayleigh scatter |
| US8773650B2 (en) | 2009-09-18 | 2014-07-08 | Intuitive Surgical Operations, Inc. | Optical position and/or shape sensing |
| US8900131B2 (en) | 2011-05-13 | 2014-12-02 | Intuitive Surgical Operations, Inc. | Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery |
| US9259274B2 (en) | 2008-09-30 | 2016-02-16 | Intuitive Surgical Operations, Inc. | Passive preload and capstan drive for surgical instruments |
| WO2016191298A1 (fr) | 2015-05-22 | 2016-12-01 | Intuitive Surgical Operations, Inc. | Systèmes et procédés d'alignement pour chirurgie guidée par image |
| WO2019018736A2 (fr) | 2017-07-21 | 2019-01-24 | Intuitive Surgical Operations, Inc. | Systèmes et procédés de dispositif allongé flexible |
Family Cites Families (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2626027B1 (fr) * | 2007-08-14 | 2020-04-29 | Koninklijke Philips N.V. | Systèmes d'instruments robotisés utilisant des capteurs à fibres optiques |
| US20100056904A1 (en) * | 2008-09-02 | 2010-03-04 | Saunders John K | Image guided intervention |
| US9675304B2 (en) * | 2011-06-27 | 2017-06-13 | Koninklijke Philips N.V. | Live 3D angiogram using registration of a surgical tool curve to an X-ray image |
| EP2903552B1 (fr) * | 2012-10-01 | 2017-07-05 | Koninklijke Philips N.V. | Enregistrement de polyligne à trois dimensions à l'aide de contraintes de forme |
| US10762380B2 (en) * | 2013-07-23 | 2020-09-01 | Koninklijke Philips N.V. | Registration system for registering an imaging device with a tracking device |
| CN105979899B (zh) * | 2013-12-09 | 2019-10-01 | 直观外科手术操作公司 | 用于设备感知柔性工具配准的系统和方法 |
| EP3217911B1 (fr) * | 2014-11-13 | 2023-01-04 | Intuitive Surgical Operations, Inc. | Systèmes de filtration de données de localisation |
| US10568702B2 (en) * | 2017-01-19 | 2020-02-25 | St. Jude Medical, Cardiology Division, Inc. | System and method for re-registration of localization system after shift/drift |
| EP3545847A1 (fr) * | 2018-03-27 | 2019-10-02 | Koninklijke Philips N.V. | Dispositif d'évaluation pour évaluer la forme en s d'un instrument par rapport à son aptitude à l'enregistrement |
-
2025
- 2025-02-07 WO PCT/US2025/014941 patent/WO2025171214A2/fr active Pending
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7316681B2 (en) | 1996-05-20 | 2008-01-08 | Intuitive Surgical, Inc | Articulated surgical instrument for performing minimally invasive surgery with enhanced dexterity and sensitivity |
| US6380732B1 (en) | 1997-02-13 | 2002-04-30 | Super Dimension Ltd. | Six-degree of freedom tracking system having a passive transponder on the object being tracked |
| US20060013523A1 (en) | 2004-07-16 | 2006-01-19 | Luna Innovations Incorporated | Fiber optic position and shape sensing device and method relating thereto |
| US7772541B2 (en) | 2004-07-16 | 2010-08-10 | Luna Innnovations Incorporated | Fiber optic position and/or shape sensing based on rayleigh scatter |
| US9259274B2 (en) | 2008-09-30 | 2016-02-16 | Intuitive Surgical Operations, Inc. | Passive preload and capstan drive for surgical instruments |
| US8773650B2 (en) | 2009-09-18 | 2014-07-08 | Intuitive Surgical Operations, Inc. | Optical position and/or shape sensing |
| US8900131B2 (en) | 2011-05-13 | 2014-12-02 | Intuitive Surgical Operations, Inc. | Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery |
| WO2016191298A1 (fr) | 2015-05-22 | 2016-12-01 | Intuitive Surgical Operations, Inc. | Systèmes et procédés d'alignement pour chirurgie guidée par image |
| WO2019018736A2 (fr) | 2017-07-21 | 2019-01-24 | Intuitive Surgical Operations, Inc. | Systèmes et procédés de dispositif allongé flexible |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025171214A3 (fr) | 2025-09-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12245833B2 (en) | Systems and methods of continuous registration for image-guided surgery | |
| JP7703738B2 (ja) | 画像誘導手術において位置合わせされた蛍光透視画像を使用するためのシステム及び方法 | |
| US12121204B2 (en) | Systems and methods of registration for image guided surgery | |
| US12369980B2 (en) | Systems and methods for intelligently seeding registration | |
| EP3576598B1 (fr) | Système d'enregistrement pour procédures guidée par l'image | |
| US20210100627A1 (en) | Systems and methods related to elongate devices | |
| US20210259783A1 (en) | Systems and Methods Related to Registration for Image Guided Surgery | |
| US20230030727A1 (en) | Systems and methods related to registration for image guided surgery | |
| EP3930616B1 (fr) | Systèmes d'enregistrement d'anatomie de patient | |
| WO2025171214A2 (fr) | Filtre directionnel et/ou contraintes de translation pour enregistrement de coordonnées | |
| WO2025019569A1 (fr) | Imagerie peropératoire pour navigation assistée par robot | |
| WO2025054381A1 (fr) | Transfert de style pour imagerie peropératoire |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 25708611 Country of ref document: EP Kind code of ref document: A2 |