[go: up one dir, main page]

WO2024257035A1 - Système et procédé d'enregistrement et d'étalonnage de sonde à ultrasons pour la navigation en temps réel - Google Patents

Système et procédé d'enregistrement et d'étalonnage de sonde à ultrasons pour la navigation en temps réel Download PDF

Info

Publication number
WO2024257035A1
WO2024257035A1 PCT/IB2024/055835 IB2024055835W WO2024257035A1 WO 2024257035 A1 WO2024257035 A1 WO 2024257035A1 IB 2024055835 W IB2024055835 W IB 2024055835W WO 2024257035 A1 WO2024257035 A1 WO 2024257035A1
Authority
WO
WIPO (PCT)
Prior art keywords
coordinate system
virtual space
tracking
navigation
response
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IB2024/055835
Other languages
English (en)
Inventor
Osvaldo BARRERA
Joseph Brannan
Mark Stiger
Anthony Ross
Bradley JACOBSEN
Patrick Helm
Yvan Paitel
Kaustubh PATIL
Darion Peterson
Joshua Blauer
Trevor Huang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medtronic Navigation Inc
Original Assignee
Medtronic Navigation Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/676,263 external-priority patent/US20240415496A1/en
Application filed by Medtronic Navigation Inc filed Critical Medtronic Navigation Inc
Publication of WO2024257035A1 publication Critical patent/WO2024257035A1/fr
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Clinical applications
    • A61B8/0833Clinical applications involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • A61B8/587Calibration phantoms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00681Aspects not otherwise provided for
    • A61B2017/00725Calibration or performance testing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2051Electromagnetic tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2068Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound

Definitions

  • the present disclosure is generally directed to navigation and ultrasound imaging, and relates more particularly to calibrating an ultrasound probe for navigation.
  • Imaging devices and navigation systems may assist a surgeon or other medical provider in carrying out a surgical procedure.
  • Imaging may be used by a medical provider for visual guidance in association with diagnostic and/or therapeutic procedures.
  • Navigation systems may be used for tracking objects (e.g., instruments, imaging devices, etc.) associated with carrying out the surgical procedure.
  • Example aspects of the present disclosure include:
  • a system including: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to one or more signals transmitted by an imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness (e.g., ultrasound beam thickness), beam shape (e.g., ultrasound beam shape), or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • beam thickness e.g., ultrasound beam thickness
  • beam shape e.g., ultrasound beam shape
  • the calibration phantom includes: ultrasound conductive material; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • instructions are further executable by the processor to: output guidance information associated with positioning the imaging device, the tracked device, or both in association with calibrating the first coordinate system with respect to the second coordinate system.
  • the tracked device is included in at least a portion of an instrument
  • the instructions are further executable by the processor to: detect, in the one or more images, one or more landmarks corresponding to at least a portion of the tracked device, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the one or more landmarks.
  • calibrating the first coordinate system with respect to the second coordinate system includes verifying a registration accuracy between the first coordinate system and the second coordinate system.
  • the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a radar tracking system, a magnetic tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
  • an optical tracking system an acoustic tracking system
  • an electromagnetic tracking system an electromagnetic tracking system
  • a radar tracking system a magnetic tracking system
  • an inertial measurement unit (IMU) based tracking system a computer vision based tracking system.
  • a system including: an imaging system including an imaging device; a tracking system including: a transmission device; and a tracked device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals emitted by the transmission device; generate a virtual space including at least a portion of the calibration phantom based on one or more images generated by the imaging system, wherein the one or more images are generated in response to one or more signals transmitted by the imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of the tracked device is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • the calibration phantom includes: ultrasound conductive material; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness, beam shape, or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • a method including: generating a navigation space based on one or more tracking signals; generating a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals; identifying a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • the calibration phantom includes: ultrasound conductive material; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • FIG. 1A illustrates an example of a system in accordance with aspects of the present disclosure.
  • Fig. IB illustrates an example of a system in accordance with aspects of the present disclosure.
  • Fig. 1C illustrates an example of a system in accordance with aspects of the present disclosure.
  • FIG. 2A illustrates an example implementation of a system in accordance with aspects of the present disclosure.
  • Fig. 2B illustrates example aspects of a navigation space and a virtual space in accordance with aspects of the present disclosure.
  • FIG. 3A illustrates example views of an ultrasound beam in accordance with aspects of the present disclosure.
  • Fig. 3B illustrates example implementations of an instrument in accordance with aspects of the present disclosure.
  • Fig. 3C illustrates example implementations of an instrument in accordance with aspects of the present disclosure.
  • Fig. 3D illustrates example aspects of an instrument in accordance with aspects of the present disclosure.
  • FIG. 4 illustrates an example of a process flow in accordance with aspects of the present disclosure.
  • Fig. 5 illustrates an example of a process flow in accordance with aspects of the present disclosure.
  • the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware -based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions). Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
  • Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
  • data storage media e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
  • processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A 10 or 10X Fusion processors; Apple Al l, A 12, A12X, A12Z, or Al 3 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry
  • DSPs digital signal processors
  • proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
  • an ultrasound probe may be calibrated/registered relative to a navigation means (e.g., a tracked sensor, etc.) in association with navigated image acquisition.
  • a navigation means e.g., a tracked sensor, etc.
  • some systems may establish a transformation matrix that maps the six-dimensional (6D) pose (e.g., position and orientation information) of the tracked sensor to the 6D pose of an ultrasound probe.
  • 6D pose e.g., position and orientation information
  • Some systems may map the 6D pose of the tracked sensor to an image generated by the ultrasound probe or to the ultrasound beam of the ultrasound probe.
  • some calibration methods may be tedious, time consuming, and error prone.
  • some calibration phantoms utilized for calibrating the ultrasound probe to the navigation system are costly and are prone to decay with time (e.g., due to the degradation of hydrogels implemented in some calibration phantoms).
  • Instances may occur in which a surgical team is unaware that the ultrasound probe has lost calibration with the navigation system. In some cases, even if the surgical team is aware of the loss in calibration, the team may be unwilling to recalibrate the ultrasound probe (e.g., due to lack of time or resources). Undetected or unaddressed loss in calibration during a medical procedure (e.g., due to deformation, tool drop/hit, etc.) may result in surgical errors. In some other cases, metal and other materials present in the environment may cause distortion to an electromagnetic field generated by the navigation system in association with tracking an object, and such distortion may result in surgical errors.
  • systems and techniques described herein may support dynamic initial calibration and dynamic recalibration of ultrasound probes (also referred to herein as ultrasonic probes) for navigation.
  • the systems and techniques may incorporate an ultrasound probe connected to a main application/navigation system (also referred to herein as a navigated surgery system).
  • the systems and techniques may include electromagnetic navigation of the ultrasound probe using trackers/sensors coupled to the ultrasound probe and an emitter capable of emitting electromagnetic signals.
  • the systems and techniques may include an electromagnetic tracked device (e.g., an electromagnetic pointer (or stylus)) and a calibration phantom.
  • the calibration phantom may be a phantom with a configuration of rods or wires.
  • the calibration phantom may be a water bath or a tissue phantom, but is not limited thereto.
  • the example calibration phantoms described herein are stable, inexpensive compared to some other calibration phantoms, and electromagnetic friendly.
  • the calibration phantoms may be free of materials that may interfere with electromagnetic navigation and tracking.
  • the calibration phantom may be a gel (e.g., hydrogel) for ultrasound calibration with optical tracking.
  • Examples of the techniques described herein may include moving or positioning a tracked device inside the calibration phantom while observing the movement of the tracked device using ultrasound imaging generated by an ultrasound imaging device, in which the ultrasound imaging corresponds to or represents an ultrasound view of the ultrasound imaging device.
  • the techniques may include recording a video file of the ultrasound imaging concurrently with the tracking data and processing the video file (e.g., using a software script). Based on the processing of the video file, the techniques may include identifying a temporal instance at which a portion (e.g., tip) of the tracked device enters the ultrasound view.
  • the systems and techniques may calibrate the ultrasound view with respect to the tracking space. Examples of the tracked device and the tracking space include an electromagnetic tracked device and an electromagnetic tracking space, but are not limited thereto. Aspects of the present disclosure support any type of tracked devices (e.g., sensors) and tracking spaces that may be implemented by a navigation system.
  • the systems and techniques described herein may support autonomous or semi-autonomous calibration, calibration verification with reduced calibration time compared to some other calibration techniques, and providing or outputting guidance information (e.g., tutorials, user prompts, etc.) for users on how to move or position a device (e.g., ultrasound imaging device, electromagnetic tracked device, an electromagnetic pointer (or stylus), etc.) in association with calibration.
  • a device e.g., ultrasound imaging device, electromagnetic tracked device, an electromagnetic pointer (or stylus), etc.
  • the systems and techniques described herein support performing multiple calibrations (e.g., an initial calibration using a water bath, one or more subsequent calibrations using a tissue phantom, etc.), in which a system may perform the calibrations autonomously (or semi- autonomously).
  • the systems and techniques may perform the calibrations continuously (or semi- continuously) and/or in response to trigger criteria, aspects of which are described herein. It is to be understood that as described herein with respect to calibration may be applied to recalibration.
  • calibration may be applied to recalibration.
  • recalibration may be used interchangeably herein.
  • Techniques described herein may be implemented in hardware, software, firmware, or any combination thereof that may automatically detect instrument landmarks on ultrasound images during a medical procedure.
  • the techniques may include detecting landmarks of an instrument (e.g., tip of a needle during placement, distinctive features of navigated catheters, tip of a registration stylus, etc.) during the medical procedure and, using the detected instrument landmarks, automatically calibrating (or adjusting the calibration of) the ultrasound imaging device to the navigation system.
  • the ultrasound imaging device may be, for example, an ultrasound probe.
  • aspects of the automatic calibration techniques may provide a time savings for the surgical team, an improved user experience, and increased accuracy over longer portions of medical procedures.
  • Other aspects of the calibration techniques provide cost savings through the use of, as a calibration phantom, an empty container (e.g., an empty box including an electromagnetic friendly material) with cross wires with patterns.
  • the calibration phantom may be filled with water for the calibration procedure, thereby resulting in cost savings compared to other materials (e.g., gels, silicones, etc.).
  • using water in the calibration phantom may provide increased durability.
  • the electromagnetic tracking and calibration solutions described herein support directly using electromagnetic tools implemented in some existing medical procedures.
  • direct use of such existing electromagnetic tools may provide increased accuracy due to accurate tracking of electromagnetic tools by some navigation systems.
  • the calibration techniques described herein may be implemented using actual tissue of a subject as a calibration phantom.
  • the calibration techniques described herein may be implemented during a medical procedure associated with the actual tissue.
  • a device e.g., an ablation antenna, cardiac catheter, etc.
  • the calibration techniques and calibration software described herein may include using (e.g., automatically, or in response to a user request, etc.) the ultrasound images and corresponding electromagnetic tracking information to recalibrate the registration between the ultrasound space and the electromagnetic navigation space (also referred to herein as recalibrating the ultrasound tracking registration), without interrupting the medical procedure.
  • the systems and techniques described herein support recalibrating the ultrasound imaging system in the background based on automatic detection of target objects (e.g., instruments, tools, etc.) in the ultrasound images using Al/machine learning computer vision algorithms and object detection.
  • target objects e.g., instruments, tools, etc.
  • the systems and techniques support automatic registration which may be implemented continuously, based on each event in which a tracked instrument or tracked device is detected in the ultrasound view, and/or periodically (e.g., based on a temporal trigger).
  • the systems and techniques support automatic registration in response to other trigger criteria (e.g., in response to detection of a target instrument in an ultrasound image) at any point during a medical procedure.
  • the systems and techniques may include continuously verifying the registration accuracy between the ultrasound imaging system and the navigation system anytime the target instrument (e.g., surgical instrument, electromagnetic pointer, etc.) is detected in the ultrasound imaging.
  • the systems and techniques support alerting the user and/or taking corrective actions in response to registration discrepancies.
  • the system may provide the user with a list of corrective actions for improving calibration.
  • corrective actions may include real-time actionable feedback for users to move the target instrument in association with registration.
  • the systems and techniques may support dynamically and automatically detecting distortion in the navigated volume due to discrepancies between expected navigation and imaging data.
  • the discrepancies may be between pose information of a tracked object as indicated by the navigation system and pose information of the tracked object as indicated by the imaging data.
  • the systems described herein may support techniques for alerting (e.g., providing a notification to) a user of the discrepancies and compensating for the discrepancies.
  • the systems and techniques may include calibrating the navigation data to the imaging data (e.g., calibrating a navigation space to an ultrasound space) while compensating for the discrepancies.
  • aspects of the present disclosure support integration of a calibration phantom (e.g., water bath, hydrogel phantom, etc.) into the structure of a patient tracker.
  • the integration of the calibration phantom may support user recalibration of a navigated ultrasound probe before or during a medical procedure.
  • the recalibration techniques described herein the temporal duration associated with recalibration is reduced compared to some other recalibration techniques.
  • Implementations of the present disclosure provide technical solutions to one or more of the problems associated with other navigation systems and calibration techniques.
  • the systems and techniques described herein provide time savings, improved user experience, cost savings, and increased accuracy in comparison to other registration and calibration techniques.
  • the systems and techniques described herein support continuous registration during a surgical procedure and continuous registration verification, in which registration and registration verification may be autonomous or semi-autonomous.
  • aspects of the systems and techniques described herein support time efficient and cost-effective utilization of ultrasound images during surgery to navigate and display to medical personnel the locations of surgical devices (e.g., instruments, surgical tools, robotic end effectors, etc.) with respect to the patient anatomy.
  • the systems and techniques described herein provide a reliable accuracy of the calibration between imaging devices (e.g., an ultrasound image probe, other imaging probes, etc.) and a navigation system, and the reliable accuracy may support accurate navigation of images (e.g., ultrasound images, etc.) that are generated based on data captured by the imaging devices.
  • different imaging probes may be different based on manufacturer, configuration, probe type, and the like, and such imaging probes may require new calibration and can lose calibration during a medical procedure.
  • Aspects of the calibration techniques described herein are relatively time efficient, cost effective, user friendly, and provide increased accuracy compared to other techniques for calibrating or recalibrating imaging probes.
  • the time efficiency, cost effectiveness, user friendliness, and increased accuracy supported by the systems and techniques described herein may provide improved confidence for a surgeon in a navigated ultrasound space and support a reduction in surgical errors.
  • the registration and calibration techniques described herein may be implemented autonomously (e.g., without input from medical personnel) or semi-autonomously (e.g., with partial input from medical personnel).
  • aspects of the present disclosure relate to navigated and robotic surgery and to any type of surgery that may be associated with intra-surgical ultrasound imaging.
  • Aspects of the present disclosure support implementing any of the techniques described herein to any medical procedure (e.g., cranial, spinal, thoracic, abdominal, cardiac, ablation, laparoscopic, minimally invasive surgery, robotic surgery, etc.) associated with the use of intra-surgical ultrasound imaging.
  • the systems and techniques described herein may be implemented in association with initiatives related to data analytics, artificial intelligence, and machine learning, for example, with respect to data analytic scenarios for procedure and device optimization.
  • the techniques described herein may be implemented as a standalone application that uses a calibration phantom (e.g., a water bath, etc.) and an imaging system (e.g., ultrasound imaging, optical or electromagnetic tracking, 3D rendering software, and calibration software) or as an application integrated with an imaging system or navigation system.
  • a calibration phantom e.g., a water bath, etc.
  • an imaging system e.g., ultrasound imaging, optical or electromagnetic tracking, 3D rendering software, and calibration software
  • the examples described herein with reference to the following figures may support multiple types, geometries, configurations, and sizes of calibration phantoms other than the examples illustrated and described herein.
  • FIG. 1A illustrates an example of a system 100 that supports aspects of the present disclosure.
  • the system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud network 134 (or other network).
  • Systems according to other implementations of the present disclosure may include more or fewer components than the system 100.
  • the system 100 may omit and/or include additional instances of one or more components of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
  • the system 100 may omit any instance of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
  • the system 100 may support the implementation of one or more other aspects of one or more of the methods disclosed herein.
  • the computing device 102 includes a processor 104, a memory 106, a communication interface 108, and a user interface 110.
  • Computing devices according to other implementations of the present disclosure may include more or fewer components than the computing device 102.
  • the computing device 102 may be, for example, a control device including electronic circuitry associated with controlling any components of the system 100.
  • the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
  • the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
  • the memory 106 may be or include RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
  • the memory 106 may store information or data associated with completing, for example, any step of the process flow 500 described herein, or of any other methods.
  • the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the imaging devices 112, the robot 114, and the navigation system 118.
  • the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
  • Such content if provided as in instruction, may, in some implementations, be organized into one or more applications, modules, packages, layers, or engines.
  • the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
  • content or data e.g., machine learning models, artificial neural networks, deep neural networks, etc.
  • the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
  • the computing device 102 may also include a communication interface 108.
  • the communication interface 108 may be used for receiving data or other information from an external source (e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100), and/or for transmitting instructions, data (e.g., image data, tracking data, navigation data, calibration data, registration data, etc.), or other information to an external system or device (e.g., another computing device 102, the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component not part of the system 100).
  • an external source e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100
  • data e.g., image data, tracking data, navigation data, calibration data, registration data, etc.
  • the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
  • the communication interface 108 may support communication between the device 102 and one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
  • the computing device 102 may also include one or more user interfaces 110.
  • the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
  • the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
  • the user interface 110 may support user modification (e.g., by a surgeon, medical personnel, a patient, etc.) of instructions to be executed by the processor 104 according to one or more implementations of the present disclosure, and/or to user modification or adjustment of a setting of other information displayed on the user interface 110 or corresponding thereto.
  • user modification e.g., by a surgeon, medical personnel, a patient, etc.
  • the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
  • the user interface 110 may be located proximate one or more other components of the computing device 102, while in other implementations, the user interface 110 may be located remotely from one or more other components of the computer device 102.
  • the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
  • image data refers to the data generated or captured by an imaging device 112, including in a machine -readable form, a graphical/visual form, and in any other form.
  • the image data may include data corresponding to an anatomical feature of a patient, or to a portion thereof.
  • the image data may be or include a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
  • a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
  • the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
  • the imaging device 112 may be or include, for example, an ultrasound scanner (which may include, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an O-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may include, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
  • the imaging device 112 may be contained entirely within a single housing, or may include a transmitter/emitter and a receiver/detector that are in separate housings or are otherwise
  • the imaging device 112 may include more than one imaging device 112.
  • a first imaging device may provide first image data and/or a first image
  • a second imaging device may provide second image data and/or a second image.
  • the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
  • the imaging device 112 may be operable to generate a stream of image data.
  • the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
  • image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
  • the robot 114 may be any surgical robot or surgical robotic system.
  • the robot 114 may be or include, for example, the Mazor XTM Stealth Edition robotic guidance system.
  • the robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation/ s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
  • the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
  • the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
  • the robot 114 may include one or more robotic arms 116.
  • the robotic arm 116 may include a first robotic arm and a second robotic arm, though the robot 114 may include more than two robotic arms.
  • one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112.
  • the imaging device 112 includes two or more physically separate components (e.g., a transmitter and receiver)
  • one robotic arm 116 may hold one such component
  • another robotic arm 116 may hold another such component.
  • Each robotic arm 116 may be positionable independently of the other robotic arm.
  • the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
  • the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
  • the robotic arm(s) 116 may include one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
  • reference markers e.g., navigation markers
  • the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
  • the navigation system 118 can be used to track other components (e.g., imaging device 112, surgical tools, instruments 145 (later described with reference to Fig.
  • the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
  • the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
  • the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic StealthStationTM S8 surgical navigation system or any successor thereof.
  • the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers (e.g., tracking devices 140, etc.) or other objects within the operating room or other room in which some or all of the system 100 is located.
  • the one or more cameras may be optical cameras, infrared cameras, or other cameras.
  • the navigation system 118 may include one or more tracking devices 140 (e.g., electromagnetic sensors, acoustic sensors, etc.).
  • the navigation system 118 may include one or more of an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
  • the navigation system 118 may include a corresponding transmission device 136 capable of transmitting signals associated with the tracking type.
  • the navigation system 118 may be capable of computer vision based tracking of objects present in images captured by the imaging de vice(s) 112.
  • the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (e.g., instrument 145, etc.) (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
  • the instrument 145 may be an electromagnetic pointer (or stylus).
  • the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
  • an external source e.g., the computing device 102, imaging device 112, or other source
  • the system 100 can operate without the use of the navigation system 118.
  • the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
  • the processor 104 may utilize data stored in memory 106 as a neural network.
  • the neural network may include a machine learning architecture.
  • the neural network may be or include one or more classifiers.
  • the neural network may be or include any machine learning network such as, for example, a deep learning network, a convolutional neural network, a reconstructive neural network, a generative adversarial neural network, or any other neural network capable of accomplishing functions of the computing device 102 described herein.
  • Some elements stored in memory 106 may be described as or referred to as instructions or instruction sets, and some functions of the computing device 102 may be implemented using machine learning techniques.
  • the processor 104 may support machine learning model(s) 138 which may be trained and/or updated based on data (e.g., training data 144) provided or accessed by any of the computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
  • the machine learning model(s) 138 may be built and updated based on the training data 144 (also referred to herein as training data and feedback).
  • the neural network and machine learning model(s) 138 may support Al/machine learning computer vision algorithms and object detection in association with automatically detecting, identifying, and tracking target objects (e.g., instruments, tools, etc.) in one or more images 153 or a multimedia file 154.
  • the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems, an ultrasound space coordinate system, a patient coordinate system, and/or a navigation coordinate system, etc.).
  • the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the ultrasound space coordinate system, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images 153 useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
  • the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud network 134.
  • the database 130 may include information associated with a calibration phantom 149 associated with a calibration procedure.
  • the database 130 may be or include part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
  • PACS picture archiving and communication system
  • HIS health information system
  • the computing device 102 may communicate with a server/ s) and/or a database (e.g., database 130) directly or indirectly over a communications network (e.g., the cloud network 134).
  • the communications network may include any type of known communication medium or collection of communication media and may use any type of protocols to transport data between endpoints.
  • the communications network may include wired communications technologies, wireless communications technologies, or any combination thereof.
  • Wired communications technologies may include, for example, Ethernet-based wired local area network (LAN) connections using physical transmission mediums (e.g., coaxial cable, copper cable/wire, fiber-optic cable, etc.).
  • Wireless communications technologies may include, for example, cellular or cellular data connections and protocols (e.g., digital cellular, personal communications service (PCS), cellular digital packet data (CDPD), general packet radio service (GPRS), enhanced data rates for global system for mobile communications (GSM) evolution (EDGE), code division multiple access (CDMA), single-carrier radio transmission technology (IxRTT), evolution-data optimized (EVDO), high speed packet access (HSPA), universal mobile telecommunications service (UMTS), 3G, long term evolution (LTE), 4G, and/or 5G, etc.), Bluetooth®, Bluetooth® low energy, Wi-Fi, radio, satellite, infrared connections, and/or ZigBee® communication protocols.
  • PCS personal communications service
  • CDPD cellular digital packet data
  • GPRS general packet radio service
  • the Internet is an example of the communications network that constitutes an Internet Protocol (IP) network consisting of multiple computers, computing networks, and other communication devices located in multiple locations, and components in the communications network (e.g., computers, computing networks, communication devices) may be connected through one or more telephone systems and other means.
  • IP Internet Protocol
  • the communications network may include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a wireless LAN (WLAN), a Session Initiation Protocol (SIP) network, a Voice over Internet Protocol (VoIP) network, a cellular network, and any other type of packet-switched or circuit- switched network known in the art.
  • POTS Plain Old Telephone System
  • ISDN Integrated Services Digital Network
  • PSTN Public Switched Telephone Network
  • LAN Local Area Network
  • WAN Wide Area Network
  • WLAN wireless LAN
  • VoIP Voice over Internet Protocol
  • the communications network may include of any combination of networks or network types.
  • the communications network may include any combination of communication mediums such as coaxial cable, copper cable/wire, fiber-optic cable, or antennas for communicating data (e.g., transmitting/receiving data).
  • the computing device 102 may be connected to the cloud network 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud network 134.
  • an external device e.g., a computing device
  • the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the process flow 500 described herein.
  • the system 100 or similar systems may also be used for other purposes.
  • Fig. IB illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example may be implemented by the computing device 102, imaging device(s) 112, robot 114 (e.g., a robotic system), and navigation system 118.
  • the computing device 102 imaging device(s) 112
  • robot 114 e.g., a robotic system
  • navigation system 118 e.g., a navigation system
  • the navigation system 118 may provide navigation information based on an electromagnetic field generated by a transmission device 136.
  • the navigation information may include tracking information 167 (also referred to herein as tracking data) as described herein.
  • the transmission device 136 may include an array of transmission coils capable of generating or forming the electromagnetic field in response to respective currents driven through the transmission coils.
  • the navigation system 118 may include tracking devices 140 capable of sensing the electromagnetic field. Aspects of the navigation system 118 described herein may be implemented by navigation processing 129.
  • the system 100 may support tracking objects (e.g., an instrument 145, imaging device 112, etc.) in a trackable volume 150 using an electromagnetic field produced by the transmission device 136.
  • the transmission device 136 may include a transmitter antenna or transmitting coil array capable of producing the electromagnetic field.
  • the system 100 may track the pose (e.g., position, coordinates, orientation, etc.) of the objects in the tracking volume 150 relative to a subject 141.
  • the system 100 may display, via a user interface of the computing device 102, icons corresponding to any tracked objects. For example, the system 100 may superimpose such icons on and/or adjacent an image displayed on the user interface.
  • the transmission device 136 may be an electromagnetic localizer that is operable to generate electromagnetic fields.
  • the transmission device 136 may drive current through the transmission coils, thereby powering the coils to generate or form the electromagnetic field. As the current is driven through the coils, the electromagnetic field will extend away from the transmission coils and form a navigation domain (e.g., volume 150).
  • the volume 150 may include any portion (e.g., the spine, one or more vertebrae, the brain, an anatomical element, or a portion thereof, etc.) of the subject 141 and/or any portion of a calibration phantom 149-a.
  • the transmission coils may be powered through a controller device and/or power supply provided by the system 100.
  • the tracking devices 140 may include or be provided as sensors (also referred to herein as tracking sensors). The sensors may sense a selected portion or component of the electromagnetic field(s) generated by the transmission device 136.
  • the navigation system 118 may support registration (e.g., through registration 128) of the volume 150 to a virtual space 155.
  • the navigation system 118 may support superimposing an icon representing a tracked object (e.g., an instrument 145, a tracking device 140-b, a tracking device 146, etc.) on the image.
  • the system 100 may support the delivery of tracking information associated with the tracking devices 140 and/or tracking device 146 to the navigation system 118.
  • the tracking information may include, for example, data associated with magnetic fields sensed by the tracking devices 140.
  • the tracking devices 140 may communicate sensor information to the navigation system 118 for determining a position of the tracked portions relative to each other and/or for localizing an object (e.g., instrument 145, tracking device 146, etc.) relative to an image 153.
  • the navigation system 118 and/or transmission device 136 may include a controller that supports operating and powering the generation of electromagnetic fields.
  • the system 100 may generate a navigation space 119 based on one or more tracking signals transmitted by the transmission device 136.
  • the navigation space 119 may correspond to environment 142 or a portion thereof.
  • the navigation space 119 may correspond to a subject 141 (e.g., a patient) included in the environment 142 or an anatomical element (e.g., an organ, bone, tissue, etc.) of the subject 141.
  • the environment 142 may be, for example, an operating room, an exam room, or the like.
  • the tracking signals are not limited to electromagnetic tracking signals, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of tracking signals (e.g., optical tracking signals, acoustic tracking signals, etc.).
  • the system 100 may generate a virtual space 155 based on (e.g., in response to) signals transmitted by imaging device 112.
  • the virtual space 155 may correspond to a field of view 159 of the imaging device 112.
  • the system 100 may generate images 153 in response to signals transmitted by imaging device 112, and the images 153 may correspond to the field of view 159 of the imaging device 112.
  • the imaging device 112 is an ultrasound probe transmitting ultrasound signals, and the images 153 may be ultrasound images.
  • the images 153 may be static images or video images. In some aspects, the images 153 may be stored as a multimedia file 154 that includes video (or video and sound).
  • the imaging device 112 and the example signals transmitted by the imaging device 112 are not limited thereto, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of imaging devices 112 (e.g., X-ray, CT scanner, OCT scanner, etc.) and imaging systems described herein.
  • the system 100 may support acquiring image data to generate or produce images (e.g., images 153, multimedia file 154, etc.) of the subject 141.
  • the system 100 may detect or track the calibration phantom 149-a and other objects (e.g., tracking devices 140, instruments 145, tracking device 146, etc.) included in the volume 150 and the virtual space 155.
  • the calibration phantom 149-a may be located in the volume 150 (as generated by the navigation system 118) and the virtual space 155 (as generated by the computing device 102).
  • the calibration phantom 149-a is a tissue phantom, but is not limited thereto.
  • the system 100 may register and calibrate the imaging device 112 with respect to the navigation system 118.
  • the system 100 may identify a set of coordinates 157 in the virtual space 155 in response to an event in which the system 100 detects at least a portion of the instrument 145 in the image 153.
  • the system 100 may detect that the portion of the instrument 145 is located in the calibration phantom 149-a and intersects a surface of the virtual space 155 at the set of coordinates 157.
  • the system 100 may detect that the portion of the instrument 145 intersects the surface at an angle perpendicular to the surface.
  • the portion of the instrument 145 may be a tracking device 146 (e.g., an electromagnetic antenna of the instrument 145).
  • the virtual space 155 may be a 2D virtual space generated based on 2D images (e.g., ultrasound images, CT images, etc.) captured by the imaging device 112, and the surface may be a plane of the virtual space 155.
  • the virtual space 155 may be a 3D virtual space (e.g., a volume), and the surface of the virtual space 155 may be a planar surface or non-planar surface.
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to the event in which the system 100 detects the instrument 145 (or at least a portion of the instrument 145) in the image 153.
  • the system 100 may calibrate a coordinate system 160 associated with the virtual space 155 with respect to a coordinate system 165 associated with the navigation space 119.
  • the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and temporal information 156 associated with the event in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155 at the set of coordinates 157. Further, for example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on tracking information associated with the tracking device 146, the tracking device 140-b, and the tracking device 140-a.
  • the instrument 145 may be registered to the navigation system 118 such that the navigation system 118 may track and determine pose information 161 of the instrument 145 based on tracking information 167 associated with the tracking device 146, the tracking device 140- a, and/or the tracking device 140-b and temporal information 166 corresponding to the tracking information 167. Further, for example, the navigation system 118 may track and determine pose information 161 of the imaging device 112 based on tracking information 167 associated with the tracking device 140-a and temporal information 166 corresponding to the tracking information 167.
  • the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the tracking information 167 (e.g., associated with the tracking device 146, the tracking device 140-b, and the tracking device 140-a), the temporal information 166 associated with the tracking information 167, the temporal information 156 associated with the event, and the coordinates 157 associated with the event.
  • the tracking information 167 e.g., associated with the tracking device 146, the tracking device 140-b, and the tracking device 140-a
  • the temporal information 166 associated with the tracking information 167 e.g., associated with the tracking device 146, the tracking device 140-b, and the tracking device 140-a
  • the temporal information 166 associated with the tracking information 167
  • the temporal information 156 associated with the event
  • the coordinates 157 associated with the event e.g., associated with the coordinates 157 associated with the event.
  • the system 100 may detect, in an image 153 (or multimedia file 154), one or more landmarks corresponding to the instrument 145, a portion of the instrument 145, or the tracking device 146.
  • the landmarks may correspond to distinctive features (e.g., a tip, a shape of the tip, etc.) of the instrument 145 or the tracking device 146.
  • the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the one or more landmarks.
  • aspects of the present disclosure support calibrating the virtual space 155 to the navigation space 119 (e.g., calibrating the coordinate system 160 associated with the virtual space 155 to the coordinate system 165 associated with the navigation space 119) in response to one or more criteria.
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each occurrence of the event.
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each n lh occurrence of the event (e.g., each third occurrence of the event, each fifth occurrence, etc.).
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each n lh occurrence of the event within a temporal duration (e.g., each third occurrence of the event, in which the third occurrence is X seconds or less after a first occurrence of the event (where X is an integer value)).
  • a temporal duration e.g., each third occurrence of the event, in which the third occurrence is X seconds or less after a first occurrence of the event (where X is an integer value).
  • the system 100 may support calibrating the coordinate system 160 with respect to the coordinate system 165 without pausing a medical procedure (e.g., surgical procedure).
  • a medical procedure e.g., surgical procedure
  • the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 (e.g., recalibrate the registration between the virtual space 155 and the navigation space 119) in the background while medical personnel performs a medical procedure on the subject 141, without interrupting the medical procedure.
  • the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 during the medical procedure, without prompting the medical personnel to pause the medical procedure, such that the medical personnel may proceed with the medical procedure without waiting for calibration to be completed.
  • the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 without prompting the medical personnel to participate in a separate calibration operation.
  • the system 100 and techniques described herein may support calibrating the coordinate system 160 with respect to the coordinate system 165 based on any of: properties (e.g., beam thickness, beam shape, signal frequency, etc.) of signals transmitted by the imaging device 112, pose information of the instrument 145 (or pose information of the tracking device 146) in association with an intersection between the instrument 145 (or tracking device 146) and the surface of the virtual space 155, and properties (e.g., shape, etc.) of the tracking device 146, example aspects of which are later described with reference to Fig. 3.
  • properties e.g., beam thickness, beam shape, signal frequency, etc.
  • calibrating the virtual space 155 to the navigation space 119 includes verifying a registration accuracy between the coordinate system 160 and the coordinate system 165.
  • the system 100 may calculate a registration accuracy between the coordinate system 160 and the coordinate system 165 and compare the registration accuracy to a target accuracy value.
  • the system 100 may perform one or more operations described herein in association with recalibrating the virtual space 155 to the navigation space 119.
  • the system 100 may autonomously recalibrate the coordinate system 160 to the coordinate system 165.
  • the system 100 may generate and output a notification including user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) regarding how to move or position a device (e.g., the imaging device 112, the instrument 145, etc.) in association with the calibration process.
  • user guidance information 175 e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.
  • the notification may include a visual notification, an audible notification, a haptic notification, or a combination thereof.
  • Other additional and/or alternative aspects of calibrating the virtual space 155 to the navigation space 119 include automatically detecting distortion in the navigated volume 150 due to discrepancies between navigation data (e.g., tracking information 167 provided by navigation system 118) and imaging data (e.g., images 153, multimedia file 154, etc.).
  • the discrepancies may be between pose information 161 of a tracked object (e.g., tracking device 140-b, instrument 145, tracking device 146, etc.) as indicated by the navigation system 118 and pose information 162 of the tracked object as determined by the computing device 102 from the imaging data.
  • the system 100 may calculate the discrepancy and compare the discrepancy to a target discrepancy threshold value. In an example, in response to a comparison result in which the discrepancy is greater than the discrepancy threshold value, the system 100 may perform one or more operations described herein (e.g., autonomous recalibration, outputting a notification including user guidance information 175, etc.) in association with recalibrating the virtual space 155 to the navigation space 119. In some aspects, the system 100 may calibrate the navigation data to the imaging data (e.g., calibrate the navigation space 119 to the virtual space 155) while compensating for the discrepancies.
  • the system 100 may calibrate the navigation data to the imaging data (e.g., calibrate the navigation space 119 to the virtual space 155) while compensating for the discrepancies.
  • the system 100 may identify that tracking device 146 (e.g., electromagnetic antenna of an instrument 145) is intersecting the plane of an ultrasound imaging field 158 at a point 147 of intersection inside a circle 148, example aspects of which will later be described with reference to Fig. 2B.
  • tracking device 146 e.g., electromagnetic antenna of an instrument 145
  • the techniques described herein may provide continuous automatic registration, continuous semi-automatic registration, or a combination thereof, and the registration techniques may be implemented during a medical procedure.
  • Fig. 1C illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example in Fig. 1C include like aspects described with reference to Fig. IB.
  • the calibration phantom 149-b may be an ultrasound transmitting volume (e.g., a water bath) implemented using an empty container (e.g., an empty box including an electromagnetic friendly material).
  • the calibration phantom 149-b may include ultrasound conductive material.
  • the container may be formed of low magnetic or non-magnetic materials so as to minimize distortion to electromagnetic fields.
  • the container may be formed of a material having a magnetic permeability of about 1.0 to about 1.1 (relative), and the material may have a relatively low electrical conductivity (e.g., an electrical conductivity less than a threshold value).
  • the material may be a stainless steel alloyed with different metallic elements associated with obtaining specific properties (e.g., temperature and corrosion resistance, fracture tolerance, etc.).
  • Non-limiting examples of the material include Nickel/Chromium alloys (e.g., Series 300 alloys, type 304 stainless steel (annealed condition only), type 316 stainless steel), Cobalt/Chromium alloys (e.g., L605, MP35N), and Titanium alloys (e.g., Ti6A14V), plastics, and wood.
  • Nickel/Chromium alloys e.g., Series 300 alloys, type 304 stainless steel (annealed condition only), type 316 stainless steel
  • Cobalt/Chromium alloys e.g., L605, MP35N
  • Titanium alloys e.g., Ti6A14V
  • plastics e.g., plastics, and wood.
  • the calibration phantom 149-b may be integrated into the structure of a patient tracker. In some other aspects, the calibration phantom 149-b may be included in the environment 142 as a standalone structure that is separate from an operating table associated with the subject 141. Example aspects of the water bath are later described with reference to Fig. 2A. [0123] According to example aspects of the present disclosure, the system 100 may support calibrating the virtual space 155 to the navigation space 119 using the calibration phantom 149-b and the techniques as described herein, in which the calibration phantom 149-b is substituted for the calibration phantom 149-a described with reference to Fig. IB.
  • the system 100 may support calibrating the virtual space 155 to the navigation space 119 using both the calibration phantom 149-a and the calibration phantom 149-b.
  • the system 100 may support calibration outside the subject 141 using the calibration phantom 149-b (e.g., water bath) and further calibration (e.g., recalibration, calibration adjustment, etc.) using the calibration phantom 149-a (e.g., tissue of the subject 141), and the combination may provide an increase in accuracy compared to other calibration techniques.
  • the calibration phantom 149-a and the calibration phantom 149-b in association with a calibration process is later described with reference to Fig. 4.
  • Fig. 2A illustrates an example implementation 200 of the system 100.
  • an imaging device 112 e.g., an electromagnetic tracked ultrasound probe
  • calibration phantom 149-b e.g., a water bath
  • a transmission device e.g., an electromagnetic emitter
  • the transmission device is positioned outside of (e.g., behind) the calibration phantom 149-b.
  • a tracking device 146 may be positioned in an ultrasound imaging field 158 (later illustrated at Fig. 2B) associated with the imaging device 112 at multiple points by moving the tracking device 146 through the ultrasound imaging field.
  • the ultrasound imaging field 158 corresponds to the field of view 159 of the imaging device 112.
  • a navigation space 119 corresponding to the calibration phantom 149-b may be generated by the navigation system 118 as described herein, and the system 100 may display a virtual representation 201 of the navigation space 119 and the virtual space 155 via, for example, a user interface 110.
  • the tracking device 146 may be referred to as a navigated instrument.
  • the tracking device 146 is a navigation pointer (e.g., stylus).
  • Example aspects of the virtual representation 201 of the navigation space 119 and the virtual space 155 are later described with reference to Fig. 2B.
  • Fig. 2B illustrates an example of the virtual representation 201 of the navigation space 119 and the virtual space 155.
  • the virtual representation 201 may include a multi-dimensional representation corresponding to the volume of the calibration phantom 149-b.
  • the virtual representation 201 may include the imaging device 112, the instrument 145, the instrument 145 (or portions of the instrument 145), and/or tracking device 146.
  • the virtual representation 201 may include a pattern (e.g., represented by lines 202 and/or dots) that correspond to the patterns described with reference to the container used for implementing the calibration phantom 149-b.
  • Each coordinate system 203 includes three-dimensions including an X-axis, a Y-axis, and a Z-axis. Additionally or alternatively, coordinate system 203-a may be used to define surfaces, planes, or volume of the calibration phantom 149-b and/or the navigation space 119.
  • the virtual representation 201 may include coordinate system 203-b that corresponds to the imaging device 112.
  • each coordinate system 203 may be disposed orthogonal, or at 90 degrees, to one another. While the origin of a coordinate system 203 may be placed at any point on or near the components of the navigation system 118, for the purposes of description, the axes of the coordinate system 203 are always disposed along the same directions from figure to figure, whether the coordinate system 203 is shown or not. In some examples, reference may be made to dimensions, angles, directions, relative positions, and/or movements associated with one or more components of the imaging device 112 and/or the navigation system 118 with respect to a coordinate system 203.
  • a tracking device 146 e.g., electromagnetic antenna of an instrument 145 is intersecting the plane of the ultrasound imaging field 158 at a point 147 of intersection inside a circle 148.
  • the point 147 may be an echogenic dot.
  • the virtual representation 201 may include a window 204 displaying the imaging field 158 and information (e.g., point 147, circle 148, etc.) associated with the imaging field 158.
  • the system 100 may display guidance information 175 indicating pose information of the tracking device 146 with respect to the ultrasound imaging field 158.
  • the user guidance information 175 may include an indication whether the tracking device 146 is outside the plane of the ultrasound imaging field 158 (e.g., ‘Out of Plane’) or intersecting the plane of the ultrasound imaging field 158 (e.g., at a point 147).
  • the guidance information 175 may include distance information (e.g., ‘Tip to Plane: 1.69 cm’) of the tracking device 146 with respect to the ultrasound imaging field 158.
  • the aspects described herein with reference to the plane of the ultrasound imaging field 158 support implementations applied to any surface of a virtual space 155 (e.g., a plane of the virtual space 155 for cases in which the virtual space 155 is a 2D virtual space, a planar surface or non-planar surface of the virtual space 155 for cases in which the virtual space 155 is a 3D virtual space, etc.). It is to be understood that the aspects described herein may be applied to an electromagnetic antenna, a navigation stylus, a pointer, or any navigated tools having a geometry and location that is defined, known, and trusted by the system 100.
  • the system 100 may record all navigation data (e.g., electromagnetic data), imaging data (e.g., ultrasound data), and corresponding temporal information (e.g., temporal information 156, temporal information 166) in a multimedia file 154.
  • the multimedia file 154 may be a movie file
  • the system 100 may record timestamps corresponding to the navigation data and the imaging data.
  • the system 100 may identify when the tip of the tracking device 146 enters the ultrasound field of view 159 and intersects the plane of the ultrasound imaging field 158. Based on the identification of when the tip of the tracking device 146 enters the ultrasound field of view 159 and intersects the plane of the ultrasound imaging field 158, the system 100 may verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
  • Fig. 3 A illustrates example views 300 and 301 of an ultrasound beam 305 transmitted by the imaging device 112 when viewed from different perspective views.
  • the ultrasound beam 305 is relatively thick (or wide) with respect to the Y-axis.
  • the ultrasound beam 305 is relatively narrow with respect to the Z-axis.
  • thickness varies in depth, and the shape and focal point of the ultrasound beam 305 may be based on parameters (e.g., power, frequency, etc.) of the ultrasound beam 305.
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 based on instances in which the instrument 145 (or tracking device 146) intersects a cross-sectional area 310 (e.g., an area in the XY plane) of the ultrasound beam 305 in a direction along the Z axis.
  • a cross-sectional area 310 e.g., an area in the XY plane
  • the system 100 may perform the calibration for instances in which the instrument 145 or tracking device 146 intersects a portion of the cross-sectional area 310 (e.g., the length of the instrument 145 or tracking device 146 is perpendicular (or near perpendicular) the cross-sectional area 310), while factoring in values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
  • a portion of the cross-sectional area 310 e.g., the length of the instrument 145 or tracking device 146 is perpendicular (or near perpendicular) the cross-sectional area 310
  • parameters e.g., thickness, depth, shape, focal point, etc.
  • the system 100 may calibrate the virtual space 155 to the navigation space 119 for instances in which the instrument 145 (or tracking device 146) intersects the ultrasound beam 305 in the direction along the Z axis. That is, for example, the system 100 may perform the calibration for instances in which the instrument 145 or tracking device 146 intersects the ultrasound beam 305, while incorporating values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
  • parameters e.g., thickness, depth, shape, focal point, etc.
  • Fig. 3B illustrates example elements 320 (e.g., element 320-a, element 320-b) that may be implemented at instrument 145 (e.g., at a tip 325 of the instrument 145, at a tip 325 of tracking device 146, etc.) in association with calibrating the virtual space 155 to the navigation space 119.
  • element 320-a may be a gel that is cylinder shaped.
  • element 320-b may be a gel that is sphere shaped.
  • Fig. 3C illustrates example shapes 330 (e.g., shapes 330-a through 330-c) that may be implemented at a tip 325 of the instrument 145, at a tip 325 of tracking device 146, or the like. Aspects of the present disclosure may include implementing any of the shapes 330, and the shapes 330 may support tip-image center alignment. That is, for example, each shape 330 may be symmetrical with respect to a center of the shape 330, which may support alignment of the center of the shape 330 and a center 335 of an ultrasound beam 305 emitted by the imaging device 112, an example of which is illustrated at Fig. 3D.
  • each shape 330 may be symmetrical with respect to a center of the shape 330, which may support alignment of the center of the shape 330 and a center 335 of an ultrasound beam 305 emitted by the imaging device 112, an example of which is illustrated at Fig. 3D.
  • Fig. 4 illustrates an example of a process flow 400 in accordance with aspects of the present disclosure.
  • process flow 400 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 3.
  • the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of the process flow 400, or other operations may be added to the process flow 400.
  • any of the operations of process flow 400 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein.
  • the system 100 may generate a navigation space 119 as described herein. In the example of the process flow 400.
  • the system 100 may generate a virtual space 155 based on images captured by the imaging device 112 as described herein.
  • the system 100 may generate the virtual space 155 based on images representing the inside of calibration phantom 149-b (e.g., water bath).
  • the system 100 may initiate a calibration process 401 in accordance with aspects of the present disclosure. For example, at 420, the system 100 may initiate calibration of the coordinate system 160 associated with the virtual space 155 (and imaging device 112) with respect to the coordinate system 165 associated with the navigation space 119 (and navigation system 118).
  • the system 100 may provide user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) described herein regarding how to move or position a device (e.g., the imaging device 112, the instrument 145, etc.) in association with the calibration process.
  • user guidance information 175 e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.
  • a device e.g., the imaging device 112, the instrument 145, etc.
  • calibration guidance information may be used interchangeably herein. Aspects of the present disclosure support implementations with or without providing user guidance information 175.
  • the system 100 may determine, from an image 153 (or multimedia file 154), whether an event has occurred in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155.
  • the system 100 may analyze subsequent images 153 (or multimedia files 154) until the system 100 detects an event in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155.
  • the system 100 may return to 425 and provide additional user guidance information 175 that prompts a user to position or orient the imaging device 112 and/or the instrument 145 to trigger such an event.
  • the system 100 may identify, from the image 153 (or multimedia file 154) the set of coordinates 157 at which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155.
  • the system 100 may identify the temporal information 156 associated with when the instrument 145 intersected the surface of the virtual space 155.
  • the system 100 may identify pose information 162 (in the virtual space 155) of the instrument 145 that corresponds to the temporal information 156.
  • the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and the temporal information 156 as described herein.
  • the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157, the temporal information 156, the pose information 161 (of the instrument 145 in the virtual space 155) corresponding to the temporal information 156, and pose information 162 (of the instrument 145 in the navigation space 119) corresponding to the temporal information 156.
  • the system 100 may determine whether to repeat the calibration process 401.
  • a temporal duration e.g., recalibration every X hours, every day, etc.
  • the system 100 may repeat the calibration process 401, beginning at any operation (e.g., generating the virtual space 155 at 410, initiating calibration at 420, etc.) of the calibration process 401.
  • the system 100 may return to 410 and generate the virtual space 155, but while navigating the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
  • the calibration phantom 149-a e.g., tissue phantom
  • the system 100 may regenerate the virtual space 155 based on images captured by the imaging device 112 as described herein, but the images may be associated with or include calibration phantom 149-a (e.g., tissue phantom).
  • calibration phantom 149-a e.g., tissue phantom
  • the system 100 may again return to 410 in response to a ‘Yes’ decision at any of 455 through 457 and generate the virtual space 155, while imaging the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
  • the calibration phantom 149-a e.g., tissue phantom
  • the system 100 may continue to provide navigation information (e.g., tracking information 167, etc.).
  • navigation information e.g., tracking information 167, etc.
  • the system 100 may monitor for one or more events in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155 as described herein.
  • the system 100 may determine, from the event detected at 460, whether recalibration is to be performed.
  • the system 100 may detect the amount of distortion in the navigated volume 150 (e.g., discrepancies between navigation data associated with the instrument 145 and imaging data associated with the instrument 145) based on the event detected at 460. Based on the amount of distortion detected by the system 100, the system 100 may return to 435 (e.g., for recalibration) or refrain from returning to 435 (e.g., abstain from performing recalibration).
  • the system 100 may detect the amount of distortion in the navigated volume 150 (e.g., discrepancies between navigation data associated with the instrument 145 and imaging data associated with the instrument 145) based on the event detected at 460. Based on the amount of distortion detected by the system 100, the system 100 may return to 435 (e.g., for recalibration) or refrain from returning to 435 (e.g., abstain from performing recalibration).
  • imaging information and/or navigation information e.g., tracking information 167, etc.
  • the system 100 may determine (at 465) whether to perform recalibration, at any occurrence of an event detected at 460. For example, the system 100 may perform recalibration at each occurrence of an event detected at 460, at each n lh occurrence of the event, or at each n lh occurrence of the event within a temporal duration.
  • the example aspects of the process flow 400 described herein support automatic and continuous (or semi-continuous) recalibration by the system 100.
  • Fig. 5 illustrates an example of a process flow 500 in accordance with aspects of the present disclosure.
  • process flow 500 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 3.
  • the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of the process flow 500, or other operations may be added to the process flow 500.
  • any of the operations of process flow 500 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein.
  • a computing device 102 e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.
  • any device e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.
  • the process flow 500 may include generating a navigation space based on one or more tracking signals.
  • the process flow 500 may include generating a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals.
  • the calibration phantom includes an ultrasound conductive material.
  • the calibration phantom may include an ultrasound transmitting volume (e.g., water bath).
  • the calibration phantom includes a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • the virtual space corresponds to a field of view of the imaging device.
  • the process flow 500 may include identifying a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images.
  • the set of coordinates are identified in the virtual space in response to at least the portion of the tracked device intersecting a plane of the virtual space at the set of coordinates.
  • the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a magnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
  • an optical tracking system an acoustic tracking system
  • an electromagnetic tracking system an electromagnetic tracking system
  • a magnetic tracking system a magnetic tracking system
  • a radar tracking system an inertial measurement unit (IMU) based tracking system
  • IMU inertial measurement unit
  • the process flow 500 may include calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event. In some aspects, calibrating the first coordinate system with respect to the second coordinate system is absent pausing the surgical procedure.
  • calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness (e.g., ultrasound beam thickness), beam shape (e.g., ultrasound beam shape), or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • beam thickness e.g., ultrasound beam thickness
  • beam shape e.g., ultrasound beam shape
  • the tracked device is comprised in at least a portion of an instrument
  • the process flow 500 may include detecting, in the one or more images, one or more landmarks corresponding to at least a portion of the tracked device, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the one or more landmarks.
  • calibrating the first coordinate system with respect to the second coordinate system includes verifying a registration accuracy between the first coordinate system and the second coordinate system.
  • the process flow 500 may include outputting guidance information associated with positioning the imaging device, the tracked device, or both in association with calibrating the first coordinate system with respect to the second coordinate system.
  • the process flow 500 may include detecting one or more discrepancies between first tracking data corresponding to the tracked device in association with the navigation space and second tracking data corresponding to the tracked device in association with the virtual space. In some aspects, the process flow 500 may include generating a notification associated with the one or more discrepancies, performing one or more operations associated with compensating for the one or more discrepancies, or both.
  • the process flow 500 (and/or one or more operations thereof) may be carried out or otherwise performed, for example, by at least one processor.
  • the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
  • the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
  • a processor other than any processor described herein may also be used to execute the process flow 500.
  • the at least one processor may perform operations of the process flow 500 by executing elements stored in a memory such as the memory 106.
  • the elements stored in memory and executed by the processor may cause the processor to execute one or more operations of a function as shown in the process flow 500.
  • One or more portions of the process flow 500 may be performed by the processor executing any of the contents of memory, such as image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
  • the present disclosure encompasses methods with fewer than all of the steps identified herein (and the corresponding description of respective process flows), as well as methods that include additional steps beyond those identified in the figures and process flows described herein).
  • the present disclosure also encompasses methods that include one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or include a registration or any other correlation.
  • Example aspects of the present disclosure include: [0185] A system including: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to one or more signals transmitted by an imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness, beam shape, or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • the calibration phantom includes: a water bath; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • instructions are further executable by the processor to: output guidance information associated with positioning the imaging device, the tracked device, or both in association with calibrating the first coordinate system with respect to the second coordinate system.
  • the tracked device is included in at least a portion of an instrument
  • the instructions are further executable by the processor to: detect, in the one or more images, one or more landmarks corresponding to at least a portion of the tracked device, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the one or more landmarks.
  • calibrating the first coordinate system with respect to the second coordinate system includes verifying a registration accuracy between the first coordinate system and the second coordinate system.
  • the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
  • an optical tracking system an acoustic tracking system
  • an electromagnetic tracking system an electromagnetic tracking system
  • a radar tracking system an inertial measurement unit (IMU) based tracking system
  • IMU inertial measurement unit
  • An imaging system including an imaging device; a tracking system including: a transmission device; and a tracked device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals emitted by the transmission device; generate a virtual space including at least a portion of the calibration phantom based on one or more images generated by the imaging system, wherein the one or more images are generated in response to one or more signals transmitted by the imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of the tracked device is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • the calibration phantom includes: a water bath; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness, beam shape, or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • a method including: generating a navigation space based on one or more tracking signals; generating a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals; identifying a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • the calibration phantom includes: a water bath; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
  • aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc.) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Any combination of one or more computer-readable medium(s) may be utilized.
  • the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
  • a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
  • a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
  • Example 1 A system comprising: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space comprising at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to one or more signals transmitted by an imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • Example 2. The system of example 1 , wherein the set of coordinates are identified in the virtual space in response to at least the portion of the tracked device intersecting a surface of the virtual space at
  • Example 3 The system of claim 1, wherein: calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • Example 4 The system of example 1 , wherein calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness, beam shape, or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • Example 5 The system of example 1, wherein the calibration phantom comprises: ultrasound conductive material; or a tissue phantom comprised in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • Example 6 The system of example 1 , wherein the instructions are further executable by the processor to: output guidance information associated with positioning the imaging device, the tracked device, or both in association with calibrating the first coordinate system with respect to the second coordinate system.
  • Example 7 The system of example 1, wherein the tracked device is comprised in at least a portion of an instrument, and the instructions are further executable by the processor to: detect, in the one or more images, one or more landmarks corresponding to at least a portion of the tracked device, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the one or more landmarks.
  • Example 8 The system of example 1, wherein calibrating the first coordinate system with respect to the second coordinate system comprises verifying a registration accuracy between the first coordinate system and the second coordinate system.
  • Example 9 The system of example 1, wherein the instructions are further executable by the processor to: detect one or more discrepancies between first tracking data corresponding to the tracked device in association with the navigation space and second tracking data corresponding to the tracked device in association with the virtual space; and generate a notification associated with the one or more discrepancies, perform one or more operations associated with compensating for the one or more discrepancies, or both.
  • Example 10 The system of example 1, wherein the virtual space corresponds to a field of view of the imaging device.
  • Example 11 The system of example 1 , wherein the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a magnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
  • an optical tracking system an acoustic tracking system
  • an electromagnetic tracking system an electromagnetic tracking system
  • a magnetic tracking system e.g., a radar tracking system
  • IMU inertial measurement unit
  • Example 12 A system comprising: an imaging system comprising an imaging device; a tracking system comprising: a transmission device; and a tracked device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals emitted by the transmission device; generate a virtual space comprising at least a portion of the calibration phantom based on one or more images generated by the imaging system, wherein the one or more images are generated in response to one or more signals transmitted by the imaging device; identify a set of coordinates in the virtual space in response to an event in which at least a portion of the tracked device is detected in the one or more images; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • Example 13 The system of example 12, wherein the set of coordinates are identified in the virtual space in response to at least the portion of the tracked device intersecting a surface of the virtual space at the set of coordinates.
  • Example 14 The system of example 12, wherein: calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • Example 15 The system of example 12, wherein the calibration phantom comprises: ultrasound conductive material; or a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
  • Example 16 The system of example 12, wherein calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness, beam shape, or both of the one or more signals transmitted by the imaging device; pose information of the portion of the tracked device in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
  • Example 17 A method comprising: generating a navigation space based on one or more tracking signals; generating a virtual space comprising at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals; identifying a set of coordinates in the virtual space in response to an event in which at least a portion of a tracked device in the calibration phantom is detected in the one or more images; and calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
  • Example 18 The method of example 17, wherein the set of coordinates are identified in the virtual space in response to at least the portion of the tracked device intersecting a plane of the virtual space at the set of coordinates.
  • Example 19 The method of example 17, wherein: calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
  • Example 20 The method of example 17, wherein the calibration phantom comprises: ultrasound conductive material; or a tissue phantom comprised in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Robotics (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

L'invention concerne un système comprenant un système d'imagerie et un système de suivi. Le système génère un espace de navigation basé sur un ou plusieurs signaux de suivi émis par un dispositif de transmission. Le système génère un espace virtuel comprenant au moins une partie d'un fantôme d'étalonnage sur la base d'une ou de plusieurs images générées par le système d'imagerie. Le système identifie un ensemble de coordonnées dans l'espace virtuel en réponse à un événement dans lequel au moins une partie du dispositif suivi est détectée dans la ou les images. Le système étalonne un premier système de coordonnées associé à l'espace virtuel par rapport à un second système de coordonnées associé à l'espace de navigation en réponse à l'événement. L'étalonnage du premier système de coordonnées par rapport au second système de coordonnées est basé sur l'ensemble des coordonnées et des informations temporelles associées à l'événement.
PCT/IB2024/055835 2023-06-15 2024-06-14 Système et procédé d'enregistrement et d'étalonnage de sonde à ultrasons pour la navigation en temps réel Pending WO2024257035A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202363521144P 2023-06-15 2023-06-15
US63/521,144 2023-06-15
US18/676,263 US20240415496A1 (en) 2023-06-15 2024-05-28 System and method to register and calibrate ultrasound probe for navigation in real time
US18/676,263 2024-05-28

Publications (1)

Publication Number Publication Date
WO2024257035A1 true WO2024257035A1 (fr) 2024-12-19

Family

ID=91621102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2024/055835 Pending WO2024257035A1 (fr) 2023-06-15 2024-06-14 Système et procédé d'enregistrement et d'étalonnage de sonde à ultrasons pour la navigation en temps réel

Country Status (1)

Country Link
WO (1) WO2024257035A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090221908A1 (en) * 2008-03-01 2009-09-03 Neil David Glossop System and Method for Alignment of Instrumentation in Image-Guided Intervention
WO2013140315A1 (fr) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Calibrage d'ultrason localisé d'intervention chirurgicale

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090221908A1 (en) * 2008-03-01 2009-09-03 Neil David Glossop System and Method for Alignment of Instrumentation in Image-Guided Intervention
WO2013140315A1 (fr) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Calibrage d'ultrason localisé d'intervention chirurgicale

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CHEN ELVIS C ET AL: "Guided ultrasound calibration: where, how, and how many calibration fiducials", INTERNATIONAL JOURNAL OF COMPUTER ASSISTED RADIOLOGY AND SURGERY, SPRINGER, DE, vol. 11, no. 6, 2 April 2016 (2016-04-02), pages 889 - 898, XP035942137, ISSN: 1861-6410, [retrieved on 20160402], DOI: 10.1007/S11548-016-1390-7 *
WEN TIEXIANG ET AL: "A Novel Ultrasound Probe Spatial Calibration Method Using a Combined Phantom and Stylus", ULTRASOUND IN MEDICINE AND BIOLOGY, NEW YORK, NY, US, vol. 46, no. 8, 21 May 2020 (2020-05-21), pages 2079 - 2089, XP086203044, ISSN: 0301-5629, [retrieved on 20200521], DOI: 10.1016/J.ULTRASMEDBIO.2020.03.018 *

Similar Documents

Publication Publication Date Title
US20220395342A1 (en) Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure
US20230389991A1 (en) Spinous process clamp registration and methods for using the same
US20220125526A1 (en) Systems and methods for segmental tracking
CN116437866A (zh) 用于基于计算的机器人臂位置生成图像的方法、装置和系统
CN118648066A (zh) 用于提供增强显示的系统、方法和装置
US20240415496A1 (en) System and method to register and calibrate ultrasound probe for navigation in real time
US12094128B2 (en) Robot integrated segmental tracking
US20230346492A1 (en) Robotic surgical system with floating patient mount
WO2024257035A1 (fr) Système et procédé d'enregistrement et d'étalonnage de sonde à ultrasons pour la navigation en temps réel
WO2025122631A1 (fr) Système et procédé de détection et de sélection automatiques de points 3d d'échographie pour l'enregistrement d'une sonde échographique pour la navigation
US12310676B2 (en) Navigation at ultra low to high frequencies
CN118102968A (zh) 解剖成像用机器人放置电极的系统、装置和方法
US20240156531A1 (en) Method for creating a surgical plan based on an ultrasound view
US12475987B2 (en) Robotically-assisted drug delivery
WO2024229651A1 (fr) Positionnement intelligent d'un chariot de bras de robot
US20230401766A1 (en) Systems, methods, and devices for generating a corrected image
US20240382169A1 (en) Long image multi-field of view preview
WO2025079075A1 (fr) Caméra de navigation suivante
WO2025141396A1 (fr) Système et procédé d'orientation de l'affichage d'une sonde pour une navigation en temps réel
WO2025146597A1 (fr) Procédés et systèmes ajoutant des annotations préalablement à une opération et en temps réel dans un espace de navigation
CN118647331A (zh) 用于生成混合图像的系统和装置
WO2023148586A1 (fr) Systèmes, procédés et dispositifs pour suivre un ou plusieurs objets
CN118613830A (zh) 用于重建三维表示的系统、方法和装置
EP4153085A1 (fr) Système et procédé pour positionner indépendamment un outil d'ablation et un dispositif d'imagerie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24735336

Country of ref document: EP

Kind code of ref document: A1