WO2025122631A1 - System and method for automatic ultrasound 3d- point detection and selection for ultrasound probe registration for navigation - Google Patents
System and method for automatic ultrasound 3d- point detection and selection for ultrasound probe registration for navigation Download PDFInfo
- Publication number
- WO2025122631A1 WO2025122631A1 PCT/US2024/058483 US2024058483W WO2025122631A1 WO 2025122631 A1 WO2025122631 A1 WO 2025122631A1 US 2024058483 W US2024058483 W US 2024058483W WO 2025122631 A1 WO2025122631 A1 WO 2025122631A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- protrusion
- imaging device
- coordinate system
- navigation
- event
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Clinical applications
- A61B8/0833—Clinical applications involving detecting or locating foreign bodies or organic structures
- A61B8/0841—Clinical applications involving detecting or locating foreign bodies or organic structures for locating instruments
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/42—Details of probe positioning or probe attachment to the patient
- A61B8/4245—Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/58—Testing, adjusting or calibrating the diagnostic device
- A61B8/587—Calibration phantoms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2048—Tracking techniques using an accelerometer or inertia sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2051—Electromagnetic tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2063—Acoustic tracking systems, e.g. using ultrasound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/373—Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
- A61B2090/3735—Optical coherence tomography [OCT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/374—NMR or MRI
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/376—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy
- A61B2090/3762—Surgical systems with images on a monitor during operation using X-rays, e.g. fluoroscopy using computed tomography systems [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
- A61B2090/378—Surgical systems with images on a monitor during operation using ultrasound
Definitions
- the present disclosure is generally directed to navigation and ultrasound imaging and relates more particularly to calibrating an ultrasound probe for navigation.
- Imaging devices and navigation systems may assist a surgeon or other medical provider in carrying out a surgical procedure. Imaging may be used by a medical provider for visual guidance in association with diagnostic and/or therapeutic procedures.
- Navigation systems may be used for tracking objects (e.g., instruments, imaging devices, etc.) associated with carrying out the surgical procedure.
- objects e.g., instruments, imaging devices, etc.
- Example aspects of the present disclosure include:
- a system including: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals from a calibration phantom and an imaging device; generate a virtual space including at least a portion the calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom; determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein
- the instructions are further executable by the processor to output guidance information associated with positioning the imaging device in association with calibrating the first coordinate system with respect to the second coordinate system.
- calibrating the first coordinate system with respect to the second coordinate system is in response to one occurrence of the first event or the second event and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
- calibrating the first coordinate system with respect to the second coordinate system is based on beam thickness, beam shape and/or signal frequency.
- the calibration phantom comprises: ultrasound conductive material; or a tissue phantom comprised in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
- the protrusion pair is provided perpendicular to the imaging device projection and the single protrusion is provided parallel to the imaging device projection.
- a system including: an imaging system including an imaging device; a tracking system including a transmission device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space including at least a portion the calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protraction in the calibration phantom; determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second
- a method including: generating a navigation space based on one or more tracking signals from a calibration phantom and an imaging device; generating a virtual space including at least a portion of a calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identifying sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom; determining an optimal set of coordinates in the virtual space based on an event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the
- any of the aspects herein further including: providing a plurality of protrusion pairs; and determining the optimal set of coordinates in the virtual space based on a third event in which the imaging device projection intersects each protrusion of the protrusion pair of more than one protrusion pair of the plurality of protrusion pairs.
- any of the aspects herein further comprising calibrating the first coordinate system associated with the virtual space with respect to the second coordinate system associated with the navigation space in response to the second event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event.
- calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event.
- FIG. 1 A illustrates an example of a system in accordance with aspects of the present disclosure.
- Fig. IB illustrates an example of a system in accordance with aspects of the present disclosure.
- FIG. 1C illustrates an example of a system in accordance with aspects of the present disclosure.
- FIG. 2A illustrates an example implementation of a system in accordance with aspects of the present disclosure.
- Fig. 2B illustrates a view of an imaging device and a calibration phantom used in the system of Fig. 2A in accordance with aspects of the present disclosure.
- Fig. 2C illustrates a view of a calibration phantom used in the system of Fig. 2A in accordance with aspects of the present disclosure.
- FIG. 3 A illustrates example views of an ultrasound beam in accordance with aspects of the present disclosure.
- Fig. 3B illustrates example views of an imaging device imaging a calibration phantom having a protrusion pair provided therein with accompanying images of the calibration phantom formed by imaging in accordance with aspects of the present disclosure.
- Fig. 3C illustrates an example view of an ultrasound beam projection in accordance with aspects of the present disclosure.
- Fig. 3D illustrates another example view of an ultrasound beam projection in accordance with aspects of the present disclosure.
- Fig. 3E illustrates example implementations for a protrusion for a protrusion pair in accordance with aspects of the present disclosure.
- Fig. 3F illustrates example implementations for a tip of a protrusion for a protrusion pair in accordance with aspects of the present disclosure.
- Fig. 3G illustrates an example implementation for a tip of a protrusion for a protrusion pair imaged with an ultrasound beam in accordance with aspects of the present disclosure.
- Fig. 3H illustrates example views of a protrusion within a protraction holder in accordance with aspects of the present disclosure.
- FIG. 31 illustrates example views of an imaging device imaging a protrusion within a protrusion holder with accompanying images formed by imaging in accordance with aspects of the present disclosure.
- FIG. 4 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
- Fig. 5 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
- Fig. 6 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
- the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions). Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions).
- Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
- data storage media e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer.
- processors such as one or more digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuit
- DSPs digital signal processors
- proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
- an ultrasound probe may be calibrated/registered relative to a navigation means (e.g., a tracked sensor, etc.) in association with navigated image acquisition.
- a navigation means e.g., a tracked sensor, etc.
- some systems may establish a transformation matrix that maps the six-dimensional (6D) pose (e.g., position and orientation information) of the tracked sensor to the 6D pose of an ultrasound probe.
- 6D pose e.g., position and orientation information
- Some systems may map the 6D pose of the tracked sensor to an image generated by the ultrasound probe or to the ultrasound beam of the ultrasound probe.
- some calibration methods may be tedious, time consuming, and error prone.
- some calibration phantoms utilized for calibrating the ultrasound probe to the navigation system are costly and are prone to decay with time (e.g., due to the degradation of hydrogels implemented in some calibration phantoms).
- wires provided within the calibration phantom can only be detected with ultrasound from very specific angles, making it difficult to acquire sufficient independent images to use for calibration. Using such calibration phantoms requires multiple images taken from multiple sides of the calibration phantom and at very specific angles in order to detect the wires. This increases the complexity and the amount of time needed to perform the calibration and introduces sources of error.
- Instances may occur in which a surgical team is unaware that the ultrasound probe has lost calibration with the navigation system. In some cases, even if the surgical team is aware of the loss in calibration, the team may be unwilling to recalibrate the ultrasound probe (e.g., due to lack of time or resources). Undetected or unaddressed loss in calibration during a medical procedure (e.g., due to deformation, tool drop/hit, etc.) may result in surgical errors. In some other cases, metal and other materials present in the environment may cause distortion to an electromagnetic field generated by the navigation system in association with tracking an object, and such distortion may result in surgical errors.
- systems and techniques described herein may support dynamic initial calibration and dynamic recalibration of ultrasound probes (also referred to herein as ultrasonic probes) for navigation.
- the systems and techniques may incorporate an ultrasound probe connected to a main application/navigation system (also referred to herein as a navigated surgery system).
- the systems and techniques may include electromagnetic navigation of the ultrasound probe using trackers/sensors coupled to the ultrasound probe and an emitter capable of emitting electromagnetic signals.
- the systems and techniques may include an electromagnetic tracked system (e.g., a calibration phantom).
- the calibration phantom may be a phantom with a configuration of a plurality of paired, opposed, pointy protrusions (e.g., peg pairs or protrusion pairs) submerged in a water bath arranged at known locations within the calibration phantom.
- the protrusion pairs have a cylindrical shape.
- the protrusion pairs are arranged in a geometric pattern. The tip of each protrusion has coordinates that are very well established based on a reference point (e.g., the calibration phantom’s cartesian origin).
- the calibration phantom may be a water bath or a tissue phantom, but is not limited thereto.
- the example calibration phantoms described herein are stable, inexpensive compared to some other calibration phantoms, and electromagnetic friendly.
- the calibration phantom may be free of materials that may interfere with electromagnetic navigation and tracking.
- the calibration phantom may be a gel (e.g., hydrogel).
- the calibration phantom can be suitable for various types of navigation and tracking, including optical, acoustic, magnetic, radar, inertial, computer vision based.
- Examples of the techniques described herein may include moving or positioning a tracked device (e.g., the ultrasound probe) adjacent to or on top of the (also tracked) calibration phantom and capturing multiple ultrasound images within the calibration phantom of the ultrasound beam of the ultrasound probe intersecting with a one or a plurality of protrusion pairs. For each iteration of the tracked device moving within the calibration phantom, the captured ultrasound images of the protrusion pairs intersecting with the ultrasound beam are identified. After enough samples that have met a predetermined threshold have been taken, a transformation matrix of the tracker-probe (or tracker-image) is calculated.
- the techniques may include recording a video file of the ultrasound imaging concurrently with the tracking data and processing the video file (e.g., using a software script). Based on the processing of the video file, the techniques may include identifying a temporal instance at which a portion (e.g., tip, or gap between two opposed protrusions’ tips, etc.) of the protrusion pair enters the ultrasound view. The systems and techniques may calibrate the ultrasound view with respect to the tracking space.
- a portion e.g., tip, or gap between two opposed protrusions’ tips, etc.
- Examples of the tracked device and the tracking space include an electromagnetic tracked device and an electromagnetic tracking space but are not limited thereto. Aspects of the present disclosure support any type of tracked devices (e.g., sensors) and tracking spaces that may be implemented by a navigation system.
- the transformation matrix is tested for validation. If the results were not fully satisfactory, the process is continued or repeated based on analytic results and/or user preferences. According to embodiments of the present disclosure, the process can be repeated by changing the ultrasound settings (e.g., the ultrasound frequency, the ultrasound depth, etc.) to optimize calibration and ultrasound probe characterization.
- the ultrasound settings e.g., the ultrasound frequency, the ultrasound depth, etc.
- ultrasound beams can be characterized (e.g., addressing axial and lateral resolution as well as side-thickness or elevation resolution).
- protrusion pair distances may be varied.
- protrusion pair distances are used to characterize ultrasound beam thickness.
- protrusion pair distances at different depths provided for a more complete and accurate calibration and ultrasound beam shape characterization.
- the calibration phantom can be used with fluids other than water to improve image quality due to variations in speed of sound for the different fluids or media.
- different phantom materials can be used.
- different protrusion pair geometry, shapes, angles, etc. can be used without departing from the spirit and scope of the present disclosure.
- the systems and techniques described herein may support autonomous or semi-autonomous calibration, calibration verification with reduced calibration time compared to some other calibration techniques, and providing or outputting guidance information (e.g., tutorials, user prompts, etc.) for users on how to move or position a device (e.g., ultrasound imaging device, electromagnetic tracked device, etc.) in association with calibration.
- a device e.g., ultrasound imaging device, electromagnetic tracked device, etc.
- the systems and techniques described herein support performing multiple calibrations (e.g., an initial calibration using a water bath, one or more subsequent calibrations using a tissue phantom, etc.), in which a system may perform the calibrations autonomously (or semi-autonomously).
- the systems and techniques may perform the calibrations continuously (or semi-continuously) and/or in response to trigger criteria, aspects of which are described herein. It is to be understood that as described herein with respect to calibration may be applied to recalibration.
- calibration may be applied to recalibration.
- recalibration may be used interchangeably herein.
- Techniques described herein may be implemented in hardware, software, firmware, or any combination thereof that may automatically detect instrument landmarks on ultrasound images during a medical procedure.
- the techniques may include detecting landmarks of an instrument (e.g., tip of a needle during placement, distinctive features of navigated catheters, tip of a registration stylus, etc.) during the medical procedure and, using the detected instrument landmarks, automatically calibrating (or adjusting the calibration of) the ultrasound imaging device to the navigation system.
- the ultrasound imaging device may be, for example, an ultrasound probe.
- aspects of the automatic calibration techniques may provide time savings for the surgical team, an improved user experience, and increased accuracy over longer portions of medical procedures.
- Other aspects of the calibration techniques provide cost savings through the use of, as a calibration phantom, an empty container (e.g., an empty box including an electromagnetic friendly material) with a plurality of protrusion pairs.
- the calibration phantom may be filled with water for the calibration procedure, thereby resulting in cost savings compared to other materials (e.g., gels, silicones, etc.).
- using water in the calibration phantom may provide increased durability.
- the electromagnetic tracking and calibration solutions described herein support directly using electromagnetic tools implemented in some existing medical procedures.
- direct use of such existing electromagnetic tools may provide increased accuracy due to accurate tracking of electromagnetic tools by some navigation systems.
- the calibration techniques described herein may be implemented using actual tissue of a subject as a calibration phantom.
- the calibration techniques described herein may be implemented during a medical procedure associated with the actual tissue.
- a device e.g., an ablation antenna, cardiac catheter, etc.
- the calibration techniques and calibration software described herein may include using (e.g., automatically, or in response to a user request, etc.) the ultrasound images and corresponding electromagnetic tracking information to recalibrate the registration between the ultrasound space and the electromagnetic navigation space (also referred to herein as recalibrating the ultrasound tracking registration), without interrupting the medical procedure.
- the systems and techniques described herein support recalibrating the ultrasound imaging system in the background based on automatic detection of target objects (e.g., instruments, tools, etc.) in the ultrasound images using Al/machine learning computer vision algorithms, random forest classifiers, and object detection.
- target objects e.g., instruments, tools, etc.
- the systems and techniques support automatic registration which may be implemented continuously, based on each event in which an ultrasound beam intersects the midpoint of each protrusion of a protrusion pair is detected in the ultrasound view, and/or periodically (e.g., based on a temporal trigger).
- the systems and techniques support automatic registration in response to other trigger criteria (e.g., in response to detection of a target instrument in an ultrasound image) at any point during a medical procedure.
- the systems and techniques may include continuously verifying the registration accuracy between the ultrasound imaging system and the navigation system anytime the target instrument (e.g., surgical instrument, electromagnetic pointer, etc.) is detected in the ultrasound imaging.
- the systems and techniques support alerting the user and/or taking corrective actions in response to registration discrepancies.
- the system may provide the user with a list of corrective actions for improving calibration.
- corrective actions may include real-time actionable feedback for users to move the target instrument in association with registration.
- the systems and techniques may support dynamically and automatically detecting distortion in the navigated volume due to discrepancies between expected navigation and imaging data.
- the discrepancies may be between pose information of a tracked object as indicated by the navigation system and pose information of the tracked object as indicated by the imaging data.
- the systems described herein may support techniques for alerting (e.g., providing a notification to) a user of the discrepancies and compensating for the discrepancies.
- the systems and techniques may include calibrating the navigation data to the imaging data (e.g., calibrating a navigation space to an ultrasound space) while compensating for the discrepancies.
- aspects of the present disclosure support integration of a calibration phantom (e.g., water bath, hydrogel phantom, etc.) into the structure of a patient tracker.
- the integration of the calibration phantom may support user recalibration of a navigated ultrasound probe before or during a medical procedure.
- the recalibration techniques described herein the temporal duration associated with recalibration is reduced compared to some other recalibration techniques.
- Implementations of the present disclosure provide technical solutions to one or more of the problems associated with other navigation systems and calibration techniques.
- the systems and techniques described herein provide time savings, improved user experience, cost savings, and increased accuracy in comparison to other registration and calibration techniques.
- the systems and techniques described herein support continuous registration during a surgical procedure and continuous registration verification, in which registration and registration verification may be autonomous or semi-autonomous.
- aspects of the systems and techniques described herein support time efficient and cost-effective utilization of ultrasound images during surgery to navigate and display to medical personnel the locations of surgical devices (e.g., instruments, surgical tools, robotic end effectors, etc.) with respect to the patient anatomy.
- the systems and techniques described herein provide a reliable accuracy of the calibration between imaging devices (e.g., an ultrasound image probe, other imaging probes, etc.) and a navigation system, and the reliable accuracy may support accurate navigation of images (e.g., ultrasound images, etc.) that are generated based on data captured by the imaging devices.
- different imaging probes may be different based on manufacturer, configuration, probe type, and the like, and such imaging probes may require new calibration and can lose calibration during a medical procedure.
- Aspects of the calibration techniques described herein are relatively time efficient, cost effective, user friendly, and provide increased accuracy compared to other techniques for calibrating or recalibrating imaging probes.
- the time efficiency, cost effectiveness, user friendliness, and increased accuracy supported by the systems and techniques described herein may provide improved confidence for a surgeon in a navigated ultrasound space and support a reduction in surgical errors.
- the registration and calibration techniques described herein may be implemented autonomously (e.g., without input from medical personnel) or semi-autonomously (e.g., with partial input from medical personnel).
- aspects of the present disclosure relate to navigated and robotic surgery and to any type of surgery that may be associated with intra-surgical ultrasound imaging.
- Aspects of the present disclosure support implementing any of the techniques described herein to any medical procedure (e.g., cranial, spinal, thoracic, abdominal, cardiac, ablation, laparoscopic, minimally invasive surgery, robotic surgery, etc.) associated with the use of intra-surgical ultrasound imaging.
- the systems and techniques described herein may be implemented in association with initiatives related to data analytics, artificial intelligence, and machine learning, for example, with respect to data analytic scenarios for procedure and device optimization.
- Fig. 1A illustrates an example of a system 100 that supports aspects of the present disclosure.
- the system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud network 134 (or other network).
- Systems according to other implementations of the present disclosure may include more or fewer components than the system 100.
- the system 100 may omit and/or include additional instances of one or more components of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
- the system 100 may omit any instance of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134.
- the system 100 may support the implementation of one or more other aspects of one or more of the methods disclosed herein.
- the computing device 102 includes a processor 104, a memory 106, a communication interface 108, and a user interface 110.
- Computing devices according to other implementations of the present disclosure may include more or fewer components than the computing device 102.
- the computing device 102 may be, for example, a control device including electronic circuitry associated with controlling any components of the system 100.
- the processor 104 of the computing device 102 may be any processor described herein or any similar processor.
- the processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
- the memory 106 may be or include RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions.
- the memory 106 may store information or data associated with completing, for example, any step of the process flow 500 described herein, or of any other methods.
- the memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the imaging devices 112, the robot 114, and the navigation system 118.
- the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128.
- Such content if provided as in instruction, may, in some implementations, be organized into one or more applications, modules, packages, layers, or engines.
- the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein.
- content or data e.g., machine learning models, artificial neural networks, deep neural networks, etc.
- the data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
- the computing device 102 may also include a communication interface 108.
- the communication interface 108 may be used for receiving data or other information from an external source (e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100), and/or for transmitting instructions, data (e.g., image data, tracking data, navigation data, calibration data, registration data, etc.), or other information to an external system or device (e.g., another computing device 102, the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component not part of the system 100).
- an external source e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100
- data e.g., image data, tracking data, navigation data, calibration data, registration data, etc.
- the communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth).
- the communication interface 108 may support communication between the device 102 and one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
- the computing device 102 may also include one or more user interfaces 110.
- the user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user.
- the user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100.
- the user interface 110 may support user modification (e.g., by a surgeon, medical personnel, a patient, etc.) of instructions to be executed by the processor 104 according to one or more implementations of the present disclosure, and/or to user modification or adjustment of a setting of other information displayed on the user interface 110 or corresponding thereto.
- user modification e.g., by a surgeon, medical personnel, a patient, etc.
- the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102.
- the user interface 110 may be located proximate one or more other components of the computing device 102, while in other implementations, the user interface 110 may be located remotely from one or more other components of the computer device 102.
- the imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.).
- image data refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form.
- the image data may include data corresponding to an anatomical feature of a patient, or to a portion thereof.
- the image data may be or include a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure.
- a first imaging device 112 may be used to obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
- first image data e.g., a first image
- second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
- the imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data.
- the imaging device 112 may be or include, for example, an ultrasound scanner (which may include, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may include, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient.
- the imaging device 112 may be contained entirely within a single housing
- the imaging device 112 may include more than one imaging device 112.
- a first imaging device may provide first image data and/or a first image
- a second imaging device may provide second image data and/or a second image.
- the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein.
- the imaging device 112 may be operable to generate a stream of image data.
- the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images.
- image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
- the robot 114 may be any surgical robot or surgical robotic system.
- the robot 114 may be or include, for example, the Mazor XTM Stealth Edition robotic guidance system.
- the robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time.
- the robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task.
- the robot 114 may be configured to hold and/or manipulate an anatomical element during or in connection with a surgical procedure.
- the robot 114 may include one or more robotic arms 116.
- the robotic arm 116 may include a first robotic arm and a second robotic arm, though the robot 114 may include more than two robotic arms.
- one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112.
- the imaging device 112 includes two or more physically separate components (e.g., a transmitter and receiver)
- one robotic arm 116 may hold one such component
- another robotic arm 116 may hold another such component.
- Each robotic arm 116 may be positionable independently of the other robotic arm.
- the robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
- the robot 114 together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
- the robotic arm(s) 116 may include one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
- reference markers e.g., navigation markers
- the reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof.
- the navigation system 118 can be used to track other components (e.g., imaging device 112, surgical tools, instruments 145 (later described with reference to Fig.
- the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
- the navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation.
- the navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic Stealth StationTM S8 surgical navigation system or any successor thereof.
- the navigation system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers (e.g., tracking devices 140, etc.) or other objects within the operating room or other room in which some or all of the system 100 is located.
- the one or more cameras may be optical cameras, infrared cameras, or other cameras.
- the navigation system 118 may include one or more tracking devices 140 (e.g., electromagnetic sensors, acoustic sensors, etc.).
- the navigation system 118 may include one or more of an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision-based tracking system.
- the navigation system 118 may include a corresponding transmission device 136 capable of transmitting signals associated with the tracking type.
- the navigation system 118 may be capable of computer vision-based tracking of objects present in images captured by the imaging device(s) 112.
- the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (e.g., instrument 145, etc.) (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing).
- the instrument 145 may be an electromagnetic pointer (or stylus).
- the navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
- an external source e.g., the computing device 102, imaging device 112, or other source
- the system 100 can operate without the use of the navigation system 118.
- the navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
- the processor 104 may utilize data stored in memory 106 as a neural network.
- the neural network may include a machine learning architecture.
- the neural network may be or include one or more classifiers.
- the neural network may be or include any machine learning network such as, for example, a deep learning network, a convolutional neural network, a reconstructive neural network, a generative adversarial neural network, or any other neural network capable of accomplishing functions of the computing device 102 described herein.
- Some elements stored in memory 106 may be described as or referred to as instructions or instruction sets, and some functions of the computing device 102 may be implemented using machine learning techniques.
- the processor 104 may support machine learning model(s) 138 which may be trained and/or updated based on data (e.g., training data 144) provided or accessed by any of the computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
- the machine learning model(s) 138 may be built and updated based on the training data 144 (also referred to herein as training data and feedback).
- the neural network and machine learning model(s) 138 may support Al/machine learning computer vision algorithms and object detection in association with automatically detecting, identifying, and tracking target objects (e.g., instruments, tools, etc.) in one or more images 153 or a multimedia file 154.
- target objects e.g., instruments, tools, etc.
- the database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems, an ultrasound space coordinate system, a patient coordinate system, and/or a navigation coordinate system, etc.).
- the database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the ultrasound space coordinate system, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images 153 useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
- the database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud network 134.
- the database 130 may include information associated with a calibration phantom 149 associated with a calibration procedure.
- the database 130 may be or include part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- a hospital image storage system such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
- PACS picture archiving and communication system
- HIS health information system
- the computing device 102 may communicate with a server(s) and/or a database (e.g., database 130) directly or indirectly over a communications network (e.g., the cloud network 134).
- the communications network may include any type of known communication medium or collection of communication media and may use any type of protocols to transport data between endpoints.
- the communications network may include wired communications technologies, wireless communications technologies, or any combination thereof.
- Wired communications technologies may include, for example, Ethernet-based wired local area network (LAN) connections using physical transmission mediums (e.g., coaxial cable, copper cable/wire, fiber-optic cable, etc.).
- Wireless communications technologies may include, for example, cellular or cellular data connections and protocols (e.g., digital cellular, personal communications service (PCS), cellular digital packet data (CDPD), general packet radio service (GPRS), enhanced data rates for global system for mobile communications (GSM) evolution (EDGE), code division multiple access (CDMA), single-carrier radio transmission technology (1 *RTT), evolution-data optimized (EVDO), high speed packet access (HSPA), universal mobile telecommunications service (UMTS), 3G, long term evolution (LTE), 4G, and/or 5G, etc.), Bluetooth®, Bluetooth® low energy, Wi-Fi, radio, satellite, infrared connections, and/or ZigBee® communication protocols.
- PCS personal communications service
- CDPD cellular digital packet data
- GPRS general packet radio service
- the Internet is an example of the communications network that constitutes an Internet Protocol (IP) network consisting of multiple computers, computing networks, and other communication devices located in multiple locations, and components in the communications network (e.g., computers, computing networks, communication devices) may be connected through one or more telephone systems and other means.
- IP Internet Protocol
- the communications network may include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a wireless LAN (WLAN), a Session Initiation Protocol (SIP) network, a Voice over Internet Protocol (VoIP) network, a cellular network, and any other type of packet-switched or circuit-switched network known in the art.
- POTS Plain Old Telephone System
- ISDN Integrated Services Digital Network
- PSTN Public Switched Telephone Network
- LAN Local Area Network
- WAN Wide Area Network
- WLAN wireless LAN
- VoIP Voice over Internet Protocol
- the communications network may include any combination of networks or network types.
- the communications network may include any combination of communication mediums such as coaxial cable, copper cable/wire, fiber-optic cable, or antennas for communicating data (e.g., transmitting/receiving data).
- the computing device 102 may be connected to the cloud network 134 via the communication interface 108, using a wired connection, a wireless connection, or both.
- the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud network 134.
- the system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the process flow 500 described herein.
- the system 100 or similar systems may also be used for other purposes.
- Fig. IB illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example may be implemented by the computing device 102, imaging device(s) 112, robot 114 (e.g., a robotic system), and navigation system 118.
- the navigation system 118 may provide navigation information based on an electromagnetic field generated by a transmission device 136.
- the navigation information may include tracking information 167 (also referred to herein as tracking data) as described herein.
- the transmission device 136 may include an array of transmission coils capable of generating or forming the electromagnetic field in response to respective currents driven through the transmission coils.
- the navigation system 118 may include tracking devices 140 capable of sensing the electromagnetic field. Aspects of the navigation system 118 described herein may be implemented by navigation processing 129.
- the system 100 may support tracking objects (e.g., an instrument 145, imaging device 112, a calibration phantom 149, etc.) in a trackable volume 150 using an electromagnetic field produced by the transmission device 136.
- the transmission device 136 may include a transmitter antenna or transmitting coil array capable of producing the electromagnetic field.
- the system 100 may track the pose (e.g., position, coordinates, orientation, etc.) of the objects in the tracking volume 150 relative to a subject 141.
- the system 100 may display, via a user interface of the computing device 102, icons corresponding to any tracked objects. For example, the system 100 may superimpose such icons on and/or adjacent an image displayed on the user interface.
- the terms “tracking volume,” “trackable volume,” “navigation volume,” and “volume” may be used interchangeably herein.
- the transmission device 136 may be an electromagnetic localizer that is operable to generate electromagnetic fields.
- the transmission device 136 may drive current through the transmission coils, thereby powering the coils to generate or form the electromagnetic field.
- the electromagnetic field will extend away from the transmission coils and form a navigation domain (e.g., volume 150).
- the volume 150 may include any portion (e.g., the spine, one or more vertebrae, the brain, an anatomical element, or a portion thereof, etc.) of the subject 141 and/or any portion of a calibration phantom 149-a.
- the transmission coils may be powered through a controller device and/or power supply provided by the system 100.
- the tracking devices 140 may include or be provided as sensors (also referred to herein as tracking sensors).
- the sensors may sense a selected portion or component of the electromagnetic field(s) generated by the transmission device 136.
- the navigation system 118 may support registration (e.g., through registration 128) of the volume 150 to a virtual space 155.
- the navigation system 118 may support superimposing an icon representing a tracked object (e.g., an instrument 145, a tracking device 140-b, tracking device 140-c, a tracking device 146, etc.) on the image.
- the system 100 may support the delivery of tracking information associated with the tracking devices 140 and/or tracking device 146 to the navigation system 118.
- the tracking information may include, for example, data associated with magnetic fields sensed by the tracking devices 140.
- the tracking devices 140 may communicate sensor information to the navigation system 118 for determining a position of the tracked portions relative to each other and/or for localizing an object (e.g., instrument 145, tracking device 146, etc.) relative to an image 153.
- the navigation system 118 and/or transmission device 136 may include a controller that supports operating and powering the generation of electromagnetic fields.
- the system 100 may generate a navigation space 119 based on one or more tracking signals transmitted by the transmission device 136.
- the navigation space 119 may correspond to environment 142 or a portion thereof.
- the navigation space 119 may correspond to a subject 141 (e.g., a patient) included in the environment 142 or an anatomical element (e.g., an organ, bone, tissue, etc.) of the subject 141.
- the environment 142 may be, for example, an operating room, an exam room, or the like.
- the tracking signals are not limited to electromagnetic tracking signals, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of tracking signals (e.g., optical tracking signals, acoustic tracking signals, etc.).
- the system 100 may generate a virtual space 155 based on (e.g., in response to) signals transmitted by imaging device 112.
- the virtual space 155 may correspond to a field of view 159 of the imaging device 112.
- the system 100 may generate images 153 in response to signals transmitted by imaging device 112, and the images 153 may correspond to the field of view 159 of the imaging device 112.
- the imaging device 112 is an ultrasound probe transmitting ultrasound signals, and the images 153 may be ultrasound images.
- the images 153 may be static images or video images. In some aspects of the present disclosure, the images 153 may be stored as a multimedia file 154 that includes video (or video and sound).
- the imaging device 112 and the example signals transmitted by the imaging device 112 are not limited thereto, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of imaging devices 112 (e.g., X-ray, CT scanner, OCT scanner, etc.) and imaging systems described herein.
- the system 100 may support acquiring image data to generate or produce images (e.g., images 153, multimedia file 154, etc.) of the subject 141.
- the system 100 may detect or track the calibration phantom 149-a and other objects (e.g., tracking devices 140, instruments 145, tracking device 146, etc.) included in the volume 150 and the virtual space 155.
- the calibration phantom 149-a may be located in the volume 150 (as generated by the navigation system 118) and the virtual space 155 (as generated by the computing device 102).
- the calibration phantom 149-a is a tissue phantom, but is not limited thereto.
- the system 100 may register and calibrate the imaging device 112 with respect to the navigation system 118.
- the system 100 may identify a set of coordinates 157 in the virtual space 155 in response to an event in which the system 100 detects at least a portion of the instrument 145 in the image 153.
- the system 100 may detect that the portion of the instrument 145 is located in the calibration phantom 149-a and intersects a surface of the virtual space 155 at the set of coordinates 157.
- the system 100 may detect that the portion of the instrument 145 intersects the surface at an angle perpendicular to the surface.
- the portion of the instrument 145 may be a tracking device 146 (e.g., an electromagnetic antenna of the instrument 145).
- the system 100 may detect when the as discussed in greater below.
- the virtual space 155 may be a 2D virtual space generated based on 2D images (e.g., ultrasound images, CT images, etc.) captured by the imaging device 112, and the surface may be a plane of the virtual space 155.
- the virtual space 155 may be a 3D virtual space (e.g., a volume), and the surface of the virtual space 155 may be a planar surface or non-planar surface.
- the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to the event in which the system 100 detects the instrument 145 (or at least a portion of the instrument 145) in the image 153 or in response to the event in which the system 100 detects the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair of the calibration phantom 149-a.
- the system 100 may calibrate a coordinate system 160 associated with the virtual space 155 with respect to a coordinate system 165 associated with the navigation space 119.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and temporal information 156 associated with the event in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155 at the set of coordinates 157.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and temporal information 156 associated with the event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair of the calibration phantom 149-a.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on tracking information associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c, and the tracking device 140-a.
- the instrument 145 and/or the calibration phantom 149-a may be registered to the navigation system 118 such that the navigation system 118 may track and determine pose information 161 of the instrument 145 and/or the calibration phantom 149- a based on tracking information 167 associated with the tracking device 146, the tracking device 140-a, tracking device 140-b and/or the tracking device 140-c and temporal information 166 corresponding to the tracking information 167. Further, for example, the navigation system 118 may track and determine pose information 161 of the imaging device 112 based on tracking information 167 associated with the tracking device 140-a and temporal information 166 corresponding to the tracking information 167.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the tracking information 167 (e.g., associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c and the tracking device 140-a), the temporal information 166 associated with the tracking information 167, the temporal information 156 associated with the event, and the coordinates 157 associated with the event.
- the tracking information 167 e.g., associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c and the tracking device 140-a
- the temporal information 166 associated with the tracking information 167 e.g., associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c and the tracking device 140-a
- the temporal information 166 associated with the tracking information 167
- the temporal information 156 associated with the event
- the coordinates 157 associated with the event e.g., associated with the coordinates 157 associated with the event.
- the system 100 may detect, in an image 153 (or multimedia file 154), one or more landmarks corresponding to the instrument 145, a portion of the instrument 145, or the tracking device 146.
- the landmarks may correspond to distinctive features (e.g., a tip, a shape of the tip, etc.) of the instrument 145 or the tracking device 146.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the one or more landmarks.
- the system 100 may detect in an image 153 (or multimedia file 154), one or more landmarks corresponding to the phantom 149-a and/or the protrusion pair provided within the phantom 149-a.
- the landmarks may correspond to distinctive features (e.g., a tip, a shape of the tip, etc.) of the phantom 149-a and/or the protrusion pair provided within the phantom 149-a.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the one or more landmarks.
- aspects of the present disclosure support calibrating the virtual space 155 to the navigation space 119 (e.g., calibrating the coordinate system 160 associated with the virtual space 155 to the coordinate system 165 associated with the navigation space 119) in response to one or more criteria.
- the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each occurrence of the event.
- the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each n th occurrence of the event (e.g., each third occurrence of the event, each fifth occurrence, etc.).
- the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each n th occurrence of the event within a temporal duration (e.g., each third occurrence of the event, in which the third occurrence is X seconds or less after a first occurrence of the event (where X is an integer value)).
- a temporal duration e.g., each third occurrence of the event, in which the third occurrence is X seconds or less after a first occurrence of the event (where X is an integer value).
- the system 100 may support calibrating the coordinate system 160 with respect to the coordinate system 165 without pausing a medical procedure (e.g., surgical procedure).
- a medical procedure e.g., surgical procedure
- the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 (e.g., recalibrate the registration between the virtual space 155 and the navigation space 119) in the background while medical personnel performs a medical procedure on the subject 141, without interrupting the medical procedure.
- the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 during the medical procedure, without prompting the medical personnel to pause the medical procedure, such that the medical personnel may proceed with the medical procedure without waiting for calibration to be completed.
- the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 without prompting the medical personnel to participate in a separate calibration operation.
- the system 100 and techniques described herein may support calibrating the coordinate system 160 with respect to the coordinate system 165 based on any of: properties (e.g., beam thickness, beam shape, signal frequency, etc.) of signals transmitted by the imaging device 112, pose information of the instrument 145, the phantom 149-a (or pose information of the tracking device 146 or the tracking device 140-c) in association with an intersection between the instrument 145 (or tracking device 146) and the surface of the virtual space 155, or the intersection of the field of view 159 of the imaging device and the protrusion pair and properties (e.g., shape, etc.) of the tracking device 146, the protrusion pair, etc., example aspects of which are later described with reference to Figs. 3A-3G.
- properties e.g., beam thickness, beam shape, signal frequency, etc.
- calibrating the virtual space 155 to the navigation space 119 includes verifying a registration accuracy between the coordinate system 160 and the coordinate system 165.
- the system 100 may calculate a registration accuracy between the coordinate system 160 and the coordinate system 165 and compare the registration accuracy to a target accuracy value.
- the system 100 may perform one or more operations described herein in association with recalibrating the virtual space 155 to the navigation space 119.
- the system 100 may autonomously recalibrate the coordinate system 160 to the coordinate system 165.
- the system 100 may generate and output a notification including user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) regarding how to move or position a device (e.g., the imaging device 112, the instrument 145, etc.) in association with the calibration process.
- user guidance information 175 e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.
- the notification may include a visual notification, an audible notification, a haptic notification, or a combination thereof.
- Other additional and/or alternative aspects of calibrating the virtual space 155 to the navigation space 119 include automatically detecting distortion in the navigated volume 150 due to discrepancies between navigation data (e.g., tracking information 167 provided by navigation system 118) and imaging data (e.g., images 153, multimedia file 154, etc.).
- the discrepancies may be between pose information 161 of a tracked object (e.g., tracking device 140-b, tracing device 140-c, instrument 145, tracking device 146, etc.) as indicated by the navigation system 118 and pose information 162 of the tracked object as determined by the computing device 102 from the imaging data.
- the system 100 may calculate the discrepancy and compare the discrepancy to a target discrepancy threshold value. In an example, in response to a comparison result in which the discrepancy is greater than the discrepancy threshold value, the system 100 may perform one or more operations described herein (e.g., autonomous recalibration, outputting a notification including user guidance information 175, etc.) in association with recalibrating the virtual space 155 to the navigation space 119. In some aspects of the present disclosure, the system 100 may calibrate the navigation data to the imaging data (e.g., calibrate the navigation space 119 to the virtual space 155) while compensating for the discrepancies.
- the system 100 may calibrate the navigation data to the imaging data (e.g., calibrate the navigation space 119 to the virtual space 155) while compensating for the discrepancies.
- the techniques described herein may provide continuous automatic registration, continuous semi-automatic registration, or a combination thereof, and the registration techniques may be implemented during a medical procedure.
- Fig. 1C illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example in Fig. 1C include like aspects described with reference to Fig. IB.
- the calibration phantom 149-b may be an ultrasound transmitting volume (e.g., a water bath) implemented using an empty container (e.g., an empty box including an electromagnetic friendly material).
- the calibration phantom 149-b may include ultrasound conductive material.
- the container may be formed of low magnetic or non-magnetic materials so as to minimize distortion to electromagnetic fields.
- the container may be formed of a material having a magnetic permeability of about 1.0 to about 1.1 (relative), and the material may have a relatively low electrical conductivity (e.g., an electrical conductivity less than a threshold value).
- the material may be a stainless steel alloyed with different metallic elements associated with obtaining specific properties (e.g., temperature and corrosion resistance, fracture tolerance, etc.).
- Non-limiting examples of the material include Nickel/Chromium alloys (e.g., Series 300 alloys, type 304 stainless steel (annealed condition only), type 316 stainless steel), Cobalt/Chromium alloys (e.g., L605, MP35N), and Titanium alloys (e.g., Ti6A14V), plastics, and wood.
- Nickel/Chromium alloys e.g., Series 300 alloys, type 304 stainless steel (annealed condition only), type 316 stainless steel
- Cobalt/Chromium alloys e.g., L605, MP35N
- Titanium alloys e.g., Ti6A14V
- one or more surfaces of the container includes protrusion pairs as illustrated in Figs. 2B and 2C and the container is full of water.
- the calibration phantom 149-b may be integrated into the structure of a patient tracker. In some other aspects of the present disclosure, the calibration phantom 149-b may be included in the environment 142 as a standalone structure that is separate from an operating table associated with the subject 141.
- the system 100 may support calibrating the virtual space 155 to the navigation space 119 using the calibration phantom 149-b and the techniques as described herein, in which the calibration phantom 149-b is substituted for the calibration phantom 149-a described with reference to Fig. IB.
- the system 100 may support calibrating the virtual space 155 to the navigation space 119 using both the calibration phantom 149-a and the calibration phantom 149-b.
- the system 100 may support calibration outside the subject 141 using the calibration phantom 149-b (e.g., water bath) and further calibration (e.g., recalibration, calibration adjustment, etc.) using the calibration phantom 149-a (e.g., tissue of the subject 141), and the combination may provide an increase in accuracy compared to other calibration techniques.
- the calibration phantom 149-a and the calibration phantom 149-b in association with a calibration process is later described with reference to Fig. 4.
- FIG. 2A illustrates an example implementation 200 of the system used to find source-target registration points according to an embodiment of the present disclosure.
- an imaging device 112 e.g., an electromagnetic tracked ultrasound probe
- the calibration phantom 149-b e.g., a water bath
- a transmission device 136 e.g., an electromagnetic emitter
- the imaging device 112 includes tracking device 140-a and the calibration phantom 149-b includes tracking device(s) 140-c used to be tracked relative to the transmission device 136.
- the field of view 159 (e.g., the ultrasound beam or projection) of the imaging device 112 is provided.
- an area 205 represents the imaging generated/provided by the device 112.
- a navigation space 119 corresponding to the calibration phantom 149-b may be generated by the navigation system 118 as described herein, and the system may display a virtual representation of the navigation space 119 and the virtual space 155.
- the EM-phantom point 150 represents the image captured of the midpoint between the tips of one of the protrusion pair provided within the phantom 149-b (or the image of any other registration object/mark) where the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair.
- EM-phantom point 150 represents the impression in the ultrasound image of an object in the real world (e.g., a phantom feature/protrusion or navigated stylus tip, etc.) whose 3D location in the tracker’s space is well known.
- T s represents the ultrasound probe’s sensor (e.g., tracking device 140-a) transformation matrix and Tp represents the phantom’s sensor (e.g., tracking device 140-c) transformation matrix (adjusted for the specific feature/protrusions’ tip represented by the EM-phantom point 150) with respect to the transmission device 136.
- the point 150 is acquired simultaneously in two different coordinate spaces: the transmission device 136 localization system (through the calibration phantom 149-b) and the imaging device 112 (through image coordinates and tracking devices 140-a).
- coordinate system 203 e.g., coordinate system 203-a, coordinate system 203-b, coordinate system 203-c, and coordinate system 203-d.
- Each coordinate system 203 includes three-dimensions including an X-axis, a Y-axis, and a Z-axis.
- coordinate system 203-a may be used to define surfaces, planes, or volume of the calibration phantom 149-b and the coordinate system 203-b may be used to define surfaces, planes, or volume of the navigation space 119 of the transmission device 136.
- the coordinate system 203-c corresponds to the imaging device 112 and the coordinate system 203-d corresponds to EM-phantom point 150.
- each coordinate system 203 may be disposed orthogonal, or at 90 degrees, to one another. While the origin of a coordinate system 203 may be placed at any point on or near the components of the navigation system 118, for the purposes of description, the axes of the coordinate system 203 are always disposed along the same directions from figure to figure, whether the coordinate system 203 is shown or not. In some examples, reference may be made to dimensions, angles, directions, relative positions, and/or movements associated with one or more components of the imaging device 112 and/or the navigation system 118 with respect to a coordinate system 203.
- the aspects described herein with reference to the plane of the ultrasound beam or projection support implementations applied to any surface of a virtual space 155 (e.g., a plane of the virtual space 155 for cases in which the virtual space 155 is a 2D virtual space, a planar surface or non-planar surface of the virtual space 155 for cases in which the virtual space 155 is a 3D virtual space, etc.).
- a virtual space 155 e.g., a plane of the virtual space 155 for cases in which the virtual space 155 is a 2D virtual space, a planar surface or non-planar surface of the virtual space 155 for cases in which the virtual space 155 is a 3D virtual space, etc.
- the aspects described herein may be applied to an electromagnetic antenna, a navigation stylus, a pointer, or any navigated tools having a geometry and location that is defined, known, and trusted by the system 100.
- the system 100 may record all navigation data (e.g., electromagnetic data), imaging data (e.g., ultrasound data), and corresponding temporal information (e.g., temporal information 156, temporal information 166) in a multimedia file 154.
- the multimedia file 154 may be a movie file
- the system 100 may record timestamps corresponding to the navigation data and the imaging data.
- the system 100 may identify when the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs. Based on the identification of when the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs, the system 100 may verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
- Fig. 2B illustrates a view of an imaging device 112 and the calibration phantom 149-b used in the system of Figs. 2A in accordance with aspects of the present disclosure.
- the imaging device 112 has provided thereon tracking device 140a.
- calibration phantom 149-b includes one or more tracking devices around its perimeter or other locations such that the calibration phantom 249-b and the one or more protrusion pairs are detected.
- Imaging device 112 includes a field of view 159 (e.g., ultrasound beam or projection).
- the calibration phantom 149-b also includes one or more protrusion pairs 247. Based on the identification of when the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs, the system 100 may verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
- Fig. 2C illustrates a view of a calibration phantom 149-b used in the system of Fig. 2A in accordance with aspects of the present disclosure.
- the calibration phantom 149-b includes at least one protrusion pair 247 (protrusion 247-a and protrusion 247-b).
- Protrusion 247-a is provided on a first surface of the calibration phantom 149-b
- protrusion 247-b is provided on a second surface, opposite the first surface of the calibration phantom 149-b, such that the tips of the protrusions 247-a and 247-b meet.
- Fig. 3A illustrates example views 300 and 301 of an ultrasound beam 305 transmitted by the imaging device 112 when viewed from different perspective views.
- the ultrasound beam 305 is relatively thick (or wide) with respect to the Y-axis.
- the ultrasound beam 305 is relatively narrow with respect to the Z-axis.
- the thickness varies in depth, and the shape and focal point of the ultrasound beam 305 may be based on parameters (e.g., power, frequency, etc.) of the ultrasound beam 305.
- the system 100 may calibrate the virtual space 155 to the navigation space 119 based on instances in which the ultrasound beam or projection (e.g., the cross-sectional area 310 of the ultrasound beam 305) intersects the midpoint of each protrusion (protrusion 247-a and protrusion 247-b) of the protrusion pair 247. That is, for example, the system 100 may perform calibration for instances in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair, while factoring in values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
- parameters e.g., thickness, depth, shape, focal point, etc.
- the system 100 may calibrate the virtual space 155 to the navigation space 119 for instances in which the ultrasound beam or projection 305 intersects the midpoint of each protrusion (protrusion 247-a and protrusion 247-b) of the protrusion pair 247 in the direction along the Z axis. That is, for example, the system 100 may perform calibration for instances in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair, while factoring in values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
- parameters e.g., thickness, depth, shape, focal point, etc.
- FIG. 3B illustrates example views 303, 304, 306 of an imaging device 112 imaging a calibration phantom 149-b having a protrusion pair 247 provided therein with accompanying example views 313, 314, 316 of images 333, 334, 336 of the calibration phantom formed by imaging in accordance with aspects of the present disclosure.
- the projection 310 of the ultrasound beam 305 intersects the protrusions 247-a, 247-b of a protrusion pair 247 at various locations.
- Example views 313, 314 and 316 illustrate images 333, 334 and 336 of representations 323, 324 and 326 for the tips of the protrusion pair 247.
- the projection 310 of the ultrasound beam 305 intersects the protrusion 247-a at a rear portion.
- the projection 310 of the ultrasound beam 305 passes in between protrusion pair 247- a and 247-b .
- the projection 310 of the ultrasound beam 305 intersects the protrusion 247-a at a front portion .
- image 334 shows representation 324 of the tips of the protrusion pair 247 having the smallest diameter as compared with representation 323 and representation 326 of example image 333 and example image 336, respectively.
- a representation with the smallest diameter or narrowest shadow is a good indication that the tips of the protrusion pair 247 are in the center of the projection of the ultrasound beam 305.
- Each of the example images 333 and 336 illustrates a representation 323 and 326 of the tips of the protrusion pair 247 illustrated with a larger shadow or a larger diameter than representation 324.
- the location for the protrusion pair 247 as described with example view 304 and the position of the imaging device 112, the ultrasound beam 305 and/or the parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305 are used to verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
- at least three sets of saved coordinates for the tips of the protrusion pair 227 are required to verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
- Fig. 3C illustrates an example of imaging device 112 emitting an ultrasound beam 305 having a projection 310 towards the tip of single protrusion 247 in accordance with aspects of the present disclosure, as well as an alternative placement where the protrusion 247-c is parallel to the main ultrasound beam plane
- Fig. 3D illustrates a view of imaging device 112 emitting an ultrasound beam 305 having a projection 310 in accordance with aspects of the present disclosure.
- the projection 310 of the ultrasound beam 305 is not exactly a plane, but has a thickness which varies based on the distance from the emitter of the imaging device 112 to towards the tip of the protrusion 247-a.
- this thickness can introduce errors as illustrated in association with calibrating the virtual space 155 to the navigation space 119.
- protrusion 247- x may be represented as being in the center of the ultrasonic beam projection 310, when viewed the XY plane, protrusion 247-x may actually be offset with respect to center of the ultrasonic beam projection 310 as indicated.
- Fig. 3E illustrates example elements 320 (e.g., element 320-a, element 320-b) that may be implemented at protrusions 247-a and 247-b (e.g., at a tip 325 of the protrusion) in association with calibrating the virtual space 155 to the navigation space 119.
- element 320-a may be a gel (or other materials with different acoustic properties from the medium -e.g., water- and the one of the protrusions) that is cylinder shaped.
- the medium -e.g., water- and the one of the protrusions that is cylinder shaped.
- At position “c” only the end portion of the element 320-a is considered and an unobstructed view of the end portion of element 320-a is represented in image “c”.
- position “b” which produces image “b” having a representation with the smallest diameter or narrowest shadow of the tip of the protrusion 247-a is a good indication that the tip of the protrusion 247-a is in the center of the projection 310 of the ultrasound beam 305.
- element 320-b may be a gel that is sphere shaped. At position “c” only the end portion of the element 320-b is considered and an unobstructed view of the end portion of element 320-b is represented in image “c”. At position “b” the tip of the protrusion 247-b and the end portion of the element 320-b are considered and a view of the end portion of the element 320-b with the tip 325 of the protrusion 247-b provided in the center is represented in image “b”.
- position “a” the shaft of the protrusion 247-b and the end portion of the element 320-b are considered and a view of the end portion of the element 320-b with the shaft of the protrusion 247-b provided in the center is represented in image “a”.
- position “b” which produces image “b” having a representation with the smallest diameter or narrowest shadow of the tip of the protrusion 247-b is a good indication that the tip of the protrusion 247-b is in the center of the projection 310 of the ultrasound beam 305.
- elements 320 function more efficiently when the protrusions 247-a and 247-b are provided perpendicular to the ultrasound beam 305.
- Fig. 3F illustrates example shapes 330 (e.g., shapes 330-a through 330-c) that may be implemented at a tip 325 of the protrusion 247, or the like.
- aspects of the present disclosure may include implementing any of the shapes 330, and the shapes 330 may support tip-image center alignment. That is, for example, each shape 330 may be symmetrical with respect to a center of the shape 330, which may support alignment of the center of the shape 330 and a center 335 of an ultrasound beam 305 emitted by the imaging device 112, an example of which is illustrated at Fig. 3G.
- shapes 330 function more efficiently when the protrusions 247 are provided parallel to the ultrasound beam 305.
- Fig. 3H illustrates example views 371, 372, 373 of a protrusion 247-a within a protrusion holder 390 in accordance with aspects of the present disclosure.
- Example view 371 is a side view of the protrusion 247-a provided within the protrusion stand 390.
- Example view 372 is a front view of the protrusion 247-a provided within the protrusion stand 390.
- Example view 373 is a top view of the protrusion 247-a provided within the protrusion stand 390.
- the protrusion stand 390 has an hour-glass shape 391 at its center portion.
- FIG. 31 illustrates example views 360, 361, 362 of an imaging device 112 imaging a protrusion 247-a with accompanying images 363, 364, 365 formed by imaging in accordance with aspects of the present disclosure.
- imaging device 112 transmits an ultrasound beam 305-a towards protrusion 247-a.
- Imaging device 112 is provided at a first location in which the ultrasound beam 305-a only covers a bottom portion of the protrusion stand 390. Imaging the protrusion stand 390 and the protrusion 247-a produces image 363 with an ultrasound beam projection 317-a including only a representation 365-a of the imaged bottom portion of the protrusion stand 390.
- imaging device 112 transmits an ultrasound beam 305-b towards protrusion 247-a.
- Imaging device 112 is provided at a second location in which the ultrasound beam 305-b covers a tip 325 of the protrusion 247-a and part of the hour-glass shape 391 of the protrusion stand 390.
- Imaging the protrusion stand 390 and the protrusion 247-a produces image 364 with an ultrasound beam projection 317- b including a representation 365-b of the imaged protrusion stand 390 and a representation 377-b of the imaged tip of the protrusion 247-a.
- imaging device 112 transmits an ultrasound beam 305-c towards protrusion 247-a.
- Imaging device 112 is provided at a third location in which the ultrasound beam 305-c covers a shaft and a tip 325 of the protrusion 247-a and another part of the hour-glass shape 391 of the protrusion stand 390.
- Imaging the protrusion stand 390 and the protrusion 247-a produces image 365 with an ultrasound beam projection 317-c including a representation 365-c of the imaged protrusion stand 390 and a representation 377-c of the imaged shaft and tip of the protrusion 247-a.
- example view 362 provides the best position of the imaging device 112 since the tip 325 of the protrusion 247-a is provided in the center 335 of the ultrasound beam 305-a. This is shown in image 365 with the representation 365-c of the imaged protrusion stand 390 having the smallest or the narrowest shadow as compared to other representations (365-a and 365-b) of the imaged stand 390. Moreover, it is the hour-glass shape 391 that causes the smallest 365-c shown at the right tip/beam-center position.
- Fig. 4 illustrates an example of a method 400 in accordance with aspects of the present disclosure.
- method 400 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
- method 400 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein.
- a device e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.
- method 400 starts with a START operation at step 404 and ends with an END operation at step 456.
- Method 400 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium.
- a computer system e.g., computing device 102, etc.
- method 400 shall be explained with reference to the systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
- Method 400 begins with the START operation at step 404 and proceeds to step 408, where the system 100 generates a navigation space 119 as described herein. After generating a navigation space 119 at step 400, method 400 proceeds to step 408, where the system 100 generates a virtual space 155 based on images captured by the imaging device 112 as described herein. For example, the system 100 may generate the virtual space 155 based on images representing the inside of calibration phantom 149-b (e.g., water bath). After generating a virtual space 155 based on images captured by the imaging device 112 at step 408, method 400 proceeds to step 412, where the system 100 initiates a calibration process 470 in accordance with aspects of the present disclosure.
- step 408 the system 100 initiates a calibration process 470 in accordance with aspects of the present disclosure.
- the system 100 may initiate calibration of the coordinate system 160 associated with the virtual space 155 (and imaging device 112) with respect to the coordinate system 165 associated with the navigation space 119 (and navigation system 118).
- method 400 proceeds to step 420, where the system 100 provides user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) described herein regarding how to move or position a device (e.g., the imaging device 112, etc.) in association with the calibration process.
- user guidance information 175 e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.
- the terms “guidance information” and “calibration guidance information” may be used interchangeably herein. Aspects of the present disclosure support implementations with or without providing user guidance information 175.
- decision step 424 the system 100 determines if an event has been detected.
- the system 100 may determine, from an image 153 (or multimedia file 154), whether an event has occurred in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247. If no event has been detected, (NO) at decision step 424, where the system 100 analyzes subsequent images 153 (or multimedia files 154) until the system 100 detects an event in which the field of view 159 of the imaging device 112 intersects a mid-point of a protrusion pair 247. In some aspects of the present disclosure, the system 100 may return to 425 and provide additional user guidance information 175 that prompts a user to position or orient the imaging device 112 to trigger such an event.
- step 424 the system 100 analyzes subsequent images 153 (or multimedia files 154), to determine whether an event has occurred in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247
- method 400 proceeds to step 428, where the system 100 identifies, from the image 153 (or multimedia file 154) the set of coordinates 157 at which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247.
- the system 100 may identify the temporal information 156 associated with when the protrusion pair 247 intersected the ultrasound beam of the imaging device.
- the system 100 may identify pose information 162 (in the virtual space 155) of the protrusion pair 247 that corresponds to the temporal information 156.
- step 432 After identifying from the image 153 (or multimedia file 154) the set of coordinates 157 at which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 at step 428, method 400 proceeds to step 432, where the system 100 calibrates the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and the temporal information 156 as described herein.
- the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157, the temporal information 156, the pose information 161 (of the instrument 145 in the virtual space 155) corresponding to the temporal information 156 and pose information 162 (of the instrument 145 in the navigation space 119) corresponding to the temporal information 156.
- method 400 proceeds to decision step 436 where it is determined whether to repeat the calibration process 470. For example, at step 436, the system 100 may determine whether a user input requesting recalibration has been received.
- method 400 proceeds to decision step 440, where the system 100 determines whether a temporal duration (e.g., recalibration every X hours, every day, etc.) associated with performing recalibration has elapsed. If a response is not received identifying that the temporal duration has elapsed (NO) at decision step 436, method 400 proceeds to decision step 444, where the system 100 detects for any losses in calibration between the imaging device 112 and the navigation system 118 (e.g., the navigation system 118 is unable to track the imaging device 112) has occurred.
- a temporal duration e.g., recalibration every X hours, every day, etc.
- method 400 proceeds to decision step 448, where the system 448 detects an event.
- the system 100 may repeat the calibration process 470, beginning at any operation (e.g., generating the virtual space 155 at step 412, initiating calibration at step 416, etc.) of the calibration process 470.
- the system 100 may return to step 412 and generate the virtual space 155, but while navigating the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
- the system 100 may regenerate the virtual space 155 based on images captured by the imaging device 112 as described herein, but the images may be associated with or include calibration phantom 149-a (e.g., tissue phantom).
- the system 100 may again return to step 412 in response to a (YES) decision at any of step 436 through step 444 and generate the virtual space 155, while imaging the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
- a (YES) decision at any of step 436 through step 444 and generate the virtual space 155, while imaging the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
- the system 100 may continue to provide navigation information (e.g., tracking information 167, etc.).
- navigation information e.g., tracking information 167, etc.
- the system 100 may monitor for one or more events in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 as described herein.
- decision step 448 the system 100 detects an event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 (Event Detected).
- the system 100 determines, from the event detected at decision step 448, whether recalibration is to be performed.
- the system 100 may detect the amount of distortion in the navigated volume 150 (e.g., discrepancies between navigation data associated with the protrusion pair 247 and imaging data associated with the protrusion pair 247) based on the event detected at 448. Based on the amount of distortion detected by the system 100, the system 100 may return to step 428 (e.g., for recalibration) or refrain from returning to step 428 (e.g., abstain from performing recalibration).
- step 428 e.g., for recalibration
- refrain from returning to step 428 e.g., abstain from performing recalibration.
- the system 100 may determine that the amount of distortion is greater than a threshold distortion value, and at (YES) decision step 452 and the system 100 may return to decision step 428 and repeat the calibration as described with reference to step 432.
- the system 100 may determine that the amount of distortion is less than the threshold distortion value, and (NO) at decision step 452 and the system 100 may continue to provide imaging information and/or navigation information (e.g., tracking information 167, etc.) while monitoring for any of the events described with reference to step 436 through step 448.
- imaging information and/or navigation information e.g., tracking information 167, etc.
- the system 100 may determine at step 452 whether to perform recalibration, at any occurrence of an event detected at step 448. For example, the system 100 may perform recalibration at each occurrence of an event detected at step 448, at each n th occurrence of the event, or at each n th occurrence of the event within a temporal duration. As illustrated and described herein, the example aspects of the method 400 described herein support automatic and continuous (or semi- continuous) recalibration by the system 100.
- Fig. 5 illustrates an example of a method 500 in accordance with aspects of the present disclosure.
- method 500 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
- method 500 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein.
- a device e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.
- method 500 starts with a START operation at step 504 and ends with an END operation at step 532.
- Method 500 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium.
- a computer system e.g., computing device 102, etc.
- method 500 shall be explained with reference to the systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
- Method 500 begins with the START operation at step 504 and proceeds to step 508, where the system 100 generates a navigation space 119 based on one or more tracking signals. After generating a navigation space 119 based on one or more tracking signals at step 504, method 500 proceeds to step 512, where systems 100 generates a virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals.
- the calibration phantom includes an ultrasound conductive material.
- the calibration phantom may include an ultrasound transmitting volume (e.g., water bath).
- the calibration phantom includes a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
- the virtual space corresponds to a field of view of the imaging device.
- step 512 After generating a virtual space at step 512, method 500 proceeds to step 516, where the system 100 identifies a set of coordinates in the virtual space in response to an event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair.
- the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a magnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
- an optical tracking system an acoustic tracking system
- an electromagnetic tracking system an electromagnetic tracking system
- a magnetic tracking system a magnetic tracking system
- a radar tracking system an inertial measurement unit (IMU) based tracking system
- IMU inertial measurement unit
- step 516 After identifying coordinates in the virtual space at step 516, method 500 proceeds to step 520, where the system 100 calibrates a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
- calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event. In some aspects, calibrating the first coordinate system with respect to the second coordinate system is absent pausing the surgical procedure.
- calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness (e.g., ultrasound beam thickness), beam shape (e.g., ultrasound beam shape), or both of the one or more signals transmitted by the imaging device; pose information of protrusion pair in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more properties of the portion of the tracked device.
- beam thickness e.g., ultrasound beam thickness
- beam shape e.g., ultrasound beam shape
- calibrating the first coordinate system with respect to the second coordinate system includes verifying a registration accuracy between the first coordinate system and the second coordinate system.
- step 520 Atter calibrating the coordinates system at step 520, method 500 proceeds to step 525, where the system 100 outputs guidance information associated with positioning the imaging device in association with calibrating the first coordinate system with respect to the second coordinate system.
- method 500 may include detecting one or more discrepancies between first tracking data corresponding to the protrusion pair and the point of view of the imaging device in association with the navigation space and second tracking data corresponding to the protrusion pair and the point of view in association with the virtual space.
- method 500 may include generating a notification associated with the one or more discrepancies, performing one or more operations associated with compensating for the one or more discrepancies, or both. After outputting guidance information at step 524, method 500 ends with END operation at step 532.
- Method 5000 (and/or one or more operations thereof) may be carried out or otherwise performed, for example, by at least one processor.
- the at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above.
- the at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118).
- a processor other than any processor described herein may also be used to execute the process flow 500.
- the at least one processor may perform operations of the process flow 500 by executing elements stored in a memory such as the memory 106.
- the elements stored in memory and executed by the processor may cause the processor to execute one or more operations of a function as shown in the process flow 500.
- One or more portions of the process flow 500 may be performed by the processor executing any of the contents of memory, such as image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
- Fig. 6 illustrates an example of a method 600 in accordance with aspects of the present disclosure.
- method 600 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
- the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of method 600, or other operations may be added to the method 600.
- method 600 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein.
- a device e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.
- method 600 starts with a START operation at step 604 and ends with an END operation at step 644.
- Method 600 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium.
- a computer system e.g., computing device 102, etc.
- method 600 shall be explained with reference to the systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
- Method 600 begins with the START operation at step 604 and proceeds to step 608, where the system 100 detects that the imaging device 112 (e.g., ultrasound probe) is placed in the calibration phantom 149-b.
- the imaging device 112 e.g., ultrasound probe
- the imaging device 112 is wiggled within the calibration phantom 149-b such that the imaging device 112 is provided at various locations within the calibration phantom 149-b.
- decision step 612 the system 100 determines if a protrusion pair has been detected.
- one protrusion pair is provided at a known location within the calibration phantom 149-b.
- a plurality of protrusion pairs is provided at known locations within the calibration phantom 149-b.
- the distance between protrusions of the plurality of protrusion pairs is the same of each protrusion pair.
- the distance between protrusions of plurality of protrusion pairs is different for each protrusion pair.
- method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If a protrusion pair has been detected, (YES) at decision step 612, method 600 proceeds to decision step 616, where the system 100 detects protrusion pair tips.
- the detection of the protrusion pair tips can be enhanced by including element 320-a which is cylinder shaped or including element 320-b which is sphere shaped. Element 320-a and element 320-b function more efficiently when the protrusions 247 are provided perpendicular to the ultrasound beam 305.
- the detection of the protrusion pair tips can enhance by including shapes 330-a through 330-c at a tip 325 of the protrusion 247, or the like.
- the shapes 330 function more efficiently when the protrusions 247 are provided parallel to the ultrasound beam 305.
- method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If a protrusion pair tip has been detected, (YES) at decision step 616, method 600 proceeds to step 620, where the system 100 retrieves calibration phantom pose data.
- the calibration phantom pose data is known.
- the calibration phantom 149-b is provided with a tracking device 140-c.
- the calibration phantom 149-b generally remains stationary. Therefore, the location of the calibration phantom 149-b with respect to the transmission device 136 is a known value which is stored in memory 106.
- step 624 the system 100 calculates the protrusion pair tip pose.
- the protrusion pair tip pose is calculated using the known location of the phantom pose 149-b.
- step 628 the system 100 retrieves the protrusion pair tip coordinates in the image space of the imaging device 112.
- step 632 the system 100 calculates and saves target-source point pairs.
- method 600 proceeds to decision step 636, where the system 100 determines if there are enough target-source point pairs saved. According to one embodiment of the present disclosure, at least three target-source point pairs are required to initiate calibration of the coordinate system 160 associated with the virtual space 155 (and the imaging device 112) with respect to the coordinate system 165 associated with the navigation space 119 (and navigation system 118). If there are not enough target-source point pairs saved, (NO) at decision step 636, method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If there are enough target-source point pairs saved, (YES) at decision step 636, method 600 proceeds to step 640, where the system 100 calculates the registration matrix. After the registration matrix has been calculated at step 640, method 600 ends with END operation at step 644.
- the present disclosure encompasses methods with fewer than all of the steps identified herein (and the corresponding description of respective process flows), as well as methods that include additional steps beyond those identified in the figures and process flows described herein).
- the present disclosure also encompasses methods that include one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or include a registration or any other correlation.
- each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
- aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc.) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Any combination of one or more computer-readable medium(s) may be utilized.
- the computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
- a computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
- a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
- a computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof.
- a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
- Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Veterinary Medicine (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Robotics (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
A system generates a navigation space and a virtual space including a calibration phantom including a protrusion pair or a single protrusion based images. The system also identifies sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or the single protrusion and determines an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion. The system further calibrates a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event and based on the optimal set of coordinates.
Description
SYSTEM AND METHOD FOR AUTOMATIC ULTRASOUND 3D- POINT
DETECTION AND SELECTION FOR ULTRASOUND PROBE REGISTRATION FOR
NAVIGATION
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] The present application claims the benefit of and priority to U.S. Provisional Application No. 63/607,053 filed on December 6, 2023, entitled “SYSTEM AND METHOD FOR AUTOMATIC ULTRASOUND 3D- POINT DETECTION AND SELECTION FOR ULTRASOUND PROBE REGISTRATION FOR NAVIGATION”, the entirety of which is hereby incorporated herein by reference.
FIELD OF INVENTION
[0002] The present disclosure is generally directed to navigation and ultrasound imaging and relates more particularly to calibrating an ultrasound probe for navigation.
BACKGROUND
[0003] Imaging devices and navigation systems may assist a surgeon or other medical provider in carrying out a surgical procedure. Imaging may be used by a medical provider for visual guidance in association with diagnostic and/or therapeutic procedures.
Navigation systems may be used for tracking objects (e.g., instruments, imaging devices, etc.) associated with carrying out the surgical procedure.
BRIEF SUMMARY
[0004] Example aspects of the present disclosure include:
A system including: a processor; and a memory storing instructions that, when executed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals from a calibration phantom and an imaging device; generate a virtual space including at least a portion the calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom; determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in
response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the first event.
[0005] Any of the aspects herein, further including a plurality of protrusion pairs, wherein the instructions are further executable by the processor to: determine the optimal set of coordinates in the virtual space based on a third event in which the imaging device projection intersects each protrusion of the protrusion pair of more than one protrusion pair of the plurality of protrusion pairs.
[0006] Any of the aspects herein, wherein a distance between protrusions of each protrusion pair of the plurality of protrusion pairs is different.
[0007] Any of the aspects herein, wherein a dimeter and a shape of the protrusions of each of the protrusion pair of the plurality of protrusion pairs is different.
[0008] Any of the aspects herein, wherein the instructions are further executable by the processor to calibrate the first coordinate system associated with the virtual space with respect to the second coordinate system associated with the navigation space in response to the second event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event.
[0009] Any of the aspects herein, wherein the tip portion of the single protrusion has a concave shape.
[0010] Any of the aspects herein, wherein the concave shape is symmetrical with respect to a center of the concave shape.
[0011] Any of the aspects herein, further comprising a plurality of single protrusions, wherein the plurality of single protrusions includes a cylinder or a sphere at the tip portion. [0012] Any of the aspects herein, wherein the instructions are further executable by the processor to output guidance information associated with positioning the imaging device in association with calibrating the first coordinate system with respect to the second coordinate system.
[0013] Any of the aspects herein, wherein calibrating the first coordinate system with respect to the second coordinate system is in response to one occurrence of the first event or the second event and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
[0014] Any of the aspects herein, wherein calibrating the first coordinate system with respect to the second coordinate system is based on beam thickness, beam shape and/or
signal frequency.
[0015] Any of the aspects herein, wherein the calibration phantom comprises: ultrasound conductive material; or a tissue phantom comprised in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure. [0016] Any of the aspects herein, wherein the protrusion pair is provided perpendicular to the imaging device projection and the single protrusion is provided parallel to the imaging device projection.
[0017] Any of the aspects herein, wherein the instructions are further executable by the processor to: detect one or more discrepancies between first tracking data corresponding to the imaging device in association with the navigation space and second tracking data corresponding to the imaging device in association with the virtual space; and generate a notification associated with the one or more discrepancies, perform one or more operations associated with compensating for the one or more discrepancies, or both.
[0018] A system including: an imaging system including an imaging device; a tracking system including a transmission device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space including at least a portion the calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protraction in the calibration phantom; determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the first event.
[0019] Any of the aspects herein, further including a plurality of protrusion pairs, wherein the data is further executable by the processor to: determine the optimal set of coordinates in the virtual space based on a third event in which the imaging device projection intersects each protrusion of the protrusion pair of more than one protrusion
pair of the plurality of protrusion pairs.
[0020] Any of the aspects herein, wherein the instructions are further executable by the processor to calibrate the first coordinate system associated with the virtual space with respect to the second coordinate system associated with the navigation space in response to the second event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event.
[0021] A method including: generating a navigation space based on one or more tracking signals from a calibration phantom and an imaging device; generating a virtual space including at least a portion of a calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identifying sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom; determining an optimal set of coordinates in the virtual space based on an event in which the imaging device projection intersects the midpoint of each protrusion of the protrusion pair based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrating a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the first event.
[0022] Any of the aspects herein, further including: providing a plurality of protrusion pairs; and determining the optimal set of coordinates in the virtual space based on a third event in which the imaging device projection intersects each protrusion of the protrusion pair of more than one protrusion pair of the plurality of protrusion pairs.
[0023] Any of the aspects herein, further comprising calibrating the first coordinate system associated with the virtual space with respect to the second coordinate system associated with the navigation space in response to the second event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event. [0024] Any aspect in combination with any one or more other aspects.
[0025] Any one or more of the features disclosed herein.
[0026] Any one or more of the features as substantially disclosed herein.
[0027] Any one or more of the features as substantially disclosed herein in combination with any one or more other features as substantially disclosed herein.
[0028] Any one of the aspects/features/implementations in combination with any one or more other aspects/features/implementations.
[0029] Use of any one or more of the aspects or features as disclosed herein. [0030] It is to be appreciated that any feature described herein can be claimed in combination with any other feature(s) as described herein, regardless of whether the features come from the same described implementation.
[0031] The details of one or more aspects of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the techniques described in this disclosure will be apparent from the description and drawings, and from the claims.
[0032] The preceding is a simplified summary of the disclosure to provide an understanding of some aspects of the disclosure. This summary is neither an extensive nor exhaustive overview of the disclosure and its various aspects, implementations, and configurations. It is intended neither to identify key or critical elements of the disclosure nor to delineate the scope of the disclosure but to present selected concepts of the disclosure in a simplified form as an introduction to the more detailed description presented below. As will be appreciated, other aspects, implementations, and configurations of the disclosure are possible utilizing, alone or in combination, one or more of the features set forth above or described in detail below.
[0033] Numerous additional features and advantages of the present disclosure will become apparent to those skilled in the art upon consideration of the implementation descriptions provided hereinbelow.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGS [0034] The accompanying drawings are incorporated into and form a part of the specification to illustrate several examples of the present disclosure. These drawings, together with the description, explain the principles of the present disclosure. The drawings simply illustrate preferred and alternative examples of how the disclosure can be made and used and are not to be construed as limiting the disclosure to only the illustrated and described examples. Further features and advantages will become apparent from the following, more detailed, description of the various aspects, implementations, and configurations of the disclosure, as illustrated by the drawings referenced below. [0035] Fig. 1 A illustrates an example of a system in accordance with aspects of the
present disclosure.
[0036] Fig. IB illustrates an example of a system in accordance with aspects of the present disclosure.
[0037] Fig. 1C illustrates an example of a system in accordance with aspects of the present disclosure.
[0038] Fig. 2A illustrates an example implementation of a system in accordance with aspects of the present disclosure.
[0039] Fig. 2B illustrates a view of an imaging device and a calibration phantom used in the system of Fig. 2A in accordance with aspects of the present disclosure.
[0040] Fig. 2C illustrates a view of a calibration phantom used in the system of Fig. 2A in accordance with aspects of the present disclosure.
[0041] Fig. 3 A illustrates example views of an ultrasound beam in accordance with aspects of the present disclosure.
[0042] Fig. 3B illustrates example views of an imaging device imaging a calibration phantom having a protrusion pair provided therein with accompanying images of the calibration phantom formed by imaging in accordance with aspects of the present disclosure.
[0043] Fig. 3C illustrates an example view of an ultrasound beam projection in accordance with aspects of the present disclosure.
[0044] Fig. 3D illustrates another example view of an ultrasound beam projection in accordance with aspects of the present disclosure.
[0045] Fig. 3E illustrates example implementations for a protrusion for a protrusion pair in accordance with aspects of the present disclosure.
[0046] Fig. 3F illustrates example implementations for a tip of a protrusion for a protrusion pair in accordance with aspects of the present disclosure.
[0047] Fig. 3G illustrates an example implementation for a tip of a protrusion for a protrusion pair imaged with an ultrasound beam in accordance with aspects of the present disclosure.
[0048] Fig. 3H illustrates example views of a protrusion within a protraction holder in accordance with aspects of the present disclosure.
[0049] Fig. 31 illustrates example views of an imaging device imaging a protrusion within a protrusion holder with accompanying images formed by imaging in accordance with aspects of the present disclosure.
[0050] illustrates example view of an imaging device imaging a calibration phantom
having a protrusion provided therein with accompanying images of the calibration phantom formed by imaging in accordance with aspects of the present disclosure. [0051] Fig. 4 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
[0052] Fig. 5 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
[0053] Fig. 6 illustrates an example method for calibrating coordinate systems in accordance with aspects of the present disclosure.
DETAILED DESCRIPTION
[0054] It should be understood that various aspects disclosed herein may be combined in different combinations than the combinations specifically presented in the description and accompanying drawings. It should also be understood that, depending on the example or implementation, certain acts or events of any of the processes or methods described herein may be performed in a different sequence, and/or may be added, merged, or left out altogether (e.g., all described acts or events may not be necessary to carry out the disclosed techniques according to different implementations of the present disclosure). In addition, while certain aspects of this disclosure are described as being performed by a single module or unit for purposes of clarity, it should be understood that the techniques of this disclosure may be performed by a combination of units or modules associated with, for example, a computing device and/or a medical device.
[0055] In one or more examples, the described methods, processes, and techniques may be implemented in hardware, software, firmware, or any combination thereof. If implemented in software, the functions may be stored as one or more instructions or code on a computer-readable medium and executed by a hardware-based processing unit. Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions). Alternatively or additionally, functions may be implemented using machine learning models, neural networks, artificial neural networks, or combinations thereof (alone or in combination with instructions). Computer-readable media may include non-transitory computer-readable media, which corresponds to a tangible medium such as data storage media (e.g., RAM, ROM, EEPROM, flash memory, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer).
[0056] Instructions may be executed by one or more processors, such as one or more
digital signal processors (DSPs), general purpose microprocessors (e.g., Intel Core i3, i5, i7, or i9 processors; Intel Celeron processors; Intel Xeon processors; Intel Pentium processors; AMD Ryzen processors; AMD Athlon processors; AMD Phenom processors; Apple A10 or 10X Fusion processors; Apple Al l, A12, A12X, A12Z, or A13 Bionic processors; or any other general purpose microprocessors), graphics processing units (e.g., Nvidia GeForce RTX 2000-series processors, Nvidia GeForce RTX 3000-series processors, AMD Radeon RX 5000-series processors, AMD Radeon RX 6000-series processors, or any other graphics processing units), application specific integrated circuits (ASICs), field programmable logic arrays (FPGAs), or other equivalent integrated or discrete logic circuitry. Accordingly, the term “processor” as used herein may refer to any of the foregoing structure or any other physical structure suitable for implementation of the described techniques. Also, the techniques could be fully implemented in one or more circuits or logic elements.
[0057] Before any implementations of the disclosure are explained in detail, it is to be understood that the disclosure is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the drawings. The disclosure is capable of other implementations and of being practiced or of being carried out in various ways. Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Further, the present disclosure may use examples to illustrate one or more aspects thereof. Unless explicitly stated otherwise, the use or listing of one or more examples (which may be denoted by “for example,” “by way of example,” “e.g.,” “such as,” or similar language) is not intended to and does not limit the scope of the present disclosure.
[0058] The terms proximal and distal are used in this disclosure with their conventional medical meanings, proximal being closer to the operator or user of the system, and further from the region of surgical interest in or on the patient, and distal being closer to the region of surgical interest in or on the patient, and further from the operator or user of the system.
[0059] In some systems, an ultrasound probe may be calibrated/registered relative to a navigation means (e.g., a tracked sensor, etc.) in association with navigated image acquisition. For example, some systems may establish a transformation matrix that maps
the six-dimensional (6D) pose (e.g., position and orientation information) of the tracked sensor to the 6D pose of an ultrasound probe. Some systems may map the 6D pose of the tracked sensor to an image generated by the ultrasound probe or to the ultrasound beam of the ultrasound probe.
[0060] In some cases, some calibration methods may be tedious, time consuming, and error prone. In some other cases, some calibration phantoms utilized for calibrating the ultrasound probe to the navigation system are costly and are prone to decay with time (e.g., due to the degradation of hydrogels implemented in some calibration phantoms). Moreover, wires provided within the calibration phantom can only be detected with ultrasound from very specific angles, making it difficult to acquire sufficient independent images to use for calibration. Using such calibration phantoms requires multiple images taken from multiple sides of the calibration phantom and at very specific angles in order to detect the wires. This increases the complexity and the amount of time needed to perform the calibration and introduces sources of error.
[0061] Instances may occur in which a surgical team is unaware that the ultrasound probe has lost calibration with the navigation system. In some cases, even if the surgical team is aware of the loss in calibration, the team may be unwilling to recalibrate the ultrasound probe (e.g., due to lack of time or resources). Undetected or unaddressed loss in calibration during a medical procedure (e.g., due to deformation, tool drop/hit, etc.) may result in surgical errors. In some other cases, metal and other materials present in the environment may cause distortion to an electromagnetic field generated by the navigation system in association with tracking an object, and such distortion may result in surgical errors.
[0062] In accordance with aspects of the present disclosure, systems and techniques described herein may support dynamic initial calibration and dynamic recalibration of ultrasound probes (also referred to herein as ultrasonic probes) for navigation. The systems and techniques may incorporate an ultrasound probe connected to a main application/navigation system (also referred to herein as a navigated surgery system). The systems and techniques may include electromagnetic navigation of the ultrasound probe using trackers/sensors coupled to the ultrasound probe and an emitter capable of emitting electromagnetic signals. In some aspects of the present disclosure, the systems and techniques may include an electromagnetic tracked system (e.g., a calibration phantom). [0063] In some examples, the calibration phantom may be a phantom with a configuration of a plurality of paired, opposed, pointy protrusions (e.g., peg pairs or
protrusion pairs) submerged in a water bath arranged at known locations within the calibration phantom. According to one embodiment of the present disclosure, the protrusion pairs have a cylindrical shape. Moreover, the protrusion pairs are arranged in a geometric pattern. The tip of each protrusion has coordinates that are very well established based on a reference point (e.g., the calibration phantom’s cartesian origin). In some other examples embodiments of the present disclosure, the calibration phantom may be a water bath or a tissue phantom, but is not limited thereto. The example calibration phantoms described herein are stable, inexpensive compared to some other calibration phantoms, and electromagnetic friendly. For example, the calibration phantom may be free of materials that may interfere with electromagnetic navigation and tracking. In some example implementations, the calibration phantom may be a gel (e.g., hydrogel). Moreover, the calibration phantom can be suitable for various types of navigation and tracking, including optical, acoustic, magnetic, radar, inertial, computer vision based. [0064] Examples of the techniques described herein may include moving or positioning a tracked device (e.g., the ultrasound probe) adjacent to or on top of the (also tracked) calibration phantom and capturing multiple ultrasound images within the calibration phantom of the ultrasound beam of the ultrasound probe intersecting with a one or a plurality of protrusion pairs. For each iteration of the tracked device moving within the calibration phantom, the captured ultrasound images of the protrusion pairs intersecting with the ultrasound beam are identified. After enough samples that have met a predetermined threshold have been taken, a transformation matrix of the tracker-probe (or tracker-image) is calculated. The techniques may include recording a video file of the ultrasound imaging concurrently with the tracking data and processing the video file (e.g., using a software script). Based on the processing of the video file, the techniques may include identifying a temporal instance at which a portion (e.g., tip, or gap between two opposed protrusions’ tips, etc.) of the protrusion pair enters the ultrasound view. The systems and techniques may calibrate the ultrasound view with respect to the tracking space.
[0065] Examples of the tracked device and the tracking space include an electromagnetic tracked device and an electromagnetic tracking space but are not limited thereto. Aspects of the present disclosure support any type of tracked devices (e.g., sensors) and tracking spaces that may be implemented by a navigation system.
[0066] According to an embodiment of the present disclosure, the transformation matrix is tested for validation. If the results were not fully satisfactory, the process is continued
or repeated based on analytic results and/or user preferences. According to embodiments of the present disclosure, the process can be repeated by changing the ultrasound settings (e.g., the ultrasound frequency, the ultrasound depth, etc.) to optimize calibration and ultrasound probe characterization.
[0067] According to embodiments of the present disclosure, ultrasound beams can be characterized (e.g., addressing axial and lateral resolution as well as side-thickness or elevation resolution). According to one embodiment of the present disclosure, protrusion pair distances may be varied. According to one embodiment of the present disclosure, protrusion pair distances are used to characterize ultrasound beam thickness. According to another embodiment of the present disclosure, protrusion pair distances at different depths provided for a more complete and accurate calibration and ultrasound beam shape characterization.
[0068] According to embodiments of the present disclosure, the calibration phantom can be used with fluids other than water to improve image quality due to variations in speed of sound for the different fluids or media. According to embodiments of the present disclosure, different phantom materials can be used. According to embodiments of the present disclosure, different protrusion pair geometry, shapes, angles, etc. can be used without departing from the spirit and scope of the present disclosure.
[0069] The systems and techniques described herein may support autonomous or semi-autonomous calibration, calibration verification with reduced calibration time compared to some other calibration techniques, and providing or outputting guidance information (e.g., tutorials, user prompts, etc.) for users on how to move or position a device (e.g., ultrasound imaging device, electromagnetic tracked device, etc.) in association with calibration. For example, the systems and techniques described herein support performing multiple calibrations (e.g., an initial calibration using a water bath, one or more subsequent calibrations using a tissue phantom, etc.), in which a system may perform the calibrations autonomously (or semi-autonomously). The systems and techniques may perform the calibrations continuously (or semi-continuously) and/or in response to trigger criteria, aspects of which are described herein. It is to be understood that as described herein with respect to calibration may be applied to recalibration. The terms “calibration,” “recalibration,” “calibration verification,” and “reregistration” may be used interchangeably herein.
[0070] Techniques described herein may be implemented in hardware, software, firmware, or any combination thereof that may automatically detect instrument landmarks
on ultrasound images during a medical procedure. The techniques may include detecting landmarks of an instrument (e.g., tip of a needle during placement, distinctive features of navigated catheters, tip of a registration stylus, etc.) during the medical procedure and, using the detected instrument landmarks, automatically calibrating (or adjusting the calibration of) the ultrasound imaging device to the navigation system. The ultrasound imaging device may be, for example, an ultrasound probe.
[0071] Aspects of the automatic calibration techniques may provide time savings for the surgical team, an improved user experience, and increased accuracy over longer portions of medical procedures. Other aspects of the calibration techniques provide cost savings through the use of, as a calibration phantom, an empty container (e.g., an empty box including an electromagnetic friendly material) with a plurality of protrusion pairs. The calibration phantom may be filled with water for the calibration procedure, thereby resulting in cost savings compared to other materials (e.g., gels, silicones, etc.). In some additional aspects of the present disclosure, as water does not decay with time (e.g., compared to gels and silicones), using water in the calibration phantom may provide increased durability.
[0072] The electromagnetic tracking and calibration solutions described herein support directly using electromagnetic tools implemented in some existing medical procedures. In some aspects of the present disclosure, direct use of such existing electromagnetic tools may provide increased accuracy due to accurate tracking of electromagnetic tools by some navigation systems.
[0073] In some examples, the calibration techniques described herein may be implemented using actual tissue of a subject as a calibration phantom. For example, the calibration techniques described herein may be implemented during a medical procedure associated with the actual tissue. In an example, if medical personnel are inserting a device (e.g., an ablation antenna, cardiac catheter, etc.) using ultrasound guidance provided by an ultrasound imaging device, the device will be visible in the ultrasound view. The calibration techniques and calibration software described herein may include using (e.g., automatically, or in response to a user request, etc.) the ultrasound images and corresponding electromagnetic tracking information to recalibrate the registration between the ultrasound space and the electromagnetic navigation space (also referred to herein as recalibrating the ultrasound tracking registration), without interrupting the medical procedure.
[0074] The systems and techniques described herein support recalibrating the ultrasound
imaging system in the background based on automatic detection of target objects (e.g., instruments, tools, etc.) in the ultrasound images using Al/machine learning computer vision algorithms, random forest classifiers, and object detection. The systems and techniques support automatic registration which may be implemented continuously, based on each event in which an ultrasound beam intersects the midpoint of each protrusion of a protrusion pair is detected in the ultrasound view, and/or periodically (e.g., based on a temporal trigger).
[0075] The systems and techniques support automatic registration in response to other trigger criteria (e.g., in response to detection of a target instrument in an ultrasound image) at any point during a medical procedure. In an example, the systems and techniques may include continuously verifying the registration accuracy between the ultrasound imaging system and the navigation system anytime the target instrument (e.g., surgical instrument, electromagnetic pointer, etc.) is detected in the ultrasound imaging. The systems and techniques support alerting the user and/or taking corrective actions in response to registration discrepancies. In an example, after outputting an alert (e.g., an audible alert, a visible alert, a haptic alert, etc.) to the user, the system may provide the user with a list of corrective actions for improving calibration. Non-limiting examples of corrective actions may include real-time actionable feedback for users to move the target instrument in association with registration.
[0076] In some other aspects of the present disclosure, the systems and techniques may support dynamically and automatically detecting distortion in the navigated volume due to discrepancies between expected navigation and imaging data. In an example, the discrepancies may be between pose information of a tracked object as indicated by the navigation system and pose information of the tracked object as indicated by the imaging data. The systems described herein may support techniques for alerting (e.g., providing a notification to) a user of the discrepancies and compensating for the discrepancies. The systems and techniques may include calibrating the navigation data to the imaging data (e.g., calibrating a navigation space to an ultrasound space) while compensating for the discrepancies.
[0077] Aspects of the present disclosure support integration of a calibration phantom (e.g., water bath, hydrogel phantom, etc.) into the structure of a patient tracker. The integration of the calibration phantom may support user recalibration of a navigated ultrasound probe before or during a medical procedure. According to example aspects of the recalibration techniques described herein, the temporal duration associated with
recalibration is reduced compared to some other recalibration techniques.
[0078] Implementations of the present disclosure provide technical solutions to one or more of the problems associated with other navigation systems and calibration techniques. For example, the systems and techniques described herein provide time savings, improved user experience, cost savings, and increased accuracy in comparison to other registration and calibration techniques. The systems and techniques described herein support continuous registration during a surgical procedure and continuous registration verification, in which registration and registration verification may be autonomous or semi-autonomous.
[0079] Aspects of the systems and techniques described herein support time efficient and cost-effective utilization of ultrasound images during surgery to navigate and display to medical personnel the locations of surgical devices (e.g., instruments, surgical tools, robotic end effectors, etc.) with respect to the patient anatomy. The systems and techniques described herein provide a reliable accuracy of the calibration between imaging devices (e.g., an ultrasound image probe, other imaging probes, etc.) and a navigation system, and the reliable accuracy may support accurate navigation of images (e.g., ultrasound images, etc.) that are generated based on data captured by the imaging devices. [0080] In some aspects of the present disclosure, different imaging probes may be different based on manufacturer, configuration, probe type, and the like, and such imaging probes may require new calibration and can lose calibration during a medical procedure. Aspects of the calibration techniques described herein are relatively time efficient, cost effective, user friendly, and provide increased accuracy compared to other techniques for calibrating or recalibrating imaging probes. The time efficiency, cost effectiveness, user friendliness, and increased accuracy supported by the systems and techniques described herein may provide improved confidence for a surgeon in a navigated ultrasound space and support a reduction in surgical errors. In some alternative and/or additional aspects, the registration and calibration techniques described herein may be implemented autonomously (e.g., without input from medical personnel) or semi-autonomously (e.g., with partial input from medical personnel).
[0081] Aspects of the present disclosure relate to navigated and robotic surgery and to any type of surgery that may be associated with intra-surgical ultrasound imaging. Aspects of the present disclosure support implementing any of the techniques described herein to any medical procedure (e.g., cranial, spinal, thoracic, abdominal, cardiac, ablation, laparoscopic, minimally invasive surgery, robotic surgery, etc.) associated with the use of
intra-surgical ultrasound imaging. In some other aspects, the systems and techniques described herein may be implemented in association with initiatives related to data analytics, artificial intelligence, and machine learning, for example, with respect to data analytic scenarios for procedure and device optimization.
[0082] In some cases, the techniques described herein may be implemented as a standalone application that uses a calibration phantom (e.g., a water bath, etc.) and an imaging system (e.g., ultrasound imaging, optical, electromagnetic, or other type of tracking, 3D rendering software, and calibration software) or as an application integrated with an imaging system or navigation system. The examples described herein with reference to the following figures may support multiple types, geometries, configurations, and sizes of calibration phantoms other than the examples illustrated and described herein. [0083] Fig. 1A illustrates an example of a system 100 that supports aspects of the present disclosure.
[0084] The system 100 includes a computing device 102, one or more imaging devices 112, a robot 114, a navigation system 118, a database 130, and/or a cloud network 134 (or other network). Systems according to other implementations of the present disclosure may include more or fewer components than the system 100. For example, the system 100 may omit and/or include additional instances of one or more components of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134. In an example, the system 100 may omit any instance of the computing device 102, the imaging device(s) 112, the robot 114, navigation system 118, the database 130, and/or the cloud network 134. The system 100 may support the implementation of one or more other aspects of one or more of the methods disclosed herein.
[0085] The computing device 102 includes a processor 104, a memory 106, a communication interface 108, and a user interface 110. Computing devices according to other implementations of the present disclosure may include more or fewer components than the computing device 102. The computing device 102 may be, for example, a control device including electronic circuitry associated with controlling any components of the system 100.
[0086] The processor 104 of the computing device 102 may be any processor described herein or any similar processor. The processor 104 may be configured to execute instructions stored in the memory 106, which instructions may cause the processor 104 to carry out one or more computing steps utilizing or based on data received from the
imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
[0087] The memory 106 may be or include RAM, DRAM, SDRAM, other solid-state memory, any memory described herein, or any other tangible, non-transitory memory for storing computer-readable data and/or instructions. The memory 106 may store information or data associated with completing, for example, any step of the process flow 500 described herein, or of any other methods. The memory 106 may store, for example, instructions and/or machine learning models that support one or more functions of the imaging devices 112, the robot 114, and the navigation system 118. For instance, the memory 106 may store content (e.g., instructions and/or machine learning models) that, when executed by the processor 104, enable image processing 120, segmentation 122, transformation 124, and/or registration 128. Such content, if provided as in instruction, may, in some implementations, be organized into one or more applications, modules, packages, layers, or engines.
[0088] Alternatively or additionally, the memory 106 may store other types of content or data (e.g., machine learning models, artificial neural networks, deep neural networks, etc.) that can be processed by the processor 104 to carry out the various method and features described herein. Thus, although various contents of memory 106 may be described as instructions, it should be appreciated that functionality described herein can be achieved through use of instructions, algorithms, and/or machine learning models. The data, algorithms, and/or instructions may cause the processor 104 to manipulate data stored in the memory 106 and/or received from or via the imaging devices 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134.
[0089] The computing device 102 may also include a communication interface 108. The communication interface 108 may be used for receiving data or other information from an external source (e.g., the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component separate from the system 100), and/or for transmitting instructions, data (e.g., image data, tracking data, navigation data, calibration data, registration data, etc.), or other information to an external system or device (e.g., another computing device 102, the imaging devices 112, the robot 114, the navigation system 118, the database 130, the cloud network 134, and/or any other system or component not part of the system 100). The communication interface 108 may include one or more wired interfaces (e.g., a USB port, an Ethernet port, a Firewire port) and/or one or more wireless transceivers or interfaces (configured, for
example, to transmit and/or receive information via one or more wireless communication protocols such as 802.1 la/b/g/n, Bluetooth, NFC, ZigBee, and so forth). In some implementations, the communication interface 108 may support communication between the device 102 and one or more other processors 104 or computing devices 102, whether to reduce the time needed to accomplish a computing-intensive task or for any other reason.
[0090] The computing device 102 may also include one or more user interfaces 110. The user interface 110 may be or include a keyboard, mouse, trackball, monitor, television, screen, touchscreen, and/or any other device for receiving information from a user and/or for providing information to a user. The user interface 110 may be used, for example, to receive a user selection or other user input regarding any step of any method described herein. Notwithstanding the foregoing, any required input for any step of any method described herein may be generated automatically by the system 100 (e.g., by the processor 104 or another component of the system 100) or received by the system 100 from a source external to the system 100. In some implementations, the user interface 110 may support user modification (e.g., by a surgeon, medical personnel, a patient, etc.) of instructions to be executed by the processor 104 according to one or more implementations of the present disclosure, and/or to user modification or adjustment of a setting of other information displayed on the user interface 110 or corresponding thereto.
[0091] In some implementations, the computing device 102 may utilize a user interface 110 that is housed separately from one or more remaining components of the computing device 102. In some implementations, the user interface 110 may be located proximate one or more other components of the computing device 102, while in other implementations, the user interface 110 may be located remotely from one or more other components of the computer device 102.
[0092] The imaging device 112 may be operable to image anatomical feature(s) (e.g., a bone, veins, tissue, etc.) and/or other aspects of patient anatomy to yield image data (e.g., image data depicting or corresponding to a bone, veins, tissue, etc.). “Image data” as used herein refers to the data generated or captured by an imaging device 112, including in a machine-readable form, a graphical/visual form, and in any other form. In various examples, the image data may include data corresponding to an anatomical feature of a patient, or to a portion thereof. The image data may be or include a preoperative image, an intraoperative image, a postoperative image, or an image taken independently of any surgical procedure. In some implementations, a first imaging device 112 may be used to
obtain first image data (e.g., a first image) at a first time, and a second imaging device 112 may be used to obtain second image data (e.g., a second image) at a second time after the first time.
[0093] The imaging device 112 may be capable of taking a 2D image or a 3D image to yield the image data. The imaging device 112 may be or include, for example, an ultrasound scanner (which may include, for example, a physically separate transducer and receiver, or a single ultrasound transceiver), an 0-arm, a C-arm, a G-arm, or any other device utilizing X-ray-based imaging (e.g., a fluoroscope, a CT scanner, or other X-ray machine), a magnetic resonance imaging (MRI) scanner, an optical coherence tomography (OCT) scanner, an endoscope, a microscope, an optical camera, a thermographic camera (e.g., an infrared camera), a radar system (which may include, for example, a transmitter, a receiver, a processor, and one or more antennae), or any other imaging device 112 suitable for obtaining images of an anatomical feature of a patient. The imaging device 112 may be contained entirely within a single housing, or may include a transmitter/emitter and a receiver/ detector that are in separate housings or are otherwise physically separated.
[0094] In some implementations, the imaging device 112 may include more than one imaging device 112. For example, a first imaging device may provide first image data and/or a first image, and a second imaging device may provide second image data and/or a second image. In still other implementations, the same imaging device may be used to provide both the first image data and the second image data, and/or any other image data described herein. The imaging device 112 may be operable to generate a stream of image data. For example, the imaging device 112 may be configured to operate with an open shutter, or with a shutter that continuously alternates between open and shut so as to capture successive images. For purposes of the present disclosure, unless specified otherwise, image data may be considered to be continuous and/or provided as an image data stream if the image data represents two or more frames per second.
[0095] The robot 114 may be any surgical robot or surgical robotic system. The robot 114 may be or include, for example, the Mazor X™ Stealth Edition robotic guidance system. The robot 114 may be configured to position the imaging device 112 at one or more precise position(s) and orientation(s), and/or to return the imaging device 112 to the same position(s) and orientation(s) at a later point in time. The robot 114 may additionally or alternatively be configured to manipulate a surgical tool (whether based on guidance from the navigation system 118 or not) to accomplish or to assist with a surgical task. In some implementations, the robot 114 may be configured to hold and/or manipulate an
anatomical element during or in connection with a surgical procedure.
[0096] The robot 114 may include one or more robotic arms 116. In some implementations, the robotic arm 116 may include a first robotic arm and a second robotic arm, though the robot 114 may include more than two robotic arms. In some implementations, one or more of the robotic arms 116 may be used to hold and/or maneuver the imaging device 112. In implementations where the imaging device 112 includes two or more physically separate components (e.g., a transmitter and receiver), one robotic arm 116 may hold one such component, and another robotic arm 116 may hold another such component. Each robotic arm 116 may be positionable independently of the other robotic arm. The robotic arms 116 may be controlled in a single, shared coordinate space, or in separate coordinate spaces.
[0097] The robot 114, together with the robotic arm 116, may have, for example, one, two, three, four, five, six, seven, or more degrees of freedom. Further, the robotic arm 116 may be positioned or positionable in any pose, plane, and/or focal point. The pose includes a position and an orientation. As a result, an imaging device 112, surgical tool, or other object held by the robot 114 (or, more specifically, by the robotic arm 116) may be precisely positionable in one or more needed and specific positions and orientations.
[0098] The robotic arm(s) 116 may include one or more sensors that enable the processor 104 (or a processor of the robot 114) to determine a precise pose in space of the robotic arm (as well as any object or element held by or secured to the robotic arm).
[0099] In some implementations, reference markers (e.g., navigation markers) may be placed on the robot 114 (including, e.g., on the robotic arm 116), the imaging device 112, or any other object in the surgical space. The reference markers may be tracked by the navigation system 118, and the results of the tracking may be used by the robot 114 and/or by an operator of the system 100 or any component thereof. In some implementations, the navigation system 118 can be used to track other components (e.g., imaging device 112, surgical tools, instruments 145 (later described with reference to Fig. IB), etc.) of the system and the system can operate without the use of the robot 114 (e.g., with the surgeon manually manipulating the imaging device 112 and/or one or more surgical tools, based on information and/or instructions generated by the navigation system 118, for example).
[0100] The navigation system 118 may provide navigation for a surgeon and/or a surgical robot during an operation. The navigation system 118 may be any now-known or future-developed navigation system, including, for example, the Medtronic Stealth Station™ S8 surgical navigation system or any successor thereof. The navigation
system 118 may include one or more cameras or other sensor(s) for tracking one or more reference markers, navigated trackers (e.g., tracking devices 140, etc.) or other objects within the operating room or other room in which some or all of the system 100 is located. The one or more cameras may be optical cameras, infrared cameras, or other cameras. In some implementations, the navigation system 118 may include one or more tracking devices 140 (e.g., electromagnetic sensors, acoustic sensors, etc.).
[0101] In some aspects of the present disclosure, the navigation system 118 may include one or more of an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision-based tracking system. The navigation system 118 may include a corresponding transmission device 136 capable of transmitting signals associated with the tracking type. In some aspects of the present disclosure, the navigation system 118 may be capable of computer vision-based tracking of objects present in images captured by the imaging device(s) 112.
[0102] In various implementations, the navigation system 118 may be used to track a position and orientation (e.g., a pose) of the imaging device 112, the robot 114 and/or robotic arm 116, and/or one or more surgical tools (e.g., instrument 145, etc.) (or, more particularly, to track a pose of a navigated tracker attached, directly or indirectly, in fixed relation to the one or more of the foregoing). In some examples, the instrument 145 may be an electromagnetic pointer (or stylus). The navigation system 118 may include a display for displaying one or more images from an external source (e.g., the computing device 102, imaging device 112, or other source) or for displaying an image and/or video stream from the one or more cameras or other sensors of the navigation system 118.
[0103] In some implementations, the system 100 can operate without the use of the navigation system 118. The navigation system 118 may be configured to provide guidance to a surgeon or other user of the system 100 or a component thereof, to the robot 114, or to any other element of the system 100 regarding, for example, a pose of one or more anatomical elements, whether or not a tool is in the proper trajectory, and/or how to move a tool into the proper trajectory to carry out a surgical task according to a preoperative or other surgical plan.
[0104] The processor 104 may utilize data stored in memory 106 as a neural network. The neural network may include a machine learning architecture. In some aspects of the present disclosure, the neural network may be or include one or more classifiers. In some other aspects of the present disclosure, the neural network may be or include any machine
learning network such as, for example, a deep learning network, a convolutional neural network, a reconstructive neural network, a generative adversarial neural network, or any other neural network capable of accomplishing functions of the computing device 102 described herein. Some elements stored in memory 106 may be described as or referred to as instructions or instruction sets, and some functions of the computing device 102 may be implemented using machine learning techniques.
[0105] For example, the processor 104 may support machine learning model(s) 138 which may be trained and/or updated based on data (e.g., training data 144) provided or accessed by any of the computing device 102, the imaging device 112, the robot 114, the navigation system 118, the database 130, and/or the cloud network 134. The machine learning model(s) 138 may be built and updated based on the training data 144 (also referred to herein as training data and feedback).
[0106] The neural network and machine learning model(s) 138 may support Al/machine learning computer vision algorithms and object detection in association with automatically detecting, identifying, and tracking target objects (e.g., instruments, tools, etc.) in one or more images 153 or a multimedia file 154.
[0107] The database 130 may store information that correlates one coordinate system to another (e.g., one or more robotic coordinate systems, an ultrasound space coordinate system, a patient coordinate system, and/or a navigation coordinate system, etc.). The database 130 may additionally or alternatively store, for example, one or more surgical plans (including, for example, pose information about a target and/or image information about a patient’s anatomy at and/or proximate the surgical site, for use by the robot 114, the ultrasound space coordinate system, the navigation system 118, and/or a user of the computing device 102 or of the system 100); one or more images 153 useful in connection with a surgery to be completed by or with the assistance of one or more other components of the system 100; and/or any other useful information.
[0108] The database 130 may be configured to provide any such information to the computing device 102 or to any other device of the system 100 or external to the system 100, whether directly or via the cloud network 134. In some implementations, the database 130 may include information associated with a calibration phantom 149 associated with a calibration procedure. In some implementations, the database 130 may be or include part of a hospital image storage system, such as a picture archiving and communication system (PACS), a health information system (HIS), and/or another system for collecting, storing, managing, and/or transmitting electronic medical records including image data.
[0109] In some aspects of the present disclosure, the computing device 102 may communicate with a server(s) and/or a database (e.g., database 130) directly or indirectly over a communications network (e.g., the cloud network 134). The communications network may include any type of known communication medium or collection of communication media and may use any type of protocols to transport data between endpoints. The communications network may include wired communications technologies, wireless communications technologies, or any combination thereof.
[0110] Wired communications technologies may include, for example, Ethernet-based wired local area network (LAN) connections using physical transmission mediums (e.g., coaxial cable, copper cable/wire, fiber-optic cable, etc.). Wireless communications technologies may include, for example, cellular or cellular data connections and protocols (e.g., digital cellular, personal communications service (PCS), cellular digital packet data (CDPD), general packet radio service (GPRS), enhanced data rates for global system for mobile communications (GSM) evolution (EDGE), code division multiple access (CDMA), single-carrier radio transmission technology (1 *RTT), evolution-data optimized (EVDO), high speed packet access (HSPA), universal mobile telecommunications service (UMTS), 3G, long term evolution (LTE), 4G, and/or 5G, etc.), Bluetooth®, Bluetooth® low energy, Wi-Fi, radio, satellite, infrared connections, and/or ZigBee® communication protocols.
[OHl] The Internet is an example of the communications network that constitutes an Internet Protocol (IP) network consisting of multiple computers, computing networks, and other communication devices located in multiple locations, and components in the communications network (e.g., computers, computing networks, communication devices) may be connected through one or more telephone systems and other means. Other examples of the communications network may include, without limitation, a standard Plain Old Telephone System (POTS), an Integrated Services Digital Network (ISDN), the Public Switched Telephone Network (PSTN), a Local Area Network (LAN), a Wide Area Network (WAN), a wireless LAN (WLAN), a Session Initiation Protocol (SIP) network, a Voice over Internet Protocol (VoIP) network, a cellular network, and any other type of packet-switched or circuit-switched network known in the art. In some cases, the communications network may include any combination of networks or network types. In some aspects of the present disclosure, the communications network may include any combination of communication mediums such as coaxial cable, copper cable/wire, fiber-optic cable, or antennas for communicating data (e.g., transmitting/receiving data).
[0112] The computing device 102 may be connected to the cloud network 134 via the communication interface 108, using a wired connection, a wireless connection, or both. In some implementations, the computing device 102 may communicate with the database 130 and/or an external device (e.g., a computing device) via the cloud network 134.
[0113] The system 100 or similar systems may be used, for example, to carry out one or more aspects of any of the process flow 500 described herein. The system 100 or similar systems may also be used for other purposes.
[0114] Fig. IB illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example may be implemented by the computing device 102, imaging device(s) 112, robot 114 (e.g., a robotic system), and navigation system 118. [0115] In some aspects of the present disclosure, the navigation system 118 may provide navigation information based on an electromagnetic field generated by a transmission device 136. The navigation information may include tracking information 167 (also referred to herein as tracking data) as described herein. For example, the transmission device 136 may include an array of transmission coils capable of generating or forming the electromagnetic field in response to respective currents driven through the transmission coils. The navigation system 118 may include tracking devices 140 capable of sensing the electromagnetic field. Aspects of the navigation system 118 described herein may be implemented by navigation processing 129.
[0116] The system 100 may support tracking objects (e.g., an instrument 145, imaging device 112, a calibration phantom 149, etc.) in a trackable volume 150 using an electromagnetic field produced by the transmission device 136. For example, the transmission device 136 may include a transmitter antenna or transmitting coil array capable of producing the electromagnetic field. The system 100 may track the pose (e.g., position, coordinates, orientation, etc.) of the objects in the tracking volume 150 relative to a subject 141. In some aspects of the present disclosure, the system 100 may display, via a user interface of the computing device 102, icons corresponding to any tracked objects. For example, the system 100 may superimpose such icons on and/or adjacent an image displayed on the user interface. The terms “tracking volume,” “trackable volume,” “navigation volume,” and “volume” may be used interchangeably herein.
[0117] In some aspects of the present disclosure, the transmission device 136 may be an electromagnetic localizer that is operable to generate electromagnetic fields. The transmission device 136 may drive current through the transmission coils, thereby powering the coils to generate or form the electromagnetic field. As the current is driven
through the coils, the electromagnetic field will extend away from the transmission coils and form a navigation domain (e.g., volume 150). The volume 150 may include any portion (e.g., the spine, one or more vertebrae, the brain, an anatomical element, or a portion thereof, etc.) of the subject 141 and/or any portion of a calibration phantom 149-a. The transmission coils may be powered through a controller device and/or power supply provided by the system 100.
[0118] The tracking devices 140 may include or be provided as sensors (also referred to herein as tracking sensors). The sensors may sense a selected portion or component of the electromagnetic field(s) generated by the transmission device 136. The navigation system 118 may support registration (e.g., through registration 128) of the volume 150 to a virtual space 155. The navigation system 118 may support superimposing an icon representing a tracked object (e.g., an instrument 145, a tracking device 140-b, tracking device 140-c, a tracking device 146, etc.) on the image. The system 100 may support the delivery of tracking information associated with the tracking devices 140 and/or tracking device 146 to the navigation system 118. The tracking information may include, for example, data associated with magnetic fields sensed by the tracking devices 140.
[0119] The tracking devices 140 may communicate sensor information to the navigation system 118 for determining a position of the tracked portions relative to each other and/or for localizing an object (e.g., instrument 145, tracking device 146, etc.) relative to an image 153. The navigation system 118 and/or transmission device 136 may include a controller that supports operating and powering the generation of electromagnetic fields. [0120] In the example of Fig. IB, the system 100 may generate a navigation space 119 based on one or more tracking signals transmitted by the transmission device 136. The navigation space 119 may correspond to environment 142 or a portion thereof. For example, the navigation space 119 may correspond to a subject 141 (e.g., a patient) included in the environment 142 or an anatomical element (e.g., an organ, bone, tissue, etc.) of the subject 141. The environment 142 may be, for example, an operating room, an exam room, or the like. The tracking signals are not limited to electromagnetic tracking signals, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of tracking signals (e.g., optical tracking signals, acoustic tracking signals, etc.).
[0121] The system 100 may generate a virtual space 155 based on (e.g., in response to) signals transmitted by imaging device 112. The virtual space 155 may correspond to a field of view 159 of the imaging device 112. In an example, the system 100 may generate
images 153 in response to signals transmitted by imaging device 112, and the images 153 may correspond to the field of view 159 of the imaging device 112. In the example of Fig. IB, the imaging device 112 is an ultrasound probe transmitting ultrasound signals, and the images 153 may be ultrasound images.
[0122] The images 153 may be static images or video images. In some aspects of the present disclosure, the images 153 may be stored as a multimedia file 154 that includes video (or video and sound). The imaging device 112 and the example signals transmitted by the imaging device 112 are not limited thereto, and it is to be understood that the example aspects described with reference to Fig. IB may be implemented using other types of imaging devices 112 (e.g., X-ray, CT scanner, OCT scanner, etc.) and imaging systems described herein. The system 100 may support acquiring image data to generate or produce images (e.g., images 153, multimedia file 154, etc.) of the subject 141.
[0123] In an example embodiment of the present disclosure, using the imaging device 112 and the transmission device 136, the system 100 may detect or track the calibration phantom 149-a and other objects (e.g., tracking devices 140, instruments 145, tracking device 146, etc.) included in the volume 150 and the virtual space 155. For example, at least a portion of the calibration phantom 149-a may be located in the volume 150 (as generated by the navigation system 118) and the virtual space 155 (as generated by the computing device 102). In the example of Fig. IB, the calibration phantom 149-a is a tissue phantom, but is not limited thereto.
[0124] The system 100 may register and calibrate the imaging device 112 with respect to the navigation system 118. In the example of Fig. IB, for an image 153 in which the calibration phantom 149-a is detected, the system 100 may identify a set of coordinates 157 in the virtual space 155 in response to an event in which the system 100 detects at least a portion of the instrument 145 in the image 153. In an example, the system 100 may detect that the portion of the instrument 145 is located in the calibration phantom 149-a and intersects a surface of the virtual space 155 at the set of coordinates 157. For example, the system 100 may detect that the portion of the instrument 145 intersects the surface at an angle perpendicular to the surface. In some examples, the portion of the instrument 145 may be a tracking device 146 (e.g., an electromagnetic antenna of the instrument 145). In another example, the system 100 may detect when the as discussed in greater below.
[0125] In some example implementations, the virtual space 155 may be a 2D virtual space generated based on 2D images (e.g., ultrasound images, CT images, etc.) captured by the imaging device 112, and the surface may be a plane of the virtual space 155. In
some other example implementations, the virtual space 155 may be a 3D virtual space (e.g., a volume), and the surface of the virtual space 155 may be a planar surface or non-planar surface.
[0126] According to example aspects of the present disclosure, the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to the event in which the system 100 detects the instrument 145 (or at least a portion of the instrument 145) in the image 153 or in response to the event in which the system 100 detects the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair of the calibration phantom 149-a. For example, in response to either one of the events, the system 100 may calibrate a coordinate system 160 associated with the virtual space 155 with respect to a coordinate system 165 associated with the navigation space 119. In an example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and temporal information 156 associated with the event in which the instrument 145 (or portion of the instrument 145) intersects a surface (e.g., a plane, a volume, etc.) of the virtual space 155 at the set of coordinates 157. Alternatively, in an example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and temporal information 156 associated with the event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair of the calibration phantom 149-a. Further, for example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on tracking information associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c, and the tracking device 140-a.
[0127] For example, the instrument 145 and/or the calibration phantom 149-a may be registered to the navigation system 118 such that the navigation system 118 may track and determine pose information 161 of the instrument 145 and/or the calibration phantom 149- a based on tracking information 167 associated with the tracking device 146, the tracking device 140-a, tracking device 140-b and/or the tracking device 140-c and temporal information 166 corresponding to the tracking information 167. Further, for example, the navigation system 118 may track and determine pose information 161 of the imaging device 112 based on tracking information 167 associated with the tracking device 140-a and temporal information 166 corresponding to the tracking information 167. Accordingly, for example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the tracking information 167 (e.g.,
associated with the tracking device 146, the tracking device 140-b, the tracking device 140-c and the tracking device 140-a), the temporal information 166 associated with the tracking information 167, the temporal information 156 associated with the event, and the coordinates 157 associated with the event.
[0128] The system 100 may detect, in an image 153 (or multimedia file 154), one or more landmarks corresponding to the instrument 145, a portion of the instrument 145, or the tracking device 146. The landmarks may correspond to distinctive features (e.g., a tip, a shape of the tip, etc.) of the instrument 145 or the tracking device 146. The system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the one or more landmarks. According to a further embodiment of the present disclosure, the system 100 may detect in an image 153 (or multimedia file 154), one or more landmarks corresponding to the phantom 149-a and/or the protrusion pair provided within the phantom 149-a. The landmarks may correspond to distinctive features (e.g., a tip, a shape of the tip, etc.) of the phantom 149-a and/or the protrusion pair provided within the phantom 149-a. The system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the one or more landmarks.
[0129] Aspects of the present disclosure support calibrating the virtual space 155 to the navigation space 119 (e.g., calibrating the coordinate system 160 associated with the virtual space 155 to the coordinate system 165 associated with the navigation space 119) in response to one or more criteria. For example, the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each occurrence of the event. In another example, the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each nth occurrence of the event (e.g., each third occurrence of the event, each fifth occurrence, etc.). In some other examples, the system 100 may calibrate the virtual space 155 to the navigation space 119 in response to each nth occurrence of the event within a temporal duration (e.g., each third occurrence of the event, in which the third occurrence is X seconds or less after a first occurrence of the event (where X is an integer value)). Accordingly, for example, aspects of the present disclosure support automatic registration during a surgical procedure, in which the registration is continuous or semi-continuous.
[0130] The system 100 may support calibrating the coordinate system 160 with respect to the coordinate system 165 without pausing a medical procedure (e.g., surgical procedure). In the example in which the calibration phantom 149-a is a tissue phantom inside the body of the subject 141, the system 100 may calibrate the coordinate system 160
with reference to the coordinate system 165 (e.g., recalibrate the registration between the virtual space 155 and the navigation space 119) in the background while medical personnel performs a medical procedure on the subject 141, without interrupting the medical procedure. That is, for example, the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 during the medical procedure, without prompting the medical personnel to pause the medical procedure, such that the medical personnel may proceed with the medical procedure without waiting for calibration to be completed. In some aspects of the present disclosure, the system 100 may calibrate the coordinate system 160 with reference to the coordinate system 165 without prompting the medical personnel to participate in a separate calibration operation.
[0131] In some aspects of the present disclosure, the system 100 and techniques described herein may support calibrating the coordinate system 160 with respect to the coordinate system 165 based on any of: properties (e.g., beam thickness, beam shape, signal frequency, etc.) of signals transmitted by the imaging device 112, pose information of the instrument 145, the phantom 149-a (or pose information of the tracking device 146 or the tracking device 140-c) in association with an intersection between the instrument 145 (or tracking device 146) and the surface of the virtual space 155, or the intersection of the field of view 159 of the imaging device and the protrusion pair and properties (e.g., shape, etc.) of the tracking device 146, the protrusion pair, etc., example aspects of which are later described with reference to Figs. 3A-3G.
[0132] Aspects of calibrating the virtual space 155 to the navigation space 119 (e.g., calibrating the coordinate system 160 associated with the virtual space 155 to the coordinate system 165 associated with the navigation space 119) include verifying a registration accuracy between the coordinate system 160 and the coordinate system 165. For example, in response to an occurrence of the event as described herein, the system 100 may calculate a registration accuracy between the coordinate system 160 and the coordinate system 165 and compare the registration accuracy to a target accuracy value. In an example, in response to a comparison result in which the registration accuracy is less than the target accuracy value, the system 100 may perform one or more operations described herein in association with recalibrating the virtual space 155 to the navigation space 119. For example, the system 100 may autonomously recalibrate the coordinate system 160 to the coordinate system 165. In another example, the system 100 may generate and output a notification including user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) regarding how to move
or position a device (e.g., the imaging device 112, the instrument 145, etc.) in association with the calibration process. In some examples, the notification may include a visual notification, an audible notification, a haptic notification, or a combination thereof.
[0133] Other additional and/or alternative aspects of calibrating the virtual space 155 to the navigation space 119 include automatically detecting distortion in the navigated volume 150 due to discrepancies between navigation data (e.g., tracking information 167 provided by navigation system 118) and imaging data (e.g., images 153, multimedia file 154, etc.). In an example, the discrepancies may be between pose information 161 of a tracked object (e.g., tracking device 140-b, tracing device 140-c, instrument 145, tracking device 146, etc.) as indicated by the navigation system 118 and pose information 162 of the tracked object as determined by the computing device 102 from the imaging data. [0134] In an example implementation, in response to an occurrence of the event as described herein, the system 100 may calculate the discrepancy and compare the discrepancy to a target discrepancy threshold value. In an example, in response to a comparison result in which the discrepancy is greater than the discrepancy threshold value, the system 100 may perform one or more operations described herein (e.g., autonomous recalibration, outputting a notification including user guidance information 175, etc.) in association with recalibrating the virtual space 155 to the navigation space 119. In some aspects of the present disclosure, the system 100 may calibrate the navigation data to the imaging data (e.g., calibrate the navigation space 119 to the virtual space 155) while compensating for the discrepancies.
[0135] The techniques described herein may provide continuous automatic registration, continuous semi-automatic registration, or a combination thereof, and the registration techniques may be implemented during a medical procedure.
[0136] Fig. 1C illustrates an example of the system 100 that supports aspects of the present disclosure. Aspects of the example in Fig. 1C include like aspects described with reference to Fig. IB. Referring to the example of Fig. 1C, the calibration phantom 149-b may be an ultrasound transmitting volume (e.g., a water bath) implemented using an empty container (e.g., an empty box including an electromagnetic friendly material). In some aspects of the present disclosure, the calibration phantom 149-b may include ultrasound conductive material.
[0137] For example, the container may be formed of low magnetic or non-magnetic materials so as to minimize distortion to electromagnetic fields. In an example, the container may be formed of a material having a magnetic permeability of about 1.0 to
about 1.1 (relative), and the material may have a relatively low electrical conductivity (e.g., an electrical conductivity less than a threshold value). In some examples, the material may be a stainless steel alloyed with different metallic elements associated with obtaining specific properties (e.g., temperature and corrosion resistance, fracture tolerance, etc.). Non-limiting examples of the material include Nickel/Chromium alloys (e.g., Series 300 alloys, type 304 stainless steel (annealed condition only), type 316 stainless steel), Cobalt/Chromium alloys (e.g., L605, MP35N), and Titanium alloys (e.g., Ti6A14V), plastics, and wood.
[0138] In the example illustrated, one or more surfaces of the container includes protrusion pairs as illustrated in Figs. 2B and 2C and the container is full of water. In some examples, the calibration phantom 149-b may be integrated into the structure of a patient tracker. In some other aspects of the present disclosure, the calibration phantom 149-b may be included in the environment 142 as a standalone structure that is separate from an operating table associated with the subject 141. According to example aspects of the present disclosure, the system 100 may support calibrating the virtual space 155 to the navigation space 119 using the calibration phantom 149-b and the techniques as described herein, in which the calibration phantom 149-b is substituted for the calibration phantom 149-a described with reference to Fig. IB.
[0139] In some example implementations, the system 100 may support calibrating the virtual space 155 to the navigation space 119 using both the calibration phantom 149-a and the calibration phantom 149-b. For example, the system 100 may support calibration outside the subject 141 using the calibration phantom 149-b (e.g., water bath) and further calibration (e.g., recalibration, calibration adjustment, etc.) using the calibration phantom 149-a (e.g., tissue of the subject 141), and the combination may provide an increase in accuracy compared to other calibration techniques. An example implementation of using both the calibration phantom 149-a and the calibration phantom 149-b in association with a calibration process is later described with reference to Fig. 4.
[0140] Fig. 2A illustrates an example implementation 200 of the system used to find source-target registration points according to an embodiment of the present disclosure. Referring to Fig. 2A, an imaging device 112 (e.g., an electromagnetic tracked ultrasound probe) may be inserted adjacent to or on top of the calibration phantom 149-b (e.g., a water bath). A transmission device 136 (e.g., an electromagnetic emitter) is positioned in the vicinity of the imaging device 112 and the calibration phantom 149-b. The imaging device 112 includes tracking device 140-a and the calibration phantom 149-b includes
tracking device(s) 140-c used to be tracked relative to the transmission device 136. In example implementation 200, the field of view 159 (e.g., the ultrasound beam or projection) of the imaging device 112 is provided. Also, an area 205 represents the imaging generated/provided by the device 112. A navigation space 119 corresponding to the calibration phantom 149-b may be generated by the navigation system 118 as described herein, and the system may display a virtual representation of the navigation space 119 and the virtual space 155.
[0141] As further illustrated in Fig. 2A, there are different transformations involved in finding source-target registration points. Rrtoi “the sensor to image registration” is the matrix that needs to be calculated. The EM-phantom point 150 represents the image captured of the midpoint between the tips of one of the protrusion pair provided within the phantom 149-b (or the image of any other registration object/mark) where the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair. EM-phantom point 150 represents the impression in the ultrasound image of an object in the real world (e.g., a phantom feature/protrusion or navigated stylus tip, etc.) whose 3D location in the tracker’s space is well known. Ts represents the ultrasound probe’s sensor (e.g., tracking device 140-a) transformation matrix and Tp represents the phantom’s sensor (e.g., tracking device 140-c) transformation matrix (adjusted for the specific feature/protrusions’ tip represented by the EM-phantom point 150) with respect to the transmission device 136. The point 150 is acquired simultaneously in two different coordinate spaces: the transmission device 136 localization system (through the calibration phantom 149-b) and the imaging device 112 (through image coordinates and tracking devices 140-a).
[0142] Features of the system 200 may be described in conjunction with a coordinate system 203 (e.g., coordinate system 203-a, coordinate system 203-b, coordinate system 203-c, and coordinate system 203-d). Each coordinate system 203, as shown in Fig. 2A, includes three-dimensions including an X-axis, a Y-axis, and a Z-axis. Additionally, or alternatively, coordinate system 203-a may be used to define surfaces, planes, or volume of the calibration phantom 149-b and the coordinate system 203-b may be used to define surfaces, planes, or volume of the navigation space 119 of the transmission device 136. The coordinate system 203-c corresponds to the imaging device 112 and the coordinate system 203-d corresponds to EM-phantom point 150.
[0143] The planes of each coordinate system 203 (e.g., coordinate system 203-a, coordinate system 203-b, coordinate system 203-c, and coordinate 203-d) may be disposed orthogonal, or at 90 degrees, to one another. While the origin of a coordinate system 203
may be placed at any point on or near the components of the navigation system 118, for the purposes of description, the axes of the coordinate system 203 are always disposed along the same directions from figure to figure, whether the coordinate system 203 is shown or not. In some examples, reference may be made to dimensions, angles, directions, relative positions, and/or movements associated with one or more components of the imaging device 112 and/or the navigation system 118 with respect to a coordinate system 203.
[0144] It is to be understood that the aspects described herein with reference to the plane of the ultrasound beam or projection (e.g. field of view 159) support implementations applied to any surface of a virtual space 155 (e.g., a plane of the virtual space 155 for cases in which the virtual space 155 is a 2D virtual space, a planar surface or non-planar surface of the virtual space 155 for cases in which the virtual space 155 is a 3D virtual space, etc.). It is to be understood that the aspects described herein may be applied to an electromagnetic antenna, a navigation stylus, a pointer, or any navigated tools having a geometry and location that is defined, known, and trusted by the system 100.
[0145] Referring back to Fig. 1, the system 100 may record all navigation data (e.g., electromagnetic data), imaging data (e.g., ultrasound data), and corresponding temporal information (e.g., temporal information 156, temporal information 166) in a multimedia file 154. In an example, the multimedia file 154 may be a movie file, and the system 100 may record timestamps corresponding to the navigation data and the imaging data. Based on the navigation data, the imaging data, and the temporal information, the system 100 may identify when the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs. Based on the identification of when the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs, the system 100 may verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
[0146] Fig. 2B illustrates a view of an imaging device 112 and the calibration phantom 149-b used in the system of Figs. 2A in accordance with aspects of the present disclosure. As illustrated, the imaging device 112 has provided thereon tracking device 140a. Although not illustrated, calibration phantom 149-b includes one or more tracking devices around its perimeter or other locations such that the calibration phantom 249-b and the one or more protrusion pairs are detected. Imaging device 112 includes a field of view 159 (e.g., ultrasound beam or projection). The calibration phantom 149-b also includes one or more protrusion pairs 247. Based on the identification of when the ultrasound beam
intersects the midpoint of each protrusion of a protrusion pair or a plurality of protrusion pairs, the system 100 may verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
[0147] Fig. 2C illustrates a view of a calibration phantom 149-b used in the system of Fig. 2A in accordance with aspects of the present disclosure. As illustrated in Fig. 2C the calibration phantom 149-b includes at least one protrusion pair 247 (protrusion 247-a and protrusion 247-b). Protrusion 247-a is provided on a first surface of the calibration phantom 149-b and protrusion 247-b is provided on a second surface, opposite the first surface of the calibration phantom 149-b, such that the tips of the protrusions 247-a and 247-b meet.
[0148] Fig. 3A illustrates example views 300 and 301 of an ultrasound beam 305 transmitted by the imaging device 112 when viewed from different perspective views. Referring to the example view 300, the ultrasound beam 305 is relatively thick (or wide) with respect to the Y-axis. In contrast, referring to the example view 301, the ultrasound beam 305 is relatively narrow with respect to the Z-axis. In the example view 301 of the ultrasound beam 305, the thickness varies in depth, and the shape and focal point of the ultrasound beam 305 may be based on parameters (e.g., power, frequency, etc.) of the ultrasound beam 305.
[0149] According to example aspects of the present disclosure, referring to example view 300, the system 100 may calibrate the virtual space 155 to the navigation space 119 based on instances in which the ultrasound beam or projection (e.g., the cross-sectional area 310 of the ultrasound beam 305) intersects the midpoint of each protrusion (protrusion 247-a and protrusion 247-b) of the protrusion pair 247. That is, for example, the system 100 may perform calibration for instances in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair, while factoring in values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
[0150] In another example, referring to example view 301, the system 100 may calibrate the virtual space 155 to the navigation space 119 for instances in which the ultrasound beam or projection 305 intersects the midpoint of each protrusion (protrusion 247-a and protrusion 247-b) of the protrusion pair 247 in the direction along the Z axis. That is, for example, the system 100 may perform calibration for instances in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair, while factoring in values of parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305.
[0151] Fig. 3B illustrates example views 303, 304, 306 of an imaging device 112 imaging a calibration phantom 149-b having a protrusion pair 247 provided therein with accompanying example views 313, 314, 316 of images 333, 334, 336 of the calibration phantom formed by imaging in accordance with aspects of the present disclosure. As illustrated in Fig. 3B, the projection 310 of the ultrasound beam 305 intersects the protrusions 247-a, 247-b of a protrusion pair 247 at various locations. Example views 313, 314 and 316 illustrate images 333, 334 and 336 of representations 323, 324 and 326 for the tips of the protrusion pair 247. In example view 303, the projection 310 of the ultrasound beam 305 intersects the protrusion 247-a at a rear portion. In example view 304, the projection 310 of the ultrasound beam 305 passes in between protrusion pair 247- a and 247-b . In example view 306, the projection 310 of the ultrasound beam 305 intersects the protrusion 247-a at a front portion . In example view 314, image 334 shows representation 324 of the tips of the protrusion pair 247 having the smallest diameter as compared with representation 323 and representation 326 of example image 333 and example image 336, respectively. A representation with the smallest diameter or narrowest shadow is a good indication that the tips of the protrusion pair 247 are in the center of the projection of the ultrasound beam 305.
[0152] Each of the example images 333 and 336 illustrates a representation 323 and 326 of the tips of the protrusion pair 247 illustrated with a larger shadow or a larger diameter than representation 324. According to an embodiment of the present disclosure, the location for the protrusion pair 247 as described with example view 304 and the position of the imaging device 112, the ultrasound beam 305 and/or the parameters (e.g., thickness, depth, shape, focal point, etc.) of the ultrasound beam 305 are used to verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118. According to at least one embodiment of the present disclosure, at least three sets of saved coordinates for the tips of the protrusion pair 227 are required to verify the calibration of the imaging device 112 with the electromagnetic navigation of the navigation system 118.
[0153] Fig. 3C illustrates an example of imaging device 112 emitting an ultrasound beam 305 having a projection 310 towards the tip of single protrusion 247 in accordance with aspects of the present disclosure, as well as an alternative placement where the protrusion 247-c is parallel to the main ultrasound beam plane, and Fig. 3D illustrates a view of imaging device 112 emitting an ultrasound beam 305 having a projection 310 in accordance with aspects of the present disclosure.
[0154] As illustrated, the projection 310 of the ultrasound beam 305 is not exactly a plane, but has a thickness which varies based on the distance from the emitter of the imaging device 112 to towards the tip of the protrusion 247-a. Thus, this thickness can introduce errors as illustrated in association with calibrating the virtual space 155 to the navigation space 119. For example, as illustrated in Fig. 3C, even though protrusion 247- x may be represented as being in the center of the ultrasonic beam projection 310, when viewed the XY plane, protrusion 247-x may actually be offset with respect to center of the ultrasonic beam projection 310 as indicated.
[0155] Fig. 3E illustrates example elements 320 (e.g., element 320-a, element 320-b) that may be implemented at protrusions 247-a and 247-b (e.g., at a tip 325 of the protrusion) in association with calibrating the virtual space 155 to the navigation space 119. In an example implementation 370, element 320-a may be a gel (or other materials with different acoustic properties from the medium -e.g., water- and the one of the protrusions) that is cylinder shaped. At position “c” only the end portion of the element 320-a is considered and an unobstructed view of the end portion of element 320-a is represented in image “c”. At position “b” the tip of the protrusion 247-a and the end portion of the element 320-a are considered and a view of the end portion of the element 320-a with the tip 325 of the protrusion 247-a provided in the center is represented in image “b”. At position “a” the shaft of the protrusion 247-a and the end portion of the element 320-a are considered and a view of the end portion of the element 320-a with the shaft of the protrusion 247-a provided in the center is represented in image “a”.
According to embodiments of the present disclosure, position “b” which produces image “b” having a representation with the smallest diameter or narrowest shadow of the tip of the protrusion 247-a is a good indication that the tip of the protrusion 247-a is in the center of the projection 310 of the ultrasound beam 305.
[0156] In another example implementation 380, element 320-b may be a gel that is sphere shaped. At position “c” only the end portion of the element 320-b is considered and an unobstructed view of the end portion of element 320-b is represented in image “c”. At position “b” the tip of the protrusion 247-b and the end portion of the element 320-b are considered and a view of the end portion of the element 320-b with the tip 325 of the protrusion 247-b provided in the center is represented in image “b”. At position “a” the shaft of the protrusion 247-b and the end portion of the element 320-b are considered and a view of the end portion of the element 320-b with the shaft of the protrusion 247-b provided in the center is represented in image “a”. According to embodiments of the
present disclosure, position “b” which produces image “b” having a representation with the smallest diameter or narrowest shadow of the tip of the protrusion 247-b is a good indication that the tip of the protrusion 247-b is in the center of the projection 310 of the ultrasound beam 305.
According to an embodiment of the present disclosure, elements 320 function more efficiently when the protrusions 247-a and 247-b are provided perpendicular to the ultrasound beam 305.
[0157] Fig. 3F illustrates example shapes 330 (e.g., shapes 330-a through 330-c) that may be implemented at a tip 325 of the protrusion 247, or the like. Aspects of the present disclosure may include implementing any of the shapes 330, and the shapes 330 may support tip-image center alignment. That is, for example, each shape 330 may be symmetrical with respect to a center of the shape 330, which may support alignment of the center of the shape 330 and a center 335 of an ultrasound beam 305 emitted by the imaging device 112, an example of which is illustrated at Fig. 3G. According to one embodiment of the present disclosure, shapes 330 function more efficiently when the protrusions 247 are provided parallel to the ultrasound beam 305.
[0158] Fig. 3H illustrates example views 371, 372, 373 of a protrusion 247-a within a protrusion holder 390 in accordance with aspects of the present disclosure. Example view 371 is a side view of the protrusion 247-a provided within the protrusion stand 390. Example view 372 is a front view of the protrusion 247-a provided within the protrusion stand 390. Example view 373 is a top view of the protrusion 247-a provided within the protrusion stand 390. As illustrated in example view 372, the protrusion stand 390 has an hour-glass shape 391 at its center portion.
[0159] Fig. 31 illustrates example views 360, 361, 362 of an imaging device 112 imaging a protrusion 247-a with accompanying images 363, 364, 365 formed by imaging in accordance with aspects of the present disclosure. Referring to the example view 360 imaging device 112 transmits an ultrasound beam 305-a towards protrusion 247-a. Imaging device 112 is provided at a first location in which the ultrasound beam 305-a only covers a bottom portion of the protrusion stand 390. Imaging the protrusion stand 390 and the protrusion 247-a produces image 363 with an ultrasound beam projection 317-a including only a representation 365-a of the imaged bottom portion of the protrusion stand 390.
[0160] Referring to the example view 361 imaging device 112 transmits an ultrasound beam 305-b towards protrusion 247-a. Imaging device 112 is provided at a second
location in which the ultrasound beam 305-b covers a tip 325 of the protrusion 247-a and part of the hour-glass shape 391 of the protrusion stand 390. Imaging the protrusion stand 390 and the protrusion 247-a produces image 364 with an ultrasound beam projection 317- b including a representation 365-b of the imaged protrusion stand 390 and a representation 377-b of the imaged tip of the protrusion 247-a.
[0161] Referring to the example view 362 imaging device 112 transmits an ultrasound beam 305-c towards protrusion 247-a. Imaging device 112 is provided at a third location in which the ultrasound beam 305-c covers a shaft and a tip 325 of the protrusion 247-a and another part of the hour-glass shape 391 of the protrusion stand 390. Imaging the protrusion stand 390 and the protrusion 247-a produces image 365 with an ultrasound beam projection 317-c including a representation 365-c of the imaged protrusion stand 390 and a representation 377-c of the imaged shaft and tip of the protrusion 247-a.
[0162] According to embodiments of the present disclosure, example view 362 provides the best position of the imaging device 112 since the tip 325 of the protrusion 247-a is provided in the center 335 of the ultrasound beam 305-a. This is shown in image 365 with the representation 365-c of the imaged protrusion stand 390 having the smallest or the narrowest shadow as compared to other representations (365-a and 365-b) of the imaged stand 390. Moreover, it is the hour-glass shape 391 that causes the smallest 365-c shown at the right tip/beam-center position.
[0163] Fig. 4 illustrates an example of a method 400 in accordance with aspects of the present disclosure. In some examples, method 400 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
[0164] In the following description of method 400, the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of method 400, or other operations may be added to the method 400.
[0165] It is to be understood that any of the operations of method 400 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein. Generally, method 400 starts with a START operation at step 404 and ends with an END operation at step 456. Method 400 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium. Hereinafter, method 400 shall be explained with reference to the
systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
[0166] Method 400 begins with the START operation at step 404 and proceeds to step 408, where the system 100 generates a navigation space 119 as described herein. After generating a navigation space 119 at step 400, method 400 proceeds to step 408, where the system 100 generates a virtual space 155 based on images captured by the imaging device 112 as described herein. For example, the system 100 may generate the virtual space 155 based on images representing the inside of calibration phantom 149-b (e.g., water bath). After generating a virtual space 155 based on images captured by the imaging device 112 at step 408, method 400 proceeds to step 412, where the system 100 initiates a calibration process 470 in accordance with aspects of the present disclosure. For example, at step 412, the system 100 may initiate calibration of the coordinate system 160 associated with the virtual space 155 (and imaging device 112) with respect to the coordinate system 165 associated with the navigation space 119 (and navigation system 118). After initiating a calibration process 470 at step 416, method 400 proceeds to step 420, where the system 100 provides user guidance information 175 (e.g., tutorials, user prompts, corrective actions, real-time actional feedback, etc.) described herein regarding how to move or position a device (e.g., the imaging device 112, etc.) in association with the calibration process. The terms “guidance information” and “calibration guidance information” may be used interchangeably herein. Aspects of the present disclosure support implementations with or without providing user guidance information 175. After providing user information 175 at step 420, method 400 proceeds to decision step 424, where the system 100 determines if an event has been detected.
[0167] The system 100 may determine, from an image 153 (or multimedia file 154), whether an event has occurred in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247. If no event has been detected, (NO) at decision step 424, where the system 100 analyzes subsequent images 153 (or multimedia files 154) until the system 100 detects an event in which the field of view 159 of the imaging device 112 intersects a mid-point of a protrusion pair 247. In some aspects of the present disclosure, the system 100 may return to 425 and provide additional user guidance information 175 that prompts a user to position or orient the imaging device 112 to trigger such an event. If an event has been detected, (YES) at decision step 424, where the system 100 analyzes subsequent images 153 (or multimedia files 154), to determine whether an event has occurred in which the ultrasound beam intersects the midpoint of each
protrusion of a protrusion pair 247, method 400 proceeds to step 428, where the system 100 identifies, from the image 153 (or multimedia file 154) the set of coordinates 157 at which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247. In some aspects of the present disclosure, at step 428, the system 100 may identify the temporal information 156 associated with when the protrusion pair 247 intersected the ultrasound beam of the imaging device. In some cases, the system 100 may identify pose information 162 (in the virtual space 155) of the protrusion pair 247 that corresponds to the temporal information 156.
[0168] After identifying from the image 153 (or multimedia file 154) the set of coordinates 157 at which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 at step 428, method 400 proceeds to step 432, where the system 100 calibrates the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and the temporal information 156 as described herein. In an example, the system 100 may calibrate the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157, the temporal information 156, the pose information 161 (of the instrument 145 in the virtual space 155) corresponding to the temporal information 156 and pose information 162 (of the instrument 145 in the navigation space 119) corresponding to the temporal information 156. After calibrating the coordinate system 160 with respect to the coordinate system 165 based on the set of coordinates 157 and the temporal information 156 at step 432, method 400 proceeds to decision step 436 where it is determined whether to repeat the calibration process 470. For example, at step 436, the system 100 may determine whether a user input requesting recalibration has been received. If no request is received requesting recalibration (NO) at decision step 436, method 400 proceeds to decision step 440, where the system 100 determines whether a temporal duration (e.g., recalibration every X hours, every day, etc.) associated with performing recalibration has elapsed. If a response is not received identifying that the temporal duration has elapsed (NO) at decision step 436, method 400 proceeds to decision step 444, where the system 100 detects for any losses in calibration between the imaging device 112 and the navigation system 118 (e.g., the navigation system 118 is unable to track the imaging device 112) has occurred. If no detection in a loss in calibration between the imaging device 112 and the navigation system 118 (e.g., the navigation system 118 is unable to track the imaging device 112), (NO) at decision step 444, method 400 proceeds to decision step 448, where the system 448 detects an event.
[0169] According to example aspects of the present disclosure, based on decisions by the system 100 at any of step 436 through 444, the system 100 may repeat the calibration process 470, beginning at any operation (e.g., generating the virtual space 155 at step 412, initiating calibration at step 416, etc.) of the calibration process 470. In an example, in response to a (YES) decision at any of step 436 through step 444, the system 100 may return to step 412 and generate the virtual space 155, but while navigating the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112. For example, in repeating the calibration process 470, the system 100 may regenerate the virtual space 155 based on images captured by the imaging device 112 as described herein, but the images may be associated with or include calibration phantom 149-a (e.g., tissue phantom).
[0170] In an example implementation, after repeating the calibration process 470, the system 100 may again return to step 412 in response to a (YES) decision at any of step 436 through step 444 and generate the virtual space 155, while imaging the calibration phantom 149-a (e.g., tissue phantom) with the imaging device 112.
[0171] In an alternative or additional example, in response to a (NO) decision at any of step 436 through step 444, the system 100 may continue to provide navigation information (e.g., tracking information 167, etc.).
[0172] While providing imaging information (e.g., images 153, etc.) and navigation information (e.g., tracking information 167, etc.), the system 100 may monitor for one or more events in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 as described herein. In an example, decision step 448, the system 100 detects an event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair 247 (Event Detected). At step 453, the system 100 determines, from the event detected at decision step 448, whether recalibration is to be performed.
[0173] In an example implementation, the system 100 may detect the amount of distortion in the navigated volume 150 (e.g., discrepancies between navigation data associated with the protrusion pair 247 and imaging data associated with the protrusion pair 247) based on the event detected at 448. Based on the amount of distortion detected by the system 100, the system 100 may return to step 428 (e.g., for recalibration) or refrain from returning to step 428 (e.g., abstain from performing recalibration).
[0174] For example, the system 100 may determine that the amount of distortion is greater than a threshold distortion value, and at (YES) decision step 452 and the system 100 may return to decision step 428 and repeat the calibration as described with reference to step 432. In another example, the system 100 may determine that the amount of
distortion is less than the threshold distortion value, and (NO) at decision step 452 and the system 100 may continue to provide imaging information and/or navigation information (e.g., tracking information 167, etc.) while monitoring for any of the events described with reference to step 436 through step 448. After all of the information for recalibration has been completed also (NO) at decision step 452, method 400 ends with the END operation at step 556.
[0175] As supported by aspects of the present disclosure, the system 100 may determine at step 452 whether to perform recalibration, at any occurrence of an event detected at step 448. For example, the system 100 may perform recalibration at each occurrence of an event detected at step 448, at each nth occurrence of the event, or at each nth occurrence of the event within a temporal duration. As illustrated and described herein, the example aspects of the method 400 described herein support automatic and continuous (or semi- continuous) recalibration by the system 100.
[0176] Fig. 5 illustrates an example of a method 500 in accordance with aspects of the present disclosure. In some examples, method 500 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
[0177] In the following description of method 500, the operations may be performed in a different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of method 500, or other operations may be added to the method 500.
[0178] It is to be understood that any of the operations of method 500 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein. Generally, method 500 starts with a START operation at step 504 and ends with an END operation at step 532. Method 500 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium. Hereinafter, method 500 shall be explained with reference to the systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
[0179] Method 500 begins with the START operation at step 504 and proceeds to step 508, where the system 100 generates a navigation space 119 based on one or more tracking signals. After generating a navigation space 119 based on one or more tracking signals at step 504, method 500 proceeds to step 512, where systems 100 generates a
virtual space including at least a portion of a calibration phantom based on one or more images, wherein the one or more images are generated in response to transmitting one or more imaging signals.
[0180] In some aspects of the present disclosure, the calibration phantom includes an ultrasound conductive material. For example, the calibration phantom may include an ultrasound transmitting volume (e.g., water bath). In some aspects of the present disclosure, the calibration phantom includes a tissue phantom included in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure. In some aspects of the present disclosure, the virtual space corresponds to a field of view of the imaging device.
[0181] After generating a virtual space at step 512, method 500 proceeds to step 516, where the system 100 identifies a set of coordinates in the virtual space in response to an event in which the ultrasound beam intersects the midpoint of each protrusion of a protrusion pair.
[0182] In some aspects of the present disclosure, the navigation space and the tracked device are associated with at least one of: an optical tracking system, an acoustic tracking system, an electromagnetic tracking system, a magnetic tracking system, a radar tracking system, an inertial measurement unit (IMU) based tracking system, and a computer vision based tracking system.
[0183] After identifying coordinates in the virtual space at step 516, method 500 proceeds to step 520, where the system 100 calibrates a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the set of coordinates and temporal information associated with the event.
[0184] In some aspects of the present disclosure, calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the event. In some aspects, calibrating the first coordinate system with respect to the second coordinate system is absent pausing the surgical procedure.
[0185] In some aspects, calibrating the first coordinate system with respect to the second coordinate system is based on: beam thickness (e.g., ultrasound beam thickness), beam shape (e.g., ultrasound beam shape), or both of the one or more signals transmitted by the imaging device; pose information of protrusion pair in association with an intersection between the portion of the tracked device and a plane of the virtual space; and one or more
properties of the portion of the tracked device.
[0186] In some aspects of the present disclosure calibrating the first coordinate system with respect to the second coordinate system includes verifying a registration accuracy between the first coordinate system and the second coordinate system.
[0187] Atter calibrating the coordinates system at step 520, method 500 proceeds to step 525, where the system 100 outputs guidance information associated with positioning the imaging device in association with calibrating the first coordinate system with respect to the second coordinate system.
[0188] In some aspects of the present disclosure, method 500 may include detecting one or more discrepancies between first tracking data corresponding to the protrusion pair and the point of view of the imaging device in association with the navigation space and second tracking data corresponding to the protrusion pair and the point of view in association with the virtual space. In some aspects of the present disclosure, method 500 may include generating a notification associated with the one or more discrepancies, performing one or more operations associated with compensating for the one or more discrepancies, or both. After outputting guidance information at step 524, method 500 ends with END operation at step 532.
[0189] Method 5000 (and/or one or more operations thereof) may be carried out or otherwise performed, for example, by at least one processor. The at least one processor may be the same as or similar to the processor(s) 104 of the computing device 102 described above. The at least one processor may be part of a robot (such as a robot 114) or part of a navigation system (such as a navigation system 118). A processor other than any processor described herein may also be used to execute the process flow 500. The at least one processor may perform operations of the process flow 500 by executing elements stored in a memory such as the memory 106. The elements stored in memory and executed by the processor may cause the processor to execute one or more operations of a function as shown in the process flow 500. One or more portions of the process flow 500 may be performed by the processor executing any of the contents of memory, such as image processing 120, a segmentation 122, a transformation 124, and/or a registration 128.
[0190] Fig. 6 illustrates an example of a method 600 in accordance with aspects of the present disclosure. In some examples, method 600 may implement aspects of a computing device 102, an imaging device 112, a robot 114, and a navigation system 118 described with reference to Figs. 1 through 31.
[0191] In the following description of method 600, the operations may be performed in a
different order than the order shown, or the operations may be performed in different orders or at different times. Certain operations may also be left out of method 600, or other operations may be added to the method 600.
[0192] It is to be understood that any of the operations of method 600 may be performed by any device (e.g., a computing device 102, an imaging device 112, a robot 114, navigation system 118, etc.) of the system 100 described herein. Generally, method 600 starts with a START operation at step 604 and ends with an END operation at step 644. Method 600 can be executed as a set of computer-executable instructions executed by a computer system (e.g., computing device 102, etc.) and encoded or stored on a computer readable medium. Hereinafter, method 600 shall be explained with reference to the systems, components, modules, applications, software, data structures, user interfaces, etc. described in conjunction with Figs. 1-31.
[0193] Method 600 begins with the START operation at step 604 and proceeds to step 608, where the system 100 detects that the imaging device 112 (e.g., ultrasound probe) is placed in the calibration phantom 149-b. According to embodiments of the present disclosure, the imaging device 112 is wiggled within the calibration phantom 149-b such that the imaging device 112 is provided at various locations within the calibration phantom 149-b. After detecting that the imaging device is placed and moved about within the calibration phantom 149-b at step 608, method 600 proceeds to decision step 612, where the system 100 determines if a protrusion pair has been detected. According to embodiments of the present disclosure, one protrusion pair is provided at a known location within the calibration phantom 149-b. According to an alternative embodiment of the present disclosure, a plurality of protrusion pairs is provided at known locations within the calibration phantom 149-b. In one embodiment of the present disclosure, the distance between protrusions of the plurality of protrusion pairs is the same of each protrusion pair. According to an alternative embodiment of the present disclosure, the distance between protrusions of plurality of protrusion pairs is different for each protrusion pair.
[0194] If a protrusion pair has not been detected, (NO) at decision step 612, method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If a protrusion pair has been detected, (YES) at decision step 612, method 600 proceeds to decision step 616, where the system 100 detects protrusion pair tips. According to an embodiment of the present disclosure, the detection of the protrusion pair tips can be enhanced by including element 320-a which is cylinder shaped or including element 320-b which is sphere shaped. Element 320-a and element
320-b function more efficiently when the protrusions 247 are provided perpendicular to the ultrasound beam 305. Alternatively, the detection of the protrusion pair tips can enhance by including shapes 330-a through 330-c at a tip 325 of the protrusion 247, or the like. The shapes 330 function more efficiently when the protrusions 247 are provided parallel to the ultrasound beam 305.
[0195] If a protrusion pair tip has not been detected, (NO) at decision step 616, method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If a protrusion pair tip has been detected, (YES) at decision step 616, method 600 proceeds to step 620, where the system 100 retrieves calibration phantom pose data. The calibration phantom pose data is known. According to embodiment of the present disclosure, the calibration phantom 149-b is provided with a tracking device 140-c. According to embodiment of the present disclosure, the calibration phantom 149-b generally remains stationary. Therefore, the location of the calibration phantom 149-b with respect to the transmission device 136 is a known value which is stored in memory 106.
[0196] After retrieving the calibration phantom pose data at step 620, method 600 proceeds to step 624, where the system 100 calculates the protrusion pair tip pose. The protrusion pair tip pose is calculated using the known location of the phantom pose 149-b. [0197] After calculating the protrusion pair tip pose at step 624, method 600 proceeds to step 628, where the system 100 retrieves the protrusion pair tip coordinates in the image space of the imaging device 112. After retrieving the protrusion pair tip coordinates in the image space of the imaging device 112 at step 628, method 600 proceeds to step 632, where the system 100 calculates and saves target-source point pairs. After calculating and saving target-source point pairs at step 632, method 600 proceeds to decision step 636, where the system 100 determines if there are enough target-source point pairs saved. According to one embodiment of the present disclosure, at least three target-source point pairs are required to initiate calibration of the coordinate system 160 associated with the virtual space 155 (and the imaging device 112) with respect to the coordinate system 165 associated with the navigation space 119 (and navigation system 118). If there are not enough target-source point pairs saved, (NO) at decision step 636, method 600 returns to step 608 where the system 100 detects that the imaging device 112 is placed in the calibration phantom 149-b. If there are enough target-source point pairs saved, (YES) at decision step 636, method 600 proceeds to step 640, where the system 100 calculates the registration matrix. After the registration matrix has been calculated at step 640, method
600 ends with END operation at step 644.
[0198] As noted above, the present disclosure encompasses methods with fewer than all of the steps identified herein (and the corresponding description of respective process flows), as well as methods that include additional steps beyond those identified in the figures and process flows described herein). The present disclosure also encompasses methods that include one or more steps from one method described herein, and one or more steps from another method described herein. Any correlation described herein may be or include a registration or any other correlation.
[0199] The foregoing is not intended to limit the disclosure to the form or forms disclosed herein. In the foregoing Detailed Description, for example, various features of the disclosure are grouped together in one or more aspects, implementations, and/or configurations for the purpose of streamlining the disclosure. The features of the aspects, implementations, and/or configurations of the disclosure may be combined in alternate aspects, implementations, and/or configurations other than those discussed above. This method of disclosure is not to be interpreted as reflecting an intention that the claims require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed aspect, implementation, and/or configuration. Thus, the following claims are hereby incorporated into this Detailed Description, with each claim standing on its own as a separate preferred implementation of the disclosure.
[0200] Moreover, though the foregoing has included description of one or more aspects, implementations, and/or configurations and certain variations and modifications, other variations, combinations, and modifications are within the scope of the disclosure, e.g., as may be within the skill and knowledge of those in the art, after understanding the present disclosure. It is intended to obtain rights which include alternative aspects, implementations, and/or configurations to the extent permitted, including alternate, interchangeable and/or equivalent structures, functions, ranges or steps to those claimed, whether or not such alternate, interchangeable and/or equivalent structures, functions, ranges or steps are disclosed herein, and without intending to publicly dedicate any patentable subject matter.
[0201] Any aspect in combination with any one or more other aspects.
[0202] Any one or more of the features disclosed herein.
[0203] Any one or more of the features as substantially disclosed herein.
[0204] Any one or more of the features as substantially disclosed herein in combination
with any one or more other features as substantially disclosed herein.
[0205] Any one of the aspects/features/implementations in combination with any one or more other aspects/features/implementations.
[0206] Use of any one or more of the aspects or features as disclosed herein.
[0207] It is to be appreciated that any feature described herein can be claimed in combination with any other feature(s) as described herein, regardless of whether the features come from the same described implementation.
[0208] The phrases “at least one,” “one or more,” “or,” and “and/or” are open-ended expressions that are both conjunctive and disjunctive in operation. For example, each of the expressions “at least one of A, B and C,” “at least one of A, B, or C,” “one or more of A, B, and C,” “one or more of A, B, or C,” “A, B, and/or C,” and “A, B, or C” means A alone, B alone, C alone, A and B together, A and C together, B and C together, or A, B and C together.
[0209] The term “a” or “an” entity refers to one or more of that entity. As such, the terms “a” (or “an”), “one or more,” and “at least one” can be used interchangeably herein. It is also to be noted that the terms “comprising,” “including,” and “having” can be used interchangeably.
[0210] The term “automatic” and variations thereof, as used herein, refers to any process or operation, which is typically continuous or semi-continuous, done without material human input when the process or operation is performed. However, a process or operation can be automatic, even though performance of the process or operation uses material or immaterial human input, if the input is received before performance of the process or operation. Human input is deemed to be material if such input influences how the process or operation will be performed. Human input that consents to the performance of the process or operation is not deemed to be “material.”
[0211] Aspects of the present disclosure may take the form of an implementation that is entirely hardware, an implementation that is entirely software (including firmware, resident software, micro-code, etc.) or an implementation combining software and hardware aspects that may all generally be referred to herein as a “circuit,” “module,” or “system.” Any combination of one or more computer-readable medium(s) may be utilized. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium.
[0212] A computer-readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system,
apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer-readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer-readable storage medium may be any tangible medium that can contain or store a program for use by or in connection with an instruction execution system, apparatus, or device.
[0213] A computer-readable signal medium may include a propagated data signal with computer-readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer-readable medium may be transmitted using any appropriate medium, including, but not limited to, wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
[0214] The terms “determine,” “calculate,” “compute,” and variations thereof, as used herein, are used interchangeably and include any type of methodology, process, mathematical operation or technique.
Claims
1. A system, comprising: a processor (104); and a memory (106) storing instructions that, when executed by the processor, cause the processor to: generate a navigation space (119) based on one or more tracking signals from a calibration phantom (1 9) and an imaging device (112); generate a virtual space (155) comprising at least a portion the calibration phantom including a protrusion pair (247) or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom; determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects a midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the first event.
2. The system of claim 1, further comprising a plurality of protrusion pairs, wherein the instructions are further executable by the processor to: determine the optimal set of coordinates in the virtual space based on a third event in which the imaging device projection intersects each protrusion of the protrusion pair of more than one protrusion pair of the plurality of protrusion pairs.
3. The system of claim 2, wherein a distance between protrusions of each protrusion pair of the plurality of protrusion pairs is different.
4. The system of claim 2 or 3, wherein a diameter and a shape of the protrusions of each protrusion pair of the plurality of protrusion pairs is different.
5. The system of any preceding claim, wherein the instructions are further executable by the processor to calibrate the first coordinate system associated with the virtual space with respect to the second coordinate system associated with the navigation space in response to the second event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the second event.
6. The system of claim 1, wherein the tip portion of the single protrusion has a concave shape.
7. The system of claim 6, wherein the concave shape is symmetrical with respect to a center of the concave shape.
8. The system of claim 1, further comprising a plurality of single protrusions, wherein the plurality of single protrusions includes a cylinder or a sphere at the tip portion.
9. The system of any preceding claim, wherein the instructions are further executable by the processor to output guidance information associated with positioning the imaging device in association with calibrating the first coordinate system with respect to the second coordinate system.
10. The system of any preceding claim, wherein: calibrating the first coordinate system with respect to the second coordinate system is in response to one or more occurrences of the first event or the second event; and calibrating the first coordinate system with respect to the second coordinate system is absent pausing a surgical procedure.
11. The system of any preceding claim, wherein calibrating the first coordinate system with respect to the second coordinate system is based on beam thickness, beam shape and/or signal frequency.
12. The system of any preceding claim, wherein the calibration phantom comprises: ultrasound conductive material; or a tissue phantom comprised in a body of a subject, wherein the tissue phantom and the subject are associated with a surgical procedure.
13. The system of claim 1, wherein the protrusion pair is provided perpendicular to the imaging device projection and the single protrusion is provided parallel to the imaging device projection.
14. The system of any preceding claim, wherein the instructions are further executable by the processor to: detect one or more discrepancies between first tracking data corresponding to the imaging device in association with the navigation space and second tracking data corresponding to the imaging device in association with the virtual space; and generate a notification associated with the one or more discrepancies, perform one or more operations associated with compensating for the one or more discrepancies, or both.
15. A system, comprising: an imaging system comprising an imaging device; a tracking system comprising a transmission device; a calibration phantom; a processor; and a memory storing data that, when processed by the processor, cause the processor to: generate a navigation space based on one or more tracking signals; generate a virtual space comprising at least a portion the calibration phantom including a protrusion pair or a single protrusion based on one or more images, wherein the one or more images are generated by one or more signals transmitted by the imaging device; identify sets of coordinates in the virtual space based on an imaging device projection intersecting the protrusion pair or intersecting the single protrusion in the calibration phantom;
determine an optimal set of coordinates in the virtual space based on a first event in which the imaging device projection intersects a midpoint of each protrusion of the protrusion pair or based on a second event in which the imaging device projection intersects a tip portion of the single protrusion; and calibrate a first coordinate system associated with the virtual space with respect to a second coordinate system associated with the navigation space in response to the first event, wherein calibrating the first coordinate system with respect to the second coordinate system is based on the optimal set of coordinates and temporal information associated with the first event.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363607053P | 2023-12-06 | 2023-12-06 | |
| US63/607,053 | 2023-12-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025122631A1 true WO2025122631A1 (en) | 2025-06-12 |
Family
ID=93924701
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/058483 Pending WO2025122631A1 (en) | 2023-12-06 | 2024-12-04 | System and method for automatic ultrasound 3d- point detection and selection for ultrasound probe registration for navigation |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025122631A1 (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140121501A1 (en) * | 2012-10-31 | 2014-05-01 | Queen's University At Kingston | Automated intraoperative ultrasound calibration |
-
2024
- 2024-12-04 WO PCT/US2024/058483 patent/WO2025122631A1/en active Pending
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140121501A1 (en) * | 2012-10-31 | 2014-05-01 | Queen's University At Kingston | Automated intraoperative ultrasound calibration |
Non-Patent Citations (2)
| Title |
|---|
| SURE U ET AL: "Intraoperative landmarking of vascular anatomy by integration of duplex and Doppler ultrasonography in image-guided surgery. Technical note", SURGICAL NEUROLOGY, LITTLE, BROWN AND CO., BOSTON, MA, US, vol. 63, no. 2, 2 February 2005 (2005-02-02), pages 133 - 141, XP004847601, ISSN: 0090-3019, DOI: 10.1016/J.SURNEU.2004.08.040 * |
| WANIS FREDERIC A ET AL: "Technical accuracy of the integration of an external ultrasonography system into a navigation platform: effects of ultrasonography probe registration and target detection", ACTA NEUROCHIRCA, SPRINGER VERLAG, AT, vol. 160, no. 2, 8 December 2017 (2017-12-08), pages 305 - 316, XP036394349, ISSN: 0001-6268, [retrieved on 20171208], DOI: 10.1007/S00701-017-3416-5 * |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20220395342A1 (en) | Multi-arm robotic systems and methods for monitoring a target or performing a surgical procedure | |
| EP3968861B1 (en) | Ultrasound system and method for tracking movement of an object | |
| US20230389991A1 (en) | Spinous process clamp registration and methods for using the same | |
| US20220125526A1 (en) | Systems and methods for segmental tracking | |
| CN116437866A (en) | Method, apparatus and system for generating an image based on calculated robot arm position | |
| CN118648066A (en) | System, method and apparatus for providing enhanced display | |
| US20230346492A1 (en) | Robotic surgical system with floating patient mount | |
| US20240415496A1 (en) | System and method to register and calibrate ultrasound probe for navigation in real time | |
| US12094128B2 (en) | Robot integrated segmental tracking | |
| WO2025122631A1 (en) | System and method for automatic ultrasound 3d- point detection and selection for ultrasound probe registration for navigation | |
| WO2024257035A1 (en) | System and method to register and calibrate ultrasound probe for navigation in real time | |
| EP3738515A1 (en) | Ultrasound system and method for tracking movement of an object | |
| US12310676B2 (en) | Navigation at ultra low to high frequencies | |
| CN118102968A (en) | System, apparatus and method for placing electrodes for anatomic imaging robots | |
| US20240156531A1 (en) | Method for creating a surgical plan based on an ultrasound view | |
| WO2024229651A1 (en) | Intelligent positioning of robot arm cart | |
| US20240382169A1 (en) | Long image multi-field of view preview | |
| US12475987B2 (en) | Robotically-assisted drug delivery | |
| US20240398362A1 (en) | Ultra-wide 2d scout images for field of view preview | |
| WO2025079075A1 (en) | Following navigation camera | |
| WO2024229649A1 (en) | Non-invasive patient tracker for surgical procedure | |
| WO2024249025A1 (en) | Ultra-wide 2d scout images for field of view preview | |
| CN118647331A (en) | System and apparatus for generating hybrid images | |
| WO2021234520A1 (en) | System and method for independently positioning an ablation tool and imaging device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24827625 Country of ref document: EP Kind code of ref document: A1 |