[go: up one dir, main page]

WO2025240638A1 - Three-dimensional (3d) cadmium-zinc-telluride (czt) detector system for extended reality intrasurgical guidance - Google Patents

Three-dimensional (3d) cadmium-zinc-telluride (czt) detector system for extended reality intrasurgical guidance

Info

Publication number
WO2025240638A1
WO2025240638A1 PCT/US2025/029392 US2025029392W WO2025240638A1 WO 2025240638 A1 WO2025240638 A1 WO 2025240638A1 US 2025029392 W US2025029392 W US 2025029392W WO 2025240638 A1 WO2025240638 A1 WO 2025240638A1
Authority
WO
WIPO (PCT)
Prior art keywords
radiation
patient body
czt
detectors
radioactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2025/029392
Other languages
French (fr)
Inventor
Steven Brown
Jerimy POLF
Willy Kaye
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
M3d Inc
Original Assignee
M3d Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by M3d Inc filed Critical M3d Inc
Publication of WO2025240638A1 publication Critical patent/WO2025240638A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01TMEASUREMENT OF NUCLEAR OR X-RADIATION
    • G01T1/00Measuring X-radiation, gamma radiation, corpuscular radiation, or cosmic radiation
    • G01T1/16Measuring radiation intensity
    • G01T1/24Measuring radiation intensity with semiconductor detectors
    • G01T1/249Measuring radiation intensity with semiconductor detectors specially adapted for use in SPECT or PET
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/42Arrangements for detecting radiation specially adapted for radiation diagnosis
    • A61B6/4208Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector
    • A61B6/4258Arrangements for detecting radiation specially adapted for radiation diagnosis characterised by using a particular type of detector for detecting non x-ray radiation, e.g. gamma radiation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/44Constructional features of apparatus for radiation diagnosis
    • A61B6/4405Constructional features of apparatus for radiation diagnosis the apparatus being movable or portable, e.g. handheld or mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest

Definitions

  • Gamma rays are a form of electromagnetic radiation that is detectable through a semiconductor detector.
  • Gamma rays can interact with the semiconductor detector, resulting in the generation of charge carriers through electron ionization.
  • Negative charge carriers, such as electrons can travel toward and be collected by an anode (a positively biased electrode), while positive charge carriers, such as holes in the semiconductor detector, can travel toward and be collected by a cathode (a negatively biased electrode).
  • the charge carriers can induce a signal in the electrodes, which can be measured to determine the amount of charge absorbed. Given that the charge carriers derive from interactions of the gamma rays with the semiconductor device, the induced signals in the electrodes can be used to measure the energy absorbed from the gamma ray interactions in the semiconductor device.
  • Figure 1 illustrates a simplified schematic view of a 3D-sensitive radiation detection system in accordance with embodiments of the present technology.
  • Figure 2 illustrates a simplified schematic view of a 3D-sensitive radiation detector in operation in accordance with embodiments of the present technology.
  • Figure 3 illustrates a simplified schematic view of a combined depth and radiation image generated by a 3D-sensitive radiation detector in accordance with embodiments of the present technology.
  • Figure 4 illustrates a simplified schematic view of a 3D-sensitive radiation detection system in operation in accordance with embodiments of the present technology.
  • FIGS 5B-5C illustrate example arrangements of radiation detectors included in an intraoperative radiation detection system, in accordance with embodiments of the present technology.
  • Figure 7 illustrates a block diagram of an example computing system configured to implement the technical solutions disclosed herein.
  • Radiation is used in medical applications to illuminate anatomic systems.
  • radiation in the form of radioisotopes
  • Gamma rays can be emitted from areas in the body containing the injected radioisotopes (referred to as irradiated areas) and detected using a detection device (e.g., a semiconductor device, such as a cadmium-zinc-telluride (CdZnTe or CZT) detector), resulting in the irradiated areas being illuminated in the image of the body.
  • a detection device e.g., a semiconductor device, such as a cadmium-zinc-telluride (CdZnTe or CZT) detector
  • the image can be used to identify lymph nodes that are connected to, and likely infected by, the cancerous tumor to enable the removal of the inf
  • a semiconductor detector can be pixelated to enable the gamma rays incident upon different x-y locations of the semiconductor detector to be detected through the measurement of charge carriers created from the gamma rays and collected at the pixelated electrodes.
  • 2D position sensitivity One challenge associated with 2D position sensitivity is that the radiation measurement can be sensitive to parallax. That is, because gamma rays are emitted radially from irradiated areas, gamma rays originating from a particular x-y location within the body can be incident upon multiple x-y locations within the semiconductor detectors, often creating unacceptable uncertainty in medical applications.
  • the number of x-y locations within the detector upon which a single gamma ray can be incident increases, resulting in a corresponding increase in uncertainty of the x- y location of the radiated area within the body.
  • 3D position-sensitive detectors provide the ability to overcome some of the disadvantages of 2D position-sensitive detectors and address concerns related to parallax.
  • 3D position-sensitive detectors measure charge drift from the point at which the charge carriers originate within the semiconductor detector to the point at which they are absorbed by the electrodes.
  • the angle of incidence at which a gamma ray is incident upon the semiconductor detector can be determined, which in turn can be used to determine the location of a radiated area from which the gamma ray is emitted. Further details about the use of 3D position-sensitive detectors can be found in U.S. Patent No. 7,411 ,197 to He et al. and U.S. Patent Publication No. 2009/01 14829A1 to He et al, each of which being incorporated by referenced herein in their respective entireties.
  • detecting radiation using 3D detectors can require complex calculations.
  • this complex detection of radiation from multiple sources and angles can make it difficult to reflect equally radiating sources with equal intensity, resulting in blurred or unprecise images.
  • the complexity of these calculations can decrease detection speed, which can make some 3D detectors suboptimal for time-sensitive applications, such as intrasurgical imaging. This complexity only increases when full-body imaging is performed due to the increased number of detections and the larger distances between the irradiated areas and the detector. In some applications, however, it may only be necessary to image a small target area.
  • a surgeon may wish to image a tumor and its surrounding area to determine the location of affected lymph nodes through which radiation from the tumor is drained. Accordingly, there is a need for a 3D positionsensitive radiation detector capable of providing a timely image of a target area.
  • the present technology provides such systems and techniques.
  • a 3D-sensitive radiation detector is disclosed.
  • the 3D-sensitive radiation detector includes a compact semiconductor detector (e.g., 5 cm x 5 cm x 1 cm) capable of providing a fast (e.g., less than one minute), accurate, and low-cost image of the target area.
  • the semiconductor detector Given its compact size, the semiconductor detector can be placed close (e.g., less than 20, 18, 15, 12, 10, 8, 5 cm, and so on) to the body to specifically image a smaller target area (e.g., within the footprint of the detector).
  • the 3D-sensitive radiation detector can have a larger thickness (e.g., greater than 5, 6, 7, 8, 9, 10, 12, 15, 20 cm, and so on) than other semiconductor detectors to enable a greater number of gamma rays to be detected. As a result, the radiation image can be generated more quickly in comparison to other semiconductor detectors.
  • a mask of radiation-blocking material e.g., tungsten
  • a mask of radiation-blocking material can be placed over and around the semiconductor detector to control the locations and angles at which gamma rays can be incident upon the semiconductor detector.
  • openings can be located at select locations of the mask to enable gamma rays to pass from irradiated areas to the semiconductor detectors.
  • the openings can be located to ensure collection of sufficient information for the accurate reconstruction of the radiation source distribution.
  • different mask designs are discussed that include the use of pinhole openings and annular openings.
  • the openings can be designed with particular taper patterns to enable gamma rays from specific locations and angles to be detected.
  • the location and taper patterns of the openings can be considered when determining the location from which a detected gamma ray was emitted.
  • the back tracing of multiple detections of the gamma ray to a particular location from which it was emitted can be limited to locations from which the gamma ray could pass through the mask openings, given their location and taper patterns, and be incident upon the semiconductor detector at the locations at which the detections occurred.
  • the radiation image generated by a radiation system can be combined with other modalities for improved analysis, visualization, and the like.
  • the radiation image can be overlaid with an optical image of the patient to show where the radiated tumor is relative to an anatomical view of the patient.
  • a depth sensor can be used to reconstruct an outer surface of the patient, and the radiated area can be presented relative to the outer surface of the patient.
  • the combined images can enable a surgeon to determine where the radiated tumor is within the patient, which can guide the surgeon toward the appropriate place to make an incision to best facilitate removal of the tumor.
  • a laser can be used to point to a portion of the patient that corresponds to the radiated tumor.
  • the laser can provide an on-patient visual indicator that directs the surgeon toward the appropriate place for an incision.
  • the laser can point to a projection of the radiated tumor onto the outer surface of the patient.
  • an intrasurgical detection system includes an extended reality (XR) device that provides immersive visualizations of a patient body that include any radioactive bodies detected by multiple 3D CZT radiation detectors.
  • XR extended reality
  • each 3D CZT radiation detector is configured for fast and accurate imaging based in part on its compact size and ability to be placed close to the body or target region.
  • the multiple radiation detectors are arranged to surround the patient body at various locations and orientations (while being up close to respective target regions), such that the intrasurgical detection system can locate and register a radioactive object to a patient body based on the respective detections of the radioactive object by the multiple radiation detectors.
  • a user e.g., a surgeon, a medical professional, a technician
  • a XR device can receive a visual indication of the detected object when viewing the patient body via the XR device.
  • a surgeon may wear an augmented reality (AR) headset that enhances a view of the patient body with a virtual or artificial representation of a detected radioactive object.
  • AR augmented reality
  • FIG. 1 illustrates a simplified schematic view of a 3D-sensitive radiation detection system 100 in accordance with an embodiment of the present technology.
  • the 3D-sensitive radiation detection system 100 includes a base 102, at least one processor 104, a display 106, an arm 108, at least one camera 110, and a radiation detector 1 12 (e.g., CZT detector).
  • the base 102 can house the at least one processor 104 and support the display 106 and the arm 108.
  • the base 102 can be portable such that it can be easily moved in and out of surgical rooms.
  • the base 102 can include wheels to allow the base to be rolled on the floor.
  • the base 102 can house or support one or more additional input/output devices, such as a keyboard, mouse, controller, printer, or other device.
  • the at least one processor 104 can include any number of processors that perform computations to enable any of the functionality of the 3D-sensitive radiation detection system 100.
  • the at least one processor 104 can include any one or more of a central processing unit (CPU), graphical processing unit (GPU), a System-on-Chip (SoC), an application-specific integrated circuit (ASIC), and so on.
  • the 3D-sensitive radiation detection system 100 can include an ASIC capable of performing the detections of incident gamma rays at the radiation detector 1 12 and at least one separate processor 104 capable of generating a radiation image from the detected radiation or providing functionality to the display 106.
  • the display 106 can be supported by the base 102.
  • the display 106 can be housed within the base 102 or be attached to a stand that is connected to or supported by the base 102.
  • the display 106 can include any number of displays, such as a liquid crystal display (LCD), a light-emitting diode (LED) display, or an organic LED (OLED) display.
  • the display 106 can present an image of the patient provided by the camera 110, a radiation image provided by the radiation detector 112, a reconstruction of a surface measured by the depth sensor, an indication of the laser 114, or a combined image of the radiation image from the radiation detector 112 and at least one of the optical images from the camera 1 10 or the depth image from the depth sensor.
  • the display 106 can include a touch sensor such that input from a user can be received at the display 106. In this way, a particular radiated area can be selected from multiple areas displayed within the image, the displayed image can be adjusted or rotated, or one or more components of the 3D-sensitive radiation detection system 100 can be controlled.
  • the arm 108 can attach to the base 102 and support the camera 1 10, the depth sensor, the radiation detector 1 12, or the laser 1 14.
  • the arm 108 can be functionally coupled to actuators to position the arm 108 such that the camera 1 10, the depth sensor, or the radiation detector 112 is in a particular configuration for imaging.
  • the arm 108 can be positioned into a variety of configurations to enable local imaging at different locations on differently sized patients.
  • the camera 1 10 can be disposed at the arm 108 to enable imaging of the patient.
  • the camera 1 10 can provide an optical image of the patient that can be overlaid with the radiation image to provide greater detail about the specific location of a radiated area within the patient.
  • the radiation detector 1 12 can similarly be disposed at the arm 108. As illustrated, the radiation detector 112 is located at the distal portion of the arm 108 to enable positioning the radiation detector 1 12 close to a target area.
  • the radiation detector 112 can be relatively small (e.g., 5 cm x 5 cm x 1 cm) to enable the radiation detector 1 12 to detect radiation within a small target area.
  • the radiation detector 112 can be pixelated in the z-dimension to enable the detection of gamma rays at a particular x-y-z location.
  • the radiation detector 1 12 can include semiconductive material (e.g., CZT, 3D pixelated scintillators, and so on) and one or more biased electrodes (e.g., anode and cathode).
  • the semiconductive material in the radiation detector 1 12 can be thicker than in other radiation detectors (e.g., approximately 1 cm) to enable a greater number of gamma ray detections.
  • the radiation detector 112 can further include circuitry to provide functionality to the radiation detector 1 12.
  • the radiation detector 112 can include circuitry connected to the biased electrodes and usable to measure electrical properties at the electrode.
  • the circuitry e.g., an ASIC
  • the circuitry can perform one or more operations to trace multiple detections of a gamma ray to a single source.
  • One or more of the at least one processor 104 can co-register the radiation image from the radiation detector 112 with one or more other image.
  • the radiation image can be co-registered with the optical image collected by the camera 110 to display the radiated area overlaid on the patient.
  • the radiation image or the optical image can be further overlaid with the depth image collected by the depth sensor such that a depth of the radiated area is indicated.
  • the depth image can be used to determine the distance of the patient from the depth sensor (e.g., with the additional accuracy needed for surgical applications, for example, mm-level accuracy), which can be used with the depth data determined from the radiation image to determine the depth of the radiated area from an exposed surface of the patient.
  • the depth of the radiated area from the exposed surface of the patient can be indicated within the image.
  • the optical image and the depth image can be used to acquire different views of the patient.
  • the optical image can provide a view of the patient in a plane in which the optical image is taken, while the depth camera can provide a 3D view of the surface of the patient that can be rotated or seen from different angles.
  • the images can be co-registered based on the locations of the various sensors used to collect the images (e.g., the camera 1 10, the depth sensor, or the radiation detector 1 12).
  • the images can be coregistered by adjusting the images to match a single point of reference. As precision is paramount in surgery, the co-registration can be performed with mm-level accuracy.
  • the radiation image can be co-registered with the depth image to create a representation of the radiated area relative to an outer surface of the patient.
  • the depth image can be used to generate a reconstruction of the outer surface of the patient (e.g., a mesh of the patient), and the radiation image can be used to reconstruct the radiated area.
  • the radiation detector 1 12 and the depth sensor can measure the depth of the radiated area and the outer surface of the patient, respectively
  • the combined image from the radiation image and depth image can indicate the depth between the radiated area and the outer surface of the patient. In aspects, this can provide a surgeon information regarding the depth of incision needed to reach the radiated area.
  • An optical laser 1 14 can further be attached to the arm 108 or elsewhere on the 3D- sensitive radiation detection system 100.
  • the laser 114 can be positioned (e.g., by one or more of the at least one processor 104) to point at the outer surface of the patient that corresponds to the radiated area.
  • the laser 1 14 can point to a portion of the outer surface of the patient that corresponds to a projection of the radiated area on the outer surface of the patient.
  • the radiation detector 1 12 can detect multiple irradiated areas that are displayed within an image presented on the display 106.
  • the user can select between the multiple irradiated areas (e.g., using the display 106 or another input device), and the laser 114 can be positioned to point toward a portion of the outer surface of the patient that corresponds to the selected radiated area.
  • the laser 1 14 can point to a radioactive area with the greatest magnitude of radiation.
  • the positioning of the laser 114 can be determined based on the location of the camera 110, the depth sensor, the radiation detector 112, the laser 1 14, the selected radiated area (e.g., determined from the radiation image), or the outer surface of the patient (e.g., determined from the camera 1 10 or the depth sensor).
  • FIG. 2 illustrates a simplified schematic view of a 3D-sensitive radiation detector 1 12 in operation in accordance with an embodiment of the present technology.
  • the radiation detector 112 is used to image a target area 202 (e.g., phantom breasts).
  • the radiation detector 112 is placed in close proximity to the target area 202 such that a local radiation image 204 of the target area 202 can be created.
  • the configuration and position of the radiation detector 112 can enable generation of the local radiation image 204 in less than one minute.
  • the local radiation image 204 can include one or more irradiated areas 206 (e.g., radiated area 206-1 and radiated area 206-2).
  • the irradiated areas 206 can be presented with a particular color to indicate an amount of radiation detected.
  • the radiated area 206-1 can appear red to indicate a greater amount of radiation than detected at the radiated area 206-2, which appears blue.
  • different portions within the same radiated area can appear with different severity. As illustrated, a lesser-radiated portion of the radiated area 206-2 appears blue, and a greater-radiated portion of the radiated area 206-2 appears red.
  • the local radiation image 204 can be overlaid with an image from a camera (e.g., camera 110 described in Figure 1 ).
  • the local radiation image 204 includes irradiated areas 206 overlaid on an optical image of the target area 202.
  • the irradiated areas 206 can be displayed at locations that correspond to projections of the irradiated areas 206 on a plane on which the image from the camera is taken.
  • the irradiated areas 206 are tagged, labeled, flagged, and/or the like with indicators in the local radiation image 204, a camera image, and/or a combination/overlay of the local radiation image 204 and the camera image.
  • the indicators that correspond to the irradiated areas 206 in the images include numerical values that indicate a (3D) depth of the irradiated areas 206, and these depth values may be determined according to the techniques and aspects of the technology discussed herein.
  • radiated area 206-1 may be labeled in the image shown in Figure 2 with an indicator (e.g., a visual label, box, flag, or the like that is overlaid on the image) of “5 mm deep” and radiated area 206-2 may be labeled with an indicator of “3 mm deep.”
  • the indicators can indicate the depth of the irradiated areas 206 from the sensors (e.g., the radiation detector 1 12, the camera 110 of Figure 1 , or the depth sensors) or from the outer surface of the target area 202.
  • the depth can be determined based on a difference in the depth of the irradiated areas 206 (e.g., determined using the radiation detector 112) and the depth of the outer surface of the target area 202 (e.g., determined from the camera 1 10 of Figure 1 or the depth sensor).
  • a 3D-sensitive radiation detection system is configured to, as an alternative or in addition to using color grading to indicate relative radiation intensity, use color grading to indicate depth.
  • a 3D- sensitive radiation detection system is configured to enable dynamic toggling between relative radiation intensity being indicated by colors in the image and 3D depth being indicated by the colors in the image.
  • FIG. 3 illustrates a simplified schematic view of a combined depth and radiation image 300.
  • the combined depth and radiation image 300 includes a radiated area 302 (e.g., generated by a radiation detector, such as the radiation detector 1 12 of Figure 1 ) and 3D mesh 304 representing an exposed surface of the patient (e.g., generated by a depth sensor, such as the depth sensor of Figure 1 ).
  • the radiated area 302 and the 3D mesh 304 can be co-registered such that the radiated area 302 and the 3D mesh 304 are located at the same relative positions as the radiated tumor and the exposed surface of the patient. This coregistration can be performed by adjusting the radiation image or the depth image to a common frame of reference.
  • the radiated area 302 can be represented in 3D.
  • the radiated area 302 can be located at X-Y-Z locations that correlate to detections at the radiation detector.
  • the 3D mesh 304 can include the X-Y-Z locations of an exposed surface of the patient.
  • the combined depth and radiation image 300 can be rotated (e.g., along the X, Y, or Z axes or any combination thereof) or translated (e.g., along the X, Y, or Z axes or any combination thereof) automatically or through user input. In doing so, a surgeon can gain additional information and perspective on the location of the radiated area 302 relative to the 3D mesh 304.
  • FIG. 4 illustrates a simplified schematic view of a 3D-sensitive radiation detection system 100 in operation in accordance with an embodiment of the present technology.
  • a combined image is presented on the display.
  • the combined image is generated from a radiation image of a patient 402 from the radiation detector 1 12 that has been overlaid with an optical image of the patient 402 from the camera 110 and, in some cases, a depth image of the patient 402 from the depth sensor.
  • the combined radiation and optical image is presented on the display 106.
  • a user can toggle the display (e.g., through user input) to present the combined radiation and optical image or the combined radiation and depth image.
  • the user can similarly alter or adjust the combined image presented on the display through user input. For example, the user can zoom in or out, translate the image, rotate the image, or make any other adjustment through user input.
  • the camera 110 and the laser 114 are located on a different portion of the arm 108 from the radiation detector 1 12; however, in other examples, the camera 1 10, the depth sensor, or the laser 114 can be located on a same portion of the arm 108 as the radiation detector 1 12.
  • the camera 110 can be used to take a single still image that is co-registered with the radiation images or real-time images that are repeatedly coregistered with radiation images from the radiation detector 1 12.
  • the laser 114 can be positioned toward a point on the surface of the patient 402 that corresponds to a radiated area (e.g., a radiated tumor).
  • the laser 114 can point to a portion of the surface of the patient 402 that corresponds to a projection of the radiated area onto the surface of the patient 402 (e.g., on a plane normal to gravity or on a plane on which the optical image is taken). In doing so, a surgeon can be provided a visual indicator of an appropriate place to make an incision to remove a targeted tumor.
  • the laser 1 14 can be positioned toward a projection of one or more of the irradiated areas 404 based on a co-registration of the laser with the radiation image, optical image, or depth image. For example, the position of the laser 1 14 relative to the positions of the camera 1 10, the depth sensor, and radiation detector 1 12 when the respective images are taken can be used to determine the appropriate positioning of the laser 1 14 to direct the laser toward a portion of the patient 402 that corresponds to the one or more irradiated areas 404 in the combined image.
  • An XR device 502 that is an AR/VR headset comprises an XR display integrated within the headset, the XR display displaying visual content (e.g., an indication of a radiation source 506) provided by a processing subsystem of the AR/VR headset.
  • An XR device 502 that is an AR eyewear or eyeglasses is configured to project virtual content (e.g., via a mini-projector) to combine the virtual content (e.g., an indication or representation of a radiation source 506) with a real optical view that a user enjoys through at least one clear, transparent, semi-transparent, or translucent lens of the eyewear/eyeglasses.
  • the AR eyewear projects the virtual content onto a semi-transparent lens of the AR eyewear.
  • the AR eyewear projects the virtual content onto the eyes of the user/wearer.
  • the intraoperative radiation detection system 100 may provide the immersive visualization of the radiation source 506 detected within the patient body 504 on a dynamic basis. According to this dynamic basis, the radiation source 506 continues to be visualized while the patient body continues to be captured or visualized by the XR device 502 at different perspective angles and locations. As such, a user can enjoy the immersive and augmented visualization while moving around relative to the patient body 504, for example, when performing medical or surgical operations on the patient body 504.
  • the intraoperative radiation detection system 500 is configured to detect the radiation source 506 in order to virtually represent the radiation source 506 in an XR environment, such as an augmented reality (AR) environment or a virtual reality (VR) environment.
  • Detecting the radiation source 506 includes determining a location of the radiation source 506, and the location of the radiation source 506 can be determined via an image reconstruction technique or localization technique based on the different locations of multiple 3D CZT detectors.
  • a 3D image reconstruction process or technique is performed to detect the radiation source 506 and its location.
  • the 3D image reconstruction process can include generating a 3D image measuring radiation at different points throughout the 3D space of a patient body based on individual detection data from each of multiple 3D CZT detectors.
  • the 3D image may then be analyzed to detect one or more radioactive objects or sources based on the objects or sources expressing relatively more radiation signal than their surroundings.
  • detection of the radiation source 506 and its location allows a virtual representation of the radiation source 506 to be combined with a real view of the patient body.
  • the radiation source 506 can be virtually placed in the AR experience relative to the patient body based on registering the detected location of the radiation source 506 with sensed or detected locations of the patient body.
  • determining the location of the radiation source 506 allows a virtual representation or avatar of the radiation source 506 to be placed within a 3D virtual environment, specifically within a virtual representation or avatar of the patient body in the 3D virtual environment. While the present disclosure may focus on aspects related to an AR experience, the disclosed embodiments are applicable also to VR experiences and other XR experiences.
  • the intraoperative radiation detection system 500 includes a plurality of 3D CZT detectors 508 arranged around the patient body 504.
  • the location of the radiation source 506 can be determined (e.g., localized, triangulated) based on individual detection data generated by the plurality of 3D CZT detectors 508 and the known locations of the 3D CZT detectors 508.
  • Each of the 3D CZT detectors 508 are configured to detect the radiation source 506 with 3D sensitivity and accordingly can determine an object’s depth, or the object’s relative distance away from the respective detector.
  • a location of the object can be estimated through triangulation.
  • Triangulation of a radiation source 506 may be effective for point-like radiation sources like lymph nodes; however, a 3D reconstruction of radiation source distribution may be more effective to detect radiation sources that may not be point-like.
  • the 3D CZT detectors 508 may be configured for 3D-sensitive radiation detection according to aspects of the present disclosure.
  • the 3D CZT detectors 508 may include thick semiconductive material and may be pixelated in three-dimensions.
  • the individual detection data generated by each 3D CZT detector 508 may include histograms of gamma ray hits that are detected according to the detector’s 3D-sensitive pixelation.
  • the distances of the detectors’ known locations from the patient body corresponds to a speed at which the radiation source 506 is detected.
  • each known location at which a 3D CZT detector 508 is positioned is within a distance of the patient body (e.g., 10 centimeters, 25 centimeters, 50 centimeters, one meter) that allows for sufficiently rapid detection of the radioactive object for visual indication thereof in a real-time XR application.
  • the 3D CZT detectors 508 may be arranged at locations surrounding the patient body 504, for radiation detection from different points-of-view. In some embodiments, at least one of the 3D CZT detectors 508 is located at a height below the patient body 504; for example, a 3D CZT detector 508 may be located within a platform on which the patient body 504 is resting. In some embodiments, the system includes at least two 3D CZT detectors, with the two 3D CZT detectors being positioned and oriented orthogonal to one another.
  • one 3D CZT detector is positioned above the patient body and oriented downwards at the patient body, and another 3D CZT detector is positioned to the side of the patient body and oriented sideways at the patient body (orthogonal to the downwards orientation of the other detector).
  • a plurality of the 3D CZT detectors 508 are circumferentially arranged around a sectional portion of the patient body 504 intersecting a longitudinal axis 505 of the patient body 504.
  • Figure 5B an example arrangement of 3D CZT detectors 508 circumferentially surrounding a cross-section of the patient body 504 is illustrated.
  • radiation detection information can be provided via XR applications in quasi- real-time. Because certain XR applications involve intra-operative radiation detection, the detectors 508 mounted on a ring structure can be angled away from a plane of the ring structure that sections the body, so that the ring structure and the detectors do not obstruct a target region.
  • an optical camera may be co-located or comprised within a 3D CZT detector.
  • the optical camera is oriented in a same or similar orientation as the 3D CZT detector.
  • the optical camera can generate optical image data capturing the patient body in a similar field-of-view as the 3D CZT detector. Edges of the patient body can be determined from the optical image data, such that a radioactive object’s two-dimensional location from the perspective of the detector-camera can be determined.
  • optical image data from optical cameras co-located with 3D CZT detectors can be used to detect or estimate a radioactive object’s location within the patient body.
  • the optical image data from optical cameras is used to generate XR content that combines a virtual representation of a detected radioactive object with a real view of the patient body.
  • the XR device 502 can then determine a triangulated location of a radioactive object detected by multiple of the 3D CZT detectors 508.
  • computational load is offloaded from the XR device 502 onto a computing station 514.
  • the XR device 502 receives a location of a radioactive object determined from a 3D image reconstruction performed by the computing station 514.
  • the computing station 514 may be a desktop or laptop computer including at least one processor (e.g., the at least one processor 104) for processing detection data generated by 3D CZT detectors 508, for example, by performing a 3D image reconstruction process.
  • the computing station 514 may be a mobile device paired with the XR device 502, a server or cloud computing platform associated with the XR device 502, and/or the like configured to execute computational workloads related to XR experiences provided via the XR device 502. According to various examples, the computing station 514 may process and analyze the individual detection data to determine a location of a radioactive object and may provide the determined location to the XR device 502. In some embodiments, the computing station computes a 3D image of a radiation source distribution and receives queries from the XR device 502 for the 3D image whenever the XR device 502 requires the 3D image to provide XR (e.g., AR, VR) content.
  • XR e.g., AR, VR
  • the XR device 502 provides an XR experience, such as an AR experience, using the determined location of a radioactive object detected by multiple of the 3D CZT detectors 508.
  • the XR device 502 includes sensors (e.g., optical cameras, light detection and ranging (LiDAR) sensors, depth sensors, infrared cameras) to provide a real capture and sensing of the patient body 504 (e.g., a video captured by optical cameras, a visualization based on LiDAR dot readings).
  • the XR device 502 can then provide, to its user, an enhanced capture of the patient body 504 that includes a virtual or artificial representation of the radioactive object.
  • the XR device 502 may register the determined location of radioactive object with the patient body 504.
  • the XR device 502 may be operated with physical reference markers that can be placed on or around the patient body, on or around the intraoperative detection system (e.g., by a 3D CZT detector), and/or the like, to allow for the XR device 502 to calibrate and register locations within the environment.
  • the XR device 502 is configured to use optical image data captured by optical cameras comprised in or co-located with the 3D CZT detectors, the optical image data including edges of the patient body, for example.
  • the XR experience may incorporate co-registration with computed tomography (CT) imaging for overlaying radiation imaging with internal anatomy, co-registration with pre-operative single-photon emission computed tomography (SPECT), positron emission tomography (PET), and/or magnetic resonance imaging (MRI), visual guidance (e.g., arrows) showing where to move one or more 3D CZT detectors for a better view or image, virtual labels for each detector with its count rate (to aid in positioning of the instrument), overlays of patient vitals (e.g., heart rate, oxygen levels), and rotation/manipulation of various virtual objects in 3D based on user input.
  • CT computed tomography
  • SPECT single-photon emission computed tomography
  • PET positron emission tomography
  • MRI magnetic resonance imaging
  • visual guidance e.g., arrows showing where to move one or more 3D CZT detectors for a better view or image, virtual labels for each detector with its count rate (to aid in positioning of the instrument
  • Figure 6 illustrates a process 600 that includes example operations for intraoperative and immersive visualization of radioactive objects detected within a patient body.
  • the example operations of process 600 are implemented by a XR device, such as a virtual reality or augmented reality headset, a mobile device with a camera and an augmented reality user application, and/or the like.
  • the XR device includes at least one memory storing executable instructions that, when executed by at least one processor of the XR device, causes the XR device to perform the example operations of process 600.
  • the XR device detects a radiation source within a patient body based on individual detection data from multiple 3D-sensitive radiation detectors.
  • the radiation source may be an anatomical structure (or a portion thereof) inside the patient body, the anatomical structure emitting gamma rays based on its uptake of radiotracers.
  • the radiation source is detected via a 3D image reconstruction from the individual detection data.
  • the 3D image reconstruction can be performed to produce a three-dimensional radiation map, and one or more radiation source can be identified from processing the three-dimensional radiation map.
  • the radiation source is detected in the individual detection data from the detectors, and the source’s location is localized (e.g., triangulated) based on the known locations of the detectors.
  • each of the 3D-sensitive radiation detectors can determine a relative x-y-z location of gamma rays incident thereupon, and the location is determined based on this detection data from multiple of the 3D-sensitive radiation detectors and known locations associated with the 3D-sensitive radiation detectors.
  • the 3D- sensitive radiation detectors are movable, and the known location associated with a 3D- sensitive radiation detector is estimated by a position sensor co-located with the 3D- sensitive radiation detector.
  • the XR device provides a view of the patient body with a virtual representation of the radiation source based on the detected location of the radiation source.
  • the XR device augments an optical view of the patient body with the virtual representation of the radioactive object.
  • the XR device may sense and map the patient body in order to register the detected location of the radiation source to the patient body, or to virtually position and align the detected location relative to (and within) the patient body.
  • the XR device visualizes the radiation source within a virtual environment in which the patient body is virtually represented.
  • the XR device may continuously show the virtual representation of the radiation source as long as the view of the patient body is provided.
  • FIG. 7 is a block diagram that illustrates an example of a computer system 700 in which at least some operations described herein can be implemented.
  • the computer system 700 is an XR device (e.g., an VR/AR headset), a computing station coupled to the XR device (e.g., for offloading some computational load from the XR device), a computing system coupled with a plurality of 3D-sensitive radiation detectors, and/or the like.
  • the computer system 700 can include: one or more processors 702, main memory 706, non-volatile memory 710, a network interface device 712, a video display device 718, an input/output device 720, a control device 722 (e.g., keyboard and pointing device), a drive unit 724 that includes a machine-readable (storage) medium 726, and a signal generation device 730 that are communicatively connected to a bus 716.
  • the bus 716 represents one or more physical buses and/or point-to-point connections that are connected by appropriate bridges, adapters, or controllers.
  • Various common components e.g., cache memory
  • the computer system 700 is intended to illustrate a hardware device on which components illustrated or described relative to the examples of the figures and any other components described in this specification can be implemented.
  • the computer system 700 can take any suitable physical form.
  • the computing system 700 can share a similar architecture as that of a server computer, personal computer (PC), tablet computer, mobile telephone, game console, music player, wearable electronic device, network-connected (“smart”) device (e.g., a television or home assistant device), AR/VR systems (e.g., head-mounted display), or any electronic device capable of executing a set of instructions that specify action(s) to be taken by the computing system 700.
  • the computer system 700 can be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC), or a distributed system such as a mesh of computer systems, or it can include one or more cloud components in one or more networks.
  • one or more computer systems 700 can perform operations in real time, in near real time, or in batch mode.
  • the network interface device 712 enables the computing system 700 to mediate data in a network 714 with an entity that is external to the computing system 700 through any communication protocol supported by the computing system 700 and the external entity.
  • Examples of the network interface device 712 include a network adapter card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, a bridge router, a hub, a digital media receiver, and/or a repeater, as well as all wireless elements noted herein.
  • the memory (e.g., main memory 706, non-volatile memory 710, machine- readable medium 726) can be local, remote, or distributed. Although shown as a single medium, the machine-readable medium 726 can include multiple media (e.g., a centralized/distributed database and/or associated caches and servers) that store one or more sets of instructions 728.
  • the machine-readable medium 726 can include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing system 700.
  • the machine-readable medium 726 can be non-transitory or comprise a non-transitory device.
  • a non-transitory storage medium can include a device that is tangible, meaning that the device has a concrete physical form, although the device can change its physical state.
  • non-transitory refers to a device remaining tangible despite this change in state.
  • machine-readable storage media such as volatile and non-volatile memory 710, removable flash memory, hard disk drives, optical disks, and transmission-type media such as digital and analog communication links.
  • routines executed to implement examples herein can be implemented as part of an operating system or a specific application, component, program, object, module, or sequence of instructions (collectively referred to as “computer programs”).
  • the computer programs typically comprise one or more instructions (e.g., instructions 704, 708, 728) set at various times in various memory and storage devices in computing device(s).
  • the instruction(s) When read and executed by the processor 702, the instruction(s) cause the computing system 700 to perform operations to execute elements involving the various aspects of the disclosure.
  • An intraoperative radiation detection system comprising: a plurality of cadmium-zinc-telluride (CZT) radiation detectors configured for 3D-position-sensitive detections of incident radiation emitted from radioactive sources inside a patient body, wherein the plurality of CZT radiation detectors are located at a plurality of known locations surrounding the patient body; and a processing subsystem configured to: detect a radioactive source within the patient body based on performing an image reconstruction from the 3D-sensitive detections of at least a subset of the plurality of CZT radiation detectors; and provide, via an extended reality (XR) display, a view of the patient body combined with a virtual representation of the radioactive source.
  • CZT cadmium-zinc-telluride
  • Solution 2 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a mounting structure to which the plurality of CZT radiation detectors are attached, the mounting structure positioning the plurality of CZT radiation detectors at the plurality of locations.
  • Solution 3 The intraoperative radiation detection system of solution 2, wherein the mounting structure is attached to a platform on which the patient body rests while the radiation source is detected.
  • Solution 4 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the mounting structure includes at least a partial arc at a radius from a sectional portion of the patient body, the plurality of CZT radiation detectors being positioned along the partial arc.
  • Solution 5. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the mounting structure includes an articulated member to which a particular radiation detector is attached, the particular radiation detector being movable relative to the patient body via the articulated member.
  • Solution 6 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a position sensor co-located with a given CZT radiation detector, wherein the position sensor is configured to estimate a known location associated with the given radiation detector.
  • Solution 7 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a position marker co-located with a given radiation detector, wherein the position marker is detectable by a position sensor for estimating a known location associated with the given radiation detector.
  • Solution 8 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: an optical camera co-located with a given radiation detector, wherein the optical camera provides optical image data capturing the patient body for combining the virtual representation of the radiation source with the view of the patient body.
  • Solution 9 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the processing subsystem is integrated with the XR display in a virtual reality (VR) headset that is wearable by a user.
  • VR virtual reality
  • Solution 10 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the XR display comprises a clear lens of an augmented reality (AR) eyewear that is wearable by a user.
  • AR augmented reality
  • Solution 11 The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the plurality of CZT radiation detectors are mounted on a motorized structure configured to cause the plurality of CZT radiation detectors to revolve around the patient body.
  • a computing device comprising at least one hardware processor and at least one memory storing executable instructions that, when executed by the at least one hardware processor, cause the computing device to implement the processing subsystem of any one or more of the solutions disclosed herein.
  • Solution 13 A method implemented by a processing subsystem of any one or more of the solutions disclosed herein.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Nuclear Medicine (AREA)

Abstract

Three-dimensional and quick radiation detection via cadmium-zinc-telluride (CZT) detectors enable intraoperative/intrasurgical imaging and guidance with respect to radioactive objects detected within a patient body. Multiple CZT detectors are located around the patient body. An image reconstruction can be performed based on 3D-sensitive detections by the multiple CZT detectors to detect (e.g., and triangulate) a radioactive object within the patient body. Using a detected location of the radioactive object, a virtual representation of the radioactive object is provided in an extended reality (XR) experience that captures the patient body. A user wearing an XR headset may be provided with the XR experience while interacting with the patient body, for example, during a surgery.

Description

THREE-DIMENSIONAL (3D) CADMIUM-ZINC-TELLURIDE (CZT) DETECTOR SYSTEM FOR EXTENDED REALITY INTRASURGICAL GUIDANCE
CROSS-REFERENCE TO RELATED APPLICATIONS
[1] This application claims priority to and the benefit of U.S. Provisional Application No. 63/647,815 filed on May 15, 2024. The contents of the aforementioned application are incorporated herein in their entireties.
TECHNICAL FIELD
[2] The present disclosure generally relates to radiation detection and, more particularly, to intraoperative and immersive imaging of detected radioactive objects.
BACKGROUND
[3] Gamma rays are a form of electromagnetic radiation that is detectable through a semiconductor detector. Gamma rays can interact with the semiconductor detector, resulting in the generation of charge carriers through electron ionization. Negative charge carriers, such as electrons, can travel toward and be collected by an anode (a positively biased electrode), while positive charge carriers, such as holes in the semiconductor detector, can travel toward and be collected by a cathode (a negatively biased electrode). The charge carriers can induce a signal in the electrodes, which can be measured to determine the amount of charge absorbed. Given that the charge carriers derive from interactions of the gamma rays with the semiconductor device, the induced signals in the electrodes can be used to measure the energy absorbed from the gamma ray interactions in the semiconductor device.
[4] Detection of gamma rays or other radiation forms can be applied in medical settings for imaging purposes. For example, radiation emitted from radioactive anatomical structures within a patient body can be detected via a semiconductor device, allowing some imaging of the anatomical structures. Technical challenges related to limited dimensionality and slow data acquisition are obstacles to radiation being used as a medical imaging modality in certain settings.
BRIEF DESCRIPTION OF THE DRAWINGS
[5] Figure 1 illustrates a simplified schematic view of a 3D-sensitive radiation detection system in accordance with embodiments of the present technology.
[6] Figure 2 illustrates a simplified schematic view of a 3D-sensitive radiation detector in operation in accordance with embodiments of the present technology.
[7] Figure 3 illustrates a simplified schematic view of a combined depth and radiation image generated by a 3D-sensitive radiation detector in accordance with embodiments of the present technology.
[8] Figure 4 illustrates a simplified schematic view of a 3D-sensitive radiation detection system in operation in accordance with embodiments of the present technology.
[9] Figure 5A illustrates a simplified diagram of an intraoperative radiation detection system configured for immersive visualization of 3D-sensed radioactive objects, in accordance with embodiments of the present technology.
[10] Figures 5B-5C illustrate example arrangements of radiation detectors included in an intraoperative radiation detection system, in accordance with embodiments of the present technology.
[11] Figure 6 illustrates a flow diagram that includes example operations implemented by an intraoperative radiation detection system for immersive visualization of 3D-sensed radioactive objects, in accordance with embodiments of the present technology.
[12] Figure 7 illustrates a block diagram of an example computing system configured to implement the technical solutions disclosed herein.
DETAILED DESCRIPTION
[13] Radiation is used in medical applications to illuminate anatomic systems. In one application, radiation, in the form of radioisotopes, can be injected into a cancerous tumor or tumor bed (e.g., post resection of the tumor) to enable the radiation to drain through attached lymph nodes. Gamma rays can be emitted from areas in the body containing the injected radioisotopes (referred to as irradiated areas) and detected using a detection device (e.g., a semiconductor device, such as a cadmium-zinc-telluride (CdZnTe or CZT) detector), resulting in the irradiated areas being illuminated in the image of the body. As a result, the image can be used to identify lymph nodes that are connected to, and likely infected by, the cancerous tumor to enable the removal of the infected lymph nodes for further testing.
[14] Often, radiation imaging is performed using two-dimensional (2D) position-sensitive semiconductor detectors. For example, a semiconductor detector can be pixelated to enable the gamma rays incident upon different x-y locations of the semiconductor detector to be detected through the measurement of charge carriers created from the gamma rays and collected at the pixelated electrodes. One challenge associated with 2D position sensitivity is that the radiation measurement can be sensitive to parallax. That is, because gamma rays are emitted radially from irradiated areas, gamma rays originating from a particular x-y location within the body can be incident upon multiple x-y locations within the semiconductor detectors, often creating unacceptable uncertainty in medical applications. When the thickness of the semiconductor detectors is increased or the detectors are placed closer to the imaged area, the number of x-y locations within the detector upon which a single gamma ray can be incident increases, resulting in a corresponding increase in uncertainty of the x- y location of the radiated area within the body.
[15] One possible technique to address these challenges includes placing semiconductor detectors far from the imaged area (e.g., between 30 to 60 centimeters (cm)) behind a collimator that filters gamma rays that are not normal or near normal to the semiconductor detector (e.g., an angle of incidence less than 0.1 degrees, 0.5 degrees, 1 degree, or 5 degrees). Thus, the detected gamma rays can be traced to a single x-y location in the body with lesser uncertainty. But, given that a large number of the gamma rays are filtered out and that the gamma rays must travel a larger distance before detection, it can take long periods of time (e.g., between 30 to 45 minutes) to detect sufficient radiation to generate the image, thus rendering this technique unfavorable for some applications where time is paramount, such as intraoperative or intrasurgical imaging. Moreover, the 2D position sensitivity of these detectors makes them useless for providing information in the z- dimension, which can be useful for identifying the location of irradiated areas (e.g., infected lymph nodes) within the imaged area.
[16] 3D position-sensitive detectors provide the ability to overcome some of the disadvantages of 2D position-sensitive detectors and address concerns related to parallax. 3D position-sensitive detectors measure charge drift from the point at which the charge carriers originate within the semiconductor detector to the point at which they are absorbed by the electrodes. By analyzing the multiple detections of a single gamma ray at different x- y-z positions within the semiconductor detector, the angle of incidence at which a gamma ray is incident upon the semiconductor detector can be determined, which in turn can be used to determine the location of a radiated area from which the gamma ray is emitted. Further details about the use of 3D position-sensitive detectors can be found in U.S. Patent No. 7,411 ,197 to He et al. and U.S. Patent Publication No. 2009/01 14829A1 to He et al, each of which being incorporated by referenced herein in their respective entireties.
[17] Given that 3D position detectors trace the detections of gamma rays through the different points in the detectors, detecting radiation using 3D detectors can require complex calculations. In some cases, this complex detection of radiation from multiple sources and angles can make it difficult to reflect equally radiating sources with equal intensity, resulting in blurred or unprecise images. Moreover, the complexity of these calculations can decrease detection speed, which can make some 3D detectors suboptimal for time-sensitive applications, such as intrasurgical imaging. This complexity only increases when full-body imaging is performed due to the increased number of detections and the larger distances between the irradiated areas and the detector. In some applications, however, it may only be necessary to image a small target area. For example, a surgeon may wish to image a tumor and its surrounding area to determine the location of affected lymph nodes through which radiation from the tumor is drained. Accordingly, there is a need for a 3D positionsensitive radiation detector capable of providing a timely image of a target area.
[18] The present technology provides such systems and techniques. A 3D-sensitive radiation detector is disclosed. In aspects, the 3D-sensitive radiation detector includes a compact semiconductor detector (e.g., 5 cm x 5 cm x 1 cm) capable of providing a fast (e.g., less than one minute), accurate, and low-cost image of the target area. Given its compact size, the semiconductor detector can be placed close (e.g., less than 20, 18, 15, 12, 10, 8, 5 cm, and so on) to the body to specifically image a smaller target area (e.g., within the footprint of the detector). The 3D-sensitive radiation detector can have a larger thickness (e.g., greater than 5, 6, 7, 8, 9, 10, 12, 15, 20 cm, and so on) than other semiconductor detectors to enable a greater number of gamma rays to be detected. As a result, the radiation image can be generated more quickly in comparison to other semiconductor detectors.
[19] In some embodiments, a mask of radiation-blocking material (e.g., tungsten) can be placed over and around the semiconductor detector to control the locations and angles at which gamma rays can be incident upon the semiconductor detector. Specifically, openings can be located at select locations of the mask to enable gamma rays to pass from irradiated areas to the semiconductor detectors. The openings can be located to ensure collection of sufficient information for the accurate reconstruction of the radiation source distribution. In aspects, different mask designs are discussed that include the use of pinhole openings and annular openings. The openings can be designed with particular taper patterns to enable gamma rays from specific locations and angles to be detected. The location and taper patterns of the openings can be considered when determining the location from which a detected gamma ray was emitted. For example, the back tracing of multiple detections of the gamma ray to a particular location from which it was emitted can be limited to locations from which the gamma ray could pass through the mask openings, given their location and taper patterns, and be incident upon the semiconductor detector at the locations at which the detections occurred.
[20] The radiation image generated by a radiation system can be combined with other modalities for improved analysis, visualization, and the like. For example, the radiation image can be overlaid with an optical image of the patient to show where the radiated tumor is relative to an anatomical view of the patient. Alternatively or additionally, a depth sensor can be used to reconstruct an outer surface of the patient, and the radiated area can be presented relative to the outer surface of the patient. The combined images can enable a surgeon to determine where the radiated tumor is within the patient, which can guide the surgeon toward the appropriate place to make an incision to best facilitate removal of the tumor. As a further example, a laser can be used to point to a portion of the patient that corresponds to the radiated tumor. The laser can provide an on-patient visual indicator that directs the surgeon toward the appropriate place for an incision. In some embodiments, the laser can point to a projection of the radiated tumor onto the outer surface of the patient.
[21] Because aspects of the disclosed technology enable quicker generation of radiation images that also include three-dimensional information, example embodiments of systems discussed herein are suited for providing intraoperative or intrasurgical guidance with respect to radioactively illuminated anatomical systems. In some aspects of the disclosed technology, an intrasurgical detection system includes an extended reality (XR) device that provides immersive visualizations of a patient body that include any radioactive bodies detected by multiple 3D CZT radiation detectors. As mentioned above, each 3D CZT radiation detector is configured for fast and accurate imaging based in part on its compact size and ability to be placed close to the body or target region. The multiple radiation detectors are arranged to surround the patient body at various locations and orientations (while being up close to respective target regions), such that the intrasurgical detection system can locate and register a radioactive object to a patient body based on the respective detections of the radioactive object by the multiple radiation detectors. A user (e.g., a surgeon, a medical professional, a technician) using a XR device can receive a visual indication of the detected object when viewing the patient body via the XR device. For example, while operating on a patient body, a surgeon may wear an augmented reality (AR) headset that enhances a view of the patient body with a virtual or artificial representation of a detected radioactive object.
[22] Figure 1 illustrates a simplified schematic view of a 3D-sensitive radiation detection system 100 in accordance with an embodiment of the present technology. The 3D-sensitive radiation detection system 100 includes a base 102, at least one processor 104, a display 106, an arm 108, at least one camera 110, and a radiation detector 1 12 (e.g., CZT detector). The base 102 can house the at least one processor 104 and support the display 106 and the arm 108. The base 102 can be portable such that it can be easily moved in and out of surgical rooms. For example, the base 102 can include wheels to allow the base to be rolled on the floor. Although not illustrated, the base 102 can house or support one or more additional input/output devices, such as a keyboard, mouse, controller, printer, or other device.
[23] The at least one processor 104 can include any number of processors that perform computations to enable any of the functionality of the 3D-sensitive radiation detection system 100. For example, the at least one processor 104 can include any one or more of a central processing unit (CPU), graphical processing unit (GPU), a System-on-Chip (SoC), an application-specific integrated circuit (ASIC), and so on. In some cases, the 3D-sensitive radiation detection system 100 can include an ASIC capable of performing the detections of incident gamma rays at the radiation detector 1 12 and at least one separate processor 104 capable of generating a radiation image from the detected radiation or providing functionality to the display 106.
[24] The display 106 can be supported by the base 102. For example, the display 106 can be housed within the base 102 or be attached to a stand that is connected to or supported by the base 102. The display 106 can include any number of displays, such as a liquid crystal display (LCD), a light-emitting diode (LED) display, or an organic LED (OLED) display. The display 106 can present an image of the patient provided by the camera 110, a radiation image provided by the radiation detector 112, a reconstruction of a surface measured by the depth sensor, an indication of the laser 114, or a combined image of the radiation image from the radiation detector 112 and at least one of the optical images from the camera 1 10 or the depth image from the depth sensor. In some cases, the display 106 can include a touch sensor such that input from a user can be received at the display 106. In this way, a particular radiated area can be selected from multiple areas displayed within the image, the displayed image can be adjusted or rotated, or one or more components of the 3D-sensitive radiation detection system 100 can be controlled.
[25] The arm 108 can attach to the base 102 and support the camera 1 10, the depth sensor, the radiation detector 1 12, or the laser 1 14. The arm 108 can be functionally coupled to actuators to position the arm 108 such that the camera 1 10, the depth sensor, or the radiation detector 112 is in a particular configuration for imaging. The arm 108 can be positioned into a variety of configurations to enable local imaging at different locations on differently sized patients. [26] The camera 1 10 can be disposed at the arm 108 to enable imaging of the patient. For example, the camera 1 10 can provide an optical image of the patient that can be overlaid with the radiation image to provide greater detail about the specific location of a radiated area within the patient. In some cases, the camera 110 can include a depth lens that can be used to collect depth data of an exposed surface of the patient. Alternatively or additionally, the depth sensor can include separate sensors, such as radar or lidar sensors. Although illustrated at a specific portion of the arm 108, the camera 110 (or the depth sensor) can instead be located at a different portion of the arm 108. By implementing the camera 110, the depth sensor, and the radiation detector 1 12 on the same member of the arm 108, however, the relative position between the camera 1 10, the depth sensor, and radiation detector 1 12 can remain constant, which can reduce the complexity of image co-registration.
[27] The radiation detector 1 12 can similarly be disposed at the arm 108. As illustrated, the radiation detector 112 is located at the distal portion of the arm 108 to enable positioning the radiation detector 1 12 close to a target area. The radiation detector 112 can be relatively small (e.g., 5 cm x 5 cm x 1 cm) to enable the radiation detector 1 12 to detect radiation within a small target area. The radiation detector 112 can be pixelated in the z-dimension to enable the detection of gamma rays at a particular x-y-z location. The radiation detector 1 12 can include semiconductive material (e.g., CZT, 3D pixelated scintillators, and so on) and one or more biased electrodes (e.g., anode and cathode). In aspects, the semiconductive material in the radiation detector 1 12 can be thicker than in other radiation detectors (e.g., approximately 1 cm) to enable a greater number of gamma ray detections. The radiation detector 112 can further include circuitry to provide functionality to the radiation detector 1 12. For example, the radiation detector 112 can include circuitry connected to the biased electrodes and usable to measure electrical properties at the electrode. Alternatively or additionally, the circuitry (e.g., an ASIC) can perform one or more operations to trace multiple detections of a gamma ray to a single source.
[28] One or more of the at least one processor 104 can co-register the radiation image from the radiation detector 112 with one or more other image. For example, the radiation image can be co-registered with the optical image collected by the camera 110 to display the radiated area overlaid on the patient. In some cases, the radiation image or the optical image can be further overlaid with the depth image collected by the depth sensor such that a depth of the radiated area is indicated. For example, the depth image can be used to determine the distance of the patient from the depth sensor (e.g., with the additional accuracy needed for surgical applications, for example, mm-level accuracy), which can be used with the depth data determined from the radiation image to determine the depth of the radiated area from an exposed surface of the patient. The depth of the radiated area from the exposed surface of the patient can be indicated within the image. In some cases, the optical image and the depth image can be used to acquire different views of the patient. For example, the optical image can provide a view of the patient in a plane in which the optical image is taken, while the depth camera can provide a 3D view of the surface of the patient that can be rotated or seen from different angles. In general, the images can be co-registered based on the locations of the various sensors used to collect the images (e.g., the camera 1 10, the depth sensor, or the radiation detector 1 12). In aspects, the images can be coregistered by adjusting the images to match a single point of reference. As precision is paramount in surgery, the co-registration can be performed with mm-level accuracy.
[29] In yet another example, the radiation image can be co-registered with the depth image to create a representation of the radiated area relative to an outer surface of the patient. For example, the depth image can be used to generate a reconstruction of the outer surface of the patient (e.g., a mesh of the patient), and the radiation image can be used to reconstruct the radiated area. Given that the radiation detector 1 12 and the depth sensor can measure the depth of the radiated area and the outer surface of the patient, respectively, the combined image from the radiation image and depth image can indicate the depth between the radiated area and the outer surface of the patient. In aspects, this can provide a surgeon information regarding the depth of incision needed to reach the radiated area.
[30] An optical laser 1 14 can further be attached to the arm 108 or elsewhere on the 3D- sensitive radiation detection system 100. The laser 114 can be positioned (e.g., by one or more of the at least one processor 104) to point at the outer surface of the patient that corresponds to the radiated area. For example, the laser 1 14 can point to a portion of the outer surface of the patient that corresponds to a projection of the radiated area on the outer surface of the patient. In some cases, the radiation detector 1 12 can detect multiple irradiated areas that are displayed within an image presented on the display 106. The user can select between the multiple irradiated areas (e.g., using the display 106 or another input device), and the laser 114 can be positioned to point toward a portion of the outer surface of the patient that corresponds to the selected radiated area. In other cases, the laser 1 14 can point to a radioactive area with the greatest magnitude of radiation. The positioning of the laser 114 can be determined based on the location of the camera 110, the depth sensor, the radiation detector 112, the laser 1 14, the selected radiated area (e.g., determined from the radiation image), or the outer surface of the patient (e.g., determined from the camera 1 10 or the depth sensor).
[31] Figure 2 illustrates a simplified schematic view of a 3D-sensitive radiation detector 1 12 in operation in accordance with an embodiment of the present technology. As illustrated, the radiation detector 112 is used to image a target area 202 (e.g., phantom breasts). The radiation detector 112 is placed in close proximity to the target area 202 such that a local radiation image 204 of the target area 202 can be created. In aspects, the configuration and position of the radiation detector 112 can enable generation of the local radiation image 204 in less than one minute.
[32] The local radiation image 204 can include one or more irradiated areas 206 (e.g., radiated area 206-1 and radiated area 206-2). The irradiated areas 206 can be presented with a particular color to indicate an amount of radiation detected. For example, the radiated area 206-1 can appear red to indicate a greater amount of radiation than detected at the radiated area 206-2, which appears blue. Moreover, different portions within the same radiated area can appear with different severity. As illustrated, a lesser-radiated portion of the radiated area 206-2 appears blue, and a greater-radiated portion of the radiated area 206-2 appears red. In some embodiments, the local radiation image 204 can be overlaid with an image from a camera (e.g., camera 110 described in Figure 1 ). For example, the local radiation image 204 includes irradiated areas 206 overlaid on an optical image of the target area 202. The irradiated areas 206 can be displayed at locations that correspond to projections of the irradiated areas 206 on a plane on which the image from the camera is taken. [33] In some embodiments, the irradiated areas 206 are tagged, labeled, flagged, and/or the like with indicators in the local radiation image 204, a camera image, and/or a combination/overlay of the local radiation image 204 and the camera image. The indicators that correspond to the irradiated areas 206 in the images include numerical values that indicate a (3D) depth of the irradiated areas 206, and these depth values may be determined according to the techniques and aspects of the technology discussed herein. As one non- illustrative example, radiated area 206-1 may be labeled in the image shown in Figure 2 with an indicator (e.g., a visual label, box, flag, or the like that is overlaid on the image) of “5 mm deep” and radiated area 206-2 may be labeled with an indicator of “3 mm deep.” The indicators can indicate the depth of the irradiated areas 206 from the sensors (e.g., the radiation detector 1 12, the camera 110 of Figure 1 , or the depth sensors) or from the outer surface of the target area 202. The depth can be determined based on a difference in the depth of the irradiated areas 206 (e.g., determined using the radiation detector 112) and the depth of the outer surface of the target area 202 (e.g., determined from the camera 1 10 of Figure 1 or the depth sensor). In some embodiments, a 3D-sensitive radiation detection system is configured to, as an alternative or in addition to using color grading to indicate relative radiation intensity, use color grading to indicate depth. In some embodiments, a 3D- sensitive radiation detection system is configured to enable dynamic toggling between relative radiation intensity being indicated by colors in the image and 3D depth being indicated by the colors in the image.
[34] Figure 3 illustrates a simplified schematic view of a combined depth and radiation image 300. The combined depth and radiation image 300 includes a radiated area 302 (e.g., generated by a radiation detector, such as the radiation detector 1 12 of Figure 1 ) and 3D mesh 304 representing an exposed surface of the patient (e.g., generated by a depth sensor, such as the depth sensor of Figure 1 ). The radiated area 302 and the 3D mesh 304 can be co-registered such that the radiated area 302 and the 3D mesh 304 are located at the same relative positions as the radiated tumor and the exposed surface of the patient. This coregistration can be performed by adjusting the radiation image or the depth image to a common frame of reference. [35] The radiated area 302 can be represented in 3D. For example, the radiated area 302 can be located at X-Y-Z locations that correlate to detections at the radiation detector. Similarly, the 3D mesh 304 can include the X-Y-Z locations of an exposed surface of the patient. The combined depth and radiation image 300 can be rotated (e.g., along the X, Y, or Z axes or any combination thereof) or translated (e.g., along the X, Y, or Z axes or any combination thereof) automatically or through user input. In doing so, a surgeon can gain additional information and perspective on the location of the radiated area 302 relative to the 3D mesh 304.
[36] Figure 4 illustrates a simplified schematic view of a 3D-sensitive radiation detection system 100 in operation in accordance with an embodiment of the present technology. A combined image is presented on the display. The combined image is generated from a radiation image of a patient 402 from the radiation detector 1 12 that has been overlaid with an optical image of the patient 402 from the camera 110 and, in some cases, a depth image of the patient 402 from the depth sensor. The combined radiation and optical image is presented on the display 106. In some cases, a user can toggle the display (e.g., through user input) to present the combined radiation and optical image or the combined radiation and depth image. The user can similarly alter or adjust the combined image presented on the display through user input. For example, the user can zoom in or out, translate the image, rotate the image, or make any other adjustment through user input.
[37] In the illustrated example, the camera 110 and the laser 114 are located on a different portion of the arm 108 from the radiation detector 1 12; however, in other examples, the camera 1 10, the depth sensor, or the laser 114 can be located on a same portion of the arm 108 as the radiation detector 1 12. The camera 110 can be used to take a single still image that is co-registered with the radiation images or real-time images that are repeatedly coregistered with radiation images from the radiation detector 1 12. The laser 114 can be positioned toward a point on the surface of the patient 402 that corresponds to a radiated area (e.g., a radiated tumor). For example, the laser 114 can point to a portion of the surface of the patient 402 that corresponds to a projection of the radiated area onto the surface of the patient 402 (e.g., on a plane normal to gravity or on a plane on which the optical image is taken). In doing so, a surgeon can be provided a visual indicator of an appropriate place to make an incision to remove a targeted tumor.
[38] As illustrated in Figure 4, the combined image presented on the display 106 includes multiple radiation sources 404 (e.g., radiation source 404-1 and radiation source 404-2). In this case, the user can select a radiated area to direct the laser 1 14 (e.g., using a touch screen on the display 106, a mouse and keyboard, a controller, or any other input device). As a non-limiting example, the user can select the radiation source 404-1 , and in response, the laser 114 can be directed to a portion of the surface of the patient that corresponds to a projection of the radiation source 404-1 onto the surface of the patient. The user can then select the radiation source 404-2, at which point the laser 114 can be positioned to point toward a projection of the radiation source 404-2 onto the surface of the patient. Alternatively or additionally, the laser 114 can point to the radiation source that includes the greatest amount of detected radiation intensity (e.g., without intervention from the user). In some cases, the combined image presented on the display 106 can include a tag, marker, or other indicator corresponding to the location to which the laser 1 14 is pointing. For example, if the laser 1 14 is pointing toward the radiated area 404-1 , the combined image presented on the display 106 can include an indication of the laser 114 (e.g., a crosshair) at the radiated area 404-1 .
[39] The laser 1 14 can be positioned toward a projection of one or more of the irradiated areas 404 based on a co-registration of the laser with the radiation image, optical image, or depth image. For example, the position of the laser 1 14 relative to the positions of the camera 1 10, the depth sensor, and radiation detector 1 12 when the respective images are taken can be used to determine the appropriate positioning of the laser 1 14 to direct the laser toward a portion of the patient 402 that corresponds to the one or more irradiated areas 404 in the combined image. In some cases, the depth of the irradiated areas 404 from the surface of the patient 402 (e.g., determined from the radiation image and the depth image or the optical image) can be used to determine the location of the projections of the irradiated areas 404 onto the surface of the patient 402.
[40] Figure 5A illustrates an intraoperative radiation detection system 100 configured for immersive visualization of radiation emission distributions, or detected radioactive bodies or objects within a patient body. The intraoperative radiation detection system 100 includes an extended reality (XR) device 502 via which a user can receive an immersive visualization that enhances a view of a patient body 504 with an indication of a radiation source 506 detected therewithin. For example, the XR device 502 is an augmented reality or virtual reality (AR/VR) headset, augmented reality (AR) eyewear or eyeglasses, a handheld device with an AR or mixed reality (MR) user application, and/or the like that is configured to overlay the patient body 504 with a visualization of the radiation source 506. An XR device 502 that is an AR/VR headset comprises an XR display integrated within the headset, the XR display displaying visual content (e.g., an indication of a radiation source 506) provided by a processing subsystem of the AR/VR headset. An XR device 502 that is an AR eyewear or eyeglasses is configured to project virtual content (e.g., via a mini-projector) to combine the virtual content (e.g., an indication or representation of a radiation source 506) with a real optical view that a user enjoys through at least one clear, transparent, semi-transparent, or translucent lens of the eyewear/eyeglasses. In some examples, the AR eyewear projects the virtual content onto a semi-transparent lens of the AR eyewear. In some examples, the AR eyewear projects the virtual content onto the eyes of the user/wearer.
[41] The intraoperative radiation detection system 100 may provide the immersive visualization of the radiation source 506 detected within the patient body 504 on a dynamic basis. According to this dynamic basis, the radiation source 506 continues to be visualized while the patient body continues to be captured or visualized by the XR device 502 at different perspective angles and locations. As such, a user can enjoy the immersive and augmented visualization while moving around relative to the patient body 504, for example, when performing medical or surgical operations on the patient body 504.
[42] The intraoperative radiation detection system 500 is configured to detect the radiation source 506 in order to virtually represent the radiation source 506 in an XR environment, such as an augmented reality (AR) environment or a virtual reality (VR) environment. Detecting the radiation source 506 includes determining a location of the radiation source 506, and the location of the radiation source 506 can be determined via an image reconstruction technique or localization technique based on the different locations of multiple 3D CZT detectors. In some embodiments, a 3D image reconstruction process or technique is performed to detect the radiation source 506 and its location. The 3D image reconstruction process can include generating a 3D image measuring radiation at different points throughout the 3D space of a patient body based on individual detection data from each of multiple 3D CZT detectors. The 3D image may then be analyzed to detect one or more radioactive objects or sources based on the objects or sources expressing relatively more radiation signal than their surroundings.
[43] In accordance with an AR experience, detection of the radiation source 506 and its location allows a virtual representation of the radiation source 506 to be combined with a real view of the patient body. For instance, the radiation source 506 can be virtually placed in the AR experience relative to the patient body based on registering the detected location of the radiation source 506 with sensed or detected locations of the patient body. In accordance with a VR experience, determining the location of the radiation source 506 allows a virtual representation or avatar of the radiation source 506 to be placed within a 3D virtual environment, specifically within a virtual representation or avatar of the patient body in the 3D virtual environment. While the present disclosure may focus on aspects related to an AR experience, the disclosed embodiments are applicable also to VR experiences and other XR experiences.
[44] The intraoperative radiation detection system 500 includes a plurality of 3D CZT detectors 508 arranged around the patient body 504. In some embodiments, alternatively or additionally to a 3D image reconstruction, the location of the radiation source 506 can be determined (e.g., localized, triangulated) based on individual detection data generated by the plurality of 3D CZT detectors 508 and the known locations of the 3D CZT detectors 508. Each of the 3D CZT detectors 508 are configured to detect the radiation source 506 with 3D sensitivity and accordingly can determine an object’s depth, or the object’s relative distance away from the respective detector. Based on known locations of the 3D CZT detectors 508 and respective 3D-sensitive detections of an object by multiple of the 3D CZT detectors 508, a location of the object can be estimated through triangulation. Triangulation of a radiation source 506 may be effective for point-like radiation sources like lymph nodes; however, a 3D reconstruction of radiation source distribution may be more effective to detect radiation sources that may not be point-like. [45] The 3D CZT detectors 508 may be configured for 3D-sensitive radiation detection according to aspects of the present disclosure. For example, the 3D CZT detectors 508 may include thick semiconductive material and may be pixelated in three-dimensions. The individual detection data generated by each 3D CZT detector 508 may include histograms of gamma ray hits that are detected according to the detector’s 3D-sensitive pixelation. In some embodiments, the distances of the detectors’ known locations from the patient body corresponds to a speed at which the radiation source 506 is detected. In some embodiments, each known location at which a 3D CZT detector 508 is positioned is within a distance of the patient body (e.g., 10 centimeters, 25 centimeters, 50 centimeters, one meter) that allows for sufficiently rapid detection of the radioactive object for visual indication thereof in a real-time XR application. Because of the multiple number of 3D CZT detectors 508 being used to detect a radioactive object, the 3D CZT detectors 508 can be positioned farther away from the patient body compared to radioactive imaging using only one 3D CZT detector. In some embodiments, the 3D sensitivity of the 3D CZT detectors 508 is effective at a certain range away from the radiation source 506, and the 3D CZT detectors 508 are movable to be positioned close to a region of interest (ROI) in which the radiation source 506 is located for effective 3D sensing.
[46] In order to improve localization of a radiation source 506 detected by multiple of the 3D CZT detectors 508, the 3D CZT detectors 508 may be arranged at locations surrounding the patient body 504, for radiation detection from different points-of-view. In some embodiments, at least one of the 3D CZT detectors 508 is located at a height below the patient body 504; for example, a 3D CZT detector 508 may be located within a platform on which the patient body 504 is resting. In some embodiments, the system includes at least two 3D CZT detectors, with the two 3D CZT detectors being positioned and oriented orthogonal to one another. For example, one 3D CZT detector is positioned above the patient body and oriented downwards at the patient body, and another 3D CZT detector is positioned to the side of the patient body and oriented sideways at the patient body (orthogonal to the downwards orientation of the other detector). In some embodiments, a plurality of the 3D CZT detectors 508 are circumferentially arranged around a sectional portion of the patient body 504 intersecting a longitudinal axis 505 of the patient body 504. Turning to Figure 5B, an example arrangement of 3D CZT detectors 508 circumferentially surrounding a cross-section of the patient body 504 is illustrated. In the example arrangement, two pairs of 3D CZT detectors 508 are positioned on two lateral sides of the cross-section of the patient body 504, and each pair includes one detector above the patient body 504 and one detector below the patient body 504. Each of the detectors are oriented towards the patient body 504 for detection of radioactive bodies located inside the patient body 504.
[47] The 3D CZT detectors 508 may be positioned at their respective locations surrounding the patient body 504 via a mounting structure 510. In some embodiments, the mounting structure 510 comprises frames, arms, members, and/or the like to which the 3D CZT detectors 508 are attached. In some embodiments, the 3D CZT detectors 508 are distributed along the mounting structure 510 to ensure multiple different views of a region of interest. In some embodiments, at least some of the 3D CZT detectors 508 are clustered or packed closely together on the mounting structure 510 in order to obtain higher sensitivity from a particular viewpoint of the region of interest. In some embodiments, the system includes multiple 3D CZT detectors 508 attached to a plurality of mounting structures 510.
[48] In some embodiments, the mounting structure 510 may be attached to a platform 512 on which the patient body 504 is resting, such as an operating table or hospital bed. For example, the mounting structure 510 is an arm that is attached to the platform. In some embodiments, the mounting structure 510 is not attached to the platform 512 but may be movable to be positioned adjacent to the platform 512. For example, the mounting structure 510 is an arm or a pole atop a base that can be moved (e.g., wheeled) next to the platform 512. In some embodiments, the mounting structure 510 is integrated with or comprises the platform 512 on which the patient body 504 is resting. For example, the mounting structure is a tube or tunnel within which the patient body 504 is disposed, and the 3D CZT detectors 508 are positioned along the circumference of the tube or tunnel to surround a sectional portion of the patient body 504 (e.g., intersecting the longitudinal axis 505). In some examples, the tube or tunnel may circumferentially surround the patient body 504 completely, and in some examples, the tube or tunnel is also movable in and out of place around the patient body 504. In other examples, the tube or tunnel may be a half-pipe-like geometry having a (partial) circular or elliptical arc cross-section (rather than a completely circular or elliptical cross-section), thus allowing access to the patient body 504 partially surrounded therewithin by an outside individual. For instance, Figure 5B shows two pairs of 3D CZT detectors 508 positioned adjacent to lateral sides of the patient body 504 via two vertical structures. Figure 5C illustrates another example in which a mounting structure 510 may be configured with a curvature that, when concave relative to a patient body 504, allows the detectors to be positioned closer to the patient body 504. In some embodiments, the mounting structure 510 is also rotatable around the longitudinal axis 505, so that the 3D CZT detectors 508 attached thereto move in an arc path while scanning the region of interest. In doing so, a number of 3D CZT detectors 508 used in the system may be reduced.
[49] The 3D CZT detectors 508 may be movable relative to the patient body 504. In some embodiments, at least one of the 3D CZT detector 508 may be a handheld detector, which may be operated by a user wearing the XR device 502 or by another user. In some embodiments, the 3D CZT detectors 508 are attached to a movable mounting structure, or movable (e.g., articulated) portions thereof. For example, a tube-shaped mounting structure that positions the 3D CZT detectors 508 circumferentially around a sectional portion of the patient body 504 may be movable along with the longitudinal axis 505 to thereby align with other sectional portions of the patient body. Conversely, the patient body 504 may be movable or slidable through or by a mounting structure to likewise align the 3D CZT detectors 508 around other sectional portions of the patient body. In other examples, a 3D CZT detector 508 is mounted on an articulated arm that can be manipulated by a user in order to position the 3D CZT detector 508 at a desired location.
[50] In a further example, the 3D CZT detectors 508 can be mounted on a circular or elliptical ring structure surrounding a sectional portion of the patient body, and the ring structure can provide a motorized movement so that the 3D CZT detectors 508 revolve around the patient body. The revolution of the 3D CZT detectors 508 can be at an angular velocity that is related to or proportional to an image acquisition time or a reconstruction processing time for the detectors. As the detectors 508 revolve, each subsequent detector passing over a region can collect data used to improve the data collected by a prior detector passing over that region. Accordingly, revolution of the detector 508 circumferentially around a section of a body can even further improve data quality and emission reconstruction accuracy. Thus, radiation detection information can be provided via XR applications in quasi- real-time. Because certain XR applications involve intra-operative radiation detection, the detectors 508 mounted on a ring structure can be angled away from a plane of the ring structure that sections the body, so that the ring structure and the detectors do not obstruct a target region.
[51] In order to obtain the known location or reference location of 3D CZT detectors 508, position sensors may be co-located or located adjacent to the 3D CZT detectors 508. For example, position sensors may also be attached to the mounting structure 510 co-located with or adjacent to the 3D CZT detectors 508 thereon. In some embodiments, the known location of 3D CZT detectors 508 may be determined or sensed by the XR device 502. For example, the 3D CZT detectors 508 may be configured with reference markers (e.g., visual markers, signal-emitting markers) that allow the XR device 502 to determine the known locations associated with the 3D CZT detectors 508. These known locations associated with the detectors can then be used to locate a detected radioactive object relatively.
[52] Similarly, in some embodiments, an optical camera may be co-located or comprised within a 3D CZT detector. The optical camera is oriented in a same or similar orientation as the 3D CZT detector. The optical camera can generate optical image data capturing the patient body in a similar field-of-view as the 3D CZT detector. Edges of the patient body can be determined from the optical image data, such that a radioactive object’s two-dimensional location from the perspective of the detector-camera can be determined. Thus, optical image data from optical cameras co-located with 3D CZT detectors can be used to detect or estimate a radioactive object’s location within the patient body. In some embodiments, the optical image data from optical cameras is used to generate XR content that combines a virtual representation of a detected radioactive object with a real view of the patient body.
[53] Individual detection data, image reconstruction data, and/or detector location data may be communicated within the intraoperative radiation detection system 500 between the detectors, position sensors, the XR device 502, and/or a computing station 514. In particular, in an intraoperative radiation detection system 500 with multiple 3D CZT detectors mounted on different mounting structures (e.g., moveable arms), detection location data is communicated to track the position of each of the 3D CZT detectors and/or each of the mounting structures. In some embodiments, the XR device 502 receives the individual detection data (e.g., x-y-z locations of gamma ray hits detected by a given detector) and detector location data for each of the 3D CZT detectors 508. Using the received data, the XR device 502 can then determine a triangulated location of a radioactive object detected by multiple of the 3D CZT detectors 508. In some embodiments, computational load is offloaded from the XR device 502 onto a computing station 514. For example, the XR device 502 receives a location of a radioactive object determined from a 3D image reconstruction performed by the computing station 514. In some examples, the computing station 514 may be a desktop or laptop computer including at least one processor (e.g., the at least one processor 104) for processing detection data generated by 3D CZT detectors 508, for example, by performing a 3D image reconstruction process. In some examples, the computing station 514 may be a mobile device paired with the XR device 502, a server or cloud computing platform associated with the XR device 502, and/or the like configured to execute computational workloads related to XR experiences provided via the XR device 502. According to various examples, the computing station 514 may process and analyze the individual detection data to determine a location of a radioactive object and may provide the determined location to the XR device 502. In some embodiments, the computing station computes a 3D image of a radiation source distribution and receives queries from the XR device 502 for the 3D image whenever the XR device 502 requires the 3D image to provide XR (e.g., AR, VR) content.
[54] The XR device 502 provides an XR experience, such as an AR experience, using the determined location of a radioactive object detected by multiple of the 3D CZT detectors 508. In some embodiments, the XR device 502 includes sensors (e.g., optical cameras, light detection and ranging (LiDAR) sensors, depth sensors, infrared cameras) to provide a real capture and sensing of the patient body 504 (e.g., a video captured by optical cameras, a visualization based on LiDAR dot readings). The XR device 502 can then provide, to its user, an enhanced capture of the patient body 504 that includes a virtual or artificial representation of the radioactive object. In order to accurately simulate or represent the radioactive object within the patient body 504, the XR device 502 may register the determined location of radioactive object with the patient body 504. In some embodiments, the XR device 502 may be operated with physical reference markers that can be placed on or around the patient body, on or around the intraoperative detection system (e.g., by a 3D CZT detector), and/or the like, to allow for the XR device 502 to calibrate and register locations within the environment. In some embodiments, the XR device 502 is configured to use optical image data captured by optical cameras comprised in or co-located with the 3D CZT detectors, the optical image data including edges of the patient body, for example. In some embodiments, the system includes various simultaneous localization and mapping (SLAM) devices, markets, or subsystems used for calibrating locations of the XR device 502 relative to other objects and for providing XR content. For example, Optitrack optical-based sensors for tracking small infrared markers located on the 3D CZT detectors and/or the XR device may be used to determine relative locations of the detectors and/or the XR device.
[55] The XR experience can virtually represent or simulate the radioactive object in a manner that indicates its three-dimensional location or depth within the patient body 504. In some examples, the virtual representation or avatar of the radioactive object is relatively sized and scaled relative to features observed on the patient body 504 in order to visually indicate scale and depth. In some examples, the virtual representation or avatar of the radioactive object is color coded to convey emission rate or activity for each voxel. In some examples, the XR experience includes a virtual overlay of a depth map or image, such as the image 300 with a 3D mesh 304 as shown in Figure 3. Similarly, the XR experience may incorporate co-registration with computed tomography (CT) imaging for overlaying radiation imaging with internal anatomy, co-registration with pre-operative single-photon emission computed tomography (SPECT), positron emission tomography (PET), and/or magnetic resonance imaging (MRI), visual guidance (e.g., arrows) showing where to move one or more 3D CZT detectors for a better view or image, virtual labels for each detector with its count rate (to aid in positioning of the instrument), overlays of patient vitals (e.g., heart rate, oxygen levels), and rotation/manipulation of various virtual objects in 3D based on user input. In some embodiments, a depth or three-dimensional location of the radioactive object can at least be implicitly indicated based on a persistent visualization thereof while the perspective of the XR device 502 moves around.
[56] Figure 6 illustrates a process 600 that includes example operations for intraoperative and immersive visualization of radioactive objects detected within a patient body. In some embodiments, the example operations of process 600 are implemented by a XR device, such as a virtual reality or augmented reality headset, a mobile device with a camera and an augmented reality user application, and/or the like. For example, the XR device includes at least one memory storing executable instructions that, when executed by at least one processor of the XR device, causes the XR device to perform the example operations of process 600.
[57] At block 602, the XR device detects a radiation source within a patient body based on individual detection data from multiple 3D-sensitive radiation detectors. For example, the radiation source may be an anatomical structure (or a portion thereof) inside the patient body, the anatomical structure emitting gamma rays based on its uptake of radiotracers. In some embodiments, the radiation source is detected via a 3D image reconstruction from the individual detection data. For example, the 3D image reconstruction can be performed to produce a three-dimensional radiation map, and one or more radiation source can be identified from processing the three-dimensional radiation map. In some embodiments, the radiation source is detected in the individual detection data from the detectors, and the source’s location is localized (e.g., triangulated) based on the known locations of the detectors. In particular, each of the 3D-sensitive radiation detectors can determine a relative x-y-z location of gamma rays incident thereupon, and the location is determined based on this detection data from multiple of the 3D-sensitive radiation detectors and known locations associated with the 3D-sensitive radiation detectors. In some embodiments, the 3D- sensitive radiation detectors are movable, and the known location associated with a 3D- sensitive radiation detector is estimated by a position sensor co-located with the 3D- sensitive radiation detector.
[58] At block 604, the XR device provides a view of the patient body with a virtual representation of the radiation source based on the detected location of the radiation source. In some embodiments, the XR device augments an optical view of the patient body with the virtual representation of the radioactive object. To do so, the XR device may sense and map the patient body in order to register the detected location of the radiation source to the patient body, or to virtually position and align the detected location relative to (and within) the patient body. In some embodiments, the XR device visualizes the radiation source within a virtual environment in which the patient body is virtually represented. The XR device may continuously show the virtual representation of the radiation source as long as the view of the patient body is provided. With these augmented reality, virtual reality, or mixed reality experiences, a user of the XR device can enjoy immersive and real-time guidance with respect to the radiation source, for example, when performing surgery on the patient body.
[59] Figure 7 is a block diagram that illustrates an example of a computer system 700 in which at least some operations described herein can be implemented. For example, the computer system 700 is an XR device (e.g., an VR/AR headset), a computing station coupled to the XR device (e.g., for offloading some computational load from the XR device), a computing system coupled with a plurality of 3D-sensitive radiation detectors, and/or the like.
[60] As shown, the computer system 700 can include: one or more processors 702, main memory 706, non-volatile memory 710, a network interface device 712, a video display device 718, an input/output device 720, a control device 722 (e.g., keyboard and pointing device), a drive unit 724 that includes a machine-readable (storage) medium 726, and a signal generation device 730 that are communicatively connected to a bus 716. The bus 716 represents one or more physical buses and/or point-to-point connections that are connected by appropriate bridges, adapters, or controllers. Various common components (e.g., cache memory) are omitted from Figure 7 for brevity. Instead, the computer system 700 is intended to illustrate a hardware device on which components illustrated or described relative to the examples of the figures and any other components described in this specification can be implemented.
[61] The computer system 700 can take any suitable physical form. For example, the computing system 700 can share a similar architecture as that of a server computer, personal computer (PC), tablet computer, mobile telephone, game console, music player, wearable electronic device, network-connected (“smart”) device (e.g., a television or home assistant device), AR/VR systems (e.g., head-mounted display), or any electronic device capable of executing a set of instructions that specify action(s) to be taken by the computing system 700. In some implementations, the computer system 700 can be an embedded computer system, a system-on-chip (SOC), a single-board computer system (SBC), or a distributed system such as a mesh of computer systems, or it can include one or more cloud components in one or more networks. Where appropriate, one or more computer systems 700 can perform operations in real time, in near real time, or in batch mode.
[62] The network interface device 712 enables the computing system 700 to mediate data in a network 714 with an entity that is external to the computing system 700 through any communication protocol supported by the computing system 700 and the external entity. Examples of the network interface device 712 include a network adapter card, a wireless network interface card, a router, an access point, a wireless router, a switch, a multilayer switch, a protocol converter, a gateway, a bridge, a bridge router, a hub, a digital media receiver, and/or a repeater, as well as all wireless elements noted herein.
[63] The memory (e.g., main memory 706, non-volatile memory 710, machine- readable medium 726) can be local, remote, or distributed. Although shown as a single medium, the machine-readable medium 726 can include multiple media (e.g., a centralized/distributed database and/or associated caches and servers) that store one or more sets of instructions 728. The machine-readable medium 726 can include any medium that is capable of storing, encoding, or carrying a set of instructions for execution by the computing system 700. The machine-readable medium 726 can be non-transitory or comprise a non-transitory device. In this context, a non-transitory storage medium can include a device that is tangible, meaning that the device has a concrete physical form, although the device can change its physical state. Thus, for example, non-transitory refers to a device remaining tangible despite this change in state.
[64] Although implementations have been described in the context of fully functioning computing devices, the various examples are capable of being distributed as a program product in a variety of forms. Examples of machine-readable storage media, machine-readable media, or computer-readable media include recordable-type media such as volatile and non-volatile memory 710, removable flash memory, hard disk drives, optical disks, and transmission-type media such as digital and analog communication links.
[65] In general, the routines executed to implement examples herein can be implemented as part of an operating system or a specific application, component, program, object, module, or sequence of instructions (collectively referred to as “computer programs”). The computer programs typically comprise one or more instructions (e.g., instructions 704, 708, 728) set at various times in various memory and storage devices in computing device(s). When read and executed by the processor 702, the instruction(s) cause the computing system 700 to perform operations to execute elements involving the various aspects of the disclosure.
[66] Some embodiments may implement one or more of the following solutions, listed in clause-format. The following clauses are supported and further described in the embodiments above and throughout this document. The following listing of solutions may be implemented by some preferred embodiments:
[67] Solution 1. 1. An intraoperative radiation detection system comprising: a plurality of cadmium-zinc-telluride (CZT) radiation detectors configured for 3D-position-sensitive detections of incident radiation emitted from radioactive sources inside a patient body, wherein the plurality of CZT radiation detectors are located at a plurality of known locations surrounding the patient body; and a processing subsystem configured to: detect a radioactive source within the patient body based on performing an image reconstruction from the 3D-sensitive detections of at least a subset of the plurality of CZT radiation detectors; and provide, via an extended reality (XR) display, a view of the patient body combined with a virtual representation of the radioactive source.
[68] Solution 2. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a mounting structure to which the plurality of CZT radiation detectors are attached, the mounting structure positioning the plurality of CZT radiation detectors at the plurality of locations.
[69] Solution 3. The intraoperative radiation detection system of solution 2, wherein the mounting structure is attached to a platform on which the patient body rests while the radiation source is detected.
[70] Solution 4. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the mounting structure includes at least a partial arc at a radius from a sectional portion of the patient body, the plurality of CZT radiation detectors being positioned along the partial arc. [71] Solution 5. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the mounting structure includes an articulated member to which a particular radiation detector is attached, the particular radiation detector being movable relative to the patient body via the articulated member.
[72] Solution 6. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a position sensor co-located with a given CZT radiation detector, wherein the position sensor is configured to estimate a known location associated with the given radiation detector.
[73] Solution 7. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: a position marker co-located with a given radiation detector, wherein the position marker is detectable by a position sensor for estimating a known location associated with the given radiation detector.
[74] Solution 8. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, further comprising: an optical camera co-located with a given radiation detector, wherein the optical camera provides optical image data capturing the patient body for combining the virtual representation of the radiation source with the view of the patient body.
[75] Solution 9. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the processing subsystem is integrated with the XR display in a virtual reality (VR) headset that is wearable by a user.
[76] Solution 10. The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the XR display comprises a clear lens of an augmented reality (AR) eyewear that is wearable by a user.
[77] Solution 11 . The intraoperative radiation detection system of any one or more of the solutions disclosed herein, wherein the plurality of CZT radiation detectors are mounted on a motorized structure configured to cause the plurality of CZT radiation detectors to revolve around the patient body.
[78] Solution 12. A computing device comprising at least one hardware processor and at least one memory storing executable instructions that, when executed by the at least one hardware processor, cause the computing device to implement the processing subsystem of any one or more of the solutions disclosed herein.
[79] Solution 13. A method implemented by a processing subsystem of any one or more of the solutions disclosed herein.
[80] Although described with respect to particular embodiments, the functions described herein may be implemented in hardware, software executed by a processor, firmware, or any combination thereof. Other examples and implementations are within the scope of the disclosure and appended claims. Features implementing functions may also be physically located at various positions, including being distributed such that portions of functions are implemented at different physical locations.
[81] As used herein, including in the claims, “or” as used in a list of items (for example, a list of items prefaced by a phrase such as “at least one of” or “one or more of”) indicates an inclusive list such that, for example, a list of at least one of A, B, or C means A or B or C or AB or AC or BC or ABC (i.e., A and B and C). Also, as used herein, the phrase “based on” shall not be construed as a reference to a closed set of conditions. For example, an exemplary step that is described as “based on condition A” may be based on both a condition A and a condition B without departing from the scope of the present disclosure. In other words, as used herein, the phrase “based on” shall be construed in the same manner as the phrase “based at least in part on.”
[82] From the foregoing, it will be appreciated that specific embodiments of the invention have been described herein for purposes of illustration but that various modifications may be made without deviating from the scope of the invention. Rather, in the foregoing description, numerous specific details are discussed to provide a thorough and enabling description for embodiments of the present technology. One skilled in the relevant art, however, will recognize that the disclosure can be practiced without one or more of the specific details. In other instances, well-known structures or operations often associated with memory systems and devices are not shown, or are not described in detail, to avoid obscuring other aspects of the technology. In general, it should be understood that various other devices, systems, and methods, in addition to those specific embodiments disclosed herein, may be within the scope of the present technology.

Claims

1 . An intraoperative radiation detection system comprising: a plurality of cadmium-zinc-telluride (CZT) radiation detectors configured for 3D- position-sensitive detections of incident radiation emitted from radioactive sources inside a patient body, wherein the plurality of CZT radiation detectors are located at a plurality of known locations surrounding the patient body; and a processing subsystem configured to: detect a radioactive source within the patient body based on performing an image reconstruction from the 3D-sensitive detections of at least a subset of the plurality of CZT radiation detectors; and provide, via an extended reality (XR) display, a view of the patient body combined with a virtual representation of the radioactive source.
2. The intraoperative radiation detection system of claim 1 , further comprising: a mounting structure to which the plurality of CZT radiation detectors are attached, the mounting structure positioning the plurality of CZT radiation detectors at the plurality of locations.
3. The intraoperative radiation detection system of claim 2, wherein the mounting structure is attached to a platform on which the patient body rests while the radiation source is detected.
4. The intraoperative radiation detection system of any of claims 2-3, wherein the mounting structure includes at least a partial arc at a radius from a sectional portion of the patient body, the plurality of CZT radiation detectors being positioned along the partial arc.
5. The intraoperative radiation detection system of any of claims 2-4, wherein the mounting structure includes an articulated member to which a particular radiation detector is attached, the particular radiation detector being movable relative to the patient body via the articulated member.
6. The intraoperative radiation detection system of any of claims 1 -5, further comprising: a position sensor co-located with a given CZT radiation detector, wherein the position sensor is configured to estimate a known location associated with the given radiation detector.
7. The intraoperative radiation detection system of any of claims 1 -6, further comprising: a position marker co-located with a given radiation detector, wherein the position marker is detectable by a position sensor for estimating a known location associated with the given radiation detector.
8. The intraoperative radiation detection system of any of claims 1 -7, further comprising: an optical camera co-located with a given radiation detector, wherein the optical camera provides optical image data capturing the patient body for combining the virtual representation of the radiation source with the view of the patient body.
9. The intraoperative radiation detection system of any of claims 1 -8, wherein the processing subsystem is integrated with the XR display in a virtual reality (VR) headset that is wearable by a user.
10. The intraoperative radiation detection system of any of claims 1 -8, wherein the XR display comprises a clear lens of an augmented reality (AR) eyewear that is wearable by a user.
1 1 . The intraoperative radiation detection system of any of claims 1 -10, wherein the plurality of CZT radiation detectors are mounted on a motorized structure configured to cause the plurality of CZT radiation detectors to revolve around the patient body.
12. A computing device comprising: at least one hardware processor; and at least one memory storing executable instructions that, when executed by the at least one hardware processor, cause the computing device to: detect a radioactive object within a patient body based on performing an image reconstruction from detection data of a plurality of 3D-sensitive CZT radiation detectors; and provide, via an XR display, a view of the patient body combined with a virtual representation of the radioactive object.
13. The computing device of claim 12, wherein detecting the radioactive object comprises determining a location of the radioactive object from 3D-pixelated detection data respectively received from the plurality of 3D-sensitive radiation detectors and from known locations associated with the plurality of 3D-sensitive radiation detectors.
14. The computing device of claim 12, wherein detecting the radioactive object comprises receiving a triangulated location of the radioactive object from a detection system communicably coupled with the plurality of 3D-sensitive radiation detectors and configured to determine the triangulated location of the radioactive object.
15. The computing device of any of claims 12-14, wherein providing the view of the patient body combined with the virtual representation of the radioactive object comprises mapping the patient body in order to register a detected location of the radioactive object to the patient body.
16. The computing device of any of claims 12-15, wherein providing the view of the patient body combined with the virtual representation of the radioactive object comprises overlaying a video stream that is captured by an optical camera and that includes the view of the patient body with the virtual representation of the radioactive object.
17. The computing device of any of claims 12-16, wherein the XR display is integrated with the computing device into an augmented reality or virtual reality (AR/VR) headset.
18. The computing device of any of claims 12-16, wherein the XR display comprises a clear lens of an augmented reality (AR) eyewear.
19. The computing device of any of claims 12-18, wherein the instructions further cause the computing device to: causing a revolution of the plurality of 3D-sensitive CZT radiation detectors circumferentially around the patient body.
20. A method for immersive and intraoperative radiation imaging, the method comprising: detecting a radioactive object within a patient body based on an image reconstruction performed with detection data of a plurality of 3D-sensitive radiation detectors positioned around the patient body; and providing, via an XR device, a view of the patient body combined with a virtual representation of the radioactive object.
21 . The method of claim 20, wherein detecting the radioactive object comprises determining a location of the radioactive object from 3D-pixelated detection data respectively received from the plurality of 3D-sensitive radiation detectors and from known locations associated with the plurality of 3D-sensitive radiation detectors.
22. The method of any of claims 20-21 , wherein providing the view of the patient body combined with the virtual representation of the radioactive object comprises mapping, via sensors included in the XR device or an optical camera co-located with one of the plurality of 3D-sensitive radiation detectors, the patient body in order to register a detected location of the radioactive object to the patient body.
23. The method of any of claims 20-22, wherein providing the view of the patient body combined with the virtual representation of the radioactive object comprises overlaying portions of a video stream that includes the view of the patient body with the virtual representation of the radioactive object, the video stream being captured by an optical camera included in the XR device.
24. The method of any of claims 20-23, further comprising: causing a revolution of the plurality of 3D-sensitive CZT radiation detectors circumferentially around the patient body.
PCT/US2025/029392 2024-05-15 2025-05-14 Three-dimensional (3d) cadmium-zinc-telluride (czt) detector system for extended reality intrasurgical guidance Pending WO2025240638A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202463647815P 2024-05-15 2024-05-15
US63/647,815 2024-05-15

Publications (1)

Publication Number Publication Date
WO2025240638A1 true WO2025240638A1 (en) 2025-11-20

Family

ID=97720822

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2025/029392 Pending WO2025240638A1 (en) 2024-05-15 2025-05-14 Three-dimensional (3d) cadmium-zinc-telluride (czt) detector system for extended reality intrasurgical guidance

Country Status (1)

Country Link
WO (1) WO2025240638A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130060134A1 (en) * 2011-09-07 2013-03-07 Cardinal Health 414, Llc Czt sensor for tumor detection and treatment
US20140369462A1 (en) * 2013-06-12 2014-12-18 General Electric Company Straddle mount detector assembly
US20210030381A1 (en) * 2018-01-25 2021-02-04 Universität Basel Imaging device, process of manufacturing such a device and visualization method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130060134A1 (en) * 2011-09-07 2013-03-07 Cardinal Health 414, Llc Czt sensor for tumor detection and treatment
US20140369462A1 (en) * 2013-06-12 2014-12-18 General Electric Company Straddle mount detector assembly
US20210030381A1 (en) * 2018-01-25 2021-02-04 Universität Basel Imaging device, process of manufacturing such a device and visualization method

Similar Documents

Publication Publication Date Title
US11464503B2 (en) Methods and systems for localization of targets inside a body
US11350965B2 (en) Methods and systems for performing navigation-assisted medical procedures
RU2464931C2 (en) Device for determining position of first object inside second object
CN109452947B (en) Method for generating a positioning image and for imaging a patient, X-ray imaging system
JP6732807B2 (en) Biopsy specimen fluorescence imaging apparatus and method
US9522045B2 (en) Distortion fingerprinting for EM tracking compensation, detection and error correction
US8705817B2 (en) Measurement of geometric quantities intrinsic to an anatomical system
CN111479509B (en) Determination of object distribution using camera
US20190001155A1 (en) Radiotherapy system and treatment support apparatus
US7715606B2 (en) Marker system and method of using the same
US20120259204A1 (en) Device and method for determining the position of an instrument in relation to medical images
FR2908628A1 (en) METHOD AND SYSTEM FOR CONTROLLING A MEDICAL INSTRUMENT
JP2016510410A (en) Scintigraphic imaging of automated 3D patient shapes with synthetic radiation-free.
JP2015522371A (en) Imaging system and method enabling instrument guidance
KR20190021027A (en) X-ray imaging apparatus and control method for the same
Rodas et al. See it with your own eyes: Markerless mobile augmented reality for radiation awareness in the hybrid room
EP3193765A1 (en) Processing system arranged to cooperate with an optical-shape-sensing-enabled interventional device
WO2010026785A1 (en) Radiation imaging apparatus
Schaller et al. Time-of-flight sensor for patient positioning
Cloutier et al. Deformable scintillation dosimeter: II. Real-time simultaneous measurements of dose and tracking of deformation vector fields
WO2025240638A1 (en) Three-dimensional (3d) cadmium-zinc-telluride (czt) detector system for extended reality intrasurgical guidance
KR102479266B1 (en) Treatment system, calibration method, and program
Duan et al. Localization of sentinel lymph nodes using augmented-reality system: a cadaveric feasibility study
CN102429675A (en) Radiographic imaging apparatus, radiographic imaging method, and program
US20230210478A1 (en) Moiré marker for x-ray imaging